Why digital acceleration has created more opportunities for deepfake fraud tactics like voice cloning and what businesses can do about it
Digital acceleration has placed information and services in the hands of the masses, connecting individuals on a global level like never-before, and in turn making them increasingly dependent on devices in their daily lives.
The argument for technology as an equalizer in society is a strong one. Most people have a voice and a platform, producing millions of virtual interactions and recordings every day. But in this digital world of relative anonymity, it is difficult to know who is really on the other side of the connection. This uncertainty gives fraudsters an opening to threaten both businesses and consumers directly, especially in the realm of deepfakes.
What is a deepfake?
Deepfakes are artificially created images, video and audio designed to emulate real human characteristics. Deepfakes use a form of artificial intelligence (AI) called deep learning. A deep learning algorithm can teach itself how to solve problems using large sets of data, swapping out voices and faces where they appear in audio and video. This technology can deliver extraordinary outcomes across accessibility, criminal forensics, and entertainment, but it also allows a way in for cybercriminals that hasn’t existed until now.
Deepfake fraud tactics
A principal tactic among deepfake fraud is voice cloning – the practice of taking sample snippets of recorded speech from a person and then leveraging AI to understand speech patterns from those samples. Based on those learnings, the modeler can then use AI to apply the cloned voice to new contexts, generating speech that was never spoken by the actual voice owner.
For businesses, deepfake tactics such as voice cloning means access to points of vulnerability in authentication processes that can put organizations at risk. Fraudsters may successfully bypass biometric systems to access areas that would otherwise be restricted. For government leaders, it can mean the proliferation of misinformation – a growing area of concern with huge repercussions. For consumers, the risk of falling victim to scams involving access to personal information or funds is particularly high when it comes to voice cloning.
How to prevent deepfake fraud
1. Vigilance: Stay on top of sensitive personal information that could be targeted. Fraudsters are always at work, relentlessly seeking out opportunities to take advantage of any loophole or weak spot. Pay close attention to suspicious voice messages or calls that may sound like someone familiar yet feel slightly off. In an era of remote work, it is important to question interactions that can impact business vulnerabilities – could it be a phishing or complex social engineering scam?
2. Machine learning and advanced analytics: Deepfake fraud is an emerging threat, which leverages the development and evolution of the technology that fuels it. The flip side is that businesses can in fact use the same technology against the fraudsters, fighting fire with fire by deploying deepfake detection and analysis.
3. Layered fraud prevention strategy: Leveraging machine learning and advanced analytics to fight deepfake fraud can only be effective within a layered strategy of defense, and most importantly, at the first line of defense. Ensuring that the only people accessing the points of vulnerability are genuine means using identification checks such as verification, device ID and intelligence, behavioral analytics, and document verification simultaneously to counter how fraudsters may deploy or distribute deepfakes within the ecosystem.
As with many types of fraud, staying one step ahead of the fraudsters is critical. The technology and the tactics continually evolve, which may make the countermeasures on the table right now obsolete, however the fundamentals of sound risk management, with the right layered approach, and a flexible and dynamic solution set, can mitigate these emerging threats.
Stay in the know with our latest research and insights:Recently Published Research