Voice modulation and impersonating celebrity likenesses are amongst the most trending ways deep fakes are being used to scam.
While it still seems that deep fakes are still in the infancy stage, there has been a recent uptick in malicious use of the technology around the world.
Deepfakes — videos that use artificial intelligence to create believable but fake depictions of real people — have become significantly more common online in recent months. Some easily accessible websites specialise in deepfake pornography of celebrities, often without the consent of the people depicted.
The first deep fake was made in 1997. However, the technology only went semi-believable and into the public mind in 2019 amidst viral stints, triggering major concerns on privacy and misinformation in a highly digital world.
Cutting-edge technologies have been used to trick people into making big money transfers while impersonating bankers, financial managers, and even close acquaintances – all of which posing threats to businesses and individuals alike.
“The technology has become far more accessible to everyday users. Apps that can create moderately convincing deepfakes, often in real time, are available to anyone with a computer or a smartphone,” said Subbarao Kambhampati to NBC, a professor of computer science at Arizona State University who has studied deepfake technology.
“I think it’s pretty scary,” an affected individual speaks. “The problem is, I don’t know what you do about it. Do you just go underground and disappear?”
Voice-generated deep fakes
Perhaps one of the fastest-growing threats seems to be voice-generated deepfakes. This particular technology and its users have grown significantly because it has become easier to collect data. For instance, with wealthy clients who make regular public appearances, including speeches, are often widely available on the internet.
“There’s a lot of audio content out there,” said Vijay Balasubramaniyan, the chief executive and a founder of Pindrop to the NYTimes, which reviews automatic voice-verification systems for eight of the 10 largest US lenders.
Pindrop is a service that businesses can use to verify audio information veracity tracking out fraudsters and the scams they have encountered have shifted from easy-to-recognise, robotic text-to-speech voices to actually human-sounding ones.
“Finding audio samples for everyday customers can also be as easy as conducting an online search — say, on social media apps like TikTok and Instagram — for the name of someone whose bank account information the scammers already have,” says Balasubramaniyan.
The sophistication of this impersonation is supposed to increase in time.
“While scary deepfake demos are a staple of security conferences, real-life attacks are still extremely rare,” told Brett Beranek to NYTimes, the general manager of security and biometrics at Nuance, a voice technology vendor that Microsoft acquired in 2021. The only successful breach of a Nuance customer, which occured in October, took the attacker more than a dozen attempts to pull off.
Though businesses have to worry about such large-scale attacks, the main target right now seems to be individuals. Attackers seem to connect stolen private financial information to targets and impersonate their bankers to collect money.
Specialists can use any one of a handful of easily accessible programmes to spoof target customers’ phone numbers, adding a layer of legitimacy.
Video remains king
Meanwhile, video deep fakes have gotten stronger and more potent for damage as well. Fake videos of celebrities hawking phony services have begun to proliferate on Facebook, TikTok, X, and YouTube.
Most of the recent videos are centered around blockchain and Elon Musk. They use his likeness as well as manipulate videos of several personalities, including Tucker Carlson and Bill Magger, to advertise and advance investment platforms.
Musk, the owner of X, formerly known as Twitter, has promoted some cryptocurrencies in the past, leading to his becoming extremely popular with scammers who use his image for their own gain.
Scams can now also use real-time deep fake programmes to mimic celebrities on live video calls with potential victims.