AI can animate your face, apply faked audio to it very quickly, easily: experts warn
Technology
AI can generate video from single still images
(Web Desk) - Artificial "deepfake" videos of you take just minutes to make – and need just a single photo to create.
That's the warning from a security expert who says artificial intelligence apps can animate your face and apply faked audio to it very quickly and easily.
Deepfakes are artificial videos of people created using AI based on existing images.
And modern deepfakes can even included faked audio with artificial voiceovers, all created using AI.
AI is now so powerful that it can generate convincing video from single still images, according to security expert Matt Sparrow.
"We are already there. I could take a picture right now, and within minutes could have you sing Metallica Ride the Lightning," said Matt, a senior intelligence operations analyst at Centripetal, speaking to The U.S. Sun.
Security experts are increasingly worried about how difficult it's becoming to tell deepfakes apart from the real thing.
It used to be easy to spot a fraudulent AI image by looking out for extra fingers or strange visual defects.
Now the advice is to investigate whether the video makes sense and seems suspicious
If it's making a bold claim or asking you to make an urgent decision, that's a red flag.
"People have to trust but verify," Matt told us.
"If I don’t get some sort of communication from you prior to a big ask or request of assets, then I am going to reach out to you through a known and trusted contact method.
"Unfortunately, long pauses and voice conversations are subjective. It’s hard to spot and tell. The technology is too good."
Now that AI is so advanced and easily accessible, criminals are able to use it for a wide range of scams.
Crooks can clone your voice in a matter of seconds with AI tools – or use chatbots to quickly generate convincing scam content to hoodwink you.
"AI is being used to rewrite emails, malicious actors are using it for phishing and social engineering campaigns," the security expert explained.
"People in general should just be wary and stop giving up free information. There’s nothing wrong with you saying ‘no’ to things.
"Other scams that happen stem from data breaches. Just recently there was a data breach that involved criminal records.
"I wholeheartedly anticipate that those people and those around them will be getting some form of scam calls.
"They’ll get things like ‘hey you owe an outstanding balance for fines’ or maybe ‘this person said they could use you as a reference.’
"Anytime there is a data breach, people are constantly being bombarded with risk."