ICAEW.com works better with JavaScript enabled.

I deepfaked myself. It was terrifyingly easy

Author: ICAEW Insights

Published: 07 Oct 2024

Cyber Security Awareness Month: Deepfakes are becoming an increasingly prevalent part of the cyber-security lexicon. So how easy is it for criminals to create a clone of someone? ICAEW’s Head of Data Analytics and Tech, Ian Pay, decided to experiment with the technology.

The topic of deepfakes has never been so big. As the AI technology behind them evolves rapidly, they are becoming increasingly convincing. Engineering company Arup recently fell victim to a deepfake scam to the tune of £20m, demonstrating just how dangerous this technology can be to unsuspecting and unprepared organisations. 

Deepfakes are, in essence, video avatars made to look and sound like a real person. They’ve been around for a number of years; there are examples of deepfakes going back decades. But instead of it taking days or even weeks of video editing to produce a deepfake that might be of patchy quality, they can now be generated in a matter of minutes and convincing enough to deceive even the most sceptical individuals. 

Generative AI technology is able to deliver the required text-to-speech conversion to synthesise audio and replicate it, and even respond in near real-time to written or verbal inputs or other stimuli. In the same way you can have written conversations with AI through OpenAI’s ChatGPT, Google’s Gemini or Anthropic’s Claude, you can talk to a digital avatar and it will talk back to you. At a basic level, the technology is not especially new, but it is now far more lifelike and more capable of responding to any prompts, not just a predefined set.

While this technology has immense potential – for example, to translate videos into different languages without clunky subtitles or dubbing, or to help deliver training video content – it also carries immense risks. 

An AI me in under 15 minutes

There’s no shortage of websites out there that will help you create your own ‘digital avatar’. Some are more trustworthy than others. After a bit of research, I went with a site recommended to me by IT experts, which also had a robust (and crucially UK GDPR compliant) data privacy policy. Their commercial model also gave me confidence that they would not be taking my video and using it for illegitimate purposes.

The process for creating my avatar was very straightforward. All I had to do was record a two-minute video of myself reading from a given script, provide my consent (also by video) and then sit back and wait. After about 10 minutes, my avatar was ready. And it’s fair to say I was shocked. There on screen, in front of me, was me, talking about avatars and encouraging me to “share my feedback” – I hadn’t said those words.

I could have spent hours honing the script, tweaking and trimming the video. But at a simple level I was keen to understand how quickly I could turn a short script into video and audio of me.

And to cut to the chase: it was terrifyingly easy. I was able to generate audio from script almost instantly. With the free version of the platform, a short 15-second video took about two minutes to generate. As for how realistic it was – well, see for yourself.

Allow StreamAMG video

This video is provided by StreamAMG. We ask for your permission before anything is loaded as they place a few cookies on our site. For more information on how we handle cookies, please see our privacy policy and cookies policy. To listen to this content on the website, please accept and continue.

Now, of course, I could make a video using my avatar to say anything I wanted.

Family and co-workers who saw this video could tell it wasn’t me. Despite it broadly looking and sounding like me, its mannerisms and vocal inflections are not quite right. The problem is, for deepfakes to be effective, they only have to be good enough to fool someone for a few critical moments – enough to click a link, approve a transfer or obtain your login details. And criminals will spend a lot more than 15 minutes honing a deepfake to get it as realistic as possible.

From avatar to deepfake – a small leap?

It is worth stating that getting from a pre-recorded avatar to an interactive deepfake is no small task – currently there is still some processing time when turning a script into a video. But it is also not a challenge for criminals and savvy cyber specialists with sufficient processing power available. 

While I’m not inclined to start digging around on the dark web to find criminals who would be more than happy to create a likeness of anyone I wanted for a fee, we can say with certainty that such services exist for those who wish to seek them.

Audio fakery, on the other hand, seems almost too easy. In my own experiment, the text-to-speech time was basically immediate. 

There are AI tools offering speech-to-speech capabilities. It wouldn’t be a bold leap to suggest that anyone with access to a couple of minutes of audio of someone’s voice could clone that voice with a high degree of accuracy, then use it in real-time conversations. This recent video featuring actor James Nesbitt talking to himself is utterly convincing, and bear in mind that because the quality of audio required for phone calls is lower, as long as the overall tone and inflections are close, an AI voice clone is highly likely to deceive.

Avoiding being deepfaked

All this leads to one very big question: how do I avoid being tricked by a deepfake audio or video? When they are this realistic, it is hard. There are, however, some very simple techniques.

Firstly, deepfake scams will almost always contact you out of the blue and will impose some sort of time criticality on a decision. Criminals want to put you under pressure so that you don’t use the rational part of your brain effectively. They also know that the longer you are on a call with a deepfake, the more likely you are to doubt it. 

The most important thing you can do, if you receive a call unexpectedly, is to take steps to confirm who you are talking to. It might seem flippant, but getting them to do or say something unexpected, or asking an out-of-context question, will typically be enough to wrongfoot a deepfake. 

Never, ever, disclose any information until you are confident that the person you’re speaking to is who they say they are and if in doubt, offer to call back or get in touch by other means.

It's also very important to have appropriate controls in place, both personally and professionally. Multi-factor authentication, approval processes, robust security questions and basic hygiene of personal data will all make it harder for criminals to act.

Finally, be careful what you share online. While your role may make it hard to completely avoid an online presence, you can reduce the risk of being deepfaked by moderating what you share publicly and not giving too much of yourself away. 

Deepfakes are more likely to be used to attack your friends, family or colleagues. So if you want to protect those you care about, the most important steps are the ones that prevent you from being deepfaked in the first place.

Cyber security awareness

Each year ICAEW marks global Cyber Security Awareness month with a series of resources addressing the latest issues and how to protect your business.

Close up of woman's hand holding a mobile phone, with a lap top open in the background. On the phone is the image of a padlock

Further resources

ICAEW Community
Magnifying glass and pen
Internal Audit Community

Essential resources, support and news on the latest technical and regulatory changes impacting the internal audit function. Membership is open to everyone, including non-ICAEW members.

ICAEW Community
Boardroom
Corporate Governance

Stay up to date with the latest news and developments in corporate governance, to help you in your role as a board member, NED or corporate governance professional. Membership is free and open to everyone

ICAEW support
A pair of hands holding 3 blocks showing compliance symbols
Training and events

Browse upcoming and on-demand ICAEW events and webinars covering corporate governance and stewardship.

See what's coming up A-Z of CPD courses
Open AddCPD icon

Add Verified CPD Activity

Introducing AddCPD, a new way to record your CPD activities!

Log in to start using the AddCPD tool. Available only to ICAEW members.

Add this page to your CPD activity

Step 1 of 3
Download recorded
Download not recorded

Please download the related document if you wish to add this activity to your record

What time are you claiming for this activity?
Mandatory fields

Add this page to your CPD activity

Step 2 of 3
Mandatory field

Add activity to my record

Step 3 of 3
Mandatory field

Activity added

An error has occurred
Please try again

If the problem persists please contact our helpline on +44 (0)1908 248 250