Expert Insight: Mark James on Voice Cloning

What is it, what are the risks and how can we protect ourselves?

Privacy consultant Mark James has a wealth of experience working with a range of organisations to help them achieve GDPR (General Data Protection Regulation) compliance. He is also trained in Cyber Essentials and ISO 27001, giving him detailed knowledge of the security aspects of compliance.

Mark has worked as a DPO (data protection officer) for various organisations, including Longleat Safari Park and the Salvation Army. As DPO, he undertook gap analyses, supported with documentation, conducted DPIAs (data protection impact assessments), and more.

Recently, we’ve been hearing more about voice cloning, and how it’s becoming increasingly sophisticated. The US FTC [Federal Trade Commission] even launched a voice cloning challenge.

But what exactly is voice cloning? What are the associated risks? And what can organisations do to protect themselves?

We put these questions to Mark.


Thanks for your time! First, what is voice cloning?

Voice cloning is a technology that uses machine learning algorithms and neural networks to replicate specific human voices. It involves analysing audio recordings to mimic the tone, pitch and other characteristics of a person’s voice.

The technology has seen an increase in its use in diverse sectors, including the music industry, where it’s used to create virtual artists and AI-generated vocals. It can also have lots of other applications, including enhancing personalised digital experiences and helping people with speech impairments.

However, there are serious concerns about ethics, and other risks around the technology, particularly when used by threat actors.

How might threat actors use voice cloning in cyber attacks?

In a number of different ways. For example, threat actors could use the technology in social engineering attacks, such as vishing [voice phishing] and spear-phishing attacks [using voice notes] – phishers aren’t limited to emails anymore.

These techniques could be used to conduct fraudulent phone calls or record fake voicemail messages while retaining anonymity, with the scams themselves ranging from requesting sensitive information to making threats or demanding a ransom.

Voice cloning can also impersonate trusted individuals or authorities, such as bank representatives or healthcare providers. By convincingly mimicking a familiar voice, the attacker can trick the victim into revealing confidential information or performing other actions that compromise their security.

We’re even seeing voice cloning as a service – or ‘VCaaS’ – emerging on the dark web, where one group provides voice cloning services for other malicious actors. This poses a particularly significant threat, as it allows for widespread misuse of the technology.

I can’t emphasise enough that as AI and machine learning technologies improve, the sophistication of these attacks will increase, making them more difficult to detect and prevent. For this reason, awareness and vigilance are critical in defending against these threats.

Can you give an example of a cyber attack that used voice cloning?

One notable example occurred in the UAE, where cyber criminals used voice cloning to execute a substantial heist.

The criminals cloned the voice of a company director, using AI and deep voice technology. They were able to convincingly impersonate the director, leading a branch manager in Hong Kong to authorise transfers amounting to $35 million [about £28 million].

The scheme involved at least 17 individuals, and the stolen funds were sent to bank accounts across the world.

How might voice cloning attacks evolve?

Such attacks will likely become increasingly sophisticated due to advancements in AI-driven technology. One future threat is real-time voice masking, which would allow cyber criminals to engage in live conversations while using cloned voices, making detection even more challenging.

Also, as AI and voice cloning technologies evolve, they’ll become better at mimicking the unique characteristics of individual voices. This increases the potential for more convincing impersonations and more effective disinformation campaigns.

However, with continual developments in security measures, such as MFA [multifactor authentication] and voice biometrics, the risks associated with voice cloning could be managed.

What can organisations do to protect themselves from voice cloning attacks?

For a start, move away from email! Organisations that solely rely on email to communicate can leave themselves vulnerable to attacks. Instead, consider leveraging a multi-channel communication strategy that combines emails and other written messages with phone calls, video conferencing and, where possible, in-person meetings.

Using the right Cloud-based services can also help reduce the risk of being targeted by voice cloning attacks, as they should have advanced security measures in place to protect users. To find out which services are ‘right’ for you, you need to conduct your own due diligence – can the service provider satisfy any assurances required? For example, ISO 27001 certification is a great indication of a robust security posture, as is being able to provide evidence and assurances of GDPR compliance. Depending on your requirements, you may also want to look out for other frameworks and standards.

Another technology that organisations could invest in lies around digital provenance – the process of authenticating digital information. By investing in technologies that verify the source and integrity of digital content, organisations can better protect themselves from voice cloning and other similar threats.

Finally, organisations should be proactively assessing threats and establishing appropriate response protocols. This includes training employees on how to identify and respond to potential voice cloning attacks.


How we can help

Our team can support you in preventing voice cloning attacks. Through gap analysis assessments, DPIAs and organisation-wide elearning, we can assess your position and educate your staff to improve your resistance to attacks.


We hope you enjoyed this week’s edition of our ‘Expert Insight’ series. We’ll be back next week, chatting to another expert within the Group.

In the meantime, if you missed it, check out last week’s Expert Insight blog, where Group CEO Alan Calder gave us his expert insights into how to maintain GDPR and data privacy compliance in 2024.

Author

Tags:

Add a Comment

Your email address will not be published. Required fields are marked *