DeepFake voice generation software presents a new frontier for fraudsters

Voice DeepFake being used to scam businesses out of thousands

With Artificial Intelligence (AI) technology evolving rapidly, we’re seeing it put to both good and bad use – sometimes leading to crime and in particular, fraud.

A recent case which has caught our attention involves a group of criminals utilising AI voice mimicking software to impersonate a chief executive from the German parent company of a UK-based energy provider. Using this software, fraudsters successfully contacted the UK company’s CEO and requested a €220,000 transfer to a Hungarian supplier. Once this had been transferred, the UK CEO was contacted again, confirming the money was set to be reimbursed immediately; it was only after receiving a third call from Austria that the CEO began to suspect foul-play.

Upon being received by an account in Hungary, the funds were subsequently moved to Mexico and other locations – with authorities unable yet to identify any suspects. It is clear that this scam was intricately planned, with an intimate knowledge of the company’s structure. As such, the fraudsters were able to extract a sizeable sum before being detected.

Always be wary if a phone call sounds suspicious, especially if the caller is asking for private information or for a financial transaction to be made.

In recent years, we’ve seen AI used as a tool to both identify and prevent fraud, but it seems that now the tide is turning – with AI mimicry being used to fool our eyes (in the form of deep fakes) and now our ears. This increase in AI fraud has been noted by authorities and leading experts who have identified AI as the ‘new frontier for fraud’. Pindrop, a company specialising in security software for call centres, noted a 350% rise in voice fraud from 2013 – 2017, further indicating the increasing popularity of this scam.

Don’t believe everything you hear…

By impersonating someone’s voice over the phone, fraudsters are able to access private information which can then be used for criminal purposes and private gain. This presents a particular threat to businesses, especially large companies which have more capital on hand. With a number of convincing commercial voice-generating tools now available to the public, it’s highly likely we will continue to see a rise in voice recognition fraud.

Addressing this issue, our Cyber & Financial Lines Underwriting Manager, Matt Drinkwater said:

“With crime getting more sophisticated as technology develops, it’s so important for business owners to be alert and aware of the scams they could fall victim to. 

“There are various preventative steps businesses can take. Always be wary if a phone call sounds suspicious, especially if the caller is asking for private information or for a financial transaction to be made. If you are sceptical of any call, hang up and make your own contact with the person who has supposedly phoned you, to double-check the validity of their request. And get peace of mind with insurance for protection if the worst happens.”

For more information about our CyberSafe Insurance or cyber crime protection, contact your local NMU branch or our Cyber & Financial Lines Underwriting Manager, Matt Drinkwater.


News: War & Strikes | 18th January 2021

Our War and Strikes risk update for 18th January is now online and can be found by... read more


News: NMU enhances underwriting service with dedicated new business team

Building on its established and award-winning service, NMU have created a team of Trading Underwriters to meet... read more


News: Port Disruption – A one off incident or a precursor to Brexit?

The recent closures of UK ports will hopefully be short lived, but with the uncertainty around what... read more