TL;DR – don’t just take a recorded message (e.g. voicemail) as authority to take action, check through another channel before moving those $$$ to an offshore account…:
Aussie businesses are being warned to treat audio messages with caution after a series of ‘deep fake’ recordings were used to scam senior financial managers overseas earlier this month.
First reported by the BBC, cyber security firm Symantec has tracked at least three successful attacks on private companies where chief executives were impersonated to convince financial managers to transfer money.
In each case, the fraudsters employed so-called ‘deep fake’ technology, where artificial intelligence programs are used to manipulate audio or video for the purpose of impersonating someone.
Any accessible audio sources can be harvested for information, including conferences, keynotes, presentations and media appearances like podcasts.
‘Deep fake’ technology has grown in sophistication in recent years and the scam activity has cyber security experts worried, particularly given something which may seem like a routine conversation with a colleague could be completely faked.
Nik Devidas, managing director of Rock IT, says while it still requires a substantial amount of accessible audio to compile deep fakes prices are falling on the black market as the technology becomes more widely available.