[imagesource:pexels]
In an era where technological marvels seamlessly integrate into our daily lives, even the innocuous WhatsApp voice note has taken on a sinister guise as cybercriminals exploit generative Artificial Intelligence (AI) to clone voices.
AI is both an exciting new frontier and terrifying untested waters. The idea of manipulating voices in not far off from many of the new explorations in this cutting-edge technology, like the AI orbs that can track pedestrians’ eyeballs or photo-editing AI that can help bring old photos back to life.
That being said, there’s always a risk of criminal activity as we try to figure out the boundaries of the technology itself.
A mere one-minute recording is all it takes for deep fake technology, fuelled by generative AI, to craft eerily convincing voice imitations.
Although generative AI holds the potential to revolutionise business operations, content creation, and data analysis, its transformative capabilities can be manipulated by malicious actors, leading to disconcertingly lifelike deep fakes and voice scams that amass significant financial losses.
The impact of voice cloning technology was starkly evident in instances like the 2019 impersonation of a UK energy company CEO, resulting in a $35-million (R668 million) scam in Hong Kong in 2021.
According to Stephen Osler, co-founder at Nclose, scammers can capture a person’s voice with a few seconds of recorded audio, exploiting the prevalent use of voice notes by busy professionals to orchestrate cybercrime. These manipulated voice notes, gleaned from platforms like WhatsApp, Facebook Messenger, phone calls, and social media posts, undergo AI-driven alterations that make them sound convincingly authentic.
While businesses can enhance their cybersecurity defences through processes like multi-level authentication, individuals remain vulnerable to vishing (voice phishing) scams.
Vigilance and awareness are paramount, and it’s essential to be prudent when sharing snippets of personal information like addresses, birthdates, and phone numbers. The data is making me nervous about how much we share online in general, despite the growing risks. Be right back, just changing all my passwords…
Even more concerning is the reality that the voice of friends or family can be cloned, along with their caller IDs, amplifying the deception.
Matthew Wright, a computing security professor at the Rochester Institute of Technology, suggests ‘knowing oneself’ as the best defence strategy. While that may sound a bit cheesy, Wright is talking more about one’s ability to critically read deep fakes, and not your internal zen-ness.
“Specifically, know your intellectual and emotional biases and vulnerabilities. This is good life advice in general, but it is key to protect yourself from being manipulated … Scammers typically seek to suss out and then prey on your financial anxieties, your political attachments or other inclinations, whatever those may be.”
Next time you’re sharing your deepest darkest secrets with your bestie via a voice note, be aware that your voice is now out there in virtual space.
Although I doubt cybercriminals are looking to manipulate the latest gossip between your friends. Unless it’s really juicy.
[source:thecitizen]
[imagesource:instagram/fitchleedesmixers] The bespoke mixers company, in collaboration ...
[imagesource:flickr] Florida’s storm-battered Gulf Coast is facing an oncoming Catego...
[imagesource: Plett Shark Spotters/ Facebook] Plett Shark Spotters have sighted a recor...
[imagesource: Bookings.com] Singapore’s gorgeous Pan Pacific Orchard has just been na...
[imagesource:flickr] American R&B artist Chris Brown announced his long-awaited ret...