My good friend Dr. Marty Jablow likes to say about AI, "Don't check your brain at the door." I couldn't agree more with him. As a matter of fact, I wish I had come up with that.
That quote really applies to today's post. It seems that AI voice scams are on the rise. I highly advise people to come up with 'code words' that readily identify themselves. It should be an unusual word that can't be guessed and will readily identify yourself to loved ones. For instance, the person might call and say, 'Hi mom. I need you to send me some money. ORANGE PEEL". And then continue the conversation. No one is going to say orange peel in the middle of a call, yet if you've setup that protocol, the mother in the example will immediately know it's their child on the phone. Why am I bringing this up? Because I recently got some info from the AI Ad Generator Bestever that I'd like to share...
The CEO of OpenAI, Sam Altman, recently declared the risk of an impending “AI fraud crisis.” Already, we have seen more sophisticated scams coming into play thanks to the evolution of AI, which has sparked a national sense of concern.
This is why experts have issued an urgent warning over the AI voice scam that’s causing people to lose thousands. Apoorva Govind, an expert at the AI ad generator site, Bestever, has issued some urgent guidance on how to keep an eye out for this latest scam.
In this elaborate new scam type, fraudsters are using AI technology to impersonate another person’s voice, whether this be the voice of a celebrity or even work colleagues.
Apoorva Govind commented: “One of the most concerning uses of AI voice scams is fraudsters utilizing technology to impersonate the voice of a family member, often faking an urgent emergency that requires funds to be sent over as soon as possible.
“Scammers can now impersonate family members’ voices by gathering samples from publicly available video clips, such as those that have been shared on social media.
“After using AI technology to generate realistic audio, scammers will then initiate a call to ask for funds to be transferred as soon as possible.”
In one recent case, a woman was scammed out of $15,000 due to an AI-cloned audio of her daughter’s voice. The scammer used AI technology to pretend that her daughter had been in a car crash, before urgently asking for money to be transferred over.
By setting up these extreme situations, scammers intend for their victims’ sense of reason to be abandoned when they hear a “family member” in need, which is why this scam type is becoming so prevalent in recent weeks.
How do I prevent falling for these scams?
If you believe that you’ve received a suspicious call from someone claiming to be a “family member,” then hang up immediately before calling back using a legitimate number.
Often, scammers will pretend that they’ve “lost their phone,” which is why they’re calling on an unknown number, but it’s always crucial to verify this long before sending any money over. A quick text or call to the family member in question can often be enough to identify these “emergency” calls as scams.
Apoorva Govind commented: “Criminals rely on creating a false sense of urgency to prevent you from taking this simple verification step, but don’t let this sway you from quickly calling back using a known number.”
“It’s also important to watch out for any unusual payment requests. Scammers typically ask for actions that legitimate callers would not, such as requesting immediate payments through gift cards or cryptocurrency. Any unusual payment method should immediately raise concerns.
“Question the legitimate urgency behind any rushed financial requests, and consider ‘Would an actual family member place this much pressure on me to send money over?’
“It is also becoming increasingly prevalent for scammers to impersonate the voices of company bosses and CEOs. This is why every organization should have a protocol in place to prevent employees from falling for this type of scam.
“Ensure that all staff members are aware that managers and CEOs would only make contact through a known and secure channel, such as company messaging services, rather than calling from an unknown number.
“As scams continue to get more sophisticated, it’s important to keep family members in the loop. Reach out to your family and let them know about any new scams circulating to prevent scammers from targeting older relatives who may be less familiar with technology.
“Taking time to set up a ‘safe phrase’ with family members can also add an additional level of reassurance when receiving an unexpected phone call. This can be an emergency code that’s something only you and your loved ones know about.”
Scammers tend to use social media as their first port of call when it comes to cloning voices. This is why it’s so important that you’re being mindful of what you share on your social media accounts.
While you may believe that you wouldn’t fall for one of these AI-voice scams, limiting the content you share publicly can ultimately protect your family members from this type of scam in the future.
This is why setting your social media accounts to private can prevent fraudsters from being able to obtain personal video clips that can then be used maliciously against family members.
Apoorva Govind, the Founder and CEO at Bestever, commented on the threat of rising AI voice scams:
“Recent statistics have shown that there has been an increase in AI voice cloning attacks, particularly targeting vulnerable people who are less familiar with technology.
"The technology to create these convincing voice clones is now widely available, with some apps costing as little as $5-10 per month and others being completely free.
“Having an awareness of these scams is often the first line of defense. Ensure that you’re keeping your family members in the loop and ensure that you’re being mindful of the content you share on social media to prevent personal information from being used maliciously by criminals.”
This information was provided by the AI Ad Generator, Bestever.

No comments:
Post a Comment