Next Gen Investing

Scammers can use AI tools to clone the voices of you and your family—how to protect yourself

Share
A young woman using voice recognition software on a smartphone.
Luis Alvarez | DigitalVision | Getty Images

A robocall impersonating President Joe Biden urged New Hampshire voters not to participate in Tuesday's presidential primary — and it probably won't be the last AI voice scam this election season.

"Of course, this will be used by foreign nation states just like the trolling farms they already have. This is just another weapon in the arsenal," Adrianus Warmenhoven, a cybersecurity expert at NordVPN, tells CNBC Make It.

The fraudulent robocall begins by saying, "What a bunch of malarkey," in a voice that sounds like President Biden's, according to NBC News. The call then instructs recipients to "save your vote for the November election" and refrain from participating in the nation's first presidential primary, which is when voters choose which candidate they would like to be their political party's nominee in the general election.

New Hampshire's Attorney General's office says it has launched an investigation following a series of complaints about the robocalls.

"Although the voice in the robocall sounds like the voice of President Biden, this message appears to be artificially generated based on initial indications," the attorney general's office said in a Jan. 22 statement. "These messages appear to be an unlawful attempt to disrupt the New Hampshire Presidential Primary Election and to suppress New Hampshire voters."

Thanks to the rapid development of the type of AI technology used to clone and mimic people's voices, these types of AI-powered schemes are becoming more common — and scammers aren't just spoofing well-known public figures.

In March, the Federal Trade Commission issued a consumer alert warning people that scammers could target them by using AI technology to clone the voice of a family member in order to convince them to send the scammers money.

How to protect yourself from AI voice scams

Although it's getting harder to differentiate between what's real and what may be an AI-generated deepfake, there are a couple of steps you can take to protect yourself.

1. Fact-check and verify

When it comes to AI voice scams like the one involving Biden, you should double check that what you're hearing is actually correct, Warmenhoven says.

Although the spoofed version of President Biden voice told New Hampshire voters to "save" their votes for the November general election, a quick Google search reveals that registered Republicans and undeclared voters could cast their ballot in Tuesday's primary.

If you suspect you're being targeted by a cybercriminal, you can also check online to see if there have been any recent reports of AI-generated voice scams, Warmenhoven says.

2. End the call

If you decide to answer a call from an unknown number and it sounds like a panicked family member is asking you for money, try not to panic yourself.

Instead, end the call and try calling or texting the person using the number they've given you, rather than the one that called, to check that they're OK, Warmenhoven says.

The same goes if you receive a call from a scammer pretending to be your bank. Don't immediately believe their claims. Instead, end the call and call the number listed on your bank's website or on the back or your debit or credit card, he says.

It's especially important to get off the phone with these types of scammers, because they don't need an especially long clip of your voice in order to use AI to clone it.

"Even if you just say 'Hello? Is anybody there ?' or just mumble something, you'll give up lots of words and inflections in your voice and then [a scammer] can clone your voice," Warmenhoven says.

In May, McAfee researchers found that free online tools could be used to clone someone's voice using just three seconds of audio, per the computer software company's "The Artificial Imposter" report.

"Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person's voice and deceive a close contact into sending money," Steve Grobman, McAfee's chief technology officer, said in the report. "It's important to remain vigilant and to take proactive steps to keep you and your loved ones safe."

DON'T MISS: Want to be smarter and more successful with your money, work & life? Sign up for our new newsletter!

Want to land your dream job in 2024? Take CNBC's new online course How to Ace Your Job Interview to learn what hiring managers are really looking for, body language techniques, what to say and not to say, and the best way to talk about pay. Get started today and save 50% with discount code EARLYBIRD.

We renovated a $90,000 abandoned school into a 33-unit apartment building — take a look inside
VIDEO6:5206:52
We renovated a $90,000 abandoned school into an apartment building