Millions of iPhone and Android owners warned of growing AI scam that's secretly raiding bank accounts | The Sun

BRITS are losing thousands of pounds to voice fraud, which has been supercharged by recent advancements in artificial intelligence (AI).

Nearly a quarter of Brits say they or a friend have already been targeted, according to research by cybersecurity giant McAfee.

Most victims of AI voice scams have lost money, but a whopping 40 per cent have lost more than £1,000, the study found.

The "access and ease" of AI tools is helping cybercriminals to scale their efforts in "increasingly convincing ways,” says Vonny Gamot, head of Europe, Middle East and Africa at McAfee.

So, here's five tips on how to protect yourself against AI voice fraud:

Don’t share your voice online

Scammers can use AI to clone your voice from videos you share on social media.


I created an AI of myself you can date for $1 a minute – but it has gone ‘rogue’

Google challenges users to spot the AI fakes in this picture quiz

As little as three seconds of audio is required, according to McAfee, before an AI voice generator can spin an array of different sentences that had never actually been said by the person.

This can be used to target your friends and family over the phone or via a voice note.

Make sure your privacy and security settings mean your social media profiles are accessible only to those who know you.

And be careful accepting any friend or follow requests from people you don't know.

Most read in Tech


Urgent warning to Facebook users as hackers find new way to access devices


Domino’s reveals Eurovision pizzas for each country – but it's not as it seems


Disney+ fans warned price rise 'inevitable' – but there's a way to get it for FREE


Hypersonic airliner Magnetarcould fly from London to New York in 50 MINUTES

Think before you click and share

It can be tricky to know who exactly is in your social media network and how wide that chain goes.

After all, 'it's a small world'.

Think about the connections you have online and how far you sharing yours or a friend's post may go.

The wider your connections and the more you share, the more risk you may be opening yourself and others up to identity theft.

Be wary of unsolicited calls

Be vigilant when it comes to random phone calls and voice notes – especially if they concern money.

Don’t trust that the person over the phone is who they say they are, unless you can verify it yourself.

Unsolicited calls requesting personal or financial information is a major red flag, particularly if it's from an unknown number.

Although cyber criminals can spoof phone numbers – so it may look to both your ears and eyes that you're speaking with a friend or family member.

Stop, pause and think.

If you're having doubts, hang up and call the person directly.

Alternatively, you can try to verify the information before responding and definitely before sending money.

High-pressure tactics are a major red flag

Emotional manipulation and high-pressure tactics are techniques often deployed by scammers.

These fake scenarios might involve a loved one needing money, or perhaps they've been injured and need your financial details for a taxi.

An increasing number of victims are detailing a kidnapping scam where family members are being pressured into sending money as a ransom.

Just last month, a mother shared a chilling warning about an AI phone scam where the criminals stole her daughter's voice and pretended they had kidnapped her in an attempt to steal $50,000 for her "safe return".

"It was 100 percent her voice," the mother, Jennifer DeStefano, said at the time.

"It was never a question of 'who is this?' It was completely her voice, it was her inflection, it was the way she would have cried – I never doubted for one second it was her. That was the freaky part that really got me to my core."

This brings us to our final and most important tip.

Have a code word

McAfee's research has found that voice-cloning tools are capable of
replicating how a person speaks with up to 95 per cent accuracy.

Spotting the difference between real and fake certainly isn’t easy, especially if a scammer is using a high-pressure tactic to startle you and ramp up your adrenaline levels.

This is why having a code word among kids, family members and trusted close friends is important.

A code word must be something only they know, and will be the final bow in your sheath when it comes to combatting voice fraud.

Make a plan to always ask for it if they call, text or email to ask for help, particularly if they’re older or more vulnerable. 

In a testimony to McAfee, one person called Phyllis explained that this trick stopped the scammers right in their tracks.

“After receiving several ‘Grandma’ calls, in conversations with my
grandsons, I asked if they would call me," she says.

"Each one said they would not want to upset me and if they were
in trouble they would contact their parents.

"But, we also established a code sentence.

"Now when I ask the question to the person who is supposed to be my grandson, they hang up.”

Artificial Intelligence explained

Here’s what you need to know

  • Artificial intelligence, also known as AI, is a type of computer software
  • Typically, a computer will do what you tell it to do
  • But artificial intelligence simulates the human mind, and can make its own deductions, inferences or decisions
  • A simple computer might let you set an alarm to wake you up
  • But an AI system might scan your emails, work out that you’ve got a meeting tomorrow, and then set an alarm and plan a journey for you
  • AI tech is often “trained” – which means it observes something (potentially even a human) then learns about a task over time
  • For instance, an AI system can be fed thousands of photos of human faces, then generate photos of human faces all on its own
  • Some experts have raised concerns that humans will eventually lose control of super-intelligent AI
  • But the tech world is still divided over whether or not AI tech will eventually kill us all in a Terminator-style apocalypse

Source: Read Full Article