Woman warns that phone scammers are using AI to mimic relatives’ voices: ‘Beyond evil’

Phone scammers are getting increasingly crafty these days, making it increasingly hard for everyday people to decipher what’s real and what’s fake.

Case in point: One woman’s recent TikTok is showing the new lows scammers are willing to go to in order to swindle innocent people, and the results can be absolutely heartbreaking.

“So I look like an emotional wreck right now because I’ve been crying for the past two hours, because I thought my little brother was dead,” Brooke Bush (@babybushwhacked) confesses at the start of her recent TikTok.

But her reason for thinking this wasn’t due to random paranoia: It was because a phone scammer allegedly used artificial intelligence (AI) technology to trick her grandfather into thinking her brother had been in a wreck.


Sharing so this doesn’t happen to anyone else

♬ original sound – Brooke Bush

According to Bush, the caller used the AI machine to mimic her brother’s voice. They then called the elderly man pretending to be his grandson.

“Oh, I’m about to get in a wreck!” the caller allegedly said before the call cut out and the line went dead.

Naturally, this horrified Bush’s grandpa, who didn’t know what to do. Moments later, he called his granddaughter in a panic, which caused her to immediately think the worst: Her brother had died.

“I started driving, trying to find his location, he wasn’t picking up,” Bush recalls in the TikTok.

It wasn’t until a considerable amount of time had passed — and she had spiraled into full-blown hysteria — that she ultimately learned the truth: A scammer had used AI technology to pretend they were her little brother.

Turns out, it was all part of an elaborate hoax designed to panic the grandfather and force him to send money to the stranger on the other end of the phone.

As Bush goes on to explain, a second call to the grandpa claimed that Bush’s brother had gone to jail after killing someone in the car wreck and was now in need of bail money.

“All for money, he acted like my little brother almost died,” Bush says, while fighting back tears. “If you guys ever get called and it’s someone asking you for money that you know, they’re using a freaking AI machine to reenact their voice. How evil?”

Bush’s hope in sharing her story is to warn others. And according to some of the comments, this is already a popular new method.

“this is the second person on here that has said this happened to them,” one person shared.

“this has been happening to A LOT of people,” someone else added.

“this happened a few years ago to my grandpa,” another TikToker shared. “they called him & told him that my cousin was drinking (he doesn’t drink) & driving & need bail money!”

Most people were simply horrified that something this elaborate could be happening so frequently. Some called it “scary,” “cruel” and “beyond evil.”

“That’s insane,” one person commented. “I’m so sorry you had to go through that. A.I voice filters are seriously getting out of hand.”

Others tried to offer some advice on how to avoid something like this in the future.

“my family and I have created a code word in case of situations like this,” one person shared. “if they get a call and that word isn’t said then they’ll know it’s not real.”

“time for us to all have safe words,” another person added.

In The Know by Yahoo is now available on Apple News — follow us here!

More from In The Know:

This is the best shampoo if you have dandruff, according to Amazon shoppers: 'My HOLY GRAIL for life!' 

Woman exposes the truth and ‘ungatekeeps’ the things that ‘hot girls in nyc are doing’: ‘and ozempic’

Here’s an honest review of TikTok’s favorite Coco & Eve face tanning mist, according to someone with sensitive skin

Madison Beer opens up about her trauma after private photos from when she was 14 were leaked

Listen to the latest episode of our pop culture podcast, We Should Talk: