FBI reports growing use of generative AI for fraud

Grandfather Mike Capstick’s heart started beating faster when he heard the voice of his daughter on the other end of the phone earlier this year—the voice that told him she was in a crash with her kids and needed help. 

Dec 17, 2024 - 03:53
 0
FBI reports growing use of generative AI for fraud

ST. LOUIS - Grandfather Mike Capstick’s heart started beating faster when he heard the voice of his daughter on the other end of the phone earlier this year—the voice that told him she was in a crash with her kids and needed help. 

“It was scary,” he said.

Capstick determined his daughter was okay by texting her. He believes fraudsters cloned her voice by using artificial intelligence.

“You would not be able to tell the difference,” Capstick said. “I didn’t, and it was my daughter.”

Generative AI is a growing tool used by fraudsters to commit a variety of crimes ranging from voice cloning to email phishing used to obtain bank information, according to the Federal Bureau of Investigations.

“Entire websites can be built using this generative AI tool that unfortunately folks are using to perpetrate fraud,” Assistant Special Agent in Charge Gregg Heeb, F.B.I. St. Louis, said.

According to Heeb, the crimes have become commonplace, including in the St. Louis area. He hopes by warning the public, they will be less likely to become victims of fraudsters.

“Not only folks that are here in our region that might be operating out of their homes and their basements but also nation-state actors that have really well-organized operations to perpetrate these frauds,” he said.

It's encouraged for families to establish a code word they can use to question any suspicious calls. 

The FBI also recommends victims reach out to their bank immediately and report the crime to the Internet Crime Complaint Center.

“That’s the best way we can act to try and recover those funds,” Heeb said. 

Anyone who is not sure about a situation should remember the phrase, "Always doubt; check it out." 

That’s what Capstick did when he sent a text to his daughter to see if the voice was really her. He said his family has also since established a code word.

Capstick hopes by sharing his story others won’t feel bad if they do become victims.

“Listen to these kinds of stories and try to prepare in advance,” he said. 

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

CryptoFortress Disclosure: This article does not represent investment advice. The content and materials featured on this page are for educational purposes only.