acm-header
Sign In

Communications of the ACM

ACM TechNews

They Thought Loved Ones Were Calling for Help. It Was an AI Scam


View as: Print Mobile App Share:
In 2022, impostor scams were the second-most-popular racket in America, according to data from the Federal Trade Commission.

Technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, that their loved ones are in distress.

Credit: Elena Lacey/The Washington Post

More sophisticated artificial intelligence (AI) tools are being used to replicate a person's voice.

Fraudsters increasingly are using such tools for impostor scams, which often target the elderly, making them believe loved ones are in trouble and in need of quick cash.

University of California, Berkeley's Hany Farid said AI voice-generating software can recreate the pitch, timbre, and individual sounds of a person's voice using a short audio sample.

Said Farid, "If you have a Facebook page ... or if you've recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice."

Software from the startup ElevenLabs, for example, allows users to turn a short audio sample into a synthetically generated voice using a text-to-speech tool. The software is free or costs $5 to $330 per month, depending on the amount of audio generated.

From The Washington Post
View Full Article - May Require Paid Subscription

 

Abstracts Copyright © 2023 SmithBucklin, Washington, DC, USA


 

No entries found