© 2026
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Scam Advisory: We have been made aware that an online entity is posing as Joe Donahue to invite authors and other creatives onto our radio shows. The scammers then attempt to charge guests an appearance fee for exposure/publicity.
Please note: WAMC does not charge guests to appear on the station and any email about appearing on a WAMC program will come from a wamc.org email address.

Preparing for the age of AI scams

A Wehead, an AI companion that can use ChatGPT, is seen during Pepcom's Digital Experience at the The Mirage resort during the Consumer Electronics Show (CES) in Las Vegas, Nevada.
A Wehead, an AI companion that can use ChatGPT, is seen during Pepcom's Digital Experience at the The Mirage resort during the Consumer Electronics Show (CES) in Las Vegas, Nevada.

If a loved one called you in a panic asking for help—maybe they just got arrested or kidnapped and needed money immediately. What would you do? 

Here’s the thing, the voice on the other end of the line might not be them. It could be AI.

Artificial Intelligence is now making it possible to clone someone’s voice – and use it to trick family or friends. Scammers are taking advantage of the technology to con panicked loved ones out of hundreds and sometimes thousands of dollars. AI is also being used to devise more realistic romance scams and AI generated videos, also known as deepfakes. Recently, a Taylor Swift deepfake was used in a video to shill pots and pans to unwitting fans. 

Washington has been watching. A bipartisan group of House lawmakers introduced the No AI Fraud Act this month. The bill would protect Americans’likenesses and voices against AI-generated fakes. Earlier this month, the FTC created a competition with an award of $25,000 for the best ideas to protect consumers from these scams.And in November, the Senate Special Committee on Aging held a hearing on this kind of fraud and how to address it. 

We learn more about these scams and what people can do to protect themselves from falling victim.

Some tips from our guests:

  • If you suspect a voice clone scam, try to interrupt the caller and ask a question
  • Ask a question only that person would know
  • Establish a password with family and friends
  • Don’t send money through untraceable means like gift cards or cryptocurrency
  • Report all instances of fraud here: ReportFraud.ftc.gov

Copyright 2024 WAMU 88.5

Michelle Harven