If a loved one called you in a panic asking for help—maybe they just got arrested or kidnapped and needed money immediately. What would you do?
Here’s the thing, the voice on the other end of the line might not be them. It could be AI.
Artificial Intelligence is now making it possible to clone someone’s voice – and use it to trick family or friends. Scammers are taking advantage of the technology to con panicked loved ones out of hundreds and sometimes thousands of dollars. AI is also being used to devise more realistic romance scams and AI generated videos, also known as deepfakes. Recently, a Taylor Swift deepfake was used in a video to shill pots and pans to unwitting fans.
Washington has been watching. A bipartisan group of House lawmakers introduced the No AI Fraud Act this month. The bill would protect Americans’ likenesses and voices against AI-generated fakes. Earlier this month, the FTC created a competition with an award of $25,000 for the best ideas to protect consumers from these scams. And in November, the Senate Special Committee on Aging held a hearing on this kind of fraud and how to address it.
How can we identify these scams? What can we do to protect ourselves from falling victim?
Find more of our programs online.
Read the full article here