Skip to content

Scammers Could Use Fake AI Kidnappings to Torment People

Photo: breakermaximus
Photo: breakermaximus (Shutterstock)

Rapidly evolving advances in generative AI models capable of mimicking voices could lead to a new genre of particularly horrific scams: fake AI kidnappings. Scammers and extortionists, the report warns, can use AI-generated versions of a loved one’s voice to make it seem like they are in distress and ask for a ransom.

This particular threat isn’t hypothetical. Earlier this year, an Arizona mother named Jennifer DeStefano sat before a Senate Hearing and recounted in chilling detail how a scammer looking to make a quick buck used a deepfake clone of her teenage daughter’s voice to make it appear as if she was kidnapped and in danger.

“Mom, I messed up,” the deepfaked voice reportedly said between spurts of crying. “Mom these bad men have me, help me, help me.” Attacks like these, the report warns, could become even more common as the technology evolves and the quality of the audio clones improves.