top of page
Kyle Chua

Criminals Leverage AI To Clone Teen's Voice in U.S. Kidnapping Scam

Updated: Dec 19, 2023

Beware, modern artificial intelligence (AI) tools are now capable of convincingly cloning people's voices, down to the subtleties of the way they might sound or speak, and criminals are leveraging the technology for new scams.

Reuters
Credit: Reuters

That was what one Arizona mom learned when she received a call from criminals claiming to have kidnapped her daughter.


"I pick up the phone, and I hear my daughter’s voice, and it says, 'Mom!' and she’s sobbing," Jennifer DeStefano told WKYT. "I said, 'What happened?' And she said, 'Mom, I messed up,' and she’s sobbing and crying." A man then picked up the phone informing the distraught mother that her daughter had been kidnapped, and demanded a ransom of US$1 million, which they later lowered to US$50,000 when she said she didn't have the funds. DeStefano's daughter at that time was out on a skiing trip, so she initially thought something had happened to her.


Fortunately for DeStefano, she happened to be at her other daughter's dance studio when she received the call, surrounded by worried mothers who offered to help. She reportedly kept the man on the other line talking, while one of the mothers called 911 and another called DeStefano's husband. Within four minutes, DeStefano was able to confirm her daughter was safe. The identity of the criminal, however, is still unknown.


DeStefano said the cloned voice was a dead "ringer" for her daughter's. "I never doubted for one second it was her," she said. "That’s the freaky part that really got me to my core."

Jennifer DeStefano talking to KPHO about the incident. Credit: AZFamily | Arizona News YouTube
Jennifer DeStefano talking to KPHO about the incident. Credit: AZFamily | Arizona News YouTube

The mother also said her daughter doesn't have any public social media accounts that has her voice, but the 15-year-old teen does appear and speak in a number school- and sports-related interviews, which could have been used by the criminals as data for the AI tool.


"In the beginning, it would require a larger amount of samples," explained Arizona State University computer science professor Subbarao Kambhampati. "Now there are ways in which you can do this with just three seconds of your voice." He further warns that with a large enough sample size, AI tools can even recreate the "inflection" and "emotion" of people's voices.


DeStefano advises the public to be aware of scams to avoid becoming a victim. She also says having an emergency word or question between family members can help validate someone's identity in case such incidents occur.

 
  • Beware, modern AI tools are now capable of convincingly cloning people's voices, down to the subtleties of the way they might sound or speak.

  • One mother in the U.S. recently reported to authorities that she almost fell victim to a kidnapping scam after criminals leveraged AI to clone her daughter's voice.

  • The mother, Jennifer DeStefano, said the cloned voice was a dead "ringer" for her daughter's, leading her to think she was indeed in danger.

As technology advances and has a greater impact on our lives than ever before, being informed is the only way to keep up.  Through our product reviews and news articles, we want to be able to aid our readers in doing so. All of our reviews are carefully written, offer unique insights and critiques, and provide trustworthy recommendations. Our news stories are sourced from trustworthy sources, fact-checked by our team, and presented with the help of AI to make them easier to comprehend for our readers. If you notice any errors in our product reviews or news stories, please email us at editorial@tech360.tv.  Your input will be important in ensuring that our articles are accurate for all of our readers.

bottom of page