A mom from Scottsdale, Arizona, received a terrifying call that changed her life forever. A man told her he had her daughter and that she would never see the girl again if she didn't follow his instructions. Jennifer DeStefano could hear her 15-year-old daughter Brianna's voice and immediately panicked, wanting to help her daughter, who was supposed to be skiing with her father.
Over the next few minutes, DeStefano did what the caller asked while trying to contact her husband and the police. In her terror, she soon realized that the whole thing was a scam and the caller had spoofed Brianna's voice using artificial intelligence, hoping to convince her to send him money. Now, she's sharing her story so other parents don't fall victim to such an evil scam.
More from CafeMom: There's a Phone Scam Targeting Grandparents & 1 Couple Lost $55,000
It was her daughter's voice.
DeStefano told CNN that she and her younger daughter, Aubrey, 13, were headed to the dance studio when her phone rang. Although it was an unknown number, DeStefano answered because Brianna was away skiing and the concerned mom wanted to ensure everything was OK on her trip. When she answered, she could hear her teen daughter yelling and crying.
"Mom! I messed up!" screamed a girl's voice. DeStefano prodded, asking what happened.
"The voice sounded just like Brie's, the inflection, everything," she told CNN. "Then, all of a sudden, I heard a man say, 'Lay down, put your head back.' I'm thinking she's being gurnied off the mountain, which is common in skiing. So I started to panic."
Then, the man began to make demands. "Listen here. I have your daughter. You call the police, you call anybody, I'm gonna pop her something so full of drugs. I'm gonna have my way with her then drop her off in Mexico, and you're never going to see her again," she recalled him saying.
He wanted money.
DeStefano explained that the caller wanted a $1 million ransom and that she went into the dance studio screaming in panic. She and Aubrey attempted to contact her husband, while another mother reached 911. The dispatcher immediately recognized the scam, but DeStefano didn't believe her.
"My rebuttal to her was, 'That's Brie crying.' It sounded just like her," DeStefano said. "I didn't believe her, because … [my daughter's] voice was so real, and her crying and everything was so real."
Because no one could convince her to get off the phone, DeStefano negotiated with the caller, explaining she didn't have $1 million, so he settled at $50,000.
He told her that he'd pick her up in a white van, cover her head with a bag, and take her to a location where she could give him the money, CNN reported.
"You better have all the cash, or both you and your daughter are dead," he said.
Then, Brianna called.
Aubrey contacted her dad via text message, who then had Brianna call to tell her mother she was OK. Someone handed DeStefano the phone with her real daughter on the other end. She told CNN she screamed at the man on the other phone, who continued to insist he had Brianna.
"She's like, 'Mom, I'm in bed. I have no idea what's going on. I'm safe. It's OK.'" DeStefano told CNN.
It was a four-minute call that changed their lives.
The entire ordeal was brief, but the fear will be there forever. DeStefano truly believed the voice was Brianna's. "A mother knows her child," she told CNN. "You can hear your child cry across the building, and you know it's yours."
Sadly, this type of scam is common, and many people lose an average of $11,000 in fake kidnapping scams, Siobhan Johnson, a special agent with the Federal Bureau of Investigation in Chicago, told CNN. These scams have been around for a long time but are now more sophisticated. Using AI to spoof things like a caller's voice makes the scams more believable.
"A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough," she added. "The trend over the past few years has been that less and less data is needed to make a compelling fake," Hany Farid, a computer sciences professor at the University of California, Berkeley, and a member of the Berkeley Artificial Intelligence Lab, told CNN.
This type of cloning can be done for about $5 a month, which is nothing compared to the ransoms some scam artists get away with.
More from CafeMom: Fifth Grade 'Teacher of Year' Uses ChatGPT To Create Learning Exercises, Sparking Debate
These scams often originate in Mexico.
While there is no data on how many people are targeted by fake kidnapping scams annually, Johnson said that they often come from Mexico and target the southern United States with larger Latin communities. There has been an uptick in fake kidnappings since AI has increased in popularity, but the threat has always been there.
"We don't want people to panic. We want people to be prepared. This is an easy crime to thwart if you know what to do ahead of time," Johnson said.
Johnson and the FBI gave some tips on how to avoid being the target of a virtual kidnapping. First, don't post about vacations or travel on social media. Next, establish a family password that would only be shared with a non-family member in an emergency. Keep financial information safe, and do not share it with random callers. Don't always trust the voice on the phone, but trust your gut. If you can't contact the alleged kidnapped person, have a friend or family member reach out.
"It was obviously the sound of her voice," she said. "It was the crying, it was the sobbing. What really got to me is that she's not a wailer. She's not a screamer. She's not a freak out. She's more of an internal, try-to-contain, try-to-manage person. That's what threw me off. It was the voice, matching with the crying," DeStefano told CNN.