Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Lifestyle
Jennifer DeStefano

Experience: scammers used AI to fake my daughter’s kidnap

Jennifer DeStefano at a Senate Judiciary Subcommittee on Human Rights and the Law hearing to examine artificial intelligence and human rights in Washington, D.C.
Jennifer DeStefano at a Senate Judiciary Subcommittee on Human Rights and the Law hearing to examine artificial intelligence and human rights in Washington, D.C. Photograph: Shutterstock

‘Mom?” repeated my daughter’s voice on my phone. “I’ve messed up.” My heart sank and I started trembling. I heard a man instructing her to lie down and put her head back. My 15-year-old daughter, Briana, was at a skiing competition with my husband two hours away, and I instantly thought she’d been badly hurt. I was in my car, picking up her sister Aubrey, who is 13, from dance class in Arizona. Over the phone, I heard Briana bawling and shouting, “Help me, help me.” My blood ran ice cold and my legs turned to jelly as the man on the phone began explaining that he had my daughter, and if I told anyone he would pump her stomach full of drugs, drop her in Mexico and I would never see her again.

I sprinted into the lobby of the building where Aubrey was having her dance class. Putting the phone on mute, I wailed for help from the other parents. By this point, the man was shouting threats down the phone. A couple of mothers started calling 911. Coming out of class to hear what the fuss was about, Aubrey was thrown into a panic too. I pleaded with her to try to contact the rest of the family, but she was frozen in terror. I’ll never forget her face as she tried to process that someone had taken her sister.

“You need to pay $1m if you want to see her again,” the man threatened. I didn’t have that kind of money, so agreed to give him $50,000. He told me he would come and pick me up in a white van, and put a bag over my head so I couldn’t see where we were going. I was told that if I didn’t bring the money in cash, I’d be driven to Mexico and left there dead, along with my daughter. No part of me questioned whether this was actually real – every instinct within me was screaming out to do anything to save my baby’s life.

One of the mothers who had been calling 911 then burst into view, her voice seeming slow and distorted in my brain as I heard the words, “It’s AI, it’s an AI scam.” The police had been seeing a rise in cases recently with similar descriptions.

But I just couldn’t accept it. I kept thinking, my baby is out there and someone needs to save her now. It was my daughter’s voice, I would never have mistaken that. Another mother then said she had managed to reach my husband, who had found Briana. She was skiing and safe, and hadn’t got a clue what anyone was talking about.

The man on the phone was still arguing. Muting his call, I demanded to speak to my husband and daughter on the other line. I needed to hear their voices myself. Briana told me she had no clue what I was talking about and was safe with her dad at the ski resort. My brain felt like fuzz. I was relieved to hear her voice, but still reeling.

I unmuted the other line, where the man was still telling me all the things he’d do to us if we didn’t comply. I started calling him out, calling him a scammer, but he kept going, saying he had her. I hung up on him.

Afterwards, I was devastated to hear that the police only considered this a prank call, and wouldn’t investigate it because no one had been hurt and no money had been taken. They said it was probably an AI scam as there had been many reports of them recently; they assumed Briana’s voice would have been AI-generated and the man’s would have been real. When I heard that, I broke down, traumatised; I couldn’t get over how real her voice sounded. I was so sure they had been her cries.

Aubrey and I drove up to join the rest of the family the next day. I squeezed Briana tight, so grateful that we were all OK and back together again. It’s really impacted Aubrey. A guy her age tried to give her his number, and she ran over to me, worrying he was trying to kidnap her. It’s heartbreaking.

It’s said that all you need is three seconds of audio to replicate a person’s voice. Though all of Briana’s social media accounts are private, there are some videos of her being interviewed for sports and for school, but nothing of her crying out. I’m nervous about how new technology like AI can be used to harm children. Having seen first‑hand how scary these things can be, I want to do everything I can to protect my family and others from anything like this happening again.

• As told to Elizabeth McCafferty

Do you have an experience to share? Email experience@theguardian.com

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.