Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Daily Mirror
Daily Mirror
National
Nia Dalton

'I Googled myself and found explicit photos I didn't know existed - it was horrifying'

When Noelle Martin was 18 years old, she curiously searched her name on Google and stumbled across objectifying photos of herself that she didn't even know existed.

Innocent images had been taken from her social media and edited onto the bodies of adult film stars, depicting her in fake graphic sexual scenarios.

It was "completely horrifying, dehumanising and degrading" for student Noelle, and at the time, there wasn't even a name for the crime - though we now know it as the vile phenomenon of deepfake porn.

In the last 10 years, artificial intelligence (AI) has allowed criminals to manipulate and distribute non-consensual photos online, with leading experts warning of an "epidemic" of image-based sexual abuse.

Noelle was 18 years old when she found deepfake porn of herself on the internet (AP)

Noelle, who is now 28, is a dedicated advocate for the cause and successfully helped to reform laws to criminalise non-consensual intimate images in Australia in 2018.

To raise more awareness, Noelle is appearing on the new SBS docuseries Asking For It, which explores the rise of consent culture and sexual violence.

She features alongside other high-profile sexual assault survivors to shed an insight into the impact of the crime and how hard it is to navigate the legal system.

Speaking to news.com.au about her experience, Noelle said: "This is something that you cannot escape from because it is a permanent, lifelong form of abuse.

Noelle discovered her face had been edited onto explicit content without her consent (SBS)

"They are literally robbing your right to self-determination, effectively, because they are misappropriating you, and your name and your image and violating you permanently."

The photoshopped pictures that falsely portrayed Noelle flooded multiple pornographic sites and impacted her in every way - from future romantic relationships to her wellbeing and employability.

Noelle said that many still don't understand the "toll" it takes on a person, and she is powerless to prevent more explicit snaps being published on the internet.

She added: "If we had greater consent, education, then people would recognise other people's boundaries, what is acceptable and what is not."

She continues to appeal for change and raise awareness of image-based sexual abuse (AP)

In the UK, approximately one in 14 adults have experienced a threat to share sexual images of them, with more than 28,000 reports of image-based sexual abuse recorded between 2015 and 2022.

An amendment to the Online Safety Bill could see people who share manipulated and explicit photos criminalised for the first time and face potential time behind bars.

Jemima Olchawski, Chief Executive of the Fawcett Society, told the Mirror that sexual deepfakes are "deeply harmful", and welcomed the change in the law.

Justice Secretary Dominic Raab commented: "We must do more to protect women and girls, from people who take or manipulate intimate photos in order to hound or humiliate them."

Have you experienced image-based sexual abuse? Email nia.dalton@reachplc.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.