When Noelle Martin was 18 years old, she curiously searched her name on Google and stumbled across objectifying photos of herself that she didn't even know existed.
Innocent images had been taken from her social media and edited onto the bodies of adult film stars, depicting her in fake graphic sexual scenarios.
It was "completely horrifying, dehumanising and degrading" for student Noelle, and at the time, there wasn't even a name for the crime - though we now know it as the vile phenomenon of deepfake porn.
In the last 10 years, artificial intelligence (AI) has allowed criminals to manipulate and distribute non-consensual photos online, with leading experts warning of an "epidemic" of image-based sexual abuse.
Noelle, who is now 28, is a dedicated advocate for the cause and successfully helped to reform laws to criminalise non-consensual intimate images in Australia in 2018.
To raise more awareness, Noelle is appearing on the new SBS docuseries Asking For It, which explores the rise of consent culture and sexual violence.
She features alongside other high-profile sexual assault survivors to shed an insight into the impact of the crime and how hard it is to navigate the legal system.
Speaking to news.com.au about her experience, Noelle said: "This is something that you cannot escape from because it is a permanent, lifelong form of abuse.
"They are literally robbing your right to self-determination, effectively, because they are misappropriating you, and your name and your image and violating you permanently."
The photoshopped pictures that falsely portrayed Noelle flooded multiple pornographic sites and impacted her in every way - from future romantic relationships to her wellbeing and employability.
Noelle said that many still don't understand the "toll" it takes on a person, and she is powerless to prevent more explicit snaps being published on the internet.
She added: "If we had greater consent, education, then people would recognise other people's boundaries, what is acceptable and what is not."
In the UK, approximately one in 14 adults have experienced a threat to share sexual images of them, with more than 28,000 reports of image-based sexual abuse recorded between 2015 and 2022.
An amendment to the Online Safety Bill could see people who share manipulated and explicit photos criminalised for the first time and face potential time behind bars.
Jemima Olchawski, Chief Executive of the Fawcett Society, told the Mirror that sexual deepfakes are "deeply harmful", and welcomed the change in the law.
Justice Secretary Dominic Raab commented: "We must do more to protect women and girls, from people who take or manipulate intimate photos in order to hound or humiliate them."
Have you experienced image-based sexual abuse? Email nia.dalton@reachplc.com.