Meta revoked a job offer to a prominent cyber-intelligence analyst immediately after he criticized Instagram for failing to protect children online.
Paul Raffile had been offered a job as a human exploitation investigator focusing on issues such as sextortion and human trafficking. He had participated in a 24 April webinar on safeguarding against financial sextortion schemes, during which he criticized Instagram for allowing children to fall prey to scammers and offered possible solutions.
“The only reason I can think of for the offer being rescinded is me trying to shine a light on this big issue of these crimes happening on Instagram, and Instagram doing little to prevent it so far,” said Raffile.
Raffile was a co-organizer of the webinar, which featured the parents of four children who had died after being scammed on Instagram. Among the 350 attendees were staffers from Meta, the National Center of Missing and Exploited Children (NCMEC), law enforcement agencies, the United Nations Office on Drugs and Crime, Visa, Google and Snap.
Raffile told the Guardian he made some quick introductory remarks at the webinar, which took less than a minute to deliver.
With a contract already signed, Raffile was due to start his new $175,000-a-year role the following Monday, but he received the call rescinding the offer within hours of the webinar concluding. Meta’s hiring manager did not share the reason for his firing, stating the directive came from “many pay-levels above us”, Raffile said.
Meta said in a statement: “It’s not accurate to imply the offer was rescinded because of the NCRI report, the webinar or the candidate’s expertise in this space.” The company did not provide a reason for rescinding Raffile’s offer.
Raffile said: “It shows that Meta is not willing to take this issue seriously. I’ve brought up legitimate concerns and recommendations, and they’re potentially unwilling to be aggressive enough to tackle this issue.”
Financial sextortion schemes have soared in the past two years, with more than 26,700 cases of underage victims reported to NCMEC in 2023 alone. According to the FBI, sextortion is the fastest-growing cybercrime in the US.
The victims are mainly teenage boys, who scammers approach by pretending to be attractive girls. After coercing victims into sending sexually explicit images of themselves, a scammer threatens to distribute the photos to their friends and family unless they pay a ransom.
A significant portion of these cases are the result of cybercriminals in Nigeria targeting teens abroad. Scammers refer to themselves as “Yahoo Boys”, and most commonly operate on Instagram and Snapchat. The crime can be deadly. Minors are often overwhelmed by scammers’ threats, and financial sextortion led to at least 20 teenage suicides between October 2021 and March 2023, the FBI has said.
Meta said in a statement it had strict rules against non-consensual sharing of intimate imagery.
Raffile questioned the reasons Meta and other social media companies have not managed to take effective action against financial sextortion.
“I had squared off against Yahoo Boys at previous employers, which were financial institutions and tech companies,” he said.
He previously held positions at the consulting firms Booz Allen Hamilton and Teneo.
He said: “We were able to eradicate them from our platforms in four to six months. Yet, the social media platforms have had two years to deal with this.”
A Meta spokesperson said its expert teams were aware that sextortion actors are disproportionately based in several countries, including in west Africa.
Raffile said Instagram’s design features help facilitate these cybercrimes, including plans to encrypt direct messages, which offers greater privacy but can handicap investigations. Another major issue is the inability of users to keep their followers and following lists private, which means a blackmailer can access the friends and family of their victims, he said.
“They message the victim and say: ‘Hey, I have your nudes, and I’ve screenshotted all your friends and family, your followers.’ Meta isn’t taking teen privacy seriously enough,” Raffile said.
Raffile criticized Meta’s April announcement that it would blur images detected as containing nudity as a default setting for under-18s. But teens can still opt to view them.
“It sounds illegal to allow minors to transmit these images on their platform,” he said. “Why not just block them?”
Meta said in a statement: “This feature aims to strike the balance between protecting people from seeing nude images and educating them about the risks of sharing them, while not preventing or interrupting people’s important conversations.”