Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Tony Owusu

Instagram is still plagued by a disturbing issue that Meta says it's making headway on solving

Social media algorithms and the way they work is one of the most ardently kept secrets in Silicon Valley.

The algorithms are the engines that drive the user experience, and similar to artificial intelligence, the way they operate is reliant on the data that is fed into them. 

Related: Elon Musk sues Media Matters, seeks damages from critical article

Over the weekend, an unsealed complaint in a lawsuit filed against Meta Platforms (META) -) by 33 states alleges that despite the company publicly stating that Instagram is only for users 13 and older, the company is not only allowing kids under the age of 13 to use the platform, but that the company has also "coveted and pursued" that demographic for years. 

On Monday, the Wall Street Journal published a report showing that IG's algorithm pushes "doses of siliceous content" to adult accounts that follow mostly children, "including risqué footage of children as well as overtly sexual adult videos."

The Journal set up test accounts that follow only young gymnasts, cheerleaders, and teen and preteen influencers that are active on the platform. 

The Journal decided to conduct the experiment after the paper said it noticed that "large numbers of adult men" are prevalent followers of some of those accounts. 

"In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff. In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl," the Journal reported.

This report follows a previous Journal article showing that Facebook and Instagram connect large communities of users interested in pedophelic content. For the most recent report, a Meta spokesman told the Journal that the company has expanded its automated systems for detecting offending users and has taken down thousands of those accounts each month. 

“Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” said Samantha Stetson, a Meta vice president who handles relations with the advertising industry.

Media companies have taken to exposing social media companies for their foibles, but the investigative journalism has come at a cost. 

Elon Musk and X are suing Media Matters over an article that cost the company about $75 million in advertising money (and counting) because its story allegedly does not accurately depict the average user experience on the site.

Media Matters, Musk and X allege, only followed white supremacists and the accounts of some of X's largest advertisers during its experiment to show that advertising was being shown next to offensive content. 

The Journal's experiment was similar, in that it curated an experience with the express intent to find out if what they feared — that the algorithm pushes sexually suggestive content about children to certain users — was true.

Get investment guidance from trusted portfolio managers without the management fees. Sign up for Action Alerts PLUS now.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.