
If you spent any time this week mindlessly scrolling through Instagram Reels on your Explore page there's a good chance you saw something that made you recoil in horror.
Don't be too alarmed because you weren't the only one.
💰💸 Don’t miss the move: SIGN UP for TheStreet’s FREE Daily newsletter 💰💸
Instagram's Explore page showcases the company's algorithm in all its glory. The page is highly curated based on the accounts you follow, the photos and videos you've liked and your connection on the platform, among other things, according to the company.
Related: Meta Platforms new push is raising red flags and strong skepticism
But a glitch this week forced users to view Reels (short-form videos on the Instagram platform) that their personalized algorithm would not normally pick for them, and often, these videos were extremely violent in nature.
Instagram apologizes for forcing shocking imagery on users

Shutterstock
On Wednesday, users reported seeing a sudden uptick in the number of violent videos showing up in their feeds.
The violence ranged from high school bathroom fights to road rage incidents to outright gun violence leading to death.
Some of the videos were hidden by the platform's "sensitive content" banner, but those types of videos can still be accessed almost instantaneously.
"We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” an Instagram spokesperson told the Wall Street Journal. "We apologize for the mistake.”
This so-called error comes after the company recently announced that it was adjusting its content moderation policies, focusing less on censoring speech and more on "high severity" violations, according to CEO Mark Zuckerberg.
The Meta spokesperson told the Journal that this week's issue is unrelated to the recent change in company philosophy.
Many more questions remain about Instagram violence surge
Anyone who spends a significant amount of time on Instagram knows that it isn't unusual sometimes to see a Reel pop up in your feed that seems out of place.
As precise as the algorithm is, it isn't out of the ordinary for an account geared toward sports content, for example, to see Reels falling outside of that realm.
Related: Meta’s recent layoffs take an unexpected turn
However, what is most concerning about the glitch that Meta apologized for and says it fixed is that some of the offending images should never have been on the platform in the first place.
Meta says that it removes "the most graphic content" regarding violence and puts a warning label on content that is violent but doesn't reach the "most graphic" threshold.
Meta did not respond to a request for comment about why videos that crossed the graphic threshold were not at least given a label before being shown in user feeds.
The amount of violent imagery on the internet, in general, is untractable and Meta, to it's credit, has been proactive in policing its own platform.
More tech
- China fires back at more than just Google after Trump tariffs
- Tech stock CEOs have surprising take on upstart AI rival
- Elon Musk shocks the world with government office changes
The company says it removed more than 10 million pieces of violent and graphic content from Instagram just between July and September of last year.
But if users' feeds this week are any indication, Meta has a lot more work to do to make its platform cleaner.
Related: Veteran fund manager unveils eye-popping S&P 500 forecast