Following yesterday’s intense testimony by Facebook whistleblower Frances Haugen, Mark Zuckerberg decided to at long last speak up about how he views the company’s still-growing scandal. He took to — where else? — his public Facebook page to post a lengthy letter. “I wanted to share a note I wrote to everyone at our company,” he begins.
He first briefly addresses Monday’s widespread Facebook product outage. His comments on the topic are brief, limited mostly to apologies and assurances that the company’s working overtime to ensure it doesn’t happen again.
The bulk of the post is dedicated to Haugen’s testimony and the larger conversation around Facebook’s house-funded research into Instagram’s toxicity for kids and teens. Right away he is dismissive of Haugen’s general view of Facebook’s internal activities. “I’m sure many of you have found the recent coverage hard to read because it just doesn’t reflect the company we know,” he writes. “We care deeply about issues like safety, well-being and mental health.”
We’re not sure what company Zuckerberg is referring to — Facebook employees have been sounding the alarm on the company culture for years now. And that’s only the post’s second full paragraph.
On Meaningful Social Interactions —
Much of Zuckerberg’s post attempts to gut Haugen’s testimony from its core. He waxes poetic about how her arguments are “just not true.”
During her testimony, Haugen returned many times over to the algorithms that power Facebook’s News Feed and Instagram’s discoverability sectors. Zuckerberg calls into question her view on the Meaningful Social Interactions (MSI) model Facebook has been using for a few years now. He says switching to an MSI model “showed fewer viral videos and more content from friends and family.”
This assertion is, on a base level, untrue. The MSI News Feed is, by Facebook’s own admission (though not publicly), best at amplifying divisive content. It is meant to be more personal — and often this translates to being more partisan, more charged, and less “safe.”
Zuckerberg goes on to say that Facebook switched to this MSI model “knowing it would mean people spent less time on Facebook.” He says research suggested it was “the right thing for people’s well-being.” But we’ve seen the research now, Mark, and it’s painfully obvious that Facebook knows its algorithms are not good for people’s well-being. Why bother saying something so easily proven false by Facebook’s own internal documents?
How about that research, Mark? —
Perhaps most frustrating about Zuckerberg’s statement is his repeated assertion that Facebook is committed to being open about its research. This is categorically untrue. Across the board, at every possible opportunity, Facebook gatekeeps its data and internal research. Unless Haugen had combed through Facebook’s networks to pick out these documents before leaving her position, we would never have known anything about Facebook’s research on teens’ mental health.
“If we’re going to have an informed conversation about the effects of social media on young people, it’s important to start with a full picture,” Zuckerberg writes. On this point it seems we agree — though Zuckerberg seems to think Facebook is already doing all it needs to do in order to present that picture. This from the company that was, less than a month ago, caught feeding third-party researchers enormously incomplete data; the company with a research program so fickle it regularly sends experts packing.
Throughout his post, Zuckerberg attempts to argue that it would be “deeply illogical” for Facebook to operate in this manner; that prioritizing harmful content in the pursuit of profits could never make sense from a business perspective. But the tobacco industry operated under this exact model for years and made countless billions. The only thing illogical about it is the deep disregard for human life.