Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Politics
Elizabeth Nolan Brown

Musk Says He Bought Twitter 'To Help Humanity,' Pledges Not To Let It Become a 'Free-for-All Hellscape'

"The bird is freed." Elon Musk's purchase of Twitter is now official, and the Tesla CEO and world's richest man has already started making changes. Musk has reportedly fired some of the social media company's top executives.

"On Thursday night, Mr. Musk closed his $44 billion deal to buy the social media service," reports The New York Times. "He also began cleaning house, with at least four top Twitter executives — including the chief executive and chief financial officer — getting fired on Thursday."

"Twitter also is expected to become private Friday, dissolving its current board of directors and ending public trading of its stock," notes The Washington Post.

Musk has publicly pledged to make a number of changes, including restoring former President Donald Trump's account and making the site more open to diverging viewpoints.

Yesterday, Musk—who has been critical of the way Twitter "censored free speech"—tweeted "the bird is freed."

Musk also offered more in-depth commentary about his "motivation in acquiring Twitter," stating that the main reason "is because it is important to the future of civilization to have a common digital town square, where a wide range of beliefs can be debated in a healthy manner, without resorting to violence."

"There is currently great danger that social media will splinter into far right wing and far left wing echo chambers that generate more hate and divide our society," Musk continued. "In the relentless pursuit of clicks, much of traditional media has fueled and catered to those polarized extremes, as they believe that is what brings in the money, but, in doing so, the opportunity for dialogue is lost. That is why I bought Twitter."

Musk said he wasn't in it for the money but "to try to help humanity."

 

 

He promised not to let Twitter become a "free-for-all hellscape, where anything can be said with no consequences!"

It will be interesting to see how Musk manages to walk the line between free speech and "free-for-all hellscape." There is certainly room for social media platforms to do a better job of this than they have been in recent years, as pressure from the government and activists has corresponded with crackdowns on a growing swath of content. But while many complaints about content moderation can surely be attributed to political pressure, part of the problem stems simply from the difficulties of trying to police a staggeringly massive level of activity.

"Sometimes it's difficult to get across to people 'the scale' part when we talk about the impossibility of content moderation at scale," as Mike Masnick wrote at Techdirt last year. "It's massive. And this is why whenever there's a content moderation decision that you dislike or that you disagree with, you have to realize that it's not personal. It wasn't done because someone doesn't like your politics. It wasn't done because of some crazy agenda. It was done because a combination of thousands of people around the globe and still sketchy artificial intelligence are making an insane number of decisions every day. And they just keep piling up and piling up and piling up."

That might not be true in high-profile cases, like with Trump, or the Hunter Biden laptop story, etc. But in general, much of what gets attributed to deliberate action and animosity is much more likely a product of technology, human error, or nonideological differences in human judgment happening at the lower levels of Twitter's power chain.

Musk can easily fix the high-profile cases, but the rest will be much more difficult.

And Musk's stated goal of making Twitter "the most respected advertising platform in the world" may also run into conflict with making it a haven for free speech. Companies are notoriously skittish about what content appears around their ads and have long tried to exert control over traditional media platforms (like magazines and TV stations) if content isn't to their liking.

Still, I'm excited—if not terribly optimistic about his chance of success—to see Musk try to effect some wide-scale change on Twitter.


FREE MINDS

Charlottesville cracks down on city employee speech. Charlottesville, Virginia, "effectively bars city employees from commenting as private citizens on a broad range of matters of public concern, including questions of community safety and governmental efficiency," warns the Foundation for Individual Rights and Expression (FIRE).

Under Charlottesville's new policy, employees must "refrain from conduct, on- and off-duty, that will undermine City government objectives or impair the proper performance of governmental functions." Barred conduct includes anything that interferes with "discipline or harmony among co-workers" or "undermines close working relationships that are essential to the effective performance of an employee's job duties."

These broad directives could allow city employees to be punished or fired for a wide range of speech and conduct. "The policy means that city employees now exercise their First Amendment rights at their own risk, even off the clock," suggests FIRE:

Under the policy's broad terms, city employees could now face discipline for speaking out on their own time—including criticizing dangerous or unsafe working conditions, disagreeing with biased or ineffective departmental decision-making, or simply voicing opinions their bosses don't like.

Let's say you're a city employee. Your boss and coworkers all have "Blue Lives Matter" bumper stickers on their cars. How comfortable will you be posting an invitation to a Black Lives Matter event on your Twitter account? Could that "impair harmony" with colleagues? It might, so you decide against it. The result: The policy has chilled your speech.

Or imagine you're a Charlottesville police officer. You're concerned that officers with little experience are receiving leadership roles. You voiced your concerns on Facebook—and now you've violated the personnel policy, even though you were discussing an issue that impacts community safety.


FREE MARKETS

Judge dismisses "blackout challenge" lawsuit against TikTok. A federal court has dismissed a lawsuit against TikTok brought by a parent whose daughter died of asphyxiation after attempting a "blackout challenge" she had seen people do in TikTok videos. The girl's mother, Tawainna Anderson, contended that TikTok and parent company ByteDance were liable because the app's "predatory and manipulative" algorithm "pushed exceedingly and unacceptably dangerous challenges."

Anderson's lawsuit echoes another suit against TikTok, which Reason's Joe Lancaster wrote about in July:

Earlier this month, the respective parents of two girls filed suit against social media platform TikTok and its parent company, ByteDance. According to the suit, the girls died while imitating behavior from videos that TikTok knew were dangerous and should have taken down. It asks for a jury trial and unspecified monetary damages….

Unique among platforms, TikTok features "challenges," in which users film themselves doing certain tasks or activities and then encourage others to do the same. Typically they will all use a unique hashtag to make the videos easily cataloged.

In this case, the parents allege that their children, an 8-year-old girl and a 9-year-old girl, each died while taking part in the "blackout challenge" in which participants film themselves holding their breath or asphyxiating until they pass out. In fact, in just over 18 months, at least seven children have died after apparently attempting the challenge.

Lancaster noted then:

Even if TikTok were directing dangerous content to people's FYPs, it's not clear whether the platform could be found legally liable at all. Section 230 of the Communications Decency Act protects online services from legal liability for content posted by its users. While TikTok may host the videos, the law states that it cannot "be treated as the publisher or speaker" of any content provided by a user.

The lawsuit tries to circumvent this restriction by claiming that by offering images, memes, and licensed music that users can incorporate into their videos, "TikTok becomes a co-publisher of such content." This is a common misreading of Section 230: In fact, there is no distinction between a "publisher" and a "platform" in the eyes of the law. Besides, a platform cannot be the same as a user solely because it supplies the user with tools to make videos.

Anderson's lawsuit also tried to circumvent Section 230, stating that it was not seeking to hold TikTok "liable as the speaker or publisher of thirdparty content" but instead "to hold [them] responsible for their own independent conduct as the designers, programmers, manufacturers, sellers, and/or distributors" of the app and its algorithms.

Judge Paul S. Diamond of the U.S. District Court for the Eastern District of Pennsylvania dismissed this argument. Anderson "cannot defeat Section 230 immunityby creatively labeling her claims," wrote the judge. "Although Anderson recasts her content claims by attacking Defendants' 'deliberate action' taken through their algorithm…courts have repeatedly held that such algorithms are 'not content in and of themselves.'" More here.


QUICK HITS

• Yes, you can yell "fire" in a crowded theater.

• The U.S. economy grew sluggishly over the summer, per the latest government data. Gross domestic product rose 0.6 percent, for a 2.6 percent annual rate of growth.

LOL: "San Francisco is considering softening a ban on publicly funded contracts and travel in 30 states that don't share its liberal values on issues such as abortion and transgender rights, as officials question whether the prohibition is having any effect beyond likely costing the city tens of millions of dollars."

• Colorado voters will get to decide whether to decriminalize psychedelics. "Colorado's Proposition 122 would decriminalize noncommercial activities related to the use of 'natural medicine' by adults 21 or older," notes Reason's Jacob Sullum. "It defines 'natural medicine' to include psilocybin, psilocyn (another psychoactive component of 'magic mushrooms'), dimethyltryptamine (DMT, the active ingredient in ayahuasca), ibogaine (a psychedelic derived from the root bark of the iboga tree), and mescaline (the active ingredient in peyote)."

• Facebook's fortunes continue to plummet:

• We're siccing anti-terrorism units on people who pay for sex now.

• "California will soon decriminalize jaywalking, which will do away with enforcement of a policy that critics say allows police to punish pedestrians with needlessly expensive fines, often in racially-motivated ways," writes Reason's Emma Camp.

• Giving massages without a license can get you felony criminal charges in New York.

The post Musk Says He Bought Twitter 'To Help Humanity,' Pledges Not To Let It Become a 'Free-for-All Hellscape' appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.