Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - US
The Guardian - US
Technology
Johana Bhuiyan

Social media and online video firms are conducting ‘vast surveillance’ on users, FTC finds

a side-by-side image of Twitter's bird logo and Facebook's thumbs up logo
Twitter/X and Facebook are among the companies alleged to have engaged in mass surveillance, according to a FTC report. Composite: AP, AFP via Getty Images

Social media and online video companies are collecting huge troves of your personal information on and off their websites or apps and sharing it with a wide range of third-party entities, a new Federal Trade Commission (FTC) staff report on nine tech companies confirms.

The FTC report published on Thursday looked at the data-gathering practices of Facebook, WhatsApp, YouTube, Discord, Reddit, Amazon, Snap, TikTok and Twitter/X between January 2019 and 31 December 2020. The majority of the companies’ business models incentivized tracking how people engaged with their platforms, collecting their personal data and using it to determine what content and ads users see on their feeds, the report states.

The FTC’s findings validate years of reporting on the depth and breadth of these companies’ tracking practices and call out the tech firms for “vast surveillance of users”. The agency is recommending Congress pass federal privacy regulations based on what it has documented. In particular, the agency is urging lawmakers to recognize that the business models of many of these companies do little to incentivize effective self-regulation or protection of user data.

“Recognizing this basic fact is important for enforcers and policymakers alike because any efforts to limit or regulate how these firms harvest troves of people’s personal data will conflict with their primary business incentives,” FTC chair Lina Khan said in a statement. “To craft effective rules or remedies limiting this data collection, policymakers will need to ensure that violating the law is not more lucrative than abiding by it.”

The FTC is also asking that the companies mentioned in the report invest in “limiting data retention and sharing, restricting targeted advertising, and strengthening protections for teens”.

Notably, the report highlights how consumers have little control over how these companies use and share their personal details. Most companies collected or inferred demographic information about users such as age, gender and language. Some collected information about household income, education and parental and marital status. But even when this type of personal information was not explicitly collected, some companies could analyze user behavior on the platform to deduce the details of their personal lives without their knowledge. For instance, some companies’ user interest categories included “baby, kids and maternity”, which would reveal parental status, or “newlyweds” and “divorce support”, which would reveal marital status. This information was then used by some companies to tailor what content people saw to increase engagement on their platforms. In some cases, that demographic information was shared with third-party entities to help target them with more relevant advertisements.

Whatever product was in use, it was not easy to opt out of data collection, according to the FTC. Nearly all the companies said they fed personal information to automated systems, most often to serve content and advertisements. On the flipside, almost none of them offered “a comprehensive ability to directly control or opt-out of use of their data by all Algorithms, Data Analytics, or AI”, per the report.

Several firms say it’s impossible to even compile a full list of who they share your data with. When the companies were asked to enumerate which advertisers, data brokers or other entities they shared consumer data with, none of these nine firms provided the FTC with a full inventory.

The FTC also found that despite evidence that children and teens use many of these platforms, many of the tech companies reported that, because their platforms are not directed at children, they do not need different data-sharing practices for children under 13 years of age. According to the report, none of the companies reported having data-sharing practices that treated the information collected about and from 13- to 17-year-olds via their sites and apps differently than adult data, even though data about minors is more sensitive.

The FTC called the companies’ data-minimization practices “woefully inadequate”, finding that some of the companies did not delete information when users requested it. “Even those Companies that actually deleted data would only delete some data, but not all,” the report stated.

“That is the most basic requirement,” said Mario Trujillo, a staff attorney at the Electronic Frontier Foundation. “The fact that some weren’t doing that even in the face of state privacy laws that require it proves that stronger enforcement is needed, especially from consumers themselves.”

Some of the firms have disputed the report’s findings. In a statement, Discord said the FTC report was an important step but lumped “very different models into one bucket”.

“Discord’s business model is very different – we are a real-time communications platform with strong user privacy controls and no feeds for endless scrolling. At the time of the study, Discord did not run a formal digital advertising service,” Kate Sheerin, Discord’s head of public policy in the US and Canada, said in a statement.

A Google spokesperson said the company had the strictest privacy policies in the industry. “We never sell people’s personal information and we don’t use sensitive information to serve ads. We prohibit ad personalization for users under 18 and we don’t personalize ads to anyone watching ‘made for kids content’ on YouTube,” said Google spokesperson, José Castañeda.

The other firms either did not provide an on-the-record comment or did not immediately respond to a request for comment.

However, if companies dispute the FTC’s findings, the onus is on them to provide evidence, says the Electronic Privacy Information Center (Epic), a Washington DC-based public interest research organization focused on privacy and free speech.

“I used to work in privacy compliance for companies, and let’s just say I believe absolutely nothing without documentation to back up claims,” said Epic global privacy counsel, Calli Schroeder. “And I agree with the FTC’s conclusion that self-regulation is a failure. Companies have repeatedly shown that their priority is profit and they will only take consumer protection and privacy issues seriously when failing to do so affects that profit.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.