Top social media and video streaming companies are facing new scrutiny from the Federal Trade Commission (FTC), which released a new report Thursday accusing the platforms of vastly violating users’ privacy and failing to provide safeguards for kids and teens.
The 129-page report, published Thursday morning, found that several social media and video streaming platforms carried out practices in the last four years that “did not consistently” prioritize consumers’ privacy.
FTC Chair Lina Khan said the report determined these platforms “harvest an enormous amount of Americans’ personal data and monetize it,” for billions of dollars every year.
“While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking,” she wrote in a release.
Those surveyed in the report included Meta Platforms, YouTube, X, Snapchat, Reddit, Discord, WhatsApp, Amazon — the owner of gaming platform Twitch — and ByteDance, the owner of TikTok. The Hill reached out to the companies for further comment.
To carry out the examination, the FTC in 2020 asked the nine companies for information on how they collect and track users’ personal and demographic information and whether they apply their content algorithms or data studies to this information. The companies were also asked how they determine which ads and other content are shown to users, and how their platforms might impact the youth.
The companies’ data management and retention practices were “woefully inadequate,” the FTC said, noting the companies have collected troves of data “in ways consumers might not expect.” This includes gathering data through online advertisement and buying information from data brokers, the report found.
Some of the companies are increasingly using the data for their artificial intelligence systems, though users were often left in the dark went it came to how their data was involved in these products, the FTC report stated.
The FTC noted each of its findings may not apply to each company, maintaining the report is instead a general summary of nearly four years of research.
The agency report separately looked at the impact of these practices on children and teens, finding they put such users at a “unique risk.” FTC staff pointed to social media algorithms, which may push harmful content such as dangerous online challenges that can prompt negative health consequences for children and teens, as a particular danger for young people.
“Several firms’ failure to adequately protect kids and teens online is especially troubling. The Report’s findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices,” Khan wrote Thursday.
The report comes as the privacy of users, especially children and teens, has captured the attention of various lawmakers and child safety advocates on Capitol Hill. The report came out a day after a House panel advanced the Kids Online Safety Act (KOSA), pushing forward legislation intended to boost online privacy and safety for children.
KOSA would create regulations governing the kinds of features tech and social media companies can offer kids online and aims to reduce the addictive nature and mental health impact of these platforms.
While KOSA received overwhelming support in the Senate and advanced through the House committee, the legislation could face challenges on the full House floor. Some Republicans are concerned the bill could give the FTC “sweeping authority,” and the potential censorship of conservative views, a House leadership source told The Hill this week.