Facebook parent company Meta is facing a massive lawsuit alleging the company is knowingly hurting young people, including by accepting underage use. Court documents suggest this is true — and that executives have been egregiously callous about harms to kids.
A twelve-year-old boy looks at a iPad screen. (Matt Cardy / Getty Images)
Last fall, forty-two US states plus the District of Columbia sued Meta, the corporate parent of Facebook and Instagram, for purposefully designing its popular apps to maximize user time while concealing harm to users, especially young people, to keep them using the platform. The states also allege that the company is aware of a large number of underage users and has tacitly accepted their use of the platform.
This action is separate from the other lawsuit brought against Meta by the Federal Trade Commission and forty-six states, for monopolizing social media through its acquisitions of WhatsApp and Instagram, which is currently awaiting a trial date. The penalties in this new case on harms to youth are likely in the range of hundreds of millions of dollars, possibly more, on top of damage to the company’s already battered reputation.
The states also want injunctions to force Meta to desist from some of its more disreputable practices. Tech giants have tended to prevail in legal settings, but the volume of cases and the hardening of bipartisan attitudes toward the social media behemoth suggest that the tide may be slowly turning.
As someone who has read and written extensively about the legal challenges to Big Tech, these lawsuits have involved some of the most abjectly incriminating documents I have seen entered into the public record. The recent unredacted documents in particular are full of stunningly embarrassing conversations among the most senior Meta executives, with CEO Mark Zuckerberg himself smelling especially bad.
“Teens Are Insatiable”
The psychological downsides of heavy social media use cited by the states include insomnia, anxiety, and interference with daily life. Heavy use of notifications and the “infinite scroll” of the app feeds create what the states call “psychologically manipulative product features.”
The states allege that Meta has ensnared users unfairly and “addicted” them to the platform. That claim is somewhat unclear on the face of it — don’t we all use social media more than we probably should, and is that really Meta’s fault? But there’s an established understanding in the law that youth comes with certain vulnerabilities and that younger people should not be treated as fully autonomous decision-makers. This understanding is reflected in legal requirements ranging from minimum age restrictions for buying tobacco to age-of-consent laws.
Corporate America by now has a long record of trying to exploit these vulnerabilities that the government has tried to remedy with measures like advertising bans for cigarettes. But social media designers are notorious for designing platforms with algorithms that learn a user’s interests quickly and feed them an endless stream of that content, consequences be damned.
Perhaps most incriminating in this regard is a 2020 internal Meta presentation on product design, saying the company should take advantage of long-recognized psychological weaknesses of young users. Noting they are “predisposed to impulse, peer pressure, and potentially harmful risky behavior,” the presentation argued that rather than designing its platforms to avoid harming minors, it should take advantage of their vulnerabilities.
“Teens are insatiable when it comes to ‘feel good’ dopamine effects,” the presentation reads. It goes on to note that the legacy Facebook platform itself is already designed to regularly provide the “dopamine hit” of social gratification, building an enduring habit.
But it is unclear whether the states’ legal claim of user harm will prevail in court, since users technically give informed consent when they create accounts, and there is little legal precedent for holding companies responsible for overuse of their products by consumers.
The stronger part of the case is likely to be the allegation that Meta platforms are riddled with underage users. Despite being notified of over a million accounts set up by users under thirteen, the company has only disabled a fraction of them, and the states call underage use of Instagram “an open secret” within the company. The platform also continued to collect personal data on underage users, like gender and birth dates, which are likely a violation of the Children’s Online Privacy Protection Act (COPPA). That law requires companies to get parental permission before collecting data on minors.
Internal documents in the unsealed complaint are pretty frank, including a company diagram depicting how Meta monitored the proportion of users under thirteen on Instagram daily. The company documents describe an algorithm estimate of up to four million underage accounts and include internal diagrams showing strong use by eleven- and twelve-year-olds. Meta employs weak age-verification measures, requiring mere user age attestation, and employing only a small number of staff dealing with reported underage use. Instagram has a backlog of millions of reported underage accounts requiring review.
Not helping the case is recent reporting, led by the Wall Street Journal, on Instagram’s video feature, Reels, which is designed to maximize user interaction much like the rest of the online platforms. Starting with the simple observation that many accounts of young women influencers, like gymnasts or product reviewers, have large numbers of adult men among their followers, the Journal created dummy accounts to follow these same young women.
The dummy accounts were then encouraged by the algorithm to follow other adult users who post extensively about sex with adults and young people, and then shown video after video of adult sex, young people engaged in flirtatious activity, and ads for national brands. The paper’s reporters concluded that “while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them.”
The radioactive nature of sexualizing youth is such that this specific subject will likely be addressed by the company, especially if it leads to liability under COPPA. Market forces, it seems, are unlikely to stop this behavior — the Journal read company documents showing that “the company’s safety staffers are broadly barred from making changes to the platform that might reduce daily active users by any measurable amount.”
Another interesting feature of the states’ complaint concerns the decision-making related to these problems, especially the abuse of notifications to keep users leashed to their devices and the platforms. The company’s own chief marketing officer supported a proposal to cut back on the platform’s heavy use of notifications, writing in an email thread, “Fundamentally I believe that we have abused the notifications channel as a company.”
The chief of product agreed — only to be overruled by Zuckerberg himself. Another executive said later that the company metric of daily usage “is a bigger concern for Mark right now than user experience.”
This imperious attitude comes from the platform’s still-gigantic size and its apparent ability to absorb punishment without much consequence. We need only recall the infamous Cambridge Analytica personal-data-vacuuming scandal in 2018–19, when the FTC found Facebook guilty of violating the privacy-protection terms of its 2011 consent decree and hit the company with a giant multibillion-dollar fine. As was reported at the time, the fine amounted to a mere 9 percent of Facebook’s revenue for that year alone.
Even as it fights multiple legal challenges under the weight of incriminating documents, the company isn’t exactly shying away from political controversy. Meta is currently blocking news in Canada in protest of the Online News Act, which would require social media companies to pay news organizations for news shared on their platforms. After four months, Canadian use of the platform has barely declined, but lawmakers and regulators are increasingly peeved at the company. In the United States, the company recently announced it will allow political ads to question the integrity of the 2020 presidential election, but not the coming 2024 race.
After years of declining growth and profits, partially due to the company’s spending an ocean of cash to develop its Metaverse flop, the company is back in the green. Revenue and profits are both up by double digits in recent quarters after Zuckerberg laid off over twenty-one thousand workers.
But even the affection of investors and Wall Street may not be enough for a company with so many incriminating remarks on the public record. Public lawsuits against the big platforms usually fizzle or yield only modestly large economic penalties like billion-dollar fines. Whether Meta can continue that winning streak depends on just how much embarrassing corporate dirty laundry district judges are willing to go on ignoring.Original post