Hi Quartz members!
This week, a federal judge in California ruled that big tech companies had to face the latest spate of litigation against them: lawsuits filed by families, schools, and state governments, all arguing that social media is deliberately addictive, in a way that harms children and young adults.
Yvonne Gonzalez Rogers, a US district judge in Oakland, issued a 52-page ruling (pdf) against four big companies: Meta, the parent of Facebook and Instagram; Alphabet, which owns YouTube and Google; ByteDance, the owner of TikTok; and Snap, which operates Snapchat. The companies were not responsible for content on their apps posted by any third party, Rogers said, nor were they constrained to limit how much time people spent on such apps. Crucially, she also exempted the apps’ algorithms, meant to maximize engagement (or, as the plaintiffs claim, addiction), from the purview of the lawsuits.
But Rogers insisted that the litigation still addressed key “defects” built into the platforms: weak parental controls and age verification processes, for instance. “For example, parental notifications could plausibly empower parents to limit their children’s access to the platform or discuss platform use with them,” Rogers wrote. Her verdict pushes the tech giants further into battle against one of the most remarkable legal campaigns of the digital era.
ONE BIG NUMBER
42: The number of US states, in addition to the District of Columbia, that are suing Meta over its addictive social media features. These features, the litigants claim, have “profoundly altered the psychological and social realities of a generation of young Americans,” in pursuit of profit.
360-DEGREE PUSHBACK
While a huge majority of US states are suing Meta for the effects of its social media platforms upon children and teens, nearly 200 school districts are suing Meta, ByteDance, Snapchat and Alphabet for the same reason. More than 2,000 families are suing as well. The US Senate, meanwhile, is holding hearings on social media and teen mental health. And earlier this year, the US Surgeon General issued a health advisory about the deleterious effects of social media on children and adolescents.
Among the most remarkable testimonies to the Senate came from Arturo Bejar, a former Facebook engineer. Bejar had left Facebook in 2015, believing that the company was on the right track to make its platform safe for children and teens. Then, as he told the Senate:
A few years later, my 14-year-old daughter joined Instagram. She and her friends began having awful experiences including repeated unwanted sexual advances, harassment. She reported these incidents to the company and it did nothing.
When Bejar returned to Meta as a consultant in 2019, he discovered that the safety features he had helped to develop earlier had been suspended. Any new features that were built in, mostly as a response to public demand, looked to be placebos to placate regulators. He could only conclude three things, he said.
One: Meta knows the harm that kids experience on their platform and executives know that their measures fail to address it. Two, there are actionable steps that Meta could take to address the problem, and three, they’re deciding time and time again to not tackle these issues.
The combination of the Senate hearings and the thousands of lawsuits from school districts, states, and families represents the most concerted effort yet to get social media companies working harder to protect their young users. But perhaps the item of most crucial interest is the approach that courts take toward social media algorithms. Rogers may have exempted the apps’ algorithms from the scope of the litigation for now, but it’s becoming clearer that the algorithms themselves—black boxes that even their parent companies may not know fully, as in the case of AI—have too few guardrails.
The US may not be prepared to go as far as appointing an algorithm regulator—a body that functions as the FDA does for drugs or the FAA for airplanes—the way the Netherlands has done. But as this raft of lawsuits shows, some sort of reckoning with powerful algorithms is in the offing for US authorities as well.
QUOTABLE
“We had a conference six months ago working on this, we had over 100 people there. That’s tobacco-suit level, opioid-suit level commitment.”
— Jonathan Skrmetti, the Republican attorney general of Tennessee, in an interview with the Wall Street Journal in late October, discussing the bipartisan alliance of states bringing lawsuits against tech companies
TRUST IN ANTITRUST
The same week Rogers ruled that social media companies had to face the lawsuits against them in court, Nepal banned TikTok. The government didn’t single out addiction as a reason, but its own rationale flowed from the same sense of unease about the effects of social media platforms on its citizens. The platform was “disrupting social harmony, goodwill and flow of indecent materials,” Narayan Prakash Saud, Nepal’s foreign minister, said.
Complete bans on any social media platform are rare; the only other countries to have outlawed TikTok altogether are India and Somalia. Numerous governments have proscribed TikTok from official devices used by their staff—which has more to do with their concerns over the security of data on an app owned by a Chinese firm.
While governments do set rules to protect the privacy of users online, they generally do not regulate social media apps for their potential dangers of addiction or mental health problems. In part, that’s because addiction and mental health are difficult to quantify and standardize. So the solution for governments concerned about the impact of social media may lie in another sphere: antitrust.
In a 2021 paper, Fiona Scott Morton, an economist at the Yale School of Management, argued that the world needs more social media platforms, not fewer. In other markets, such as cars or food, brands often flourish by presenting themselves as the safest option. In a competitive social media market, Scott Morton said, tech companies might try to set themselves apart the same way. “More social media sites means I can choose the site that offers me fewer ads, less addiction, more of the content that interests me,” she said. For that to happen, though, governments will have to dissolve monopolies and begin hacking away at anticompetitive practices—and that’s a different battle altogether.
Thanks for reading! And don’t hesitate to reach out with comments, questions, or topics you want to know more about.
Have a sociable weekend!
—Samanth Subramanian, Weekend Brief editor