In This Story
Meta, the parent company behind Facebook and Instagram, is facing a new investigation from European authorities who say the company hasn’t done enough to protect children on its social media platforms.
The European Commission on Thursday said the probe will determine whether Meta has breached the European Union’s Digital Services Act (DSA), adding its concerns that Facebook and Instagram’s algorithms may “may stimulate behavioral addictions in children” and create what it calls “rabbit-hole effects.” The regulatory arm of the E.U. is also concerned that Menlo Park, California-based Meta isn’t doing enough to verify minors’ ages and prevent them from accessing inappropriate posts.
“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram,” Thierry Breton, the commissioner for internal markets at the European Commission, said in a statement. “We are sparing no effort to protect our children.”
The investigation will determine Meta’s compliance with DSA obligations regarding the design of Facebook and Instagram, which the commission says ma exploit “the weaknesses and inexperience of minors” and cause addictive behavior. The commission is also evaluating Meta’s obligations to ensure that minors have a high level of privacy, safety, and security on its social media platforms, expressing concern over the default settings enabled on minors’ accounts.
The investigation will allow the commission to advance its enforcement of the DSA and adopt interim measures or non-compliance decisions. It can also accept commitments made by Meta to remedy their concerns.
Meta in a statement said it has spent a decade developing 50 tools and features designed to protect young people online and ensure they have safe experiences. The company also pointed to its age verification methods, which includes requiring all users to share their age to sign up for an accounts and tools that prevent teens from editing their birthdays to appear as adults.
“This is a challenge the whole industry is facing, which is why we’re continuing to advance industry-wide solutions to age-assurance that are applied to all apps teens access,” a Meta spokesperson said. “We look forward to sharing details of our work with the European Commission.”
Meta and child safety
Meta has faced myriad controversies over contributing to mental health issues among users, especially minors. Several researchers have published findings that Facebook and Instagram users — and, more broadly, users of social media as a whole — have reported feeling more negative feelings about themselves.
The company has allegedly known about its impacts on the health of minors. A March 2020 presentation posted on Facebook’s internal message board, published by The Wall Street Journal in its sweeping 2021 Facebook Files, found that 32% of teen girls “said that when they felt bad about their bodies, Instagram made them feel worse.” A presentation from 2019 noted that “we make body image issues worse for one in three teen girls.”
Meta has faced litigation in the United States for the issues, too. In October 2023, more than thirty U.S. attorneys general sued Meta in federal district court in California, accusing the company of routinely collecting data on children under 13 without parental consent and harming minors’ mental health. Meta’s features, they alleged, are designed to prey on young users’ vulnerabilities, including algorithms designed to “encourage compulsive use.”
“Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” New York Attorney General Letitia James said in a statement at the time.
Lawsuits ahead
Thursday’s announcement marks the second probe opened into Meta in just weeks. On April 30, the commission said it was opening formal proceedings into whether Meta breached the DSA by failing to adequately moderate advertisements and be transparent about so-called “shadowbanning,” along with its decision to phase out the popular social media monitoring tool CrowdTangle.
And in March, the commission began investigating Meta over its “pay or consent” model. The tech giant last year began charging users in the EU for an ad-free subscription to Facebook and Instagram unless they agreed to be tracked and profiled for its advertising business. Desktop users shell out about €10 (nearly $11) per month, while mobile users pay roughly €13 ($14) per month.
Meta Platforms stock fell more than 1% in trading Thursday.