Meta is introducing new parental supervision tools to Facebook and Instagram, as it looks to add privacy and security features for minors just one month after the US surgeon general said social media poses a “profound risk” to teenagers’ mental health.
However, like Meta’s previous parental supervision updates, teenage users must opt into the new features before they can take effect, making it unlikely that the new protections will be widely adopted.
If a minor does opt in, the new features give parents the ability to track how much time their child spends on the platform and set time limits on usage. Parents will also be able to monitor who their child follows. They won’t be able to read private messages.
These updated tools will be part of the Meta Family Center, first introduced in March 2022 with a series of similar opt-in supervision tools for parents to monitor their children.
The update also introduced new rules on how users receive private DMs from accounts they don’t follow, including limiting message requests to text only, so accounts can’t send photos or videos until the recipient has accepted the invitation.
In an email to Quartz, a spokesperson for Meta declined to provide numbers for how many teenage users had opted into the supervision tools in the year and a half since it was first introduced.
A push to severely limit social media use for minors
The surgeon general’s report is not the only pressure Big Tech companies like Meta and Google have felt in recent months. Earlier this spring, a bipartisan group of senators attempted to push through legislation that would ban TikTok, the mega-popular social media app used by more than 150 million Americans.
While the legislation stalled, a number of state legislatures across the country have passed similar bans on the video-sharing social media platform, as well as proposing wide-ranging restrictions on the use of any social media by teenagers.
Frances Haugen—a former Meta employee who leaked internal documents proving that Meta was aware its products were contributing to the teen mental health crisis—said she understood the motivation behind the bans but cautioned against overreacting to the issue.
“The thing that I get really nervous about is we’re reaching a point where there’s enough kids being harmed that people are going to start making emotional decisions,” Haugen said in an interview with Quartz.
Instead, Haugen suggested a middle ground between Meta’s lukewarm approach and total bans. She pointed to legislation in Europe meant to make social media use safer as well as reforms to discourage unhealthy behavior, such as slowing down loading times for teen users late at night.
Teen social media use, by the numbers
95%: Share of US teenagers who report using a social media platform.
3.5 hours: Average time US teenagers spend on social media each day.
64%: Share of teens who report being exposed to hate-based content on the internet.
2/3: Portion of US teens who use TikTok, making it the second most popular social media for young people after YouTube.
16%: Percentage of teens who say that they use TikTok “almost constantly.”
95%: Percentage of American teens, ages 13-17, who report they have constant access to a smartphone.
1/3: Share of teen girls who felt worse about their bodies after using Instagram, according to leaked internal documents at Meta.
72%: Percentage of teenagers who report being cyberbullied.
Editorial note (6/28 11:25am): The article was updated with an additional feature of Meta’s new privacy protections.
Related stories:
⛏️ Arkansas says teens need parental permission to use social media, but not to get a job
🇺🇸 A US senator is slowing down the process to ban TikTok, citing free speech concerns
📱Montana’s TikTok ban is unconstitutional and makes no sense