
In This Story
Meta’s (META) Mark Zuckerberg built his career on the idea that you have friends. Now he’s concerned that you don’t have enough, so he’d like to invent some for you.
The co-founder of Facebook recently told podcaster Dwarkesh Pate that “the average American has fewer than three friends.” He’s likely referring to a 2023 Pew Research study that found that 40% of adult Americans have three or fewer friends; that number includes 8% who said they had “none,” while 38% have five or more.
Zuckerberg thinks that AI chatbots are the solution to the Bowling Alone epidemic. It’s hard sometimes to connect with real people in real life; chatbots are always there.
The Meta CEO also touted the potential in AI therapists, an idea that’s been around for 60 years, ever since the development of the ELIZA program at MIT. Recent developments have been promising: AFebruary 2025 academic paper in PLOS said said that “responses written by ChatGPT were rated higher than the [human] therapist’s responses.”
The exceptions, though, can have fatal consquences. There are currently two lawsuits pending against Character.AI, both involving cases where a chatbot posed as a therapist; one involving a 14-year-old boy who died by suicide, the other with a 17-year-old austistic boy who became violent toward his parents. One of the parents told the New York Times she believes these AI tools should be tested in clinical trials and overseen by the FDA.
Having Zuckerberg tout Meta’s chatbots as a psychological panacea is problematic for some researchers, who note the company’s dubious history with data privacy dating back to the Cambridge Analytica scandal and user-tracking across various sites. Users already give Meta with a wide swath of personal information — but the kind extracted from an inquisitive chatbot raises all kinds of red flags.
“Recent research shows we are as likely to share intimate information with a chatbot as we are with fellow humans,” wrote Uri Gal, professor of business information systems at the University of Sydney, in a column for The Conversation. Some use it to talk to the dead. Some develop cult-like spiritual delusions, reported Rolling Stone this week.
“The personal nature of these interactions makes them a gold mine for a company whose revenue depends on knowing everything about you,” Gal wrote, so “when Meta AI says it is ‘built to get to know you,’ we should take it at its word and proceed with appropriate caution.”
Last week, OpenAI’s ChatGPT apologized for an update that made its chatbot praise anything the user said, which had the result of providing extremely bad advice. In a statement, OpenAI said it was “refining core training techniques and system prompts to explicitly steer the model away from sycophancy.”
The Wall Street Journal (NWSA) reported in April that Meta purposely removed guardrails in its chatbot products in 2023 to allow for the kind of role-play that much less conservative sites were pushing. Meta staffers were concerned about the chatbots’ new capacity for not just engaging in fantasy sex, but encouraging it. Most alarmingly, this can happen regardless of the age of the user — or the alleged age of the chatbot, as in the “Submissive Schoolgirl” character.
While Meta dismissed the Journal’s reporting, it did enforce age restrictions and made other alterations to its products after being presented with the findings. In a statement to the publication, the company wrote: “We’ve now taken additional measures to help ensure other individuals who want to spend hours manipulating our products into extreme use cases will have an even more difficult time of it.”
Meghana Dhar, a former Instagram exec, said on social media last week that Meta AI’s chatbots are “a data goldmine” and a reaction to declining engagement. “These apps need dopamine hits for its users,” she said, and chatbots are an attempt to rebuild “Meta’s slipping grip on the attention economy.”
Yet Meta’s first quarter of 2025 had $42 billion in sales, a growth of 16% — ahead of expectations — and showed no signs of slipping. Is it lonely at the top?