In a habitually lengthy blog post, Facebook founder and CEO Mark Zuckerberg announced on Wednesday (March 6) what looks like a significant policy shift for his company.
“As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms,” Zuckerberg writes.
He outlines his vision for a new kind of social network, focusing on messaging, and how it should be protected by end-to-end encryption—à la WhatsApp; on making Facebook content more ephemeral—à la Stories; and interoperable—à la the way people use Messenger to send SMS on Android phones.
But despite the 3,200 words it took to write the post, there are still a lot of crucial questions Facebook has yet to answer. Here are just some of them:
- Zuckerberg barely mentions advertising in the post, which is the company’s core business. How is Facebook going to advertise on its encrypted products, if at all? How will this affect Facebook’s revenue?
- What will remain public on the platform? If the News Feed stays as it is, how do you combine a “town square” model, in which people share publicly, with a “living room” model, in which people communicate more privately?
- “Private” features are not the same as “privacy,” as Nikhil Sonnad explains in a story about Facebook’s pivot. What does privacy mean to Mark Zuckerberg and Facebook?
- Will Facebook continue to collect the same amount of data from users? How will it use the data? Will it change the way it targets ads?
- If the new platform will be “simpler,” as Zuckerberg says, what will the company nix, or how will it strip down the existing products?
- Zuckerberg acknowledges that Facebook has a reputation problem. How is Facebook going to prove to people that it can be trusted?
- Facebook has focused on groups in recent years, and it’s where a lot of Facebook activity happens every day. How will this pivot affect groups?
- If messages are encrypted, how will the company deal with law enforcement requests for information, and work with governments to prevent bad actors from communicating and spreading their messages on the platforms?
- Will the content that’s supposed to be ephemeral actually be ephemeral—or will it be stored somehow, in case law enforcement compels it in the future?
- How, more generally, will the company police bad content if it is encrypted? The spread of misinformation has been a particularly insidious problem on WhatsApp, which seems to be the model for the new pivot. Zuckerberg said the company is “working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages.” What, exactly, does this mean? Is it only looking at metadata, and will that be enough?
- Zuckerberg says the company will focus on secure data storage. But where is it actually going to build the data centers, and which data is it going to store?