Mark Zuckerberg’s extraordinary announcement via an Instagram Reel has sent shockwaves through the online safety community. He has decided to overhaul how Facebook, Instagram, Threads, and WhatsApp go about their content moderation work and, in the case of the moderators themselves, physically move them from (liberal) California to (less liberal) Texas. He is replacing Fact Checkers with Community Notes - a crowd-sourced mechanism to “correct” misinformation, which, of course, has its own issues.
While Meta’s CEO began with a jaunty “Hey, everyone!” it became clear that he was speaking to two individuals who will have much to say about the future of Facebook: the President-elect and Elon Musk. Zuckerberg gave a shout-out to X (formerly Twitter) and simply took the title of Musk’s Community Notes as his own. And, of course, there is a broader audience that Meta is trying to appeal to and that is the Republican held House and Senate as well as the Justice Department.
It is this last group that might be of greatest concern to Zuckerberg as the threat of antitrust lawsuits could well loom in the new Administration. Yes, he was making changes to the moderation system that many on the right feel unfairly censors their voices. But in sacrificing all the gains that have been made in filtering out mis- and disinformation, Zuckerberg hopes to garner favor with lawmakers and Trump himself in keeping his empire intact.
While the idea of a crowd-sourced solution to the notoriously challenging task of moderating online speech—much like Wikipedia’s reliance on volunteer editors—may seem appealing, both X and now Meta are discovering that Community Notes introduce a whole new set of challenges. There is the risk of bias and groupthink, where a majority of loud and persistent voices could very well drown out a minority viewpoint, that could quite likely be the correct one. Notes can amplify misinformation simply by receiving more votes or thumbs up than, say, a CDC statement on vaccines. And then there’s the anonymous nature of the new system, leading to a lack of accountability.
When we began in 2007, FOSI’s stated mission was to protect kids while protecting free speech on the Internet. We advocated for parental controls and online safety tools that would be a less intrusive way to filter out unwanted content rather than government censorship, leaving adults to speak freely online. While the pendulum swings back and forth between the protection of kids and free expression and between safety and privacy, we feel that the platforms have done a moderately good job of maintaining a difficult balancing act. It doesn’t come cheap. Many tens of thousands of content moderators have been hired over the past decade or so just to keep the sites from overflowing with hateful content, adult material, and misinformation.
But this retreat by Meta which emulates the chaotic descent experienced on X, portends for a much less safe and civil online experience, especially for kids. I hope that Zuckerberg’s “time to get back to our roots” is short-lived and even he sees the need to moderate his platforms if only for the sake of his three kids.
Your information will be subject to a different privacy policy that we recommend you review. FOSI has no control over the content of an external site.