Meta has announced plans to launch Community Notes on its platforms, including Facebook, Instagram, and Threads, starting in the U.S. next week, specifically on March 18, 2025. The company is adopting a crowdsourced fact-checking model similar to the one used by X, leveraging X’s open-source technology, including its algorithm, to power this feature. This move marks a shift away from Meta’s previous reliance on third-party fact-checkers, aiming to empower users to add context to potentially misleading content while emphasizing free expression.
However, Meta has not committed to making its own Community Notes system fully open source, though it is considering this possibility for the future. While both X (formerly Twitter) and Meta (which operates Facebook, Instagram, and Threads) are adopting Community Notes as a crowdsourced fact-checking mechanism, their policies and approaches to content moderation, including the implementation of Community Notes, differ in several keyways.
X pioneered Community Notes, originally launched as Birdwatch in 2021, before being expanded under Elon Musk’s leadership after his 2022 acquisition of the platform. The system relies on users to add context to potentially misleading posts, with notes rated for helpfulness by other contributors. X emphasizes transparency by making its Community Notes algorithm open source, allowing public scrutiny of how notes are ranked and displayed.
.
The platform also publishes daily contributions to Community Notes, enhancing accountability. X uses a “bridging-based” algorithm to ensure notes are rated by a diverse range of contributors, aiming to reduce bias by requiring agreement from users with differing perspectives. This is intended to prioritize notes that are helpful to a broad audience, not just a majority.
X does not allow users to appeal Community Notes directly, but the system is designed to self-correct through community ratings, and X claims it does not write, rate, or moderate notes itself, positioning them as independent of corporate influence. X’s leadership, particularly Elon Musk, has championed Community Notes as a cornerstone of its “free speech” ethos, often framing it as a superior alternative to centralized fact-checking, which Musk has criticized as biased.
Meta has not detailed whether users can appeal Community Notes decisions, potentially reducing user recourse compared to X’s broader transparency measures. This opacity could lead to skepticism about fairness, especially given Meta’s history of centralized moderation. Meta’s implementation of Community Notes is part of a broader policy shift to reduce reliance on third-party fact-checkers, whom Meta has criticized for perceived biases and overreach. This shift aligns with a stated goal of prioritizing “free expression,” but critics argue it may also reflect political pressures, particularly in the context of the new U.S. administration in 2025.
Historically, Meta has adopted a more interventionist approach to content moderation, employing thousands of moderators and partnering with over 80 third-party fact-checking organizations worldwide to label, demote, or remove misleading content. This approach was part of a broader effort to combat misinformation, particularly after events like the 2016 U.S. election and the COVID-19 pandemic, but it drew criticism from conservatives for alleged bias and censorship.
With the 2025 policy shift, Meta is moving toward a less restrictive stance, aiming to “restore free expression” by reducing proactive content removal and simplifying its content policies. This includes ending third-party fact-checking in the U.S., adopting Community Notes, and focusing automated systems on “high-severity violations” like terrorism, child exploitation, and illegal drugs, while dialing back on other areas like hate speech and political content.
Meta’s shift has been framed as a response to cultural and political changes, including user dissatisfaction with perceived over-censorship, but critics argue it may also be a strategic move to align with political pressures, particularly from the incoming Trump administration, which has historically criticized social media moderation as biased against conservative voices.
X places a strong emphasis on transparency in its Community Notes system, with its open-source algorithm and public data on contributions. This allows researchers, users, and critics to analyze how the system works and identify potential flaws, though some studies have questioned its effectiveness in consistently curbing misinformation, especially during rapid news cycles like elections.