Meta Starts Testing User Fact-Checking in the US

Meta Starts Testing User Fact-Checking in the US
Depositphotos

Meta Platforms plans to start testing crowd-sourced fact-checking in the US across Facebook, Threads, and Instagram. The move comes after the company ended its third-party fact-checking program in January.

The company’s community notes content moderation tool will use an open-source algorithm from X to allow users to flag and rate false or misleading information across its social media sites. X relies on members to add context to posts that may be misleading or incorrect, a model Meta CEO Mark Zuckerberg embraced to replace third-party fact-checkers.

The decision to ditch the fact checkers is another example of Meta aligning its social media sites and board of directors with President Donald Trump’s agenda. Meta's community notes will be limited to 500 characters and require a link to support them. The notes will initially cover English, Chinese, Spanish, Vietnamese, French, and Portuguese languages. Meta stated it would add more languages over time.

Contributors must be over 18 years old, have an account that is more than six months old, be in good standing, and either have a verified phone number or be enrolled in two-factor authentication, Meta stated in a blog. The company expects community notes to be less biased than the third-party fact-checking program and to operate at a large scale when fully up and running.

Around 200,000 US users have signed on to potentially become community note contributors. Contributors will not be able to submit notes on advertisements but almost any other forms of content, including posts by Meta, its executives, politicians, and other public figures are covered. When community notes go live, third-party fact-check labels will no longer appear in the US. Meta ultimately intends to roll out community notes globally.