In order to maintain the integrity of content across its family of apps, Facebook has been adopting a strategy called “remove, reduce, and inform” since 2016. The strategy involves removing the content in violation of its policies, reducing the spread of problematic content that’s not in violation of its policies, and providing users with information so that they make informed choices when deciding what to click, read or share.
Facebook’s Community Standards page sets out what content is or is not permitted, and is designed to create a communication environment that embraces real-world safety, diversity, and equity.
Facebook has recently introduced a new section on this Community Standards page where users are regularly informed of the latest updates. A new feature named “Group Quality” has also been implemented to help admins of Facebook groups understand better how Facebook enforces its standards.
While some types of content might not violate the Community Standards, they are not exactly popular among Facebook users, e.g. misinformation and clickbait. Utilizing technology and people in its fight against photo and video-based false news, Facebook has been gaining momentum enforcing against fake accounts and coordinated inauthentic behavior. Measures have been introduced to help users identify false news. Certified fact-checking partners are also used to assess content in 24 languages.
Going forward, Facebook is to continue consulting academics, fact-checking experts, journalists, and other organizations to find new ways to snuff out fake news stories. Groups found to have repeatedly shared false content will have their overall News Feed distribution reduced.
Features, e.g. the Context Button, have been launched to provide background information related to News Feed content to help users evaluate source credibility.
As part of its upcoming plan, Facebook is set to expand the Context Button to images as well. Existing features, e.g. the Page Quality tab, will be enhanced, and more features like the Trust Indicators will be added in order to provide clarity on a news organization’s ethics and standards.
Facebook will be bringing its Verified Badge to Messenger, helping users avoid scammers. Messenger users will be given more control over who can reach them and whom to block. In a move to curb the spread of false news, Messenger will also have its own Context Button and Forward Indicator.
As scammers are constantly evolving their techniques, Facebook is determined to continuously step up its efforts in its war against misinformation.