Take a fresh look at your lifestyle.

Facebook is cracking down on users who share wrong information; EU tightens content rules

Facebook has said it will take action against users who repeatedly share misinformation on its platform by reducing the number of users seeing their posts.

People posting misleading content about Covid-19, vaccines, climate change, elections and other divisive topics will be fact-checked and users will be able to see their conclusions more prominently on the page.

Facebook said it already reduced the reach of individual fake messages if they were considered misleading, but this restriction will now apply to entire accounts.

Users will see a notification of misleading posts in which the fact-checker’s article invalidates the claim, as well as a prompt to share the article with their followers.

It also includes a notice that people who repeatedly share false information can post their posts lower in the news feed so that other people are less likely to see them. Facebook first brought its fact-checking initiative to the UK in 2019.

It partners with the independent charity Full Fact to review stories, images and videos flagged by users and rate them based on their accuracy.

The European Commission also released new guidelines this week for its code of practice on disinformation, which states that platforms such as Google and Facebook cannot make money from ads linked to false messages.

They also propose to pass legislation this year to improve the transparency of political advertising, with social media playing an increasingly important role in elections around the world.

“Disinformation cannot remain a source of income. We need to see stronger commitments from online platforms, the entire ad ecosystem and fact-checker networks, ”said Thierry Breton, head of the EU industry, in a statement.

Vera Jourova, the Commission’s Vice-President for Values ​​and Transparency, said the issue was urgent because of the rapidly evolving threats posed by disinformation:

“We need online platforms and other players to address the systemic risks of their services and algorithmic amplification.

to grab. Alone and no longer allow themselves to make money from disinformation, while retaining freedom of expression ”. Facebook said it supports the Commission’s focus on greater transparency for users and better collaboration between platforms and the ad ecosystem.

The code of conduct was first introduced in 2018 and includes Google, Facebook, Twitter, Microsoft, Mozilla, TikTok and some advertising and tech lobby groups among the signatories.

It expects all signatories to provide details on how to comply with the updated guidelines by the end of 2021 and implement them early next year.

In 2019, EU officials expressed frustration at Facebook’s unwillingness to share key data regarding its efforts to curb disinformation campaigns.