Facebook is tightening the noose on users who continue to spread misinformation even after the posts have been flagged as false.
The social media network said it would limit the distribution of userss’ subsequent posts if they repeatedly share posts that have been flagged by its fact-checking partners.
Facebook said it would further launch new ways to alert users if they are interacting with information that has been flagged as misinformation.
The company has reported an influx of false information and conspiracy theories since the start of the Covid-19 pandemic.
“Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” the company said in a statement.
Earlier this year, Facebook announced that it had taken down 1.3 billion fake accounts between October and December, 2020.