Fb will ship notifications on to customers who like, share, or touch upon COVID-19 posts that violate the corporate’s phrases of service, in accordance with a report from Fast Company.
This new characteristic works like this: if a consumer interacts with a put up that’s later eliminated, Fb sends a notification to the consumer telling them that the put up was taken down. If the consumer clicks the notification, they’ll be taken to a touchdown web page with a screenshot of the put up and a brief rationalization for why it was eliminated. The touchdown web page can even characteristic hyperlinks to COVID-19 instructional sources and actions, like unfollowing the group that posted it.
That is an enlargement of Fb’s earlier makes an attempt to combat misinformation. Earlier than this, the corporate displayed a banner on the information feed, urging customers who had engaged with content material that had been eliminated, to “Help Friends and Family Avoid False Information About Covid-19.” However customers had been typically confused at what the banner was referring to, a Fb product supervisor informed Quick Firm. The hope is that new strategy is extra direct than the banner, whereas nonetheless avoiding scolding customers or re-exposing them to misinformation.
Fb’s modified strategy is arriving virtually a yr into the pandemic — just a little late. The notifications don’t debunk claims in eliminated posts. Additionally they don’t apply to posts that later have fact-checking labels placed on them, Quick Firm writes. Which means less-dangerous misinformation nonetheless has the chance to unfold.
Fb has been gradual to behave on misinformation that the corporate doesn’t take into account harmful. Although conspiracy theories about COVID-19 vaccines have unfold for months, Fb solely started removing COVID-19 vaccine misinformation in December. The query now could be: is that this too little, and too late?