The increasing number of fake posts related to Coronavirus has become a cause of concern for Facebook as it could cause a damaging implication on the world’s largest social media platform.
With the rapid spread of the COVID-19 outbreak, came along a humongous volume of fake news and misinformation related to the virus across all social media platforms. If goes unchecked, the content pieces which are spreading this misinformation will potentially be able to impact billions of people worldwide. This is why the social media giant Facebook is now coming up with innovative new ways to tackle this problem.
Facebook has recently announced that they are all set to tackle the spread of COVID-19 misinformation with new measures that it is all set to implement. Users on Facebook’s social media platform will now be alerted every time they might have possibly interacted with fake or dangerous content.
The company’s Vice President of Integrity, Guy Rosen, on Thursday, revealed that the social media behemoth will start notifying their users on the platform whenever they react, like or comment on a coronavirus related content which claims to ‘debunk myths’ as such.
He also mentioned that the company’s entire team of moderators are now working overtime to ensure that all kinds of misinformation related to the pandemic is removed immediately. This includes all kinds of ‘voodoo’ preventive measures several people are prescribing, conspiracy theories, fake statistics and so on.
If the social media giant’s algorithms detect that one of their users have possibly interacted with a piece of information which has already been proven incorrect by the World Health Organisation or WHO, then they will be redirected to a shareable link to a page on WHO’s official website which debunks that rumours of those kinds.
Facebooks Guy Rosen has said that the company aims to keep its users safe from all kinds of imminent danger from these virus-related misinformations.
At present, the social media platform consists of over 2.5 billion monthly active users. Therefore, if goes unchecked these COVID-19 related fake pieces of content could end up having a catastrophic impact on a huge scale.
According to Facebook, in the month of March itself, a whopping 40 million posts that contained misinformation related to the virus were slapped with a warning and many more further had to be completely wiped off the platform.
The spread of the misinformation amid this coronavirus outbreak has increased so much that it is becoming nearly impossible for the company’s moderators to take care of it by themselves alone. Therefore, now Facebook has partnered up with as many as 60 external organizations to help them fact check all suspicious pieces of content.
It is now being claimed by the company that they have already been able to successfully redirect over two billion users to WHO’s official website. Also, it has been recorded that over a whopping 350 million users jointly across Facebook and Instagram have clicked through the company’s official COVID-19 Information Centre.
22 Days To Flag: Weird Enough?
A new report issued by Avaaz that has recently surfaced while commending Facebook’s efforts has mentioned that there seem to be significant delays when it comes to the flagging and taking down of coronavirus related misinformation.
According to the report, it takes as long as 22 days for warning labels to finally show up on dubious content pieces related to the COVID-19 outbreak.
Facebook has not yet responded back with any replies related to this report. But from the outsider’s perspective, it can well be understood that the company is currently under a lot of pressure to deal with this sudden surge in misinformation while being low on manpower as most of its workforce is now working from home. That being said, it will be interesting to observe how Facebook manages to shorten this window of alleged ‘22 days’ in the near future. We will keep you updated on all future developments. Stay tuned.