Meta is rolling back its Covid misinformation rules in countries like the US, where it revoked the pandemic’s national state of emergency recommended by its independent oversight board in April this year. Washington Post Reported Friday Morning (Via Engadget,
in one Updates As for the July announcement that it asked the Meta Oversight Board to investigate the safety of doing so, Meta cited the end of the World Health Organization’s global public health emergency declaration as the reason for the change:
Our COVID-19 misinformation rules will no longer be in effect globally because the global public health emergency declaration that triggered those rules has been lifted.
Now, the company says it will frame its own rules as per the region. In its transparency center Page considering the board’s recommendations, Meta says that, because the WHO downgraded the pandemic to emergency status, it will not directly address some of the board’s concerns.
Among those concerns are recommendations that Meta re-evaluate what misinformation it removes and take steps to increase transparency regarding government requests to remove COVID content. Instead, Meta says its response to the board’s fourth recommendation — that companies create a process to determine the risk of its misinformation moderation policies — addresses the spirit of the first recommendation. It says it will be “consulting with internal and external experts” to gauge the Covid situation around the world and will share details about local enforcement in “future quarterly updates”.
The WHO ended its global emergency declaration on May 5, 2023, six months after Twitter stopped enforcing its own misinformation rules shortly after Elon Musk bought it in November 2022. Both TikTok and YouTube continue to maintain policies around COVID misinformation, although YouTube recently changed its rules around election misinformation.










