Meta oversight board to review policies on transgender nudity, COVID-19 misinformation

The oversight board of Facebook's parent has taken on two policy advisory opinions

The oversight board of Facebook and Instagram parent Meta Platforms announced it would take on new cases as well as review the company's COVID-19 misinformation policies. 

Ticker Security Last Change Change %
META META PLATFORMS INC. 592.83 +18.51 +3.22%

The new cases involve:

  • Gender identity and nudity,
  • Hate speech and Russia’s invasion of Ukraine, and
  • U.K. drill music.

The social media giant has requested a policy advisory opinion on its approach to COVID-19 misinformation, asking whether it should continue to remove content under the policy or alter its approach to make it "less restrictive."

While Meta's approach to tackling misinformation "mainly relies on contextualizing potentially false claims," since the pandemic it has moved toward the removal of "entire categories of misinformation" about coronavirus from its platforms.

Meta currenly removes misinformation that is likely to directly contribute to the risk of imminent physical harm and labels, while fact-checking and demoting misinformation that does not meet the "imminent physical harm" standard.

INSTAGRAM LAWSUITS CLAIM PLATFORM FUELS EATING DISORDERS, MENTAL HEALTH ISSUES FOR YOUNG USERS

For content that does not meet that standard, the company relies on third-party fact-checking organizations. 

Content deemed "False," "Alerted," or "Partly False," is demoted in users' feeds.

Meta implements a temporary emergency reduction measure when fact-checkers cannot keep up with rating misinformation about a particular crisis spiking on its platforms. 

Meta apps

iPhone screen showing logos of WhatsApp, Messenger, Instagram and Facebook with logo of Meta Platforms, Facebook's parent, in the background, in Paris, France, February 3, 2022.  (Photo illustration by Chesnot/Getty Images / Getty Images)

In limited circumstances, the company may add a label to non-violating content on Covid-19 that directs users to its COVID-19 Information Center.

Meta pointed to the "changed landscape surrounding COVID-19" for its decision in reaching out to the board, including the development of vaccines and treatments, severity of the virus and greater access to information.

It's asking for a review of labels, fact-checking and its removal of misinformation — including the decision to remove content that directly contributes to the risk of imminent physical harm. 

GET FOX BUSINESS ON THE GO BY CLICKING HERE

While the board's policy advisory opinion is not binding, Meta must provide a public response and follow-on actions within 60 days of receiving recommendations.

Facebook phone

Illustration of Facebook logo an iPhone screen in Paris, France, October 06, 2021.  (Photo illustration by Chesnot/Getty Images / Getty Images)

The board sends its recommendations through its official policy development process and provides regular updates.

To date, the board has taken on two policy advisory opinions within the last two years. One involved the sharing of private residential information; the other began a review of Meta’s cross-check system.

Two of the new issues involve gender identity and nudity and a request for images with captions to be restored to Instagram. 

An account jointly run by a U.S.-based couple who identify as transgender and non-binary posted two images in 2021 and 2022. 

In the first image, both are nude from the waist up and have flesh-colored tape over their nipples. In the second image, one person is clothed, and the other person is bare-chested, covering their nipples with their hands. 

In the captions, the couples discuss the first bare-chested individual's upcoming top-surgery, including fundraiser announcements.

Instagram lawsuit

The Instagram app icon on the screen of a mobile device.  (AP Photo/Jenny Kane, file / AP Images)

Meta removed the posts under the Sexual Solicitation Community Standard because its automated systems identified the content as potential policy violations. 

While human moderators found the first image not to be a violation after three users reported the content for pornography and self-harm, when it was reported for a fourth time, another human reviewer found the post to be violating and removed it. 

In the second case, the post was identified as potentially violating by Meta’s automated systems twice, and found to be non-violating when reviewed by a human. Two users then reported the content, but each report was closed automatically without being reviewed. 

After the automated systems identified the content as potentially violating for a third time, a reviewer found the post to be violating and removed it.

The company maintained its decisions and the account owners appealed both removal decisions to the board. 

CLICK HERE TO READ MORE ON FOX BUSINESS

As a result of the board selecting these posts, Meta has identified the removals as "enforcement errors" and restored them. 

"In their statements to the Board, the couple express confusion about how their content violated Meta’s policies. They explain that the breasts in the photos are not those of women and that it is important that transgender bodies are not censored on the platform especially when trans rights and access to gender-affirming healthcare are being threatened in the United States," the board wrote in a post.  

The board is asking for public comments on both matters.