Meta Oversight Board calls on company to investigate how content moderation changes could impact human rights ...Middle East

Technology by : (The Hill) -

Meta's Oversight Board is calling on the company to evaluate how recent changes to its content moderation policies could impact the human rights of some users, including those in the LGBTQ community.

The Oversight Board published 11 case decisions overnight Wednesday, marking the first cases to take into account the policy and enforcement changes announced by the Facebook and Instagram parent company at the start of the year.

"Our decisions note concerns that Meta's January 7, 2025, policy and enforcement changes were announced hastily, in a departure from regular procedure, with no public information shared as to what, if any, prior human rights due diligence the company performed," the board wrote in a release.

The board points specifically to Meta's decision to drop some LGBTQ protections from its hate speech rules amid a wider overhaul of content moderation practices. Under the changes, Meta now allows users to accuse LGBTQ individuals of being mentally ill despite otherwise prohibiting such content.

“We do allow allegations of mental illness or abnormality when based on gender or sexual orientation, given political and religious discourse about transgenderism and homosexuality,” Meta’s policy now states.

"As the changes are being rolled out globally, the Board emphasizes it is now essential that Meta identifies and addresses adverse impacts on human rights that may result from them," the board wrote.

This includes investigating the potential negative effects on Global Majority nations, LGBTQ users, minors and immigrants, according to the release. The board recommended Meta update it on its progress every six months and report its findings publicly "very soon."

The 11 cases reviewed by the board related to freedom of expression issues and the board noted it has a "high threshold" for restricting speech under an international human rights framework.

In two cases related to gender identity debate videos, for example, the board upheld Meta's decision to allow two posts about transgender peoples' access to bathrooms and participation in athletic events in the U.S.

"Despite the intentionally provocative nature of the posts, which misgender identifiable trans people in ways many would find offensive, a majority of the Board found they related to matters of public concern and would not incite likely and imminent violence or discrimination," the board wrote.

The board also recommended Meta improve its enforcement against bullying and harassment policies, including the rules requiring users to self-report content.

Meta CEO Mark Zuckerberg described the changes in January as an effort to "get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression."

In doing so, he also announced Meta's elimination of its fact-checking program. The system was replaced with a community-based process called Community Notes that relies on users to submit notes or corrections to posts that are potentially misleading or lack context.

The fact-checking program officially ended in the U.S. earlier in this month and Meta began testing the Community Notes feature last month. It used X's open-source algorithm for the rating system that determines whether notes get published.

The board recommended Meta "continually assess the effectiveness of Community Notes compared to third-party fact-checking, particularly in situations where the rapid spread of false information creates risks to public safety."

For years, Meta has also used artificial intelligence technology to proactively detect and remove violating content before it is reported. The board said Meta should also assess whether reducing a reliance on automatic technology may have impacts across the globe, especially in countries faced with crisis.

The board is run independently from Meta and funded by a grant provided by the company. It can offer non-binding policy recommendations, which if adopted, can have far-reaching impacts for the company's social media platforms.

Read More Details
Finally We wish PressBee provided you with enough information of ( Meta Oversight Board calls on company to investigate how content moderation changes could impact human rights )

Also on site :

Most Viewed Technology
جديد الاخبار