ePaper Bangla

Meta’s content moderation efforts against violence

Meta’s content moderation efforts against violence
Tech

Facebook’s now well-recorded fumble in failing to control anti-Rohingya propaganda in Myanmar can be found in a 60-page report by BSR, which was commissioned to undertake a human rights impact assessment (HRIA) of the company’s presence in Myanmar, potentially leading to a $150 billion lawsuit.

In response to the crisis, Facebook hired more Burmese speakers, crafted new policies and, for the first time in its history, banned a slew of high-ranking government officials from the platform.

Fast-forward to the present. The platform now bans groups proclaiming hateful and violent missions from having a presence on its apps and removes content that represents, praises or supports them. Facebook has so far identified  a range of groups across the globe as hate organizations because they engage in coordinated violence against others based on characteristics such as religion, race, ethnicity or national origin.

The platform  routinely evaluates groups and individuals to determine if they have violated policy.

In the third quarter of 2021, as part of the company’s quarterly documentation of its efforts to curb offensive content, such as nudity, terrorism and hate speech, it reported taking action against more than 28 million pieces of content on Facebook and Instagram that violated its policies against hate speech.

There have been several cumulative incidents of violence in Bangladesh as well, following which the global social media giant saw fingers being pointed at it for inciting it, Dr. Nawab Osman, Head of Counter-Terrorism and Dangerous Organizations at Meta in Asia Pacific, said during a virtual press briefing.

“Whatever we are seeing on our platform is a reflection, perhaps the most extreme reflection of offline realities, that lead to real tensions  at ground levels and so we have been proactive in terms of identifying groups and individuals that potentially can use our platform to incite riots or carry out violence or torture,” he said.

Meta Platforms Inc now has 350 people whose primary job is countering terrorism and organized hate-one of the largest teams in the industry, and another 350,000 people working in safety and security at Facebook, including 15,000 content reviewers, he told reporters.

The company also now has a formal human rights policy, rules around misinformation that can lead to violence and staff who speak the local language.

Meta also runs support programs at the community level to intervene and to build peace between some of these or to reduce tension between these communities.

According to the platform it remains vigilant in learning and combating new ways people may try to abuse apps, working with external partners to get the latest intelligence about adversarial behavior across the internet, while commissioning independent research from academics and experts.

A delegation of Facebook earlier this year said the social media platform will look at Bangladesh-related issues in the light of the country's laws, traditions, culture, values and rules and regulations as much as possible.

Facebook currently responds to around 40% of requests of the government. The rate of response was almost zero a few years back.

The platform has said it will step up its responses to requests from the Bangladesh authorities regarding its contents relating to militancy, religious incitement and anti-state elements.

The assurance came when a delegation from the regional headquarters of Facebook in Singapore met Posts and Telecommunications Minister Mustafa Jabbar in Dhaka.

Facebook has also trained a group of Bangladeshi officials on how to report content they want to be removed.