Facebook touts beefed up hate speech detection ahead of Myanmar election – TechCrunch

Facebook has offered a little detail on extra steps it’s taking to improve its ability to detect and remove hate speech and election disinformation ahead of Myanmar’s A general election is scheduled to take place in the country on November 8, 2020. The announcement comes close…

Curated via Twitter from TechCrunch’s twitter account….

Facebook says now that it has expanded its misinformation policy with the aim of combating voter suppression and will now remove information “that could lead to voter suppression or damage the integrity of the electoral process” — giving the example of a post that falsely claims a candidate is a Bengali, not a Myanmar citizen, and thus ineligible to stand. “Working with local partners, between now and November 22, we will remove verifiable misinformation and unverifiable rumors that are assessed as having the potential to suppress the vote or damage the integrity of the electoral process,” it writes.

District Court for the District of Columbia to reject the application, the social media giant says The Gambia fails to ‘identify accounts with sufficient specificity’,” Time reported. “The Gambia was actually quite specific, going so far as to name 17 officials, two military units and dozens of pages and accounts,” it added. “Facebook also takes issue with the fact that The Gambia is seeking information dating back to 2012, evidently failing to recognize two similar waves of atrocities against Rohingya that year, and that genocidal intent isn’t spontaneous, but builds over time.

Facebook also says it’s working with two local partners to verify the official national Facebook Pages of political parties in Myanmar. “So far, more than 40 political parties have been given a verified badge,” it writes. “This provides a blue tick on the Facebook Page of a party and makes it easier for users to differentiate a real, official political party page from unofficial pages, which is important during an election campaign period.

Facebook’s blog post offers a metric to imply progress — with the company stating that in Q2 2020 it took action against 280,000 pieces of content in Myanmar for violations of its Community Standards prohibiting hate speech, of which 97. 8% were detected proactively by its systems before the content was reported to it. “This is up significantly from Q1 2020, when we took action against 51,000 pieces of content for hate speech violations, detecting 83% proactively,” it adds.

In summing up the changes, Facebook says it’s “built a team that is dedicated to Myanmar”, which it notes includes people “who spend significant time on the ground working with civil society partners who are advocating on a range of human and digital rights issues across Myanmar’s diverse, multi-ethnic society” — though clearly this team is not operating out of Myanmar.

Another recent change it flags is an ‘image context reshare’ product, which launched in June — which Facebook says alerts a user when they attempt to share a image that’s more than a year old and could be “potentially harmful or misleading” (such as an image that “may come close to violating Facebook’s guidelines on violent content”). “Out-of-context images are often used to deceive, confuse and cause harm.

On hate speech, which Facebook admits could suppress the vote in addition to leading to what it describes as “imminent, offline harm” (aka violence), the tech giant claims to have invested “significantly” in “proactive detection technologies” that it says help it “catch violating content more quickly”, albeit without quantifying the size of its investment nor providing further details.

On coordinated election interference, the tech giant has nothing of substance to share — beyond its customary claim that it’s “constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps”, including groups seeking to do so ahead of a major election. “Since 2018, we’ve identified and disrupted six networks engaging in Coordinated Inauthentic Behavior in Myanmar.

It further claims engagement with key regional stakeholders will ensure Facebook’s business is “responsive to local needs” — something the company demonstrably failed on back in 2018. “We remain committed to advancing the social and economic benefits of Facebook in Myanmar.

Earlier this month, Time reported that Facebook is using US law to try to block a request for information related to Myanmar military officials’ use of its platforms by the West African nation, The Gambia. “Facebook said the request is ‘extraordinarily broad’, as well as ‘unduly intrusive or burdensome’. Calling on the U. S.

Link to original article….

Leave a Reply

Leave a comment
%d bloggers like this:
scroll to top