Should Social Media Be Held Responsible For The Atrocities And Deaths It Facilitates?

Facebook logo. (Jaap Arriens/NurPhoto via Getty Images)Getty

Not a week goes by without another tragic story of how social media directly facilitated the injury, death or suffering of someone in the world. Earlier this month a young female child was sold in South Sudan using Facebook to increase the bidding price, while two months ago a false rumor spreading on WhatsApp in Mexico led an angry mob to burn two innocent men to death. Perhaps most infamously, Facebook has been singled out as a primary facilitator in the Myanmar genocide. As social media is increasingly used to facilitate horrific activities across the world, should the companies behind those platforms bear responsibility for the misuse of their platforms? In particular, as companies make explicit decisions regarding how much to invest in tools and staffing to counter misinformation and illegal use of their platforms, knowing full well the dangers posed by underinvestment, should they be held accountable in cases where they specifically declined to invest in protecting their users against a given threat, knowing full well that that lack of investment could lead to serious harm to those users?

Every day around the world, bad actors are misusing social platforms to commit horrific atrocities. Each time another atrocity makes headlines, the companies predictably issue statements that they are sorry about the misuse of their platform, but that there is nothing further they could have done to prevent the situation.

In the case of Myanmar, the company had long refused to answer questions regarding the number of Burmese speaking content moderators it employed or any other concrete details regarding its efforts to combat misuse of its platform to advance the violence there. The public and policymakers were left with nothing more than the company’s assurances that it was doing what needed to be done. Of course, that trust turned out to be misplaced, as it turns out the company had invested insufficient resources until the public outcry and threat of legislative action forced it to boost its investment in the country. Yet, even with all of these new resources, its total investment in the country pales compared to what would be needed to match its investment in other countries that are not undergoing active genocide.

How does a company like Facebook decide precisely how many moderators to hire for each language and socio-cultural focus? For a company that declines to even offer rough estimates of the number of moderators it has by language, getting it to offer any clarity into how it determines those staffing numbers is impossible.

In the case of Germany, the company had long claimed that it had done everything possible in terms of staffing and technical solutions to combat hate speech in the country. Yet, after the passage of new hate speech laws and the threats of future legislative action, Facebook changed its stance almost overnight, rapidly hiring a vast new army of moderators and deploying a number of new technical features – all the things it had long declined to do.

[“source=forbes]

Popular posts
Xiaomi Black Shark 2 may spruce up gaming smartphone — Here’s why

Xiaomi is all set to make a big splash in the $140 billion gaming industry with a new pocket-friendly powerhouse. Xiaomi’s new gaming handset is a successor to last year’s […]

Google Eyes The Future of Gaming, But is a Cloud Service Enough to Take on Microsoft And Sony?

Google is getting into gaming. It was always a matter of when and not if. It could very well be now. Google has teased the keynote for the upcoming Game […]