Online Hate Speech Remains Unmoderated in Balkans
A study conducted by BIRN in 2021 showed social media platforms removed reported threats of violence and harassment in local languages in 60 and 50 per cent of the cases respectively in four Western Balkans countries, including Bosnia and Herzegovina. Notably, up to half of this reported hate speech remained online, as platforms fail to detect harmful content in local languages other than English.
To be able to spot such danger and moderate content appropriately, global companies need to take a closer look at the local ground. This is vital, given the huge role that social media plays in how information is disseminated across society. We should all care about and engage with how social media companies moderate the content uploaded on their platforms.
Illustration: Unsplash.com/Mika Baumeister.
Even in countries where these platforms are just about equivalent to the Internet, as in Bosnia and Herzegovina where, according to one interviewee, "Facebook is the Internet", social media companies do very little to ensure that they understand the societies in which they operate.
ARTICLE 19's research in Bosnia and Herzegovina, Indonesia and Kenya has revealed a wide gap between local civil society organisations and the social media companies. Content rules are not necessarily available in local languages and the mechanisms to appeal content moderation decisions are perceived as ineffective.
Local actors feel that they have little to no opportunity to engage with social media companies to discuss the moderation of content that directly impacts their society.
It is evident how this gap may profoundly distort the information that people read on social media and therefore the type of society they see and make sense of. And...