Analysing Algorithms: Bosnian Media Complain of Facebook Guessing Game

But editors and journalists in Bosnia tell BIRN they struggle with its inconsistency in content moderation and a lack of transparency about its algorithms. And when they call Facebook for clarification, too often they are left hanging.

"The same content is treated differently on two different Facebook profiles," said one in an anonymous response to a BIRN survey of newsrooms, referring to the same text posted on the Facebook pages of two different media. "On one, it's coloured orange [denoting semi-restricted content]. On another, it's green, without any warnings or restrictions."

"It's confusing, and the procedure lacks transparency," said another. "There's no explanation; analysing algorithms comes down to experience."

Analysing algorithms

Posting content that is deemed to violate Facebook rules can have far-reaching consequences for small media outlets, which rely on the platform's sheer scale to reach an audience and attract advertisers. Repeat occurrences of content being flagged as false or misleading can result in a media's visibility being reduced, or it being locked out altogether.

Meta's website states: "Pages, groups, accounts and websites that repeatedly share misinformation will face some restrictions, including having their distribution reduced. This includes content rated False or Altered by fact-checking partners; content that is nearly identical to what fact-checkers have debunked as False or Altered…"

But even content that passes the grade must negotiate complex algorithms that push, promote or suppress visibility, determining which feeds it reaches and how often. How these algorithms work exactly is kept under wraps, and they are constantly changing.

Experimentation is the only way to get...

Continue reading on: