A German project investigating whether Instagram is politically polarizing and promoting misinformation or hate speech has been shut down. Facebook is pressuring researchers to obey or take legal action.
The Berlin Algorithm Watch investigated how certain things appear on Instagram in response to the upcoming German elections. The aim is to investigate whether the platform owned by Facebook allows certain trends to be discussed more often and how disinformation or hateful messages are displayed.
The researchers did this via an add-on that about 1,500 users voluntarily install and monitor what can be seen on Instagram. The investigation entered its second year, and each time Facebook was also asked to respond to the interim findings. But instead of responding, the researchers were instructed to stop their project and delete the data.
That was not a casual question. If Algorithm Watch didn’t stop, Facebook would use a ‘more formal engagement’. However, algorithm Watch says it doesn’t have the resources to fight a legal battle with Facebook, so the data was deleted. “We were bullied until we stopped our Instagram monitoring,” it said in an open letter to the EU asking for better protection of such projects.
In a conversation with the Finnish YLE, researcher Nicolas Kayser-Bril explains that his research finds, among other things, that messages from politicians with a lot of text are shown less than images.
“We were a long way from discovering the secrets of the algorithm, but Facebook attacked us anyway. It shows that they either have something to hide or are concerned about what could threaten their partial monopoly,” Kayser-Bril told YLE.
In addition, last week’s revelation that Facebook knows very well how harmful its platforms are. Last week, based on internal documents, the Wall Street Journal revealed that Facebook is well aware that Instagram gives teenage girls low self-esteem and that the platform is sometimes blamed for suicide attempts. Yet, the company never shared those findings with the world until they were made public by the newspaper.
In short, it still seems that Facebook prefers not to have independent snoopers on its platform who use research and facts to point out abuses around disinformation or political preference in the algorithm. But the company knows that many things go wrong internally and is silent about this. If data is nevertheless released to researchers, then that is a mistake.