We are now all too familiar with echo chambers (EC) and how they can produce people who are convinced of things that are demonstrably untrue. We all have our own theories about why people hold views that are mistaken. These theories often boil down to “people are stupid”. Indeed much psychological literature does blame cognitive biases and errors in thinking for people’s fallacious conclusions (1).
However, recent research coming from academics working at Oxford University and UCL have proposed that the fault might not lie with the individuals after all, but rather the networks themselves (2). The team have built a Social Network Simulator to see how agents sort themselves around different beliefs and into different binary groups of opinion.
What they have been able to demonstrate is that even perfectly rational and honest actors can form themselves into ECs, as people seek out opinions within a given range of their own and then prune their social network of people they deem to be wrong. All rational agents start with a level of uncertainty about their beliefs and a corresponding range of opinions they are willing to consider given what they already know about the world. As their confidence increases, their tolerance for views that fall outside their own reduces. So a view they would have once been open to, they then come to see as beyond the pale. Crucially this is not a function of cogitative errors or ignorance since all actors have the same cognitive ability and access to the same information. It is the product of the network itself and can happen even when people are perfectly rational and honest and is independent of their open-mindedness and access to information (3).
What’s more, the study also suggests that the bigger the network, the bigger the problem. In fact the larger the network the larger the quantity of people who will give you confidence in your pre-existing opinions, and so the further from the truth you get.
As the7stars’ own research with Newsworks shows only 35% of users understand the news they see on Facebook is matched to their profile (4). It’s reasonable to assume this naivety increases confidence in the news user are exposed to without understanding their targeted nature.
As the authors of the study point out these findings make the case an even stronger case for social network owners to build their systems in a way that mitigates these echo chambers and preserves people’s tolerance for new information. (5)
Picture: Wonder net image of fake news spreading on twitter – “the host of nodes on the head of the mushroom represent all the bots created during the Pizzagate scandal, all of which target a single giant node in the middle of the map–an influential person, who then slowly begins to believe the bots and spread the fake news out into the real-news ecosystem.”
1,2,3,5: Jens Koed Madsen, Richard M Bailey & Toby D. Pilditch, “Large networks of rational agents form persistent echo chambers”, Scientific Reports, 8, Article number: 12391 (2018) https://www.nature.com/articles/s41598-018-25558-7
4:The7stars + Newsworks ‘Pop Goes The Filter Bubble’ Research 2017 https://www.newsworks.org.uk/News-and-Opinion/popping-the-filter-bubble-will-improve-the-online-brand-experience-