Myth #22: We all live in filter bubbles.
Myth: Filter bubbles run our whole life. Political, social and economic problems like the rise of populism, hate speech, fake news, growing capitalism and even depressions are caused by the personalization of search engines and social media platforms as well as by micro targeting. Filter bubbles and echo chambers separate users from each other by creating invisible bubbles.
Busted: Conceptually, filter bubbles exist. (#21) In 2009, Google included algorithms in its search tools that were personalized by individual user data. In 2011, Eli Pariser claimed that this meant that no ‘standard’ (common) Google search outcome existed. This is what he called a ‘filter bubble’– a space inside search algorithms and social media platforms that uses data to personalize a specific ‘bubble’ for every single user. Today, in view of the rise of large platforms like Facebook, Google, Alibaba and Baidu models of data accumulation and micro-targeting are part of a new data-driven capitalism (Srnicek 2016).
Today, Pariser’s filter bubble is used to explain different social, economic and digital phenomena like the growth of populism, hate speech, fake news, growing capitalism and even depression. Often the concept of the echo chamber that existed long before the filter bubble and had been already employed by Marshall McLuhan to describe the resonating world of tribal cultures (McLuhan/Norden 1969: 72) is misused to extend the concept of being (consciously or automatically) separated from dissenting world views on social media platforms. Filter bubbles and echo chambers have become blurry concepts to simplify different and complex phenomena of decision-making and formation of public opinion.
As numerous studies on public opinion formation have shown, network effects (#41) and other communicative structures that are caused by relations in social media have a much stronger impact on the formation of public opinion on platforms than algorithmic filtering (Haim et al., 2018). Empirically there has even been no direct proof that there is an effect of personal filtering on networks and formations of public opinion (Krafft et al. 2018) and even the direct effects of the algorithms on the personalization itself are very trivial (Feuz/Fuller/Stalder 2011). Political segmentations in populist debates on social media are mainly caused by the dynamics of populism itself, network effects or social bots and not directly by filtering algorithms (Dreyer/Schulz, 2019; Leistert 2017). Filter bubbles in search algorithms are not the ultimate cause for network effects, hate speech, populism or fake news – they have just become a metaphor to simplify these complex processes.
Truth: Filter bubbles do not run our lives. Personalized filtering by algorithms is not the cause for public opinion formation and has merely trivial effects on search results with major search engines. The concept is mainly used as a metaphor to reduce the complexity of social, economic and technological dynamics on platforms and public debates, but is of little value beyond that.
Source: Mario Haim, Andreas Graefe, Hans-Bernd Brosius, Burst of the filter bubble? Effects of personalization on the diversity of Google News, Digital Journalism (2018) 6 (3), 330–343; Martin Feuz, Matthew Fuller and Felix Stalder, Personal Web Searching in the Age of Semantic Capitalism: Diagnosing the Mechanisms of Personalisation, First Monday (2011) 16 (2), doi:10.5210/fm.v16i2.3344.