Skip to content

Filter Bubbles on the web – 4 things you can do about it

Hardly any international election was followed so closely in Germany as in the 2020 U.S. election. Donald Trump, who was running for re-election, divided minds, not only in his own country. Some wondered: how can it be that opinions and views, although seemingly based on the same available information, differ so much? Well, the simple answer is: In our digital world, we don’t all see the same facts; we are in so-called filter bubbles.

Trump’s Inauguration (left) compared to Obama’s (right) | Source: The Guardian

Donald Trump’s very first day in office began with a lie. When later confronted with a correction, Trump’s statement was simply referred to as “alternative facts“. This was the first time a phenomenon was put into words that expressed a new kind of discussion. False claims are legitimized as a means of public discourse, even against clear scientific evidence. We experience this particularly often in the digital space. Donald Trump’s actions and use of social media in the election campaign fueled the debate about social media regulations.

What are Filter Bubbles?

Political activist Eli Pariser first mentioned the term filter bubble in one of his books in 2011. He had noticed that he was reading fewer posts on Facebook from particularly conservative friends than from more liberal contacts. Accordingly, he coined the term filter bubble as the phenomenon when targeted information is filtered in social media and search engines.

What Influence does Social Media have?

Social media are not the sole reason for the polarization of society, but they do enable group formations. Better than ever, they let us find like-minded people. According to the theory of digital tribalism, people naturally form groups, while outsiders are excluded. At the same time, people search the web specifically for information that confirms their own identity. This phenomenon is also known as the echo chamber effect. This is reinforced by social networks and their algorithms. In order to capture the interest of users for as long as possible, we are shown online in particular those people and content that coincide with our own views. Dissenting opinions then disappear more and more from view.

While misinformation is relatively harmless on its own, in combination with the effects described it becomes a considerable danger to our own opinion formation. Algorithms can make it difficult to critically question facts if users are only provided with one-sided sources of information. However, if we only receive news that coincides with our own “facts,” we are unable to correct misinformation and are trapped in the filter bubble. As a consequence, incorrect beliefs can emerge. Social media can cause people to disconnect themselves from reality so facts lose credibility for them.

What can you do against Filter Bubbles?

  1. Obtain information analogously: As we have seen, filter bubbles are a problem that occurs in the digital environment. So, if possible, try to use analog resources, such as books or magazines, when searching for information. This may be the most blatant and costly step, but unfortunately, it is also the most effective. Of course, there are also things we can do online to combat filter bubbles.
  2. Search articles directly on the source site rather than on social media: Try to search for your favorite news portals’ articles directly on their site, rather than trusting social media to suggest them to you. The goal of Facebook, Instagram, and co. is to keep you on their platform as long as possible and unfortunately, that doesn’t happen by suggesting posts from news portals that you might not like. So check out various news portals at regular intervals and don’t take the detour via social media.
  1. Remain logged out when searching for information and delete cookies: If you want to find out more about a specific topic, it’s best not to surf logged in, either with your Google account or on the news sites. If possible, use different browsers, such as Google Chrome, Mozilla Firefox, or Opera. Let’s not kid ourselves, cookies also allow personal information to be passed between sites, so the portals may already be aware of who is surfing. But we don’t have to make it too easy for the portals and websites and enter the site already with a name tag.
  2. Find out what the major providers know about you: The new data protection regulation (DSGVO) may only be known to most, as you can now accept or reject the use of cookies on every page. In addition, however, this also means that every provider must provide the information previously collected about a person and delete it on request. Check regularly with your providers to see what data they have accumulated about you and for what services they use.

What do Filter Bubbles do in a Democracy?

For our liberal democracy and public debate, however, a common, fact-based basis for discussion is essential. If this is missing in our society, because everyone only believes in their own (alternative) facts, then arguments become illegitimate just because they differ from one’s own information and views. This can lead to a polarization of society and results in groups that each belief in their own truth and lose any understanding of others. The echo chamber effect described above leads to like-minded people reinforcing each other’s views online, and even getting worked up, without any self-reflection taking place. This can quickly lead to radicalization in the digital space.

To counteract these effects and prevent a hyperpolarization of society – such as we are currently seeing in political events in the United States – we must act now. We need to regulate the algorithms of social networks more strongly and demand more transparency on the net. If users have insight into those personalized parameters that let the algorithm decide what content to display, a self-determined decision becomes possible. Google announced to stop personalized tracking from 2022. This would mean that all users in our digital world would see the same facts – at least in theory. It remains to be seen whether social media companies will follow this example.

This is what you should take with you

  • Filter bubbles appear when targeted information and posts are not displayed to a user.
  • There are various ways to escape such filter bubbles or at least make them less likely.
  • As a society, we need to consider how to deal with algorithms that support such filter bubbles without that being their primary intention.
Anomaly Detection / Anomalieerkennung

What is Anomaly Detection?

Discover effective anomaly detection techniques in data analysis. Detect outliers and unusual patterns for improved insights. Learn more now!

t5 Model / t5 Modell

What is the T5-Model?

Unlocking Text Generation: Discover the Power of T5 Model for Advanced NLP Tasks - Learn Implementation and Benefits.

Computer Vision

What is Computer Vision?

Introduction to computer vision and its applications.

MLOps

What is MLOps?

Discover the world of MLOps and learn how it revolutionizes machine learning deployments. Explore key concepts and best practices.

Jupyter Notebook

What is Jupyter Notebook?

Learn how to boost your productivity with Jupyter notebook! Discover tips, tricks, and best practices for data science and coding. Get started now.

ChatGPT

What is ChatGPT?

Discover the power of ChatGPT - the cutting-edge language model trained by OpenAI. Learn how ChatGPT is changing the game in NLP.

  • A study on the 2017 federal election, in which the same search terms were fired on 1,500 different, real accounts and the results were analyzed, is linked here (use Google Translate for accessing it in English).
Das Logo zeigt einen weißen Hintergrund den Namen "Data Basecamp" mit blauer Schrift. Im rechten unteren Eck wird eine Bergsilhouette in Blau gezeigt.

Don't miss new articles!

We do not send spam! Read everything in our Privacy Policy.

Cookie Consent with Real Cookie Banner