Filter Bubbles and Ethics in Technology

From New Media Business Blog

Jump to: navigation, search

Contents

Filter Bubbles

Eli Pariser popularized the term "Filter Bubble" at his TED talk [1]

A filter bubble is a situation in which an Internet user encounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual’s online experience[2]. The term was coined by Eli Pariser in 2010[3]. Algorithms and personalized searches contribute to a state where users become disillusioned with viewpoints and opinions that differ from their own, due to users’ constant interactions with like-minded people and information that conforms to their beliefs[4].

Algorithms

[5]

An algorithm in a social media sense is a set of rules or calculations used to deliver certain content to the user[6]. Nearly all social media platforms user algorithms, including Facebook, Instagram, Twitter, Snapchat[7], and Reddit.
Before algorithms, posts would appear on a users’ feed in chronological order, showing whoever posted most recently, first. The switch to algorithms was originally received quite negatively, but as time passed, since users are able to see and interact with their friends’/families’ posts more frequently, users actually end up spending more time on the app. This is a positive thing for social media companies, given that the longer a user is on the app, the more advertisements the user is interacting with, thus making the social media companies more money.
Some key factors in determining the ranking of posts include:

  • Interest – Based on the content you have viewed, liked, or commented on, plus the users you follow, algorithms decide which posts a use should see first
  • Recency – Seemingly obvious, algorithms prioritize timely posts over posts that are a few days old
  • Relationship – Your relationship with the user who posted the content contributes to which posts the algorithm will show you – if a user frequently interacts (by liking, commenting, sharing) with another user’s content, this user’s content will be shown near the top of the feed[8]

Secondary factors include:

  • Frequency – how often a user visits the app, since algorithms will prioritize posts the user has not yet seen
  • Following – how many people a user follows impacts how often they will see multiple posts from the same user
  • Usage – The more time a user spends on an app, the more variety of content the user will see[9]

Echo Chambers

Echo chamber reinforces own opinions [10]

An echo chamber is a phenomenon that occurs due to filter bubbles, in which a person only encounters beliefs or opinions that coincide with their own, which results in alternative ideas and viewpoints not being considered[11].
The Internet has increased access to all kinds of different information, but this wide access may lead to selective exposure to channels that support a user’s ideologies[12].
The Wikipedia page on echo chambers gives an important example of this:
"Participants in online discussions may find their opinions constantly echoed back to them, which reinforces their individual belief systems. However, individuals who participate in echo chambers often do so because they feel more confident that their opinions will be more readily accepted by others in the echo chamber. This happens because the Internet has provided access to a wide range of readily available information. People are increasingly receiving their news online through nontraditional sources, such as Facebook, Google, and Twitter, that have established personalization algorithms that cater specific information to individuals’ online feeds."[13]
It is easier for people to have their beliefs validated than to have them challenged. Unfortunately, the more time a person spends on social media sites, the more the algorithm can accurately predict what they want to see – which is most likely content that conforms to their own beliefs.


Impact on Society

Businesses

There are benefits that can arise with filter bubbles, such as the personalized algorithm that can tailor business marketing to specific users. For businesses, filter bubbles are leveraged so that their products can be marketed to a demographic of users who potentially align with the business product or brand. It is an automated system that is able to gather past search results across platforms such as YouTube, Amazon and other websites containing marketable content. The main purpose of this is to drive website traffic and advertisers are able to target a specific geography. This saves cost, time and resources for companies trying to get their product out. From this, it can be viewed that filter bubbles are actually an effective tool for businesses today to help promote growth and money in the economy. Marketing and capturing customers through algorithm allow businesses to grow their brand and effectively sell their products based off the data they are able to capture. When used properly, algorithms tailored to learn for specific capabilities such as to inform itself of consumer buying patterns can help a business grow and scale effectively.

Facebook is an example of an algorithm that businesses use to leverage social media. The following are factors that businesses need to consider in order to drive traffic: [14]

  • Comments: Content that will initiate conversation among users
  • Reactions: Users should be active in "reacting" to posts to drive numerical statistics
  • Comment replies: Ongoing conversation
  • Shares on Messenger: Easier to share content when messages are sent directly
  • Engagement on shares: Demonstrate the popularity of posts

The above criteria will build rapport for businesses and drive engagement on their pages. When algorithms are tailored towards consumers, this will help people gain a view of different pages based off the different reviews. Businesses now leverage the power of algorithm to tailor the specific types of information that can meet consumer demand and therefore create benefits on both sides of a marketplace exchange.

Politics

The effects are called “computational propaganda” where people on the internet and are being swayed by the type of political content they are able to see through their social media or news outlets. [15] Due to the internet, it has caused young people to be lazy with their education and default over to a “quick Google search.” In reality, the majority of content from different political sources rarely appear in front of users and therefore will swing political party votes.

Not only does the system by default create a filter bubble on political articles, but politicians themselves have the power to leverage social media platforms toward their advantage. Donald Trump reportedly used bots and “fake fans” to boost the popularity of his Twitter and Facebook accounts. [16] From the clout that he has built, other users are artificially being persuaded to support him as they believe there is ongoing political support for his stance in the election.

Democracy is also affected as democracy requires that everyone should be equally informed about the candidates they are voting for. [17] Without context and a full view of how politics work in the digital age, people are misinformed about the information they intake about candidates on social media. Behind the scenes, politicians leverage the power of the internet to persuade an even greater following based off their internet clout rather than their capabilities as a representative of the country.


Impact on People: Pros/Cons

Awareness and education are key to preventing the effects of the filter bubble on how modern users consume the internet. It is often quoted that “people are more likely to consume news sources that confirm their ideological beliefs.” [18] People are now driven by consumerism as filter bubbles promote a cycle of purchasing more of what they desire and seek by the reach of a few clicks. Filter bubbles can negatively affect “news” as it limits and provides bias on the information consumers read. It is up to people to be educated on the natural existence of filter bubbles and to take steps to explore news sources that may differ in view.

A side that is not often addressed are the benefits in which filter bubbles can be used. One of the strongest benefits of filter bubbles is the ability to connect like-minded “creatives” on a platform that serve a variety of different art. The filter bubble can be leveraged by people to find communities or blogs/portals that share the same or similar passion as they do. When the system knows the type of work a user is interested in, this tailoring system can be effective for showing users job postings that are relating to the content people are already viewing and doing their research on. This will accelerate the process and even highlight the content people would have never found unless the algorithm provided it. In a world where people have addressed the problems of the filter bubble, there are more ways to also leverage the side of it that will connect people similar to them.

The filter bubble has become a natural part of the evolution of technology and convenience. In reality, people have always dreamed for society to continually evolve, and technology has become the medium in which educational growth can occur. Many people in society today leverage the internet and popular search engines to learn more about topics in which they are passionate about. There will never be a perfect system that will satisfy everyone, therefore the filter bubble effect will need to be accepted on both ends of the spectrum.

Pros

Filter Bubble use [19]
  • Connect with other like-minded individuals
  • Better marketing for businesses
  • Reduced search times for products or education online
  • Convenient and fast access to information
  • Gain insight to customer demographics

Cons

  • Only exposed to information they are familiar with
  • Unethically leveraged by politicians for clout
  • Allow businesses to take advantage of customers by “selling” products
  • “Fast food” education
  • Encourages a consumerist culture

Case Studies

This section looks at two studies conducted with regards to filter bubbles and their effects.

Google Searches Study

Google has claimed to have fixed their search algorithm to minimize the filter bubble effect on their search platform. [20]

This group asked 87 volunteers to search for the following terms on Sunday, June 24, 2018 at 9PM (ET):

  • "Gun control”
  • “Immigration”
  • “Vaccinations”

Result #1: Users saw unique results (even in incognito mode)

Google study actual search results [21]

Most of the links that appeared were different and varied among the different participants. If there were links that were similar among testers, they would also be rearranged in different orders and the study shows that the order in which links appear do matter. The links that appear first, have double the possibility of being clicked, as compared to the links consecutively right after it. At the same time, users found that there was no difference in search results between browsing in normal mode and incognito mode.

Result #2: Only some users were provided certain links

From these results, when users searched for “gun control,” they received 19 domains that were ordered in 31 different ways. Once again, not only were links different, but the arrangement in which they occurred varied as well. All users within this test group were given vastly different domains.

Result #3: News and Videos show variance in results

Not only were top domains arranged differently, but also the news and videos are ordered sporadically in different arrangements. People are not exposed to this section of the internet as much, but the effects are still the same.

Result #4: Incognito mode provided almost no filter bubble protection

When users are using private browsing, it is a misconception as websites use IP addresses and browser history to track people, regardless of the type of browser they use.

Overall, this case study provided proof that no matter what companies may “communicate” to the public about their plans to remove the filter bubble, people are ultimately still stuck within the system. Most studies will show that the filter bubble affect all media platforms, regardless of the steps to "pop" the bubble.

Using The Filter Bubble to Create a Teachable Moment

(This section, unless otherwise stated, is entirely cited from [22])
This case study is based on an article by Allyson Valentine and Laura Wukovitz, title Using The Filter Bubble to Create a Teachable Moment.
In spring 2012, teaching librarians (a person who is a qualified teacher and librarian that improve information skills and information resources[23]) at York University decided to make a radical change in their curriculum for information literacy classes after having received negative feedback from students who noted that they felt the course content was not particularly useful or engaging.
Previously, the course content was based on the course’s electronic text amongst other supplementary materials, one of which was the Ted Talk by Eli Pariser, “Beware Online Filter Bubbles” who coined the term Filter Bubble and wrote a book on the topic in 2011. Students particularly enjoyed this material, and upon changing the curriculum of information literacy courses, the course text switched from the course’s electronic text, to Pariser’s book.
The teaching librarians aimed to make the content relevant to students’ daily lives. They wanted to teach students how to evaluate information, research current events, and inform them on information ethics, all the while ensuring that the way they delivered this information made its importance clear.

Literature Review

Despite consistently high ratings regarding the instructors’ performances in course evaluations from Fall 2009- Fall 2010, only 50% of students rated the course to be of overall value to them.
Teaching librarians found that a major issue in increasing engagement was that it was difficult for students to relate to the material. Students had no context, and thus the instructors aimed to create a “real-world” experience to pique their interest.
Furthermore, the speed with which we can find information creates bad habits, in that students are very likely to use the first or second source that is shown to them upon searching. The authors refer to this as: “‘googlitis’, [which] describes an over reliance on simplistic search techniques using Internet search engines and the extension of these poor searching skills to the use of library resources”.

Using The Filter Bubble in the Classroom

As was previously mentioned, the course text was changed to Eli Pariser’s The Filter Bubble. The book contains information regarding personalized content online, which is overabundant in today’s internet. As written in the case study regarding Google, all search engines (unless specifically made to not allow this) use an algorithm, making the information shown to each person different, and biased to information the user has looked at before. Pariser focuses on how this creates a unique filter bubble for each of us, which can cause damage when it affects exposure to different information and ideas than we are used to. Furthermore, he emphasizes the fact that those who create the algorithms do not always have our best interests at heart, combined with the fact that one does not enter a filter bubble voluntarily, can contribute to poor decision-making.
Students were asked to keep a reflective journal as they read Pariser’s book, and the instructors found the students’ reflections to be emotional and engaged with the material. Students began to realize the importance of the topics being taught to them, and were able to connect this material to other coursework.

Valuing Viewpoints

Instructors chose to use Facebook as a tool to teach students about web personalization due to its relevancy. Students learned about Facebook’s personalization algorithm, “EdgeRank”, and how this can affect whose posts they see on their newsfeed. Upon learning that Facebook may be hiding posts of friends who have differing viewpoints than them, students began to advocate for the value of exposing oneself to opposing beliefs.

Confirmation Bias and Cognitive Dissonance

In Pariser’s book, he describes cognitive dissonance as “a tendency to believe things that reinforce our existing views, to see what we want to see” (p. 86). Learning theories reveal that exposing oneself to different thought patterns is when development occurs. Students may choose to reject new information as opposed to challenging themselves to alter their thought patterns: development is not inevitable.

Web Personalization vs. Relevancy Ranking

Students examined and compared Google results with their classmates as part of an in-class activity. They were asked to hypothesize why Google results may be different for each student, while taking into consideration their personal viewpoints. In addition to their results, students were also asked to examine the ads shown to them.
Students were made aware of the close ties between personalization and advertising, and they noticed that the ads they were shown were directly related to their web browsing history and what they interacted with on Facebook. A student commented: “Pariser brings up a good point when he says that Google puts us in our own filter bubbles, alone. We don't make the choice to have our searches filter to something we would ‘rather see.’”
The students’ expectations for research/searching using search engines or databases changed, and students began to expect to receive results that were specifically tailored to them. An interesting point was mentioned, in that when a student starts researching a new topic, algorithms struggle to provide the same personalized content. Academic databases have begun to use the same “search box” as search engines, instead of the multiple advanced filters available for use due to students’ comfort with these simplistic search engines.

Evaluating Sources and Using Rhetorical Analysis

Mastering the skills of critically evaluating information and its sources is difficult for students. The instructors quote Salibrici (1999), who indicates that “reading and writing should be ‘critical’ activities that push students beyond the stage of comprehension and interpretation to a higher level of evaluation or critical consciousness …”. To teaching librarians, this combination of literacy, comprehension, an evaluation is nearly impossible with critical evaluation of sources.
Teaching librarians introduced the topic of critical evaluation of content students find online by using a rhetorical triangle as a framework. This framework allowed students to represent the interdependent relationships of author, purpose and audience.
This all contributed to students’ general knowledge and critical thinking skills relating to filter bubbles, how to avoid them, and their general awareness of how they receive information.

Conclusions

The instructors only had one semester (spring 2012) to explore the impacts of their new curriculum. Although in the past reviews on the information literacy courses has been mixed, and many students were unable to see the relevance of the course content to their every day lives, after the end of the spring 2012 semester students praised the material. They considered it to be engaging, relevant, and even “enlightening”.
Students enjoyed learning about web personalization, although at times they found the implications of living in such a personalized filter bubble disturbing.
The instructors succeeded in providing context to the material, and taught in ways that made students understand and connect to the content.

Ethical Implications of Filter Bubbles

As stated above from the earlier definitions, filter bubbles are built and designed specifically for each user. This creates an environment that users are more likely to enjoy and come back to. As this environment continues, echo chambers are created and the user’s thoughts and existing opinions are reinforced.

Companies have taken advantage of filter bubbles for years. Their algorithms use data from all aspects of the user to tailor a newsfeed or environment for each user. In turn, users spend more and more time on the company’s platform. Specifically, modern-day social media companies are known for taking advantage of this technology. The business model of most social media companies revolve around selling advertisements by leveraging their large network of users. By providing a (typically) free platform and engaging features, social media companies attract users from a variety of ages. The more time that users spend on the platform, the greater opportunities there are to sell ads and generate revenue from those ad sales. This provides an incentive for companies to keep users on their platform as long as possible. This can prove to be mutually beneficial – by providing relevant information to the user at the right times, users can perhaps make better life decisions.

However this can also be quite detrimental. When the purpose to is simply keep users on the platform, companies will often not care about the impact of their actions. For example, a recent movement that involves people believing that vaccines cause more harm than good has been dubbed “anti-vaxx” (short for anti-vaccination) and is thought to be popularised for the rise of news consumption through social media. One unofficial study in 2018 created a brand new YouTube account and searched for vaccination videos. [24] After clicking on one link that perpetuates false information about the supposed harms of vaccinations, the YouTube algorithm recommended more and more “anti-vaxx” videos. Undeniably, the overwhelming science indicates that the claims of the “anti-vaxxers” (those in the “anti-vaxx” movement) are false. Yet this study shows that social media companies such as YouTube continue to perpetrate potentially misleading information.

Along with numerous other examples, it has been shown that social media companies often have not realized or cared about the adverse impacts of their filter bubbles on users and society. To emphasize the ethical impacts of filter bubbles, a video of billionaire Bill Gates speaking about the negative effects of curated news environments can be found in the following link: https://qz.com/913114/bill-gates-says-filter-bubbles-are-a-serious-problem-with-news/

It is also important to realize that this growing trend of filter bubbles may continue to get worse. In this day and age, more and more people are spending a proportionately large amount of their time on social media. In addition, these users are using social media as a source of news. As researched in a 2018 study, a survey in the Reuters Institute Digital News Report based on 53,000 news consumers in 26 countries found that 23% of respondents use online (digital) channels as their main news source. Further, this study cited that another 44% reported that they consider digital and traditional sources equally. [25] It is clear that consumer trends indicate an increased dependence of social media as a main news source. Additionally, the study indicates that when consuming news online, roughly 40% of respondents discover news through search engines and approximately 33% via social network sites. [26] These search engines and social media sites have curated filter bubbles for each user as opposed to more traditional news sources that have the same content for everyone. It can be seen that increased personalization and the potential negative effects will be amplified if current trends of news through social media and search engines continue.

What is the Purpose of a Corporation?

When thinking about the concept of filter bubbles and the implementation of this technology in today’s business environment, it is helpful to step back and come to an understanding on the purpose of a corporation. What should a company be responsible for? What ethical or normative rules should corporations follow? What is the purpose of a company?

By looking to answer these guiding questions, one can better come to a conclusion on the use of filter bubble technology and its ethical implications. Two normative theories for companies will be discussed: the Shareholder Theory and the Stakeholder Theory.

Shareholder Theory

Milton Friedman [27]

The Shareholder Theory, or the Friedman Doctrine (also known as Shareholder Primacy), is a theory of business ethics that tells how a company ought to act. [28] Popularized by economist Milton Friedman, the shareholder theory argues that the main goal of a company is solely focus on maximizing profits in order to increase shareholder value. As long as a company stays within the confines of the law, then the managers at the company are morally obligated to continue pursuing profits despite any other impacts. The theory also states that the company has “no social responsibility to the public and society” and Friedman himself justifies this view:

"In a free-enterprise, private-property system, a corporate executive is an employee of the owners of the business. He has direct responsibility to his employers. That responsibility is to conduct the business in accordance with their desires...the key point is that, in his capacity as a corporate executive, the manager is the agent of the individuals who own the corporation...and his primary responsibility is to them."[29]

While this theory may seem selfish, Friedman argues that it is up to the shareholder to decide what socially responsible initiatives they would like to invest in. The responsibility is more so placed on the individuals rather than the corporations.

Criticism

The criticism surrounding the shareholder theory varies greatly from claims that “it is financially wrong, economically wrong, legally wrong, socially wrong, or morally wrong”. [30] One of the main arguments against Friedman’s theory is centered around the concept of shareholders. Author Lynn Stout points out in her book The Shareholder Value Myth numerous valid objections. Primary being the definition of shareholder. The shareholder theory simply states that all shareholders very much value the value of the share and profits drive that value. However, this assumption of shareholders is not true. Stout claims that a deeper analysis of shareholders shows that most shareholders are actually those investing for the long-term (eg. Retirement funds or long-term investments) rather than the supposed short-term trader who intends to make monetary gain through active investing. [31] These long-term investors are more concerned with the sustainability of their investments. There can be actions of a company that maximize profits in the short term but could potentially be damaging in the long term. Therefore shareholders cannot be grouped as one and it cannot be stated that pursuing the maximization of short-term profits is what all shareholders desire.

Stakeholder Theory

Similarly, the stakeholder theory is a normative business theory that also states that shareholders should be valued highly. However, the stakeholder theory also states that companies must account for a multitude of stakeholders, those that impact and are impacted by the company. [32] These groups of interest include: shareholders, employees, suppliers, customers, local communities, and more. Originally detailed by philosopher Edward Freeman, the stakeholder theory aims to balance the different interests of stakeholders and maintain the overall health of the organization. This means looking more at the long-term sustainability of the business. [33]

Criticism

While the stakeholder theory may sound very approachable, it is often impractical in real scenarios. One main criticism is that it is simply impossible to balance the opposing interests of all parties (or even the major one). Pursuing profits for shareholders and lowering costs for customers is definitely paradoxical. It is also difficult to determine how many stakeholders and which ones to account for. A business’ action may affect millions of people and it is improbable to manage all stakeholders. In addition, it can be time-consuming and costly to undergo this process.[34]

Applying the Theories to Filter Bubbles

Understanding these two approaches to business ethics can help one attain deeper grasp into the business use of filter bubble technology.

Using the shareholder theory, corporations are incentivized to maximize profits and therefore maximize user time on their platform, since user time directly translates to ads and more revenue. Companies are not only incentivized to build filter bubbles, they are morally obligated to do this. Social media and search engines must continue to feed users what they “want” to see and continue utilizing the algorithm to tailor content. As we have seen in previous examples, this can strengthen unhealthy opinions and create echo chambers, thus leading to potential harm to society. However, companies are simply not responsible for these negative effects as long as they stay within the law.

Looking at filter bubble technology through a stakeholder lens provides a different story. Social media companies must begin to ask themselves hard-hitting questions before freely pursuing profits. In considering the long-term health of the organization and the interests of various stakeholders, companies must investigate the effects of echo chambers. As indicated previously, the rise of “anti-vaxx” can be partly attributed to the increasing usage of social media as a medium to quickly spread disinformation. This spread of disinformation is in part perpetrated by social media companies. Under the stakeholder theory, companies will have to re-evaluate their role in this unhealthy process. Additionally, social media companies must care more about public opinion from employees and customers as this can affect the company negatively in the long run.

Businesses Take Action

Some large companies have realized that they are more than profit-pursuing entities and are global stakeholders that have a vested interest and duty to society. These companies have recognized the impact of filter bubbles and the spread of disinformation and have begun to take action.

Twitter

[35]

One such example is with Twitter. In an effort to reduce the confusion for new users and to inadvertently attempt to break the filter bubble, social media company Twitter announced it would be implementing a new feature into their platform in 2019. The company is rolling out a feature that allows users to follow specific topics rather than simply getting their tweets from the users they follow. [36] This would allow for users to diversify their information from a variety of sources rather than just the ones that have appealed to them before. The idea is to give users many points of opinion about a given topic.

Twitter is starting this new feature with around 300 topics such as sports, entertainment, and media. However, the company is avoiding any political topics for now such as gun control, immigration, etc. This will allow Twitter to test the impacts of this feature on more neutral topics before deciding if political information will also be affected

Instagram

[37]

Some social media companies such as Instagram have recognized the urgency that comes with the spread of dangerous disinformation. In 2019, Instagram announced increased steps to combat “anti-vaxxer” content and vaccine misinformation. [38] The company, which focuses on the sharing photo or video content, intends to ban all hashtags that relate to “anti-vaxx” information. Instagram will review posts with these hashtags and develop their machine learning algorithm to flag these posts before getting too popular.

Other social media companies also recognize the dangers of misinformation such as Facebook, [39] Pinterest, [40] and [41] among others.

Prevention of Filter Bubbles

One might choose to avoid filter bubbles, should they be concerned about its potential negative effects. There are a plethora of ways to do this, such as employing adblockers or cookie blockers, or even exposing oneself to different types of media.

New Viewpoints

Pariser cautions against “overcorrecting”, and suggest reading authors with views that do not vary too much from ones own to start[42]. If, politically, someone is right leaning, for them to read far left-leaning articles will most likely not do anyone any good, and cause upset and even further confirmation bias. Once one is comfortable with moderate viewpoints, they can continue to venture into farther sides of the spectrum[43].
Receiving email newsletters is another simple way to incorporate new viewpoints into one’s media exposure[44].

Read Across the Aisle

Read Across the Aisle is an app that launched February 21, 2017, that advertises itself as a Fitbit for your filter bubble[45]. Similarly to how a Fitbit tells a user to get up and stretch, this app will remind them to broaden the viewpoints of the media they are consuming. There is a blue-to-red meter that tracks how often a user reads left-biased or right-biased information[46].

Ad Blockers

[47]

Using an ad blocker will remove the majority of advertisements from the websites a user visits[48]. Being unable to see ads limits the chances users have to interact with them, and therefore limits the information sites can track about you regarding which ads a user likes and interacts with, or which ads a user scrolls past.
Ad blockers can also result in faster loading times and better battery life, resulting in a more seamless browsing experience[49]. Some examples of ad blockers include (list below cited from [50]):

  • AdBlock
    • One of the most popular options
    • Customizable
    • Easy to install
  • CyberSec by NordVPN
    • Blocks communications from dangerous servers
    • Helpful to those who are already infected with malware
    • Not free
  • CleanWeb by Surfshark
    • Stops unwanted communication by blocked DNS requests from certain servers
    • Not free

Cookies

Cookies are data that your web browser stores for a user, which allows for ease of login, or to save settings for certain sites[51]. If permission to use cookies in enabled, other sites can access the information that the user’s browser stores and use it to contribute to which content a user will see[52].
To prevent this from occurring, users should delete their cookies frequently, which is quite simple on browsers such as Firefox, Google Chrome, Internet Explorer, and Safari.
Listed below are tutorials for each:


Web History

Although erasing a user’s web history is not a perfect method for them to receive completely unbiased Google search results, it still limits what information Google and other sites can gather from the user. The filter bubble is not only specific to online activity (which is the information sites can gather via a user’s web history). An article on searchenginewatch.com brings attention to the fact that “[the filter bubble] also takes into [consideration] personal factors that are not dictated by the individual such as device and location.”[57]

Birthday

A less obvious technique to limiting the information sites can gather from web activity is altering the birthday a user displays online.
Leaving off or changing the year associated with a user’s birthday online makes it significantly more difficult for ad companies to target a user based on their age[58].

Awareness

The most effective way for an internet user to prevent filter bubbles is to simply be aware that they exist. If a user is aware of the impact filter bubbles could be having on their daily life, they can take steps towards changing the media they consume, being more open towards new viewpoints, and creating a more balanced environment online.
It is crucial to educate others about filter bubbles. A large concern regarding filter bubbles is the lack of democracy stemming from a lack of variety of information; if more users are aware of this phenomenon, they can continue to be aware of the information they are receiving, and therefore make more educated and informed decisions.

Conclusion

Overall, the Filter Bubble group for the Fall 2019 semester has highlighted the technology of filter bubbles from the pros and cons to the ethical implications to prevention tactics. We think that filter bubbles can provide tremendous value in the form of insightful information delivered to users at the exact times they need it and through greater efficiency that improves quality of life. Filter bubbles are also great for businesses as they can increase user time spent on their platform by providing a useful service and creating a mutually beneficial relationship.

However, with these benefits there are also some negative implications that are of concern. Primary among those are the ethical implications that come with the continued push of filter bubbles on unsuspecting consumers. As businesses feed into the filter bubble without caring of the potential side effects such as the spread of disinformation, society can be unnoticeably harmed as seen in the case of the “anti-vaxx” movement. The creation of echo chambers along with confirmation bias must be taken into consideration before continuing with pushing filter bubbles.

Businesses are beginning to take notice of these long-term consequences and some social media companies have taken steps to manage these concerns. Two notable examples highlighted include Twitter and Instagram along with a plethora of tech companies moving towards banning dangerous misinformation. The trend seems to be towards global companies understanding that many stakeholders are key to the longevity of the business and are appropriately taking steps towards managing competing interests.

We have also highlighted key prevention methods for filter bubbles, namely having the awareness to recognize when you might be under their influence. Having the awareness when browsing social media builds up your mental barrier and critical thinking skills so that you may not be easily influenced when disinformation shows up. Additionally, awareness can also help you understand conflicting points of view and searching for alternative opinions may broaden your horizons.

Ultimately, our team believes that this technology can grow further and mature into a balanced form but we also recognize filter bubbles are essential to modern life and are unavoidable. Therefore, it is imperative to keep awareness in mind as a 21st-century-citizen.


Authors

Clement Chow Gabriel Lim Katarina Watt
Beedie School of Business
Simon Fraser University
Burnaby, BC, Canada
Beedie School of Business
Simon Fraser University
Burnaby, BC, Canada
Beedie School of Business
Simon Fraser University
Burnaby, BC, Canada

<img src = https://pi.tedcdn.com/r/talkstar-assets.s3.amazonaws.com/production/playlists/playlist_470/pop_filter_bubble_1200x627.jpg?quality=89&w=800...>

References

  1. Eli Pariser. Retrieved from https://www.google.com/search?q=eli+pariser&sxsrf=ACYBGNQDr4ngmhss2LDijedqOhYflLIyiw:1575274874550&source=lnms&tbm=isch&sa=X&ved=2ahUKEwjiu8iaxJbmAhUJs54KHZPmCMEQ_AUoAnoECBAQBA&biw=1280&bih=601&dpr=2#imgrc=SKBfi4YPlxoHPM:
  2. https://www.lexico.com/en/definition/filter_bubble
  3. https://en.wikipedia.org/wiki/Filter_bubble
  4. https://en.wikipedia.org/wiki/Filter_bubble
  5. 1*K2R0_nEeVEfrzLe8yTB59Q.png
  6. https://www.hopperhq.com/social-media-marketing-glossary-2018/algorithm/
  7. https://www.hopperhq.com/social-media-marketing-glossary-2018/algorithm/
  8. https://www.hopperhq.com/social-media-marketing-glossary-2018/algorithm/
  9. https://www.hopperhq.com/social-media-marketing-glossary-2018/algorithm/
  10. Retrieved from https://www.google.com/search?q=echo+chamber&sxsrf=ACYBGNQSMi5xqgcTz-_45u-MZVFPQs3TaQ:1575275478819&source=lnms&tbm=isch&sa=X&ved=2ahUKEwifi9q6xpbmAhVWu54KHaK-AncQ_AUoAXoECBEQAw&biw=1280&bih=552&dpr=2#imgrc=9-0AR_EHw6HEDM:
  11. https://www.lexico.com/en/definition/echo_chamber
  12. https://en.wikipedia.org/wiki/Echo_chamber_(media)
  13. https://en.wikipedia.org/wiki/Echo_chamber_(media)
  14. https://trillioncreative.com/how-algorithm-changes-will-affect-your-companys-social-media-accounts/
  15. https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles
  16. https://qz.com/1422395/how-many-of-donald-trumps-twitter-followers-are-fake/
  17. https://fs.blog/2017/07/filter-bubbles/
  18. https://fs.blog/2017/07/filter-bubbles/
  19. Eli Pariser. Retrieved from https://www.google.com/search?biw=1280&bih=552&tbm=isch&sxsrf=ACYBGNSgy947f1ZCZKJvYOEbzkWzGiNkXQ%3A1575274876521&sa=1&ei=fMnkXeavH4OU-gSMi5PgAw&q=filter+bubble&oq=filter+bubble&gs_l=img.3..35i39j0l5j0i67j0l3.801582.803328..803425...2.0..0.80.826.15......0....1..gws-wiz-img.......0i7i30j0i24j0i5i30j0i131.sRMNjZy7BOc&ved=0ahUKEwjm3cCbxJbmAhUDip4KHYzFBDwQ4dUDCAc&uact=5#imgrc=T-slu91Oy1F9TM:
  20. https://spreadprivacy.com/google-filter-bubble-study/
  21. Retrieved from https://spreadprivacy.com/google-filter-bubble-study/
  22. https://pdfs.semanticscholar.org/2f38/564fdfadda7a6e672fc7e3c71e6dfc72e7dc.pdf
  23. https://asla.org.au/what-is-a-teacher-librarian
  24. https://globalnews.ca/news/4113403/youtube-filter-bubble/
  25. https://www.researchgate.net/publication/318256136_Burst_of_the_Filter_Bubble_Effects_of_personalization_on_the_diversity_of_Google_News
  26. https://www.researchgate.net/publication/318256136_Burst_of_the_Filter_Bubble_Effects_of_personalization_on_the_diversity_of_Google_News
  27. Retrieved from https://en.wikipedia.org/wiki/Milton_Friedman
  28. https://en.wikipedia.org/wiki/Friedman_doctrine
  29. https://web.archive.org/web/20060207060807/https://www.colorado.edu/studentgroups/libertarians/issues/friedman-soc-resp-business.html
  30. https://web.archive.org/web/20060207060807/https://www.colorado.edu/studentgroups/libertarians/issues/friedman-soc-resp-business.html
  31. https://scholarship.law.cornell.edu/cgi/viewcontent.cgi?article=2311&context=facpub
  32. https://en.wikipedia.org/wiki/Stakeholder_theory
  33. https://www.cambridge.org/core/elements/stakeholder-theory/1D970D2659D47C2FB7BCBAA7ADB61285
  34. https://en.wikipedia.org/wiki/Stakeholder_theory
  35. Retrieved from https://www.google.com/search?biw=1280&bih=601&tbm=isch&sxsrf=ACYBGNTNxdWfyvKZ14VShhmr1P1Fpe15pQ%3A1575275681030&sa=1&ei=oczkXaq0AZDB-wSZ4I_YCg&q=twitter+logo&oq=twitter&gs_l=img.3.0.35i39j0i67l9.465390.466678..467525...3.0..0.91.710.11......0....1..gws-wiz-img.......0.03CjqFHyai8#imgrc=oaT2g4OeBLnRAM:
  36. https://mashable.com/article/twitter-follow-topics-filter-bubbles/
  37. Retrieved from https://www.google.com/search?biw=1280&bih=552&tbm=isch&sxsrf=ACYBGNTVTK_R-VFalBVVCVf4nVr8KgLbHg%3A1575276149747&sa=1&ei=dc7kXY2bLYCt0PEPuayWqAc&q=instagram+logo&oq=instagram+logo&gs_l=img.3..0i67l10.15925.17484..17640...2.0..0.93.1017.16......0....1..gws-wiz-img.......35i39j0.9AcfqfU4SEI&ved=0ahUKEwjNqdD6yJbmAhWAFjQIHTmWBXUQ4dUDCAc&uact=5#imgrc=3cYapCIbmM18tM:
  38. https://www.globalcitizen.org/en/content/instagram-takes-more-steps-bans-anti-vaccination/
  39. https://www.wired.com/story/facebook-anti-vaccine-crack-down/
  40. https://www.fastcompany.com/90310970/the-tech-giant-fighting-anti-vaxxers-isnt-twitter-or-facebook-its-pinterest
  41. Indiegogohttps://techcrunch.com/2019/04/27/as-measles-returns-indiegogo-joins-other-tech-platforms-in-banning-anti-vaccine-campaigns/
  42. https://qz.com/896000/a-complete-guide-to-seeing-beyond-your-cozy-filter-bubble/
  43. https://qz.com/896000/a-complete-guide-to-seeing-beyond-your-cozy-filter-bubble/
  44. https://qz.com/896000/a-complete-guide-to-seeing-beyond-your-cozy-filter-bubble/
  45. http://www.readacrosstheaisle.com/
  46. https://qz.com/896000/a-complete-guide-to-seeing-beyond-your-cozy-filter-bubble/
  47. Retrieved from https://www.google.com/search?biw=1280&bih=552&tbm=isch&sxsrf=ACYBGNTVTK_R-VFalBVVCVf4nVr8KgLbHg%3A1575276149747&sa=1&ei=dc7kXY2bLYCt0PEPuayWqAc&q=adblocker&oq=adblocker&gs_l=img.3..0i10j0i10i67j0i10j0l2j0i10j0l4.86122.87052..87807...2.0..0.79.682.11......0....1..gws-wiz-img.......35i39j0i67.CQxkHyM0udI&ved=0ahUKEwjNqdD6yJbmAhWAFjQIHTmWBXUQ4dUDCAc&uact=5#imgrc=02g6R-cFo1mscM:
  48. https://fs.blog/2017/07/filter-bubbles/
  49. https://www.vpnmentor.com/blog/the-best-and-worst-ad-blockers/
  50. https://www.vpnmentor.com/blog/the-best-and-worst-ad-blockers/
  51. https://wit-ie.libguides.com/c.php?g=652919&p=4581472
  52. https://wit-ie.libguides.com/c.php?g=652919&p=4581472
  53. https://support.mozilla.org/en-US/kb/clear-cookies-and-site-data-firefox?redirectlocale=en-US&redirectslug=delete-cookies-remove-info-websites-stored
  54. https://support.microsoft.com/en-us/help/17442/windows-internet-explorer-delete-manage-cookies
  55. https://support.google.com/accounts/answer/32050?co=GENIE.Platform%3DDesktop&hl=en
  56. https://support.apple.com/en-ca/guide/safari/sfri11471/mac
  57. https://www.searchenginewatch.com/2017/08/18/how-to-escape-googles-filter-bubble/
  58. https://wit-ie.libguides.com/c.php?g=652919&p=4581472
Personal tools