V. E. Bozdag
Originally published in 2015
Online web services such as Google and Facebook started using personalization algorithms. Because information is customized per user by the algorithms of these services, two users who use the same search query or have the same friend list may get different results. Online services argue that by using personalization algorithms, they may show the most relevant information for each user, hence increasing user satisfaction. However, critics argue that the opaque filters used by online services will only show agreeable political viewpoints to the users and the users never get challenged by opposing perspectives. Considering users are already biased in seeking like-minded perspectives, viewpoint diversity will diminish and the users may get trapped in a “filter bubble”. This is an undesired behavior for almost all democracy models. In this thesis we first analyzed the filter bubble phenomenon conceptually, by identifying internal processes and factors in online web services that might cause filter bubbles. Later, we analyzed this issue empirically. We first studied existing metrics in viewpoint diversity research of the computer science literature. We also extended these metrics by adding a new one, namely minority access from media and communication studies. After conducting an empirical study for Dutch and Turkish Twitter users, we showed that minorities cannot reach a large percentage of users in Turkish Twittersphere. We also analyzed software tools and design attempts to combat filter bubbles. We showed that almost all of the tools implement norms required by two popular democracy models. We argue that democracy is essentially a contested concept, and other less popular democracy models should be included in the design of such tools as well.
The book/thesis can be downloaded here.