Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Personalized Algorithms. Show all posts
Showing posts with label Personalized Algorithms. Show all posts

Tuesday, January 19, 2021

Escape the echo chamber

C Thi Nguyen
aeon.co
Originally published  9 April 18

Here is an excerpt:

Let’s start with epistemic bubbles. They have been in the limelight lately, most famously in Eli Pariser’s The Filter Bubble (2011) and Cass Sunstein’s #Republic: Divided Democracy in the Age of Social Media (2017). The general gist: we get much of our news from Facebook feeds and similar sorts of social media. Our Facebook feed consists mostly of our friends and colleagues, the majority of whom share our own political and cultural views. We visit our favourite like-minded blogs and websites. At the same time, various algorithms behind the scenes, such as those inside Google search, invisibly personalise our searches, making it more likely that we’ll see only what we want to see. These processes all impose filters on information.

Such filters aren’t necessarily bad. The world is overstuffed with information, and one can’t sort through it all by oneself: filters need to be outsourced. That’s why we all depend on extended social networks to deliver us knowledge. But any such informational network needs the right sort of broadness and variety to work. A social network composed entirely of incredibly smart, obsessive opera fans would deliver all the information I could want about the opera scene, but it would fail to clue me in to the fact that, say, my country had been infested by a rising tide of neo-Nazis. Each individual person in my network might be superbly reliable about her particular informational patch but, as an aggregate structure, my network lacks what Sanford Goldberg in his book Relying on Others (2010) calls ‘coverage-reliability’. It doesn’t deliver to me a sufficiently broad and representative coverage of all the relevant information.

Epistemic bubbles also threaten us with a second danger: excessive self-confidence. In a bubble, we will encounter exaggerated amounts of agreement and suppressed levels of disagreement. We’re vulnerable because, in general, we actually have very good reason to pay attention to whether other people agree or disagree with us. Looking to others for corroboration is a basic method for checking whether one has reasoned well or badly. This is why we might do our homework in study groups, and have different laboratories repeat experiments. But not all forms of corroboration are meaningful. Ludwig Wittgenstein says: imagine looking through a stack of identical newspapers and treating each next newspaper headline as yet another reason to increase your confidence. This is obviously a mistake. The fact that The New York Times reports something is a reason to believe it, but any extra copies of The New York Times that you encounter shouldn’t add any extra evidence.

Saturday, February 8, 2020

Bursting the Filter Bubble: Democracy, Design, and Ethics

V. E. Bozdag
Book/Thesis
Originally published in 2015

Online web services such as Google and Facebook started using personalization algorithms. Because information is customized per user by the algorithms of these services, two users who use the same search query or have the same friend list may get different results. Online services argue that by using personalization algorithms, they may show the most relevant information for each user, hence increasing user satisfaction. However, critics argue that the opaque filters used by online services will only show agreeable political viewpoints to the users and the users never get challenged by opposing perspectives. Considering users are already biased in seeking like-minded perspectives, viewpoint diversity will diminish and the users may get trapped in a “filter bubble”. This is an undesired behavior for almost all democracy models. In this thesis we first analyzed the filter bubble phenomenon conceptually, by identifying internal processes and factors in online web services that might cause filter bubbles. Later, we analyzed this issue empirically. We first studied existing metrics in viewpoint diversity research of the computer science literature. We also extended these metrics by adding a new one, namely minority access from media and communication studies. After conducting an empirical study for Dutch and Turkish Twitter users, we showed that minorities cannot reach a large percentage of users in Turkish Twittersphere. We also analyzed software tools and design attempts to combat filter bubbles. We showed that almost all of the tools implement norms required by two popular democracy models. We argue that democracy is essentially a contested concept, and other less popular democracy models should be included in the design of such tools as well.

The book/thesis can be downloaded here.