Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Filter Bubbles. Show all posts
Showing posts with label Filter Bubbles. Show all posts

Tuesday, January 19, 2021

Escape the echo chamber

C Thi Nguyen
aeon.co
Originally published  9 April 18

Here is an excerpt:

Let’s start with epistemic bubbles. They have been in the limelight lately, most famously in Eli Pariser’s The Filter Bubble (2011) and Cass Sunstein’s #Republic: Divided Democracy in the Age of Social Media (2017). The general gist: we get much of our news from Facebook feeds and similar sorts of social media. Our Facebook feed consists mostly of our friends and colleagues, the majority of whom share our own political and cultural views. We visit our favourite like-minded blogs and websites. At the same time, various algorithms behind the scenes, such as those inside Google search, invisibly personalise our searches, making it more likely that we’ll see only what we want to see. These processes all impose filters on information.

Such filters aren’t necessarily bad. The world is overstuffed with information, and one can’t sort through it all by oneself: filters need to be outsourced. That’s why we all depend on extended social networks to deliver us knowledge. But any such informational network needs the right sort of broadness and variety to work. A social network composed entirely of incredibly smart, obsessive opera fans would deliver all the information I could want about the opera scene, but it would fail to clue me in to the fact that, say, my country had been infested by a rising tide of neo-Nazis. Each individual person in my network might be superbly reliable about her particular informational patch but, as an aggregate structure, my network lacks what Sanford Goldberg in his book Relying on Others (2010) calls ‘coverage-reliability’. It doesn’t deliver to me a sufficiently broad and representative coverage of all the relevant information.

Epistemic bubbles also threaten us with a second danger: excessive self-confidence. In a bubble, we will encounter exaggerated amounts of agreement and suppressed levels of disagreement. We’re vulnerable because, in general, we actually have very good reason to pay attention to whether other people agree or disagree with us. Looking to others for corroboration is a basic method for checking whether one has reasoned well or badly. This is why we might do our homework in study groups, and have different laboratories repeat experiments. But not all forms of corroboration are meaningful. Ludwig Wittgenstein says: imagine looking through a stack of identical newspapers and treating each next newspaper headline as yet another reason to increase your confidence. This is obviously a mistake. The fact that The New York Times reports something is a reason to believe it, but any extra copies of The New York Times that you encounter shouldn’t add any extra evidence.

Saturday, February 8, 2020

Bursting the Filter Bubble: Democracy, Design, and Ethics

V. E. Bozdag
Book/Thesis
Originally published in 2015

Online web services such as Google and Facebook started using personalization algorithms. Because information is customized per user by the algorithms of these services, two users who use the same search query or have the same friend list may get different results. Online services argue that by using personalization algorithms, they may show the most relevant information for each user, hence increasing user satisfaction. However, critics argue that the opaque filters used by online services will only show agreeable political viewpoints to the users and the users never get challenged by opposing perspectives. Considering users are already biased in seeking like-minded perspectives, viewpoint diversity will diminish and the users may get trapped in a “filter bubble”. This is an undesired behavior for almost all democracy models. In this thesis we first analyzed the filter bubble phenomenon conceptually, by identifying internal processes and factors in online web services that might cause filter bubbles. Later, we analyzed this issue empirically. We first studied existing metrics in viewpoint diversity research of the computer science literature. We also extended these metrics by adding a new one, namely minority access from media and communication studies. After conducting an empirical study for Dutch and Turkish Twitter users, we showed that minorities cannot reach a large percentage of users in Turkish Twittersphere. We also analyzed software tools and design attempts to combat filter bubbles. We showed that almost all of the tools implement norms required by two popular democracy models. We argue that democracy is essentially a contested concept, and other less popular democracy models should be included in the design of such tools as well.

The book/thesis can be downloaded here.

Tuesday, June 25, 2019

Truth by Repetition: Explanations and Implications

Unkelbach, C., Koch, A., Silva, R. R., & Garcia-Marques, T. (2019).
Current Directions in Psychological Science, 28(3), 247–253. https://doi.org/10.1177/0963721419827854

Abstract

People believe repeated information more than novel information; they show a repetition-induced truth effect. In a world of “alternative facts,” “fake news,” and strategic information management, understanding this effect is highly important. We first review explanations of the effect based on frequency, recognition, familiarity, and coherent references. On the basis of the latter explanation, we discuss the relations of these explanations. We then discuss implications of truth by repetition for the maintenance of false beliefs and ways to change potentially harmful false beliefs (e.g., “Vaccination causes autism”), illustrating that the truth-by-repetition phenomenon not only is of theoretical interest but also has immediate practical relevance.

Here is a portion of the closing section:

No matter which mental processes may underlie the repetition-induced truth effect, on a functional level, repetition increases subjective truth. The effect’s robustness may be worrisome if one considers that information nowadays is not randomly but strategically repeated. For example, the phenomenon of the “filter bubble” (Pariser, 2011) suggests that people get verbatim and paraphrased repetition only of what they already know and believe. As discussed, logically, this should not strengthen information’s subjective truth. However, as discussed above, repetition does influence subjective truth psychologically. In combination with phenomena such as selective exposure (e.g., Frey, 1986), confirmation biases (e.g., Nickerson, 1998), or failures to consider the opposite (e.g., Schul, Mayo, & Burnstein, 2004), it becomes apparent how even blatantly false information may come “to fix itself in the mind in such a way that it is accepted in the end as a demonstrated truth” (Le Bon, 1895/1996). For example, within the frame of a referential theory, filter bubbles repeat information and thereby add supporting coherent references for existing belief networks, which makes them difficult to change once they are established. Simultaneously, people should also process such information more fluently. In the studies reviewed here, statement content was mostly trivia. Yet, even for this trivia, participants evaluated contradictory information as being less true compared with novel information, even when they were explicitly told that it was 100% false (Unkelbach & Greifeneder, 2018). If one considers how many corresponding references the information that “vaccination leads to autism” may instigate for parents who must decide whether to vaccinate or not, the relevance of the truth-by-repetition phenomenon becomes apparent.

Wednesday, May 29, 2019

The Problem with Facebook


Making Sense Podcast

Originally posted on March 27, 2019

In this episode of the Making Sense podcast, Sam Harris speaks with Roger McNamee about his book Zucked: Waking Up to the Facebook Catastrophe.

Roger McNamee has been a Silicon Valley investor for thirty-five years. He has cofounded successful venture funds including Elevation with U2’s Bono. He was a former mentor to Facebook CEO Mark Zuckerberg and helped recruit COO Sheryl Sandberg to the company. He holds a B.A. from Yale University and an M.B.A. from the Tuck School of Business at Dartmouth College.

The podcast is here.

The fundamental ethical problems with social media companies like Facebook and Google start about 20 minutes into the podcast.

Tuesday, May 28, 2019

Values in the Filter Bubble Ethics of Personalization Algorithms in Cloud Computing

Engin Bozdag and Job Timmermans
Delft University of Technology
Faculty of Technology, Policy and Management

Abstract

Cloud services such as Facebook and Google search started to use personalization algorithms in order to deal with growing amount of data online. This is often done in order to reduce the “information overload”. User’s interaction with the system is recorded in a single identity, and the information is personalized for the user using this identity. However, as we argue, such filters often ignore the context of information and they are never value neutral. These algorithms operate without the control and knowledge of the user, leading to a “filter bubble”. In this paper we use Value Sensitive Design methodology to identify the values and value assumptions implicated in personalization algorithms. By building on existing philosophical work, we discuss three human values implicated in personalized filtering: autonomy, identity, and transparency.

A copy of the paper is here.