Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Attention. Show all posts
Showing posts with label Attention. Show all posts

Monday, January 8, 2024

Human-Algorithm Interactions Help Explain the Spread of Misinformation

McLoughlin, K. L., & Brady, W. J. (2023).
Current Opinion in Psychology, 101770.

Abstract

Human attention biases toward moral and emotional information are as prevalent online as they are offline. When these biases interact with content algorithms that curate social media users’ news feeds to maximize attentional capture, moral and emotional information are privileged in the online information ecosystem. We review evidence for these human-algorithm interactions and argue that misinformation exploits this process to spread online. This framework suggests that interventions aimed at combating misinformation require a dual-pronged approach that combines person-centered and design-centered interventions to be most effective. We suggest several avenues for research in the psychological study of misinformation sharing under a framework of human-algorithm interaction.

Here is my summary:

This research highlights the crucial role of human-algorithm interactions in driving the spread of misinformation online. It argues that both human attentional biases and algorithmic amplification mechanisms contribute to this phenomenon.

Firstly, humans naturally gravitate towards information that evokes moral and emotional responses. This inherent bias makes us more susceptible to engaging with and sharing misinformation that leverages these emotions, such as outrage, fear, or anger.

Secondly, social media algorithms are designed to maximize user engagement, which often translates to prioritizing content that triggers strong emotions. This creates a feedback loop where emotionally charged misinformation is amplified, further attracting human attention and fueling its spread.

The research concludes that effectively combating misinformation requires a multifaceted approach. It emphasizes the need for interventions that address both human psychology and algorithmic design. This includes promoting media literacy, encouraging critical thinking skills, and designing algorithms that prioritize factual accuracy and diverse perspectives over emotional engagement.

Saturday, May 27, 2023

Costly Distractions: Focusing on Individual Behavior Undermines Support for Systemic Reforms

Hagmann, D., Liao, Y., Chater, N., & 
Loewenstein, G. (2023, April 22). 

Abstract

Policy challenges can typically be addressed both through systemic changes (e.g., taxes and mandates) and by encouraging individual behavior change. In this paper, we propose that, while in principle complementary, systemic and individual perspectives can compete for the limited attention of people and policymakers. Thus, directing policies in one of these two ways can distract the public’s attention from the other—an “attentional opportunity cost.” In two pre-registered experiments (n = 1,800) covering three high-stakes domains (climate change, retirement savings, and public health), we show that when people learn about policies targeting individual behavior (such as awareness campaigns), they are more likely to themselves propose policies that target individual behavior, and to hold individuals rather than organizational actors responsible for solving the problem, than are people who learned about systemic policies (such as taxes and mandates, Study 1). This shift in attribution of responsibility has behavioral consequences: people exposed to individual interventions are more likely to donate to an organization that educates individuals rather than one seeking to effect systemic reforms (Study 2). Policies targeting individual behavior may, therefore, have the unintended consequence of redirecting attention and attributions of responsibility away from systemic change to individual behavior.

Discussion

Major policy problems likely require a realignment of systemic incentives and regulations, as well as measures aimed at individual behavior change. In practice, systemic reforms have been difficult to implement, in part due to political polarization and in part because concentrated interest groups have lobbied against changes that threaten their profits. This has shifted the focus to individual behavior. The past two decades, in particular, have seen increasing popularity of ‘nudges’: interventions that can influence individual behavior without substantially changing economic incentives (Thaler &Sunstein, 2008). For example, people may be defaulted into green energy plans (Sunstein &Reisch, 2013) or 401(k) contributions (Madrian & Shea, 2001), and restaurants may varywhether they place calorie labels on the left or the right side of the menu (Dallas, Liu, &Ubel, 2019). These interventions have enjoyed tremendous popularity, because they can often be implemented even when opposition to systemic reforms is too large to change economic incentives. Moreover, it has been argued that nudges incur low economic costs, making them extremely cost effective even when the gains are small on an absolute scaleTor & Klick (2022).

In this paper, we document an important and so far unacknowledged cost of such interventions targeting individual behavior, first postulated by Chater and Loewenstein(2022). We show that when people learn about interventions that target individual behavior, they shift their attention away from systemic reforms compared to those who learn about systemic reforms. Across two experiments, we find that this subsequently  affects their attitudes and behaviors. Specifically, they become less likely to propose systemic policy reforms, hold governments less responsible for solving the policy problem, and are less likely to support organizations that seek to promote systemic reform.The findings of this study may not be news to corporate PR specialists. Indeed, as would be expected according to standard political economy considerations (e.g., Stigler,1971), organizations act in a way that is consistent with a belief in this attentional opportunity cost account. Initiatives that have captured the public’s attention, including recycling campaigns and carbon footprint calculators, have been devised by the very organizations that stood to lose from further regulation that might have hurt their bottomline (e.g., bottle bills and carbon taxes, respectively), potentially distracting individual citizens, policymakers, and the wider public debate from systemic changes that are likely to be required to shift substantially away from the status quo.

Sunday, March 19, 2023

The role of attention in decision-making under risk in gambling disorder: an eye-tracking study

Hoven, M., Hirmas, A., Engelmann, J. B., 
& van Holst, R. (2022, June 30).
https://doi.org/10.31234/osf.io/fxd3m

Abstract

Gambling disorder (GD) is a behavioural addiction characterized by impairments in decision-making, favouring risk- and reward-prone choices. One explanatory factor for this behaviour is a deviation in attentional processes, as increasing evidence indicates that GD patients show an attentional bias toward gambling stimuli. However, previous attentional studies have not directly investigated attention during risky decision-making. 25 patients with GD and 27 healthy matched controls (HC) completed a mixed gambles task combined with eye-tracking to investigate attentional biases for potential gains versus losses during decision-making under risk. Results indicate that compared to HC, GD patients gambled more and were less loss averse. GD patients did not show a direct attentional bias towards gains (or relative to losses). Using a recent (neuro)economics model that considers average attention and trial-wise deviations in average attention, we conducted fine-grained exploratory analyses of the attentional data. Results indicate that the average attention in GD patients moderated the effect of gain value on gambling choices, whereas this was not the case for HC. GD patients with high average attention for gains started gambling at less high gain values. A similar trend-level effect was found for losses, where GD patients with high average attention for losses stopped gambling with lower loss values. This study gives more insight into how attentional processes in GD play a role in gambling behaviour, which could have implications for the development of future treatments focusing on attentional training or for the development of interventions that increase the salience of losses.

From the Discussion section

We extend the current literature by investigating the role of attention in risky decision-making using eye-tracking, which has been underexplored in GD thus far. Consistent with previous studies in HCs, subjects’ overall relative attention toward gains decreased in favor of attention toward losses when  loss  values  increased.  We  did not find group differences in attention to either  gains or losses, suggesting no direct attentional biases in GD. However, while HCs increased their attention to gains with higher gain values, patients with GD did not. Moreover, while patients with GD displayed lower loss aversion, they did not show less attention to losses, rather, in both groups, increased trial-by-trial attention to losses resulted in less gambling.

The question arises whether attention modulates the effect of gains and losses on choice behavior differently in GD relative to controls. Our exploratory analyses that differentiated between two different channels of attention indeed indicated that the effect of gain value on gambling choices was modulated by the amount of average attention on gains in GD only. In other words, patients with GD who focused more on gains exhibited a greater gambling propensity at relatively low gain values. Notably, the strength of the effect of gain value on choice only significantly differed at average and high levels of attention to gains between groups, while patients with GD and HCs with relatively low levels of average attention to gains did not differ. Moreover, patients with GD who had relatively more average attention to losses showed a reduction in gambling propensity at relatively lower loss values, but note that this was at trend level.  Since  average  attention  relates  to  goal-directed or top-down attention, this measure likely reflects one’s preferences and beliefs.  Hence,  the  current  results suggest  that  gambling  choices  in  patients  with GD, relative to HCs are more  influenced by their preferences for gains. Future studies are needed to verify if and how top-down attentional processes affect decision-making in GD.


Editor's note: Apparently, GD focusing primarily on gains continue to gamble.  GD and HC who focus on losses are more likely to stop.  Therefore, psychologists treating people with impulse control difficulties may want to help patient's focus on potential losses/harm, as opposed to imagined gains.

Sunday, February 26, 2023

Time pressure reduces misinformation discrimination ability but does not alter response bias

Sultan, M., Tump, A.N., Geers, M. et al. 
Sci Rep 12, 22416 (2022).
https://doi.org/10.1038/s41598-022-26209-8

Abstract

Many parts of our social lives are speeding up, a process known as social acceleration. How social acceleration impacts people’s ability to judge the veracity of online news, and ultimately the spread of misinformation, is largely unknown. We examined the effects of accelerated online dynamics, operationalised as time pressure, on online misinformation evaluation. Participants judged the veracity of true and false news headlines with or without time pressure. We used signal detection theory to disentangle the effects of time pressure on discrimination ability and response bias, as well as on four key determinants of misinformation susceptibility: analytical thinking, ideological congruency, motivated reflection, and familiarity. Time pressure reduced participants’ ability to accurately distinguish true from false news (discrimination ability) but did not alter their tendency to classify an item as true or false (response bias). Key drivers of misinformation susceptibility, such as ideological congruency and familiarity, remained influential under time pressure. Our results highlight the dangers of social acceleration online: People are less able to accurately judge the veracity of news online, while prominent drivers of misinformation susceptibility remain present. Interventions aimed at increasing deliberation may thus be fruitful avenues to combat online misinformation.

Discussion

In this study, we investigated the impact of time pressure on people’s ability to judge the veracity of online misinformation in terms of (a) discrimination ability, (b) response bias, and (c) four key determinants of misinformation susceptibility (i.e., analytical thinking, ideological congruency, motivated reflection, and familiarity). We found that time pressure reduced discrimination ability but did not alter the—already present—negative response bias (i.e., general tendency to evaluate news as false). Moreover, the associations observed for the four determinants of misinformation susceptibility were largely stable across treatments, with the exception that the positive effect of familiarity on response bias (i.e., response tendency to treat familiar news as true) was slightly reduced under time pressure. We discuss each of these findings in more detail next.

As predicted, we found that time pressure reduced discrimination ability: Participants under time pressure were less able to distinguish between true and false news. These results corroborate earlier work on the speed–accuracy trade-off, and indicate that fast-paced news consumption on social media is likely leading to people misjudging the veracity of not only false news, as seen in the study by Bago and colleagues, but also true news. Like in their paper, we stress that interventions aimed at mitigating misinformation should target this phenomenon and seek to improve veracity judgements by encouraging deliberation. It will also be important to follow up on these findings by examining whether time pressure has a similar effect in the context of news items that have been subject to interventions such as debunking.

Our results for the response bias showed that participants had a general tendency to evaluate news headlines as false (i.e., a negative response bias); this effect was similarly strong across the two treatments. From the perspective of the individual decision maker, this response bias could reflect a preference to avoid one type of error over another (i.e., avoiding accepting false news as true more than rejecting true news as false) and/or an overall expectation that false news are more prevalent than true news in our experiment. Note that the ratio of true versus false news we used (1:1) is different from the real world, which typically is thought to contain a much smaller fraction of false news. A more ecologically valid experiment with a more representative sample could yield a different response bias. It will, thus, be important for future studies to assess whether participants hold such a bias in the real world, are conscious of this response tendency, and whether it translates into (in)accurate beliefs about the news itself.

Thursday, February 17, 2022

Filling the gaps: Cognitive control as a critical lens for understanding mechanisms of value-based decision-making.

Frömer, R., & Shenhav, A. (2021, May 17). 
https://doi.org/10.31234/osf.io/dnvrj

Abstract

While often seeming to investigate rather different problems, research into value-based decision making and cognitive control have historically offered parallel insights into how people select thoughts and actions. While the former studies how people weigh costs and benefits to make a decision, the latter studies how they adjust information processing to achieve their goals. Recent work has highlighted ways in which decision-making research can inform our understanding of cognitive control. Here, we provide the complementary perspective: how cognitive control research has informed understanding of decision-making. We highlight three particular areas of research where this critical interchange has occurred: (1) how different types of goals shape the evaluation of choice options, (2) how people use control to adjust how they make their decisions, and (3) how people monitor decisions to inform adjustments to control at multiple levels and timescales. We show how adopting this alternate viewpoint offers new insight into the determinants of both decisions and control; provides alternative interpretations for common neuroeconomic findings; and generates fruitful directions for future research.

Highlights

•  We review how taking a cognitive control perspective provides novel insights into the mechanisms of value based choice.

•  We highlight three areas of research where this critical interchange has occurred:

      (1) how different types of goals shape the evaluation of choice options,

      (2) how people use control to adjust how they make their decisions, and

      (3) how people monitor decisions to inform adjustments to control at multiple levels and timescales.

From Exerting Control Beyond Our Current Choice

We have so far discussed choices the way they are typically studied:in isolation. However, we don’t make choices in a vacuum, and our current choices depend on previous choices we have made (Erev & Roth, 2014; Keung, Hagen, & Wilson, 2019; Talluri et al., 2020; 618Urai, Braun, & Donner, 2017; Urai, de Gee, Tsetsos, & Donner, 2019). One natural way in which choices influence each other is through learning about the options, where the evaluations of the outcome of one choice refines the expected value (incorporating range and probability) assigned to that option in future choices (Fontanesi, Gluth, et al., 2019; Fontanesi, Palminteri, et al., 2019; Miletic et al., 2021).  Here we focus on a different, complementary way, central to cognitive control research, where evaluations of the process of ongoing and past choices inform the process of future choices(Botvinick et al., 1999; Bugg, Jacoby, & Chanani, 2011; Verguts, Vassena, & Silvetti, 2015). In cognitive control research, these choice evaluations and their influence on subsequent adaptation are studied under the umbrella of performance monitoring (Carter et al., 1998; Ullsperger, Fischer, Nigbur, & Endrass, 2014). Unlike option-based learning, performance monitoring influences not only which options are chosen, but also how subsequent choices are made. It also informs higher order decisions about strategy and task selection(Fig. 6305A).

Monday, April 20, 2020

How Becoming a Doctor Made Me a Worse Listener

Adeline Goss
JAMA. 2020;323(11):1041-1042.
doi:10.1001/jama.2020.2051

Here is an excerpt:

And I hadn’t noticed. Maybe that was because I was still connecting to patients. I still choked up when they cried, felt joy when they rejoiced, felt moved by and grateful for my work, and generally felt good about the care I was providing.

But as I moved through my next days in clinic, I began to notice the unconscious tricks I had developed to maintain a connection under time pressure. A whole set of expressions played out across my face during history taking—nonverbal concern, nonverbal gentleness, nonverbal apology—a time-efficient method of conveying empathy even when I was asking directed questions, controlling the type and volume of information I received, and, at times, interrupting. Sometimes I apologized to patients for my style of interviewing, explaining that I wanted to make sure I understood things clearly so that I could treat them. I apologized because I didn’t like communicating this way. I can’t imagine it felt good to them.

What’s strange is that, at the end of these visits, patients often thanked me for my concern and detail-orientedness. They may have interpreted my questioning as a sign that I was interested. But was I?

Interest is a multilayered concept in medicine. I care about patients, and I am interested in their stories in the sense that they contain the information I need to make the best possible decisions for their care. Interest motivates doctors to take a detailed history, review the chart, and analyze the literature. Interest leads to the correct diagnosis and treatment. Residency rewards this kind of interest. Perhaps as a result, looking around at my co-residents, it’s in abundant supply, even when time is tight.

The info is here.

Friday, September 20, 2019

Why Moral Emotions Go Viral Online

Ana P. Gantman, William J. Brady, & Jay Van Bavel
Scientific American
Originally posted August 20, 2019

Social media is changing the character of our political conversations. As many have pointed out, our attention is a scarce resource that politicians and journalists are constantly fighting to attract, and the online world has become a primary trigger of our moral outrage. These two ideas, it turns out, are fundamentally related. According to our forthcoming paper, words that appeal to one’s sense of right and wrong are particularly effective at capturing attention, which may help explain this new political reality.

It occurred to us that the way people scroll through their social media feeds is very similar to a classic method psychologists use to measure people’s ability to pay attention. When we mindlessly browse social media, we are rapidly presenting a stream of verbal stimuli to ourselves. Psychologists have been studying this issue in the lab for decades, displaying to subjects a rapid succession of words, one after another, in the blink of an eye. In the lab, people are asked to find a target word among a collection of other words. Once they find it, there’s a short window of time in which that word captures their attention. If there’s a second target word in that window, most people don’t even see it—almost as if they had blinked with their eyes open.

There is an exception: if the second target word is emotionally significant to the viewer, that person will see it. Some words are so important to us that they are able to capture our attention even when we are already paying attention to something else.

The info is here.

Monday, February 25, 2019

Information Processing Biases in the Brain: Implications for Decision-Making and Self-Governance

Sali, A.W., Anderson, B.A. & Courtney, S.M.
Neuroethics (2018) 11: 259.
https://doi.org/10.1007/s12152-016-9251-1

Abstract

To make behavioral choices that are in line with our goals and our moral beliefs, we need to gather and consider information about our current situation. Most information present in our environment is not relevant to the choices we need or would want to make and thus could interfere with our ability to behave in ways that reflect our underlying values. Certain sources of information could even lead us to make choices we later regret, and thus it would be beneficial to be able to ignore that information. Our ability to exert successful self-governance depends on our ability to attend to sources of information that we deem important to our decision-making processes. We generally assume that, at any moment, we have the ability to choose what we pay attention to. However, recent research indicates that what we pay attention to is influenced by our prior experiences, including reward history and past successes and failures, even when we are not aware of this history. Even momentary distractions can cause us to miss or discount information that should have a greater influence on our decisions given our values. Such biases in attention thus raise questions about the degree to which the choices that we make may be poorly informed and not truly reflect our ability to otherwise exert self-governance.

Here is part of the Conclusion:

In order to consistently make decisions that reflect our goals and values, we need to gather the information necessary to guide these decisions, and ignore information that is irrelevant. Although the momentary acquisition of irrelevant information will not likely change our goals, biases in attentional selection may still profoundly influence behavioral outcomes, tipping the balance between competing options when faced with a single goal (e.g., save the least competent swimmer) or between simultaneously competing goals (e.g., relieve drug craving and withdrawal symptoms vs. maintain abstinence). An important component of self-governance might, therefore, be the ability to exert control over how we represent our world as we consider different potential courses of action.

Monday, July 30, 2018

Biases Make People Vulnerable to Misinformation Spread by Social Media

Giovanni Luca Ciampaglia & Filippo Mencze
Scientific American
Originally published June 21, 2018

Here is an excerpt:

The third group of biases arises directly from the algorithms used to determine what people see online. Both social media platforms and search engines employ them. These personalization technologies are designed to select only the most engaging and relevant content for each individual user. But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation.

For instance, the detailed advertising tools built into many social media platforms let disinformation campaigners exploit confirmation bias by tailoring messages to people who are already inclined to believe them.

Also, if a user often clicks on Facebook links from a particular news source, Facebook will tend to show that person more of that site’s content. This so-called “filter bubble” effect may isolate people from diverse perspectives, strengthening confirmation bias.

The information is here.

Friday, July 28, 2017

I attend, therefore I am

Carolyn Dicey Jennings
Aeon.com
Originally published July 10, 2017

Here is an excerpt:

Following such considerations, the philosopher Daniel Dennett proposed that the self is simply a ‘centre of narrative gravity’ – just as the centre of gravity in a physical object is not a part of that object, but a useful concept we use to understand the relationship between that object and its environment, the centre of narrative gravity in us is not a part of our bodies, a soul inside of us, but a useful concept we use to make sense of the relationship between our bodies, complete with their own goals and intentions, and our environment. So, you, you, are a construct, albeit a useful one. Or so goes Dennett’s thinking on the self.

And it isn’t just Dennett. The idea that there is a substantive self is passé. When cognitive scientists aim to provide an empirical account of the self, it is simply an account of our sense of self – why it is that we think we have a self. What we don’t find is an account of a self with independent powers, responsible for directing attention and resolving conflicts of will.

There are many reasons for this. One is that many scientists think that the evidence counts in favour of our experience in general being epiphenomenal – something that does not influence our brain, but is influenced by it. In this view, when you experience making a tough decision, for instance, that decision was already made by your brain, and your experience is mere shadow of that decision. So for the very situations in which we might think the self is most active – in resolving difficult decisions – everything is in fact already achieved by the brain.

The article is here.

Saturday, November 1, 2014

Are We Really Conscious?

By Michael Graziano
The New York Times Sunday Review
Originally published October 10, 2014

Here is an excerpt:

The brain builds models (or complex bundles of information) about items in the world, and those models are often not accurate. From that realization, a new perspective on consciousness has emerged in the work of philosophers like Patricia S. Churchland and Daniel C. Dennett. Here’s my way of putting it:

How does the brain go beyond processing information to become subjectively aware of information? The answer is: It doesn’t. The brain has arrived at a conclusion that is not correct. When we introspect and seem to find that ghostly thing — awareness, consciousness, the way green looks or pain feels — our cognitive machinery is accessing internal models and those models are providing information that is wrong. The machinery is computing an elaborate story about a magical-seeming property. And there is no way for the brain to determine through introspection that the story is wrong, because introspection always accesses the same incorrect information.

The entire article is here.

Friday, October 24, 2014

Can Our Brains Handle the Information Age?

An Interview with Daniel Levitin
By Bret S. Stetka
Medscape
Originally posted September 24, 2014

In his new book, The Organized Mind, best-selling author and neuroscientist Daniel Levitin, PhD, discusses our brain's ability—or lack thereof—to process the dizzying flow of information brought on us by the digital age. Dr Levitin also suggests numerous ways of organizing mass information to make it more manageable. Medscape recently spoke with Dr Levitin about the neuroscience of information processing as well as approaches potentially useful to overworked clinicians.

The Fear of Information

Medscape: Your new book discusses how throughout history humans have been suspicious of increased access to information, from the printing press back to the first Sumerian writings. But I think most would agree that these were positive advancements. Do you think the current digital age weariness expressed by many is more of the same and that today's rapid technological progression will end up being a positive development for humanity? Or has the volume of data out there just gotten too big for the human brain to handle?

Dr Levitin: I have two minds about this. On one hand, there is this "same as it ever was" kind of complaint cycle. Seneca complained at the time of the ancient Greeks about the invention of writing—that it was going to weaken men's minds because they would no longer engage in thoughtful conversation. You couldn't interrogate the person who was telling you something, meaning that lies could be promulgated more easily and passed from generation to generation.

(cut)

If we look back at our evolutionary history, the amount of information that existed in the world just a few thousand years ago was really just a small percentage of what exists now. By some estimates, the amount of scientific and medical information produced in the last 25 years is equal to all of the information in all of human history up to that point.

The human brain can really only attend to a few things at once, so I think we are reaching a point where we have to figure out how to filter information so that we can use it more intelligently and not be distracted by irrelevant information. Studies show that people who are given more information in certain situations tend to make poorer decisions because they become distracted or overwhelmed by the irrelevant information.

The entire interview is here.