Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Critical Thinking. Show all posts
Showing posts with label Critical Thinking. Show all posts

Wednesday, December 13, 2023

Science and Ethics of “Curing” Misinformation

Freiling, I., Knause, N.M., & Scheufele, D.A.
AMA J Ethics. 2023;25(3):E228-237. 

Abstract

A growing chorus of academicians, public health officials, and other science communicators have warned of what they see as an ill-informed public making poor personal or electoral decisions. Misinformation is often seen as an urgent new problem, so some members of these communities have pushed for quick but untested solutions without carefully diagnosing ethical pitfalls of rushed interventions. This article argues that attempts to “cure” public opinion that are inconsistent with best available social science evidence not only leave the scientific community vulnerable to long-term reputational damage but also raise significant ethical questions. It also suggests strategies for communicating science and health information equitably, effectively, and ethically to audiences affected by it without undermining affected audiences’ agency over what to do with it.

My summary:

The authors explore the challenges and ethical considerations surrounding efforts to combat misinformation. The authors argue that using the term "curing" to describe these efforts is problematic, as it suggests that misinformation is a disease that can be eradicated. They argue that this approach is overly simplistic and disregards the complex social and psychological factors that contribute to the spread of misinformation.

The authors identify several ethical concerns with current approaches to combating misinformation, including:
  • The potential for censorship and suppression of legitimate dissent.
  • The undermining of public trust in science and expertise.
  • The creation of echo chambers and further polarization of public opinion.
Instead of trying to "cure" misinformation, the authors propose a more nuanced and ethical approach that focuses on promoting critical thinking, media literacy, and civic engagement. They also emphasize the importance of addressing the underlying social and psychological factors that contribute to the spread of misinformation, such as social isolation, distrust of authority, and a desire for simple explanations.

Wednesday, August 30, 2023

Not all skepticism is “healthy” skepticism: Theorizing accuracy- and identity-motivated skepticism toward social media misinformation

Li, J. (2023). 
New Media & Society, 0(0). 

Abstract

Fostering skepticism has been seen as key to addressing misinformation on social media. This article reveals that not all skepticism is “healthy” skepticism by theorizing, measuring, and testing the effects of two types of skepticism toward social media misinformation: accuracy- and identity-motivated skepticism. A two-wave panel survey experiment shows that when people’s skepticism toward social media misinformation is driven by accuracy motivations, they are less likely to believe in congruent misinformation later encountered. They also consume more mainstream media, which in turn reinforces accuracy-motivated skepticism. In contrast, when skepticism toward social media misinformation is driven by identity motivations, people not only fall for congruent misinformation later encountered, but also disregard platform interventions that flag a post as false. Moreover, they are more likely to see social media misinformation as favoring opponents and intentionally avoid news on social media, both of which form a vicious cycle of fueling more identity-motivated skepticism.

Discussion

I have made the case that it is important to distinguish between accuracy-motivated skepticism and identity-motivated skepticism. They are empirically distinguishable constructs that cast opposing effects on outcomes important for a well-functioning democracy. Across the board, accuracy-motivated skepticism produces normatively desirable outcomes. Holding a higher level of accuracy-motivated skepticism makes people less likely to believe in congruent misinformation they encounter later, offering hope that partisan motivated reasoning can be attenuated. Accuracy-motivated skepticism toward social media misinformation also has a mutually reinforcing relationship with consuming news from mainstream media, which can serve to verify information on social media and produce potential learning effects.

In contrast, not all skepticism is “healthy” skepticism. Holding a higher level of identity-motivated skepticism not only increases people’s susceptibility to congruent misinformation they encounter later, but also renders content flagging by social media platforms less effective. This is worrisome as calls for skepticism and platform content moderation have been a crucial part of recently proposed solutions to misinformation. Further, identity-motivated skepticism reinforces perceived bias of misinformation and intentional avoidance of news on social media. These can form a vicious cycle of close-mindedness and politicization of misinformation.

This article advances previous understanding of skepticism by showing that beyond the amount of questioning (the tipping point between skepticism and cynicism), the type of underlying motivation matters for whether skepticism helps people become more informed. By bringing motivated reasoning and media skepticism into the same theoretical space, this article helps us make sense of the contradictory evidence on the utility of media skepticism. Skepticism in general should not be assumed to be “healthy” for democracy. When driven by identity motivations, skepticism toward social media misinformation is counterproductive for political learning; only when skepticism toward social media is driven by the accuracy motivations does it inoculate people against favorable falsehoods and encourage consumption of credible alternatives.


Here are some additional thoughts on the research:
  • The distinction between accuracy-motivated skepticism and identity-motivated skepticism is a useful one. It helps to explain why some people are more likely to believe in misinformation than others.
  • The findings of the studies suggest that interventions that promote accuracy-motivated skepticism could be effective in reducing the spread of misinformation on social media.
  • It is important to note that the research was conducted in the United States. It is possible that the findings would be different in other countries.

Tuesday, August 8, 2023

Predictors and consequences of intellectual humility

Porter, T., Elnakouri, A., Meyers, E.A. et al. 
Nat Rev Psychol 1, 524–536 (2022).

Abstract

In a time of societal acrimony, psychological scientists have turned to a possible antidote — intellectual humility. Interest in intellectual humility comes from diverse research areas, including researchers studying leadership and organizational behaviour, personality science, positive psychology, judgement and decision-making, education, culture, and intergroup and interpersonal relationships. In this Review, we synthesize empirical approaches to the study of intellectual humility. We critically examine diverse approaches to defining and measuring intellectual humility and identify the common element: a meta-cognitive ability to recognize the limitations of one’s beliefs and knowledge. After reviewing the validity of different measurement approaches, we highlight factors that influence intellectual humility, from relationship security to social coordination. Furthermore, we review empirical evidence concerning the benefits and drawbacks of intellectual humility for personal decision-making, interpersonal relationships, scientific enterprise and society writ large. We conclude by outlining initial attempts to boost intellectual humility, foreshadowing possible scalable interventions that can turn intellectual humility into a core interpersonal, institutional and cultural value.

Importance of intellectual humility

The willingness to recognize the limits of one’s knowledge and fallibility can confer societal and individual benefits, if expressed in the right moment and to the proper extent. This insight echoes the philosophical roots of intellectual humility as a virtue. State and trait intellectual humility have been associated with a range of cognitive, social and personality variables (Table 2). At the societal level, intellectual humility can promote societal cohesion by reducing group polarization and encouraging harmonious intergroup relationships. At the individual level, intellectual humility can have important consequences for wellbeing, decision-making and academic learning.

Notably, empirical research has provided little evidence regarding the generalizability of the benefits or drawbacks of intellectual humility beyond the unique contexts of WEIRD (Western, educated, industrialized, rich and democratic) societies. With this caveat, below is an initial set of findings concerning the implications of possessing high levels of intellectual humility. Unless otherwise specified, the evidence below concerns trait-level intellectual humility. After reviewing these benefits, we consider attempts to improve an individual’s intellectual humility and confer associated benefits.

(cut)

Individual benefits

Intellectual humility might also have direct consequences for individuals’ wellbeing. People who reason about social conflicts in an intellectually humbler manner and consider others’ perspectives (components of wise reasoning) are more likely to report higher levels of life satisfaction and less negative affect compared to people who do not. Leaders who are higher in intellectual humility are also higher in emotional intelligence and receive higher satisfaction ratings from their followers, which suggests that intellectual humility could benefit professional life. Nonetheless, intellectual humility is not associated with personal wellbeing in all contexts: religious leaders who see their religious beliefs as fallible have lower wellbeing relative to leaders who are less intellectually humble in their beliefs.

Intellectual humility might also help people to make well informed decisions. Intellectually humbler people are better able to differentiate between strong and weak arguments, even if those arguments go against their initial beliefs9. Intellectual humility might also protect against memory distortions. Intellectually humbler people are less likely to claim falsely that they have seen certain statements before116. Likewise, intellectually humbler people are more likely to scrutinize misinformation and are more likely to intend to receive the COVID-19 vaccine.

Lastly, intellectual humility is positively associated with knowledge acquisition, learning and educational achievement. Intellectually humbler people are more motivated to learn and more knowledgeable about general facts. Likewise, intellectually humbler high school and university students expend greater effort when learning difficult material, are more receptive to assignment feedback and earn higher grades.

Despite evidence of individual benefits associated with intellectual humility, much of this work is correlational. Thus, associations could be the product of confounding factors such as agreeableness, intelligence or general virtuousness. Longitudinal or experimental studies are needed to address the question of whether and under what circumstances intellectual humility promotes individual benefits. Notably, philosophical theorizing about the situation-specific virtuousness of the construct suggests that high levels of intellectual humility are unlikely to benefit all people in all situations.


What is intellectual humility? Intellectual humility is the ability to recognize the limits of one's knowledge and to be open to new information and perspectives.

Predictors of intellectual humility: There are a number of factors that can predict intellectual humility, including:
  • Personality traits: People who are high in openness to experience and agreeableness are more likely to be intellectually humble.
  • Cognitive abilities: People who are better at thinking critically and evaluating evidence are also more likely to be intellectually humble.
  • Cultural factors: People who live in cultures that value open-mindedness and tolerance are more likely to be intellectually humble.
Consequences of intellectual humility: Intellectual humility has a number of positive consequences, including:
  • Better decision-making: Intellectually humble people are more likely to make better decisions because they are more open to new information and perspectives.
  • Enhanced learning: Intellectually humble people are more likely to learn from their mistakes and to grow as individuals.
  • Stronger relationships: Intellectually humble people are more likely to have strong relationships because they are more willing to listen to others and to consider their perspectives.

Overall, intellectual humility is a valuable trait that can lead to a number of positive outcomes.

Sunday, January 1, 2023

The Central Role of Lifelong Learning & Humility in Clinical Psychology

Washburn, J. J., Teachman, B. A., et al. 
(2022). Clinical Psychological Science, 0(0).
https://doi.org/10.1177/21677026221101063

Abstract

Lifelong learning plays a central role in the lives of clinical psychologists. As psychological science advances and evidence-based practices develop, it is critical for clinical psychologists to not only maintain their competencies but to also evolve them. In this article, we discuss lifelong learning as a clinical, ethical, and scientific imperative in the myriad dimensions of the clinical psychologist’s professional life, arguing that experience alone is not sufficient. Attitude is also important in lifelong learning, and we call for clinical psychologists to adopt an intellectually humble stance and embrace “a beginner’s mind” when approaching new knowledge and skills. We further argue that clinical psychologists must maintain and refresh their critical-thinking skills and seek to minimize their biases, especially as they approach the challenges and opportunities of lifelong learning. We intend for this article to encourage psychologists to think differently about how they approach lifelong learning.

Here is an excerpt:

Schwartz (2008) was specifically referencing the importance of teaching graduate students to embrace what they do not know, viewing it as an opportunity instead of a threat. The same is true, perhaps even more so, for psychologists engaging in lifelong learning.

As psychologists progress in their careers, they are told repeatedly that they are experts in their field and sometimes THE expert in their own tiny subfield. Psychologists spend their days teaching others what they know and advising students how to make their own discoveries. But expertise is a double-edged sword. Of course, it serves psychologists well in that they are less likely to repeat past mistakes, but it is a disadvantage if they become too comfortable in their expert role. The Egyptian mathematician, Ptolemy, devised a system based on the notion that the sun revolved around the earth that guided astronomers for centuries until Copernicus proved him wrong. Although Newton devised the laws of physics, Einstein showed that the principles of Newtonian physics were wholly bound by context and only “right” within certain constraints. Science is inherently self-correcting, and the only thing that one can count on is that most of what people believe today will be shown to be wrong in the not-too-distant future. One of the authors (S. D. Hollon) recalls that the two things that he knew for sure coming out of graduate school was that neural tissues do not regenerate and that you cannot inherit acquired characteristics. It turns out that both are wrong. Lifelong learning and the science it is based on require psychologists to continuously challenge their expertise. Before becoming experts, psychologists often experience impostor phenomenon during education and training (Rokach & Boulazreg, 2020). Embracing the self-doubt that comes with feeling like an impostor can motivate lifelong learning, even for areas in which one feels like an expert. This means not only constantly learning about new topics but also recognizing that as psychologists tackle tough problems and their associated research questions, complex and often interdisciplinary approaches are required to develop meaningful answers. It is neither feasible nor desirable to become an expert in all domains. This means that psychologists need to routinely surround themselves with people who make them question or expand their expertise.

Here is the conclusion:

Lifelong learning should, like doctoral programs in clinical psychology, concentrate much more on thinking than training. Lifelong learning must encourage critical and independent thinking in the process of mastering relevant bodies of knowledge and the development of specific skills. Specifically, lifelong learning must reinforce the need for clinical psychologists to reflect carefully and critically on what they read, hear, and say and to think abstractly. Such abstract thinking is as relevant after one’s graduate career as before.

Wednesday, November 3, 2021

Maybe a free thinker but not a critical one: High conspiracy belief is associated with low critical thinking ability

Lantian, A., Bagneux, V., Delouvée, S., 
& Gauvrit, N. (2020, February 7). 
Applied Cognitive Psychology
https://doi.org/10.31234/osf.io/8qhx4

Abstract

Critical thinking is of paramount importance in our society. People regularly assume that critical thinking is a way to reduce conspiracy belief, although the relationship between critical thinking and conspiracy belief has never been tested. We conducted two studies (Study 1, N = 86; Study 2, N = 252), in which we found that critical thinking ability—measured by an open-ended test emphasizing several areas of critical thinking ability in the context of argumentation—is negatively associated with belief in conspiracy theories. Additionally, we did not find a significant relationship between self-reported (subjective) critical thinking ability and conspiracy belief. Our results support the idea that conspiracy believers have less developed critical thinking ability and stimulate discussion about the possibility of reducing conspiracy beliefs via the development of critical thinking.

From the General Discussion

The presumed role of critical thinking in belief in conspiracy theories is continuously discussed by researchers, journalists, and by lay people on social networks. One example is the capacity to exercise critical thinking ability to distinguish bogus conspiracy theories from genuine conspiracy theories (Bale, 2007), leading us to question when critical thinking ability could be used to support this adaptive function. Sometimes, it is not unreasonable to think that a form of rationality would help to facilitate the detection of dangerous coalitions (van Prooijen & Van Vugt, 2018). In that respect, Stojanov and Halberstadt (2019) recently introduced a distinction between irrational versus rational suspicion. Although the former focuses on the general tendency to believe in any conspiracy theories, the later focus on higher sensitivity to deception or corruption, which is defined as“healthy skepticism.” These two aspects of suspicion can now be handled simultaneously thanks to a new scale developed by Stojanov and Halberstadt (2019). In our study, we found that critical thinking ability was associated with lower unfounded belief in conspiracy theories, but this does not answer the question as to whether critical thinking ability can be helpful for the detection of true conspiracies. Future studies could use this new measurement to address this specific question.

Thursday, January 14, 2021

'How Did We Get Here?' A Call For An Evangelical Reckoning On Trump

Rachel Martin
NPR.org
Originally poste 13 Jan 202

Here is an excerpt:

You write that Trump has burned down the Republican Party. What has he done to the evangelical Christian movement?

If you asked today, "What's an evangelical?" to most people, I would want them to say: someone who believes Jesus died on the cross for our sin and in our place and we're supposed to tell everyone about it. But for most people they'd say, "Oh, those are those people who are really super supportive of the president no matter what he does." And I don't think that's what we want to be known for. That's certainly not what I want to be known for. And I think as this presidency is ending in tatters as it is, hopefully more and more evangelicals will say, "You know, we should have seen earlier, we should have known better, we should have honored the Lord more in our actions these last four years."

Should ministers on Sunday mornings be delivering messages about how to sort fact from fiction and discouraging their parishioners from seeking truth in these darkest corners of the Internet peddling lies?

Absolutely, absolutely. Mark Noll wrote years ago a book called The Scandal of the Evangelical Mind, and he was talking about the lack of intellectual engagement in some corners of evangelicalism.

I think the scandal of the evangelical mind today is the gullibility that so many have been brought into — conspiracy theories, false reports and more — and so I think the Christian responsibility is we need to engage in what we call in the Christian tradition, discipleship. Jesus says, "I am the way, the truth and the life." So Jesus literally identifies himself as the truth; therefore, if there ever should be a people who care about the truth, it should be people who call themselves followers of Jesus.

Monday, February 25, 2019

A philosopher’s life

Margaret Nagle
UMaineToday
Fall/Winter 2018

Here is an excerpt:

Mention philosophy and for most people, images of the bearded philosophers of Ancient Greece pontificating in the marketplace come to mind. Today, philosophers are still in public arenas, Miller says, but now that engagement with society is in K–12 education, medicine, government, corporations, environmental issues and so much more. Public philosophers are students of community knowledge, learning as much as they teach.

The field of clinical ethics, which helps patients, families and clinicians address ethical issues that arise in health care, emerged in recent decades as medical decisions became more complex in an increasingly technological society. Those questions can range from when to stop aggressive medical intervention to whether expressed breast milk from a patient who uses medical marijuana should be given to her baby in the neonatal intensive care unit.

As a clinical ethicist, Miller provides training and consultation for physicians, nurses and other medical personnel. She also may be called on to consult with patients and their family members. Unlike urban areas where a city hospital may have a whole department devoted to clinical ethics, rural health care settings often struggle to find such philosophy-focused resources.

That’s why Miller does what she does in Maine.

Miller focuses on “building clinical ethics capacity” in the state’s rural health care settings, providing training, connecting hospital personnel to readings and resources, and facilitating opportunities to maintain ongoing exploration of critical issues.

The article is here.

Friday, December 14, 2018

Don’t Want to Fall for Fake News? Don’t Be Lazy

Robbie Gonzalez
www.wired.com
Originally posted November 9, 2018

Here are two excerpts:

Misinformation researchers have proposed two competing hypotheses for why people fall for fake news on social media. The popular assumption—supported by research on apathy over climate change and the denial of its existence—is that people are blinded by partisanship, and will leverage their critical-thinking skills to ram the square pegs of misinformation into the round holes of their particular ideologies. According to this theory, fake news doesn't so much evade critical thinking as weaponize it, preying on partiality to produce a feedback loop in which people become worse and worse at detecting misinformation.

The other hypothesis is that reasoning and critical thinking are, in fact, what enable people to distinguish truth from falsehood, no matter where they fall on the political spectrum. (If this sounds less like a hypothesis and more like the definitions of reasoning and critical thinking, that's because they are.)

(cut)

All of which suggests susceptibility to fake news is driven more by lazy thinking than by partisan bias. Which on one hand sounds—let's be honest—pretty bad. But it also implies that getting people to be more discerning isn't a lost cause. Changing people's ideologies, which are closely bound to their sense of identity and self, is notoriously difficult. Getting people to think more critically about what they're reading could be a lot easier, by comparison.

Then again, maybe not. "I think social media makes it particularly hard, because a lot of the features of social media are designed to encourage non-rational thinking." Rand says. Anyone who has sat and stared vacantly at their phone while thumb-thumb-thumbing to refresh their Twitter feed, or closed out of Instagram only to re-open it reflexively, has experienced firsthand what it means to browse in such a brain-dead, ouroboric state. Default settings like push notifications, autoplaying videos, algorithmic news feeds—they all cater to humans' inclination to consume things passively instead of actively, to be swept up by momentum rather than resist it.

The info is here.

Monday, March 12, 2018

Train PhD students to be thinkers not just specialists

Gundula Bosch
nature.com
Originally posted February 14, 2018

Under pressure to turn out productive lab members quickly, many PhD programmes in the biomedical sciences have shortened their courses, squeezing out opportunities for putting research into its wider context. Consequently, most PhD curricula are unlikely to nurture the big thinkers and creative problem-solvers that society needs.

That means students are taught every detail of a microbe’s life cycle but little about the life scientific. They need to be taught to recognize how errors can occur. Trainees should evaluate case studies derived from flawed real research, or use interdisciplinary detective games to find logical fallacies in the literature. Above all, students must be shown the scientific process as it is — with its limitations and potential pitfalls as well as its fun side, such as serendipitous discoveries and hilarious blunders.

This is exactly the gap that I am trying to fill at Johns Hopkins University in Baltimore, Maryland, where a new graduate science programme is entering its second year. Microbiologist Arturo Casadevall and I began pushing for reform in early 2015, citing the need to put the philosophy back into the doctorate of philosophy: that is, the ‘Ph’ back into the PhD.

The article is here.

Wednesday, February 28, 2018

Can scientists agree on a code of ethics?

David Ryan Polgar
BigThink.com
Originally published January 30, 2018

Here is an excerpt:

Regarding the motivation for developing this Code of Ethics, Hug mentioned the threat of reduced credibility of research if the standards seem to loose. She mentioned the pressure that many young scientists face in being prolific with research, insinuating the tension with quantity versus quality. "We want research to remain credible because we want it to have an impact on policymakers, research being turned into action." One of the goals of Hug presenting about the Code of Ethics, she said, was to start having various research institutions endorse the document, and have those institutions start distributing the Code of Ethics within their network.

“All these goals will conflict with each other," said Jodi Halpern, referring to the issues that may get in the way of adopting a code of ethics for scientists. "People need rigorous education in ethical reasoning, which is just as rigorous as science education...what I’d rather have as a requirement, if I’d like to put teeth anywhere. I’d like to have every doctoral student not just have one of those superficial IRB fake compliance courses, but I’d like to have them have to pass a rigorous exam showing how they would deal with certain ethical dilemmas. And everybody who will be the head of a lab someday will have really learned how to do that type of thinking.”

The article is here.

Monday, December 11, 2017

Epistemic rationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational

TomasStåhl and Jan-Willem van Prooijen
Personality and Individual Differences
Volume 122, 1 February 2018, Pages 155-163

Abstract

Why does belief in the paranormal, conspiracy theories, and various other phenomena that are not backed up by evidence remain widespread in modern society? In the present research we adopt an individual difference approach, as we seek to identify psychological precursors of skepticism toward unfounded beliefs. We propose that part of the reason why unfounded beliefs are so widespread is because skepticism requires both sufficient analytic skills, and the motivation to form beliefs on rational grounds. In Study 1 we show that analytic thinking is associated with a lower inclination to believe various conspiracy theories, and paranormal phenomena, but only among individuals who strongly value epistemic rationality. We replicate this effect on paranormal belief, but not conspiracy beliefs, in Study 2. We also provide evidence suggesting that general cognitive ability, rather than analytic cognitive style, is the underlying facet of analytic thinking that is responsible for these effects.

The article is here.

To think critically, you have to be both analytical and motivated

John Timmer
ARS Techica
Originally published November 15, 2017

Here is an excerpt:

One of the proposed solutions to this issue is to incorporate more critical thinking into our education system. But critical thinking is more than just a skill set; you have to recognize when to apply it, do so effectively, and then know how to respond to the results. Understanding what makes a person effective at analyzing fake news and conspiracy theories has to take all of this into account. A small step toward that understanding comes from a recently released paper, which looks at how analytical thinking and motivated skepticism interact to make someone an effective critical thinker.

Valuing rationality

The work comes courtesy of the University of Illinois at Chicago's Tomas Ståhl and Jan-Willem van Prooijen at VU Amsterdam. This isn't the first time we've heard from Ståhl; last year, he published a paper on what he termed "moralizing epistemic rationality." In it, he looked at people's thoughts on the place critical thinking should occupy in their lives. The research identified two classes of individuals: those who valued their own engagement with critical thinking, and those who viewed it as a moral imperative that everyone engage in this sort of analysis.

The information is here.

The target article is here.

Saturday, December 9, 2017

Evidence-Based Policy Mistakes

Kausik Basu
Project Syndicate
Originally published November 30, 2017

Here is an excerpt:

Likewise, US President Donald Trump cites simplistic trade-deficit figures to justify protectionist policies that win him support among a certain segment of the US population. In reality, the evidence suggests that such policies will hurt the very people Trump claims to be protecting.

Now, the chair of Trump’s Council of Economic Advisers, Kevin Hassett, is attempting to defend Congressional Republicans’ effort to slash corporate taxes by claiming that, when developed countries have done so in the past, workers gained “well north of” $4,000 per year. Yet there is ample evidence that the benefits of such tax cuts accrue disproportionately to the rich, largely via companies buying back stock and shareholders earning higher dividends.

It is not clear whence Hassett is getting his data. But chances are that, at the very least, he is misinterpreting it. And he is far from alone in failing to reach accurate conclusions when assessing a given set of data.

Consider the oft-repeated refrain that, because there is evidence that virtually all jobs over the last decade were created by the private sector, the private sector must be the most effective job creator. At first glance, the logic might seem sound. But, on closer examination, the statement begs the question. Imagine a Soviet economist claiming that, because the government created virtually all jobs in the Soviet Union, the government must be the most effective job creator. To find the truth, one would need, at a minimum, data on who else tried to create jobs, and how.

The article is here.

Thursday, December 7, 2017

Social media threat: People learned to survive disease, we can handle Twitter

Glenn Harlan Reynolds
USA Today
Originally posted November 20, 2017

Here is an excerpt:

Hunters and gatherers were at far less risk for infectious disease because they didn’t encounter very many new people very often. Their exposure was low, and contact among such bands was sporadic enough that diseases couldn’t spread very fast.

It wasn’t until you crowded thousands, or tens of thousands of them, along with their animals, into small dense areas with poor sanitation that disease outbreaks took off.  Instead of meeting dozens of new people per year, an urban dweller probably encountered hundreds per day. Diseases that would have affected only a few people at a time as they spread slowly across a continent (or just burned out for lack of new carriers) would now leap from person to person in a flash.

Likewise, in recent years we’ve gone from an era when ideas spread comparatively slowly, to one in which social media in particular allow them to spread like wildfire. Sometimes that’s good, when they’re good ideas. But most ideas are probably bad; certainly 90% of ideas aren’t in the top 10%. Maybe we don’t know the mental disease vectors that we’re inadvertently unleashing.

It took three things to help control the spread of disease in cities: sanitation, acclimation and better nutrition. In early cities, after all, people had no idea how diseases spread, something we didn’t fully understand until the late 19th century. But rule-of-thumb sanitation made things a lot better over time. Also, populations eventually adapted:  Diseases became endemic, not epidemic, and usually less severe as people developed immunity. And finally, as Scott notes, surviving disease was always a function of nutrition, with better-nourished populations doing much better than malnourished ones.

The article is here.

Wednesday, January 25, 2017

The candy diet

Seth Godin
Seth's Blog
Originally posted January 4, 2017

The bestselling novel of 1961 was Allen Drury's Advise and Consent. Millions of people read this 690-page political novel. In 2016, the big sellers were coloring books.

Fifteen years ago, cable channels like TLC (the "L" stood for Learning), Bravo and the History Channel (the "History" stood for History) promised to add texture and information to the blighted TV landscape. Now these networks run shows about marrying people based on how well they kiss.

And of course, newspapers won Pulitzer prizes for telling us things we didn't want to hear. We've responded by not buying newspapers any more.

The decline of thoughtful media has been discussed for a century. This is not new. What is new: A fundamental shift not just in the profit-seeking gatekeepers, but in the culture as a whole.

"Everything should be made as simple as possible, but no simpler."*

[*Ironically, this isn't what Einstein actually said. It was this, "It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience." Alas, I've been seduced into believing that the shorter one now works better.]

Is it possible we've made things simpler than they ought to be, and established non-curiosity as the new standard?

The blog post is here.

Tuesday, June 7, 2016

Student Resistance to Thought Experiments

Regina A. Rini
APA Newsletter - Teaching Philosophy
Spring 2016, Volume 15 (2)

Introduction

From Swampmen to runaway trolleys, philosophers make routine use of thought experiments. But our students are not always so enthusiastic. Most teachers of introductory philosophy will be familiar with the problem: students push back against the use of thought experiments, and not for the reasons that philosophers are likely to accept. Rather than challenge whether the thought experiments actually
support particular conclusions, students instead challenge their realism or their relevance.

In this article I will look at these sorts of challenges, with two goals in mind. First, there is a practical pedagogical goal: How do we guide students to overcome their resistance to a useful method? Second, there is something I will call “pedagogical bad faith.” Many of us actually do have sincere doubts, as professional philosophers, about the value of thought experiment methodology. Some of
these doubts in fact correspond to our students’ naïve resistance. But we often decide, for pedagogical reasons, to avoid mentioning our own doubts to students. Is this practice defensible?

The article is here.

Editor's Note: I agree with this article in many ways.  After I have read a philosophy article and a podcast using a thought experiment, I provided critiques regarding how the thought experiments were limited to the author. My criticisms were dismissed with a more ad hominem attack of my lack of understanding of philosophy or how philosophers work.  I was told I should read more philosophy, especially Derek Parfit.  I wish I had this article several years ago.

Saturday, September 5, 2015

Singularitarians, AItheists, and Why the Problem with Artificial Intelligence is Humanity At Large

By Luciano Floridi
The American Philosophical Association Newsletter
Spring 2015, Volume 14, Number 2, pp 8-11.

Here is an excerpt:

The success of our technologies largely depends on the fact that, while we were speculating about the possibility of true AI, we increasingly enveloped the world in so many devices, applications, and data that it became an IT-friendly environment, where technologies can replace us without having any understanding or semantic skills. Memory (as in algorithms and immense datasets) outperforms intelligence when landing an aircraft, finding the fastest route from home to the office, or discovering the best price for your next fridge. The BBC has made a two-minutes short animation to introduce the idea of a fourth revolution that is worth watching.  Unfortunately, like John Searle, it made a mistake in the end, equating “better at accomplishing tasks” with “better at thinking.” I never argued that digital technologies think better than us, but that they can do more and more things better than us by processing increasing amounts of data. What’s the difference? The same as between you and the dishwasher when washing the dishes. What’s the consequence? That any apocalyptic vision of AI is just silly. The serious risk is not the appearance of some superintelligence, but that we may misuse our digital technologies, to the detriment of a large percentage of humanity and the whole planet. We are and shall remain for the foreseeable future the problem, not our technology. We should be worried about real human stupidity, not imaginary artificial intelligence. The problem is not HAL but H.A.L., Humanity At Large.

The entire article is here.

Sunday, July 6, 2014

Empirical neuroenchantment: from reading minds to thinking critically

Sabrina S. Ali, Michael Lifshitz, and Amir Raz
Front. Hum. Neurosci., 27 May 2014 | doi: 10.3389/fnhum.2014.00357

While most experts agree on the limitations of neuroimaging, the unversed public—and indeed many a scholar—often valorizes brain imaging without heeding its shortcomings. Here we test the boundaries of this phenomenon, which we term neuroenchantment. How much are individuals ready to believe when encountering improbable information through the guise of neuroscience? We introduced participants to a crudely-built mock brain scanner, explaining that the machine would measure neural activity, analyze the data, and then infer the content of complex thoughts. Using a classic magic trick, we crafted an illusion whereby the imaging technology seemed to decipher the internal thoughts of participants. We found that most students—even undergraduates with advanced standing in neuroscience and psychology, who have been taught the shortcomings of neuroimaging—deemed such unlikely technology highly plausible. Our findings highlight the influence neuro-hype wields over critical thinking.

The entire article is here.