Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Phenomenal Consciousness. Show all posts
Showing posts with label Phenomenal Consciousness. Show all posts

Saturday, December 23, 2023

Folk Psychological Attributions of Consciousness to Large Language Models

Colombatto, C., & Fleming, S. M.
(2023, November 22). PsyArXiv

Abstract

Technological advances raise new puzzles and challenges for cognitive science and the study of how humans think about and interact with artificial intelligence (AI). For example, the advent of Large Language Models and their human-like linguistic abilities has raised substantial debate regarding whether or not AI could be conscious. Here we consider the question of whether AI could have subjective experiences such as feelings and sensations (“phenomenological consciousness”). While experts from many fieldshave weighed in on this issue in academic and public discourse, it remains unknown how the general population attributes phenomenology to AI. We surveyed a sample of US residents (N=300) and found that a majority of participants were willing to attribute phenomenological consciousness to LLMs. These attributions were robust, as they predicted attributions of mental states typically associated with phenomenology –but also flexible, as they were sensitive to individual differences such as usage frequency. Overall, these results show how folk intuitions about AI consciousness can diverge from expert intuitions –with important implications for the legal and ethical status of AI.


My summary:

The results of the study show that people are generally more likely to attribute consciousness to LLMs than to other non-human entities, such as animals, plants, and robots. However, the level of consciousness attributed to LLMs is still relatively low, with most participants rating them as less conscious than humans. The authors argue that these findings reflect the influence of folk psychology, which is the tendency to explain the behavior of others in terms of mental states.

The authors also found that people's attributions of consciousness to LLMs were influenced by their beliefs about the nature of consciousness and their familiarity with LLMs. Participants who were more familiar with LLMs were more likely to attribute consciousness to them, and participants who believed that consciousness is a product of complex computation were also more likely to attribute consciousness to LLMs.

Overall, the study suggests that people are generally open to the possibility that LLMs may be conscious, but they also recognize that LLMs are not as conscious as humans. These findings have implications for the development and use of LLMs, as they suggest that people may be more willing to trust and interact with LLMs that they believe are conscious.