Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Wednesday, July 2, 2025

Realization of Empathy Capability for the Evolution of Artificial Intelligence Using an MXene(Ti3C2)-Based Memristor

Wang, Y., Zhang, Y., et al. (2024).
Electronics, 13(9), 1632.

Abstract

Empathy is the emotional capacity to feel and understand the emotions experienced by other human beings from within their frame of reference. As a unique psychological faculty, empathy is an important source of motivation to behave altruistically and cooperatively. Although human-like emotion should be a critical component in the construction of artificial intelligence (AI), the discovery of emotional elements such as empathy is subject to complexity and uncertainty. In this work, we demonstrated an interesting electrical device (i.e., an MXene (Ti3C2) memristor) and successfully exploited the device to emulate a psychological model of “empathic blame”. To emulate this affective reaction, MXene was introduced into memristive devices because of its interesting structure and ionic capacity. Additionally, depending on several rehearsal repetitions, self-adaptive characteristic of the memristive weights corresponded to different levels of empathy. Moreover, an artificial neural system was designed to analogously realize a moral judgment with empathy. This work may indicate a breakthrough in making cool machines manifest real voltage-motivated feelings at the level of the hardware rather than the algorithm.

Here are some thoughts:

This research represents a critical step toward endowing machines with human-like emotional capabilities, particularly empathy. Traditionally, AI has been limited to algorithmic decision-making and pattern recognition, lacking the nuanced ability to understand or simulate human emotions. By using an MXene-based memristor to emulate "empathic blame," researchers have demonstrated a hardware-level mechanism that mimics how humans adjust their moral judgments based on repeated exposure to similar situations—an essential component of empathetic reasoning. This breakthrough suggests that future AI systems could be designed not just to recognize emotions but to adaptively respond to them in real time, potentially leading to more socially intelligent machines.

For psychologists, this research raises profound questions about the nature of empathy, its role in moral judgment, and whether artificially created systems can truly embody these traits or merely imitate them. The ability to program empathy into AI could change how we conceptualize machine sentience and emotional intelligence, blurring the lines between biological and artificial cognition. Furthermore, as AI becomes more integrated into social, therapeutic, and even judicial contexts, understanding how machines might "feel" or interpret human suffering becomes increasingly relevant. The study also opens up new interdisciplinary dialogues between neuroscience, ethics, and AI development, emphasizing the importance of considering psychological principles in the design of emotionally responsive technologies. Ultimately, this work signals a shift from purely functional AI toward systems capable of engaging with humans on a deeper, more emotionally resonant level.