Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Crowd Wisdom. Show all posts
Showing posts with label Crowd Wisdom. Show all posts

Wednesday, July 21, 2021

The Parliamentary Approach to Moral Uncertainty

Toby Newberry & Toby Ord
Future of Humanity Institute
University of Oxford 2021

Abstract

We introduce a novel approach to the problem of decision-making under moral uncertainty, based
on an analogy to a parliament. The appropriate choice under moral uncertainty is the one that
would be reached by a parliament comprised of delegates representing the interests of each moral
theory, who number in proportion to your credence in that theory. We present what we see as the
best specific approach of this kind (based on proportional chances voting), and also show how the
parliamentary approach can be used as a general framework for thinking about moral uncertainty,
where extant approaches to addressing moral uncertainty correspond to parliaments with different
rules and procedures.

Here is an excerpt:

Moral Parliament

Imagine that each moral theory in which you have credence got to send delegates to an internal parliament, where the number of delegates representing each theory was proportional to your credence in that theory. Now imagine that these delegates negotiate with each other, advocating on behalf of their respective moral theories, until eventually the parliament reaches a decision by the delegates voting on the available options. This would provide a novel approach to decision-making under moral uncertainty that may avoid some of the problems that beset the others, and it may even provide a new framework for thinking about moral uncertainty more broadly.

(cut)

Here, we endorse a common-sense approach to the question of scale which has much in common with standard decision-theoretic conventions. The suggestion is that one should convene Moral Parliament for those decision-situations to which it is intuitively appropriate, such as those involving non-trivial moral stakes, where the possible options are relatively well-defined, and so on. Normatively speaking, if Moral Parliament is the right approach to take to moral uncertainty, then it may also be right to apply it to all decision-situations (however this is defined). But practically speaking, this would be very difficult to achieve. This move has essentially the same implications as the approach of sidestepping the question but comes with a positive endorsement of Moral Parliament’s application to ‘the kinds of decision-situations typically described in papers on moral uncertainty’. This is the sense in which the common-sense approach resembles standard decision-theoretic conventions. 

Friday, June 18, 2021

Wise teamwork: Collective confidence calibration predicts the effectiveness of group discussion

Silver, I, Mellers, B.A., & Tetlock, P.E.
Journal of Experimental Social Psychology
Volume 96, September 2021.

Abstract

‘Crowd wisdom’ refers to the surprising accuracy that can be attained by averaging judgments from independent individuals. However, independence is unusual; people often discuss and collaborate in groups. When does group interaction improve vs. degrade judgment accuracy relative to averaging the group's initial, independent answers? Two large laboratory studies explored the effects of 969 face-to-face discussions on the judgment accuracy of 211 teams facing a range of numeric estimation problems from geographic distances to historical dates to stock prices. Although participants nearly always expected discussions to make their answers more accurate, the actual effects of group interaction on judgment accuracy were decidedly mixed. Importantly, a novel, group-level measure of collective confidence calibration robustly predicted when discussion helped or hurt accuracy relative to the group's initial independent estimates. When groups were collectively calibrated prior to discussion, with more accurate members being more confident in their own judgment and less accurate members less confident, subsequent group interactions were likelier to yield increased accuracy. We argue that collective calibration predicts improvement because groups typically listen to their most confident members. When confidence and knowledge are positively associated across group members, the group's most knowledgeable members are more likely to influence the group's answers.

Conclusion

People often display exaggerated beliefs about their skills and knowledge. We misunderstand and over-estimate our ability to answer general knowledge questions (Arkes, Christensen, Lai, & Blumer, 1987), save for a rainy day (Berman, Tran, Lynch Jr, & Zauberman, 2016), and resist unhealthy foods (Loewenstein, 1996), to name just a few examples. Such failures of calibration can have serious consequences, hindering our ability to set goals (Kahneman & Lovallo, 1993), make plans (Janis, 1982), and enjoy experiences (Mellers & McGraw, 2004). Here, we show that collective calibration also predicts the effectiveness of group discussions. In the context of numeric estimation tasks, poorly calibrated groups were less likely to benefit from working together, and, ultimately, offered less accurate answers. Group interaction is the norm, not the exception. Knowing what we know (and what we don't know) can help predict whether interactions will strengthen or weaken crowd wisdom.