Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label War. Show all posts
Showing posts with label War. Show all posts

Monday, July 25, 2022

Morally Exhausted: Why Russian Soldiers are Refusing to Fight in the Unprovoked War on Ukraine

Tіmofеі Rоzhаnskіy
Syndicated
Originally posted 23 July 22

Here is an excerpt:

I Had To Refuse So I Could Stay Alive

Russia’s troops in Ukraine are largely made up of contract soldiers: volunteer personnel who sign fixed-term contracts for service. The range of experience varies. Other units include troops from private military companies like Vagner, or specialized, semiautonomous units overseen by Chechnya’s strongman leader, Ramzan Kadyrov.

The discontent in Kaminsky’s 11th Brigade is not an isolated case, and there are indications that Russian commanders are trying different tactics to keep the problem from spiraling out of control: for example, publicly shaming soldiers who are refusing to fight.

In Buryatia, where the 11th Brigade is based, dozens of personnel have sought legal assistance from local activists, seeking to break their contracts and get out of service in Ukraine, for various reasons.

In the southern Russian town of Budyonnovsk, on the home base for the 205th Cossack Motorized Rifle Brigade, commanders have erected a “wall of shame” with the names, ranks, and photographs of some 300 soldiers who have disobeyed orders in the Ukraine war.

“They forgot their military oaths, the ceremonial promise, their vows of duty to their Fatherland,” the board reads.

In conversations via the Russian social media giant VK, several soldiers from the brigade disputed the circumstances behind their inclusion on the wall of shame. All asked that their names be withheld for fear of further punishment or retaliation by commanders.

“I understand everything, of course. I signed a contract. I’m supposed to be ready for any situation; this war, this special operation,” one soldier wrote. “But I was thinking, I’m still young; at any moment, a piece of shrapnel, a bullet could fly into my head.”

The soldier said he broke his contract and resigned from the brigade before the February 24 invasion, once he realized it was in fact going forward.

“I thought a long time about it and came to the decision. I understood that I had to refuse so I could stay alive,” he said. “I don’t regret it one bit.”

Thursday, March 17, 2022

важное сообщение: vazhnoye soobshcheniye


More for Russian friends

больше информации для вас
bol'she informatsii dlya vas

Sunday, March 6, 2022

Полонені росіяни дали пресконференцію українським (Full Russian Press Conference in Ukraine)

I know there are people in Russia who follow this site.

Я знаю, что в России есть люди, которые следят за этим сайтом.

YA znayu, chto v Rossii yest' lyudi, kotoryye sledyat za etim saytom.

-------------------------

This is an SOS.  There are 33 different Russian addresses that viewed my site in the past 7 days.  Please distribute safely.

Это SOS. За последние 7 дней мой сайт просматривали 33 разных российских адреса.

Eto SOS. Za posledniye 7 dney moy sayt prosmatrivali 33 raznykh rossiyskikh adresa.    

Monday, April 19, 2021

The Military Is Funding Ethicists to Keep Its Brain Enhancement Experiments in Check

Sara Scoles
Medium.com
Originally posted 1 April 21

Here is an excerpt:

The Department of Defense has already invested in a number of projects to which the Minerva research has relevance. The Army Research Laboratory, for example, has funded researchers who captured and transmitted a participant’s thoughts about a character’s movement in a video game, using magnetic stimulation to beam those neural instructions to another person’s brain and cause movement. And it has supported research using deep learning algorithms and EEG readings to predict a person’s “drowsy and alert” states.

Evans points to one project funded by Defense Advanced Research Projects Agency (DARPA): Scientists tested a BCI that allowed a woman with quadriplegia to drive a wheelchair with her mind. Then, “they disconnected the BCI from the wheelchair and connected to a flight simulator,” Evans says, and she brainfully flew a digital F-35. “DARPA has expressed pride that their work can benefit civilians,” says Moreno. “That helps with Congress and with the public so it isn’t just about ‘supersoldiers,’” says Moreno.

Still, this was a civilian participant, in a Defense-funded study, with “fairly explicitly military consequences,” says Evans. And the big question is whether the experiment’s purpose justifies the risks. “There’s no obvious therapeutic reason for learning to fly a fighter jet with a BCI,” he says. “Presumably warfighters have a job that involves, among other things, fighter jets, so there might be a strategic reason to do this experiment. Civilians rarely do.”

It’s worth noting that warfighters are, says Moreno, required to take on more risks than the civilians they are protecting, and in experiments, military members may similarly be asked to shoulder more risk than a regular-person participant.

DARPA has also worked on implants that monitor mood and boost the brain back to “normal” if something looks off, created prosthetic limbs animated by thought, and made devices that improve memory. While those programs had therapeutic aims, the applications and follow-on capabilities extend into the enhancement realm — altering mood, building superstrong bionic arms, generating above par memory.

Friday, March 29, 2019

Artificial Morality

Robert Koehler
www.commondreams.org
Originally posted March 14, 2019

Artificial Intelligence is one thing. Artificial morality is another. It may sound something like this:

“First, we believe in the strong defense of the United States and we want the people who defend it to have access to the nation’s best technology, including from Microsoft.”

The words are those of Microsoft president Brad Smith, writing on a corporate blogsite last fall in defense of the company’s new contract with the U.S. Army, worth $479 million, to make augmented reality headsets for use in combat. The headsets, known as the Integrated Visual Augmentation System, or IVAS, are a way to “increase lethality” when the military engages the enemy, according to a Defense Department official. Microsoft’s involvement in this program set off a wave of outrage among the company’s employees, with more than a hundred of them signing a letter to the company’s top executives demanding that the contract be canceled.

“We are a global coalition of Microsoft workers, and we refuse to create technology for warfare and oppression. We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country’s government ‘increase lethality’ using tools we built. We did not sign up to develop weapons, and we demand a say in how our work is used.”

The info is here.

Monday, October 1, 2018

Moral Injury in International Relations

Jelena Subotic & Brent J Steele
Journal of Global Security Studies
https://doi.org/10.1093/jogss/ogy021
Published: 28 August 2018

Abstract

The war in Iraq unleashed disastrous global instability—from the strengthening of Al-Qaeda, to the creation of ISIS, and civil war in Syria accompanied by a massive exodus of refugees. The war in Afghanistan is continuing in perpetuity, with no clear goals or objectives other than the United States’ commitment to its sunk cost. The so-called war on terror is a vague catch-all phrase for a military campaign against moving targets and goalposts, with no end date and no conceivable way to declare victory. The toll of these wars on civilians in Iraq and Afghanistan and elsewhere in the Middle East, on US troops, and on the US economy is staggering. But these ambiguous campaigns are also fundamentally changing US state identity—its view of itself, its role in the world, and its commitment to a liberal international order. They are producing profound anxiety in the US body politic and anxiety in US relationships with other international actors. To understand the sources and consequences of this anxiety, we adopt an ontological security perspective on state identity. We enrich ontological security scholarship by introducing the concept of moral injury and its three main consequences: loss of control, ethical anxiety, and relational harm. We demonstrate how the concept of moral injury illuminates some of the most central anxieties at the core of US identity, offering a new understanding of our global moment of crisis.

The info is here.

Friday, August 10, 2018

Is compassion fatigue inevitable in an age of 24-hour news?

Elisa Gabbert
The Guardian
Originally posted August 2, 2018

Here is an excerpt:

Not long after compassion fatigue emerged as a concept in healthcare, a similar concept began to appear in media studies – the idea that overexposure to horrific images, from news reports in particular, could cause viewers to shut down emotionally, rejecting information instead of responding to it. In her 1999 book  Compassion Fatigue: How the Media Sell Disease, Famine, War and Death, the journalist and scholar Susan Moeller explored this idea at length. “It seems as if the media careen from one trauma to another, in a breathless tour of poverty, disease and death,” she wrote. “The troubles blur. Crises become one crisis.” The volume of bad news drives the public to “collapse into a compassion fatigue stupor”.

Susan Sontag grappled with similar questions in her short book Regarding the Pain of Others, published in 2003. By “regarding” she meant not just “with regard to”, but looking at: “Flooded with images of the sort that once used to shock and arouse indignation, we are losing our capacity to react. Compassion, stretched to its limits, is going numb. So runs the familiar diagnosis.” She implies that the idea was already tired: media overload dulls our sensitivity to suffering. Whose fault is that – ours or the media’s? And what are we supposed to do about it?

By Moeller’s account, compassion fatigue is a vicious cycle. When war and famine are constant, they become boring – we’ve seen it all before. The only way to break through your audience’s boredom is to make each disaster feel worse than the last. When it comes to world news, the events must be “more dramatic and violent” to compete with more local stories, as a 1995 study of international media coverage by the Pew Research Center in Washington found.

The information is here.

Tuesday, August 7, 2018

Google’s AI ethics won't curb war by algorithm

Phoebe Braithwaite
Wired.com
Originally published July 5, 2018

Here is an excerpt:

One of these programmes is Project Maven, which trains artificial intelligence systems to parse footage from surveillance drones in order to “extract objects from massive amounts of moving or still imagery,” writes Drew Cukor, chief of the Algorithmic Warfare Cross-Functional Team. The programme is a key element of the US army’s efforts to select targets. One of the companies working on Maven is Google. Engineers at Google have protested their company’s involvement; their peers at companies like Amazon and Microsoft have made similar complaints, calling on their employers not to support the development of the facial recognition tool Rekognition, for use by the military, police and immigration control. For technology companies, this raises a question: should they play a role in governments’ use of force?

The US government’s policy of using armed drones to hunt its enemies abroad has long been controversial. Gibson argues that the CIA and US military are using drones to strike “far from the hot battlefield, against communities that aren't involved in an armed conflict, based on intelligence that is quite frequently wrong”. Paul Scharre, director of the technology and national security programme at the Center for a New American Security and author of Army of None says that the use of drones and computing power is making the US military a much more effective and efficient force that kills far fewer civilians than in previous wars. “We actually need tech companies like Google helping the military to do many other things,” he says.

The article is here.

Tuesday, May 15, 2018

Google code of ethics on military contracts could hinder Pentagon work

Brittany De Lea
FoxBusiness.com
Originally published April 13, 2018

Google is among the frontrunners for a lucrative, multibillion dollar contract with the Pentagon, but ethical concerns among some of its employees may pose a problem.

The Defense Department’s pending cloud storage contract, known as Joint Enterprise Defense Infrastructure (JEDI), could span a decade and will likely be its largest yet – valued in the billions of dollars. The department issued draft requests for proposals to host sensitive and classified information and will likely announce the winner later this year.

While Google, Microsoft, Amazon and Oracle are viewed as the major contenders for the job, Google’s employees have voiced concern about creating products for the U.S. government. More than 3,000 of the tech giant’s employees signed a letter, released this month, addressed to company CEO Sundar Pichai, protesting involvement in a Pentagon pilot program called Project Maven.

“We believe that Google should not be in the business of war. Therefore we ask that Project Maven be cancelled, and that Google draft, publicize and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology,” the letter, obtained by The New York Times, read.

The article is here.

Sunday, April 15, 2018

What If There Is No Ethical Way to Act in Syria Now?

Sigal Samel
The Atlantic
Originally posted April 13, 2018

For seven years now, America has been struggling to understand its moral responsibility in Syria. For every urgent argument to intervene against Syrian President Bashar al-Assad to stop the mass killing of civilians, there were ready responses about the risks of causing more destruction than could be averted, or even escalating to a major war with other powers in Syria. In the end, American intervention there has been tailored mostly to a narrow perception of American interests in stopping the threat of terror. But the fundamental questions are still unresolved: What exactly was the moral course of action in Syria? And more urgently, what—if any—is the moral course of action now?

The war has left roughly half a million people dead—the UN has stopped counting—but the question of moral responsibility has taken on new urgency in the wake of a suspected chemical attack over the weekend. As President Trump threatened to launch retaliatory missile strikes, I spoke about America’s ethical responsibility with some of the world’s leading moral philosophers. These are people whose job it is to ascertain the right thing to do in any given situation. All of them suggested that, years ago, America might have been able to intervene in a moral way to stop the killing in the Syrian civil war. But asked what America should do now, they all gave the same startling response: They don’t know.

The article is here.

Sunday, March 25, 2018

Did Iraq Ever Become A Just War?

Matt Peterson
The Atlantic
Originally posted March 24, 2018

Here is an excerpt:

There’s a broader sense of moral confusion about the conduct of America’s wars. In Iraq, what started as a war of choice came to resemble much more a war of necessity. Can a war that started unjustly ever become righteous? Or does the stain permanently taint anything that comes after it?

The answers to these questions come from the school of philosophy called “just war” theory, which tries to explain whether and when war is permissible, and under what circumstances. It offers two big ways to think about the justice of war. One is whether it’s appropriate to go to war in the first place. Take North Korea, for example. Is there a cause worth killing thousands—millions—of North and South Korean civilians over? Invoking “national security” isn’t enough to make a war just. Kim Jong Un’s nuclear weapons pose an obvious threat to South Korea, Japan, and the United States. But that alone doesn’t make war an acceptable choice, given the lives at stake. The ethics of war require the public to assess how certain it is that innocents will be killed if the military doesn’t act (Will Kim really use his nukes offensively?), whether there’s any way to remove the threat without violence (Has diplomacy been exhausted?), and whether the scale of the deaths that would come from intervention is truly in line with the danger war is meant to avert (If the peninsula has to be burned down to be saved, is it really worth it?)—among other considerations.

The other questions to ask are about the nature of the combat. Are soldiers taking care to target only North Korea’s military? Once the decision has been made that Kim’s nuclear weapons pose an imminent threat, hypothetically, that still wouldn’t make it acceptable to firebomb Pyongyang to turn the population against him. Similarly, American forces could not, say, blow up a bus full of children just because one of Kim’s generals was trying to escape on it.

The article is here.

Thursday, November 9, 2017

Morality and Machines

Robert Fry
Prospect
Originally published October 23, 2017

Here is an excerpt:

It is axiomatic that robots are more mechanically efficient than humans; equally they are not burdened with a sense of self-preservation, nor is their judgment clouded by fear or hysteria. But it is that very human fallibility that requires the intervention of the defining human characteristic—a moral sense that separates right from wrong—and explains why the ethical implications of the autonomous battlefield are so much more contentious than the physical consequences. Indeed, an open letter in 2015 seeking to separate AI from military application included the signatures of such luminaries as Elon Musk, Steve Wozniak, Stephen Hawking and Noam Chomsky. For the first time, therefore, human agency may be necessary on the battlefield not to take the vital tactical decisions but to weigh the vital moral ones.

So, who will accept these new responsibilities and how will they be prepared for the task? The first point to make is that none of this is an immediate prospect and it may be that AI becomes such a ubiquitous and beneficial feature of other fields of human endeavour that we will no longer fear its application in warfare. It may also be that morality will co-evolve with technology. Either way, the traditional military skills of physical stamina and resilience will be of little use when machines will have an infinite capacity for physical endurance. Nor will the quintessential commander’s skill of judging tactical advantage have much value when cognitive computing will instantaneously integrate sensor information. The key human input will be to make the judgments that link moral responsibility to legal consequence.

The article is here.

Tuesday, September 26, 2017

The Influence of War on Moral Judgments about Harm

Hanne M Watkins and Simon M Laham
Preprint

Abstract

How does war influence moral judgments about harm? While the general rule is “thou shalt not kill,” war appears to provide an unfortunately common exception to the moral prohibition on intentional harm. In three studies (N = 263, N = 557, N = 793), we quantify the difference in moral judgments across peace and war contexts, and explore two possible explanations for the difference. Taken together, the findings of the present studies have implications for moral psychology researchers who use war based scenarios to study broader cognitive or affective processes. If the war context changes judgments of moral scenarios by triggering group-based reasoning or altering the perceived structure of the moral event, using such scenarios to make “decontextualized” claims about moral judgment may not be warranted.

Here is part of the discussion.

A number of researchers have begun to investigate how social contexts may influence moral judgment, whether those social contexts are grounded in groups (Carnes et al, 2015; Ellemers & van den Bos, 2009) or relationships (Fiske & Rai, 2014; Simpson, Laham, & Fiske, 2015). The war context is another specific context which influences moral judgments: in the present study we found that the intergroup nature of war influenced people’s moral judgments about harm in war – even if they belonged to neither of the two groups actually at war – and that the usually robust difference between switch and footbridge scenarios was attenuated in the war context. One implication of these findings is that some caution may be warranted when using war-based scenarios for studying morality in general. As mentioned in the introduction, scenarios set in war are often used in the study of broad domains or general processes of judgment (e.g. Graham et al., 2009; Phillips & Young, 2011; Piazza et al., 2013). Given the interaction of war context with intergroup considerations and with the construed structure of the moral event in the present studies, researchers are well advised to avoid making generalizations to morality writ large on the basis of war-related scenarios (see also Bauman, McGraw, Bartels, & Warren, 2014; Bloom, 2011).

The preprint is here.

Saturday, December 31, 2016

The Wright Show: Against Empathy

Robert Wright interviews Paul Bloom on his book "Against Empathy."
The Wright Show
Originally published December 6, 2016


Sunday, April 10, 2016

The Paradox of Nonlethal Weapons

Fritz Allhoff
Law and Bioethics Blog
Originally published March 10, 2016

Here are two excerpts:

These are all examples of lethal weapons. Importantly, though, there are myriad restrictions on the use of nonlethal weapons as well. And this gives rise to what I’ll call the “paradox of nonlethal weapons.” The paradox is simply that, sometimes, international law allows soldiers to kill, but not to disable. Or, in other words, some nonlethal weapons may be prohibited, while, at the same time, some lethal weaponry is not. As Donald Rumsfeld put it, “in many instances, our forces are allowed to shoot somebody and kill them, but they’re not allowed to use a nonlethal riot control agent.”

(cut)

Regardless of the specific technologies, though, the general question is this: why should there be limits on nonlethal weapons at the same time that lethal weapons are allowed? This leads to the curious—and perhaps perverse—outcome that enemy combatants can be killed, but not even temporarily disabled.

The article is here.

Friday, October 16, 2015

The Dark Side of Empathy

By Paul Bloom
The Atlantic
Originally published on September 25, 2015

Here is an excerpt:

Our reaction to these atrocities can cloud our judgment, biasing us in favor of war. The benefits of war—including avenging those who have suffered—are made vivid, but the costs of war remain abstract and statistical. We see this same bias reflected in our criminal-justice system. The outrage that comes from empathy drives some of our most powerful punitive desires. It’s not an accident that so many statutes are named for dead girls—as in Megan’s Law, Jessica’s Law, and Caylee’s Law—and no surprise that there is now enthusiasm for “Kate’s Law.” The high incarceration rate in the United States, and our continued enthusiasm for the death penalty, is in part the product of fear and anger, but is also driven by the consumption of detailed stories of victims’ suffering.

Then there are victim-impact statements, where detailed descriptions of how victims are affected by a crime are used to help determine the sentence imposed on a criminal. There are arguments in favor of these statements, but given all the evidence that we are more prone to empathize with some individuals over others—with factors like race, sex, and physical attractiveness playing a powerful role—it’s hard to think of a more biased and unfair way to determine punishment.

The entire article is here.

Thursday, July 23, 2015

Healing a Wounded Sense of Morality

Many veterans are suffering from a condition similar to, but distinct from, PTSD: moral injury, in which the ethical transgressions of war can leave service members traumatized.

By Maggie Puniewska
The Atlantic
Originally published July 3, 2015

Here are two excerpts:

Identifying moral injury can be tricky for two reasons: First, it’s easily mistaken for PTSD, which shares many of the same symptoms. And second, because veterans may feel too ashamed to talk about their moral infractions, therapists might not even know to look for the signs of moral injury at all, says Joseph Currier, an assistant professor of psychology at the University of South Alabama. To help therapists better understand how to diagnose the condition, he and several colleagues have developed a 20-item questionnaire that screens patients for moral injury, asking patients to rate their agreement with statements like “I did things in war that betrayed my personal values” and “I made mistakes in the war zone that led to injury and death.”

(cut)

But healing isn’t just confined to the individual. Emotions that guide morality, Currier explains, are rooted in social relationships:  “The function of guilt is to reconcile a potentially damaged social bond, whereas with shame, the reaction is to withdraw so the social group can preserve its identity,” he says.   For many veterans, therefore, recovery from moral injury depends in part on the civilian communities to which they return. “A part of feeling betrayed or distrusted or guilty by the practices of war is feeling alienated. It’s feeling like you can’t share your experiences because people will judge you or won’t understand,” Sherman says.

The entire article is here.

Sunday, July 19, 2015

Healing Moral Wounds of War

Religion and Ethics News Weekly
PBS
Originally posted June 26, 2015

In her book Afterwar: Healing the Moral Wounds of Our Soldiers, Georgetown University philosophy professor Nancy Sherman argues that many of the 2.6 million U.S. service members returning from our wars in Afghanistan and Iraq suffer from complex moral injuries that are more than post-traumatic stress and that have to do with feelings of guilt, anger, and “the shame of falling short of your lofty military ideals.” Citizens have “a sacred obligation,” says Sherman, to morally engage with those who have fought in our name and who feel moral responsibility for traumatic incidents they experienced. Managing editor Kim Lawton interviews Sherman about the moral aftermath of war and visits a former Marine and his wife to talk about the healing that comes through listening, trust, hope, and moral understanding.


Saturday, January 31, 2015

Ethics and the Enhanced Soldier of the Near Future

By Dave Shunk
Military Review
January-February 2015

Here are two excerpts:

The soldier of the future likely will be enhanced through neuroscience, biotechnology, nanotechnology, genetics, and drugs. According to Patrick Lin, writing in The Atlantic about the ethics of enhancing soldiers, “Soldier enhancements, through biological or technological augmentation of human capabilities, reduce warfighter risk by providing tactical advantages over the enemy.” Lin describes efforts to develop a “super-soldier” who can perform more like
a machine.

(cut)

New ethical challenges are arising from the technological developments in stem cells, genetics,
neurosciences, robotics, and information technology.  Lawrence Hinman of the Center for Ethics in
Science and Technology, University of San Diego, reports that “these developments have created ethical vacuums, situations in which our technology has outstripped our ethical framework.” This statement, although made in 2008, remains true. In fact, current military references to enhanced soldiers are very limited.

The entire article is here.

Friday, December 5, 2014

Moral Injury Is The 'Signature Wound' Of Today's Veterans

Interview with David Wood
NPR
Originally posted November 11, 2014

Here is an excerpt:

On the best therapy for treating this "bruise on the soul"

The biggest thing that [the veterans] told me was that they're carrying around this horrible idea that they are bad people because they've done something bad and they can't ever tell anybody about it — or they don't dare tell anybody about it — and may not even be able to admit it to themselves.

One of the most healing things they have found is to stand in a group of fellow veterans and say, "This is what happened. This is what I saw. This is what I did," and to have their fellow veterans nod and say, "I hear you. I hear you." And just accept it, without saying, "Well, you couldn't help it," or, "You're really a good person at heart."

But just hearing it and accepting it — and not being blamed or castigated for whatever it was that you're feeling bad about. It's that validating kind of listening that is so important to all the therapies that I've seen.

The entire article is here.