Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Law Enforcement. Show all posts
Showing posts with label Law Enforcement. Show all posts

Friday, September 22, 2023

Police are Getting DNA Data from People who Think They Opted Out

Jordan Smith
The Intercept
Originally posted 18 Aug 23

Here is an excerpt:

The communications are a disturbing example of how genetic genealogists and their law enforcement partners, in their zeal to close criminal cases, skirt privacy rules put in place by DNA database companies to protect their customers. How common these practices are remains unknown, in part because police and prosecutors have fought to keep details of genetic investigations from being turned over to criminal defendants. As commercial DNA databases grow, and the use of forensic genetic genealogy as a crime-fighting tool expands, experts say the genetic privacy of millions of Americans is in jeopardy.

Moore did not respond to The Intercept’s requests for comment.

To Tiffany Roy, a DNA expert and lawyer, the fact that genetic genealogists have accessed private profiles — while simultaneously preaching about ethics — is troubling. “If we can’t trust these practitioners, we certainly cannot trust law enforcement,” she said. “These investigations have serious consequences; they involve people who have never been suspected of a crime.” At the very least, law enforcement actors should have a warrant to conduct a genetic genealogy search, she said. “Anything less is a serious violation of privacy.”

(cut)

Exploitation of the GEDmatch loophole isn’t the only example of genetic genealogists and their law enforcement partners playing fast and loose with the rules.

Law enforcement officers have used genetic genealogy to solve crimes that aren’t eligible for genetic investigation per company terms of service and Justice Department guidelines, which say the practice should be reserved for violent crimes like rape and murder only when all other “reasonable” avenues of investigation have failed. In May, CNN reported on a U.S. marshal who used genetic genealogy to solve a decades-old prison break in Nebraska. There is no prison break exception to the eligibility rules, Larkin noted in a post on her website. “This case should never have used forensic genetic genealogy in the first place.”

A month later, Larkin wrote about another violation, this time in a California case. The FBI and the Riverside County Regional Cold Case Homicide Team had identified the victim of a 1996 homicide using the MyHeritage database — an explicit violation of the company’s terms of service, which make clear that using the database for law enforcement purposes is “strictly prohibited” absent a court order.

“The case presents an example of ‘noble cause bias,’” Larkin wrote, “in which the investigators seem to feel that their objective is so worthy that they can break the rules in place to protect others.”


My take:

Forensic genetic genealogists have been skirting GEDmatch privacy rules by searching users who explicitly opted out of sharing DNA with law enforcement. This means that police can access the DNA of people who thought they were protecting their privacy by opting out of law enforcement searches.

The practice of forensic genetic genealogy has been used to solve a number of cold cases, but it has also raised concerns about privacy and civil liberties. Some people worry that the police could use DNA data to target innocent people or to build a genetic database of the entire population.

GEDmatch has since changed its privacy policy to make it more difficult for police to access DNA data from users who have opted out. However, the damage may already be done. Police have already used GEDmatch data to solve dozens of cases, and it is unclear how many people have had their DNA data accessed without their knowledge or consent.

Tuesday, January 21, 2020

10 Years Ago, DNA Tests Were The Future Of Medicine. Now They’re A Social Network — And A Data Privacy Mess

Peter Aldhaus
buzzfeednews.com
Originally posted 11 Dec 19

Here is an excerpt:

But DNA testing can reveal uncomfortable truths, too. Families have been torn apart by the discovery that the man they call “Dad” is not the biological father of his children. Home DNA tests can also be used to show that a relative is a rapist or a killer.

That possibility burst into the public consciousness in April 2018, with the arrest of Joseph James DeAngelo, alleged to be the Golden State Killer responsible for at least 13 killings and more than 50 rapes in the 1970s and 1980s. DeAngelo was finally tracked down after DNA left at the scene of a 1980 double murder was matched to people in GEDmatch who were the killer's third or fourth cousins. Through months of painstaking work, investigators working with the genealogist Barbara Rae-Venter built family trees that converged on DeAngelo.

Genealogists had long realized that databases like GEDmatch could be used in this way, but had been wary of working with law enforcement — fearing that DNA test customers would object to the idea of cops searching their DNA profiles and rummaging around in their family trees.

But the Golden State Killer’s crimes were so heinous that the anticipated backlash initially failed to materialize. Indeed, a May 2018 survey of more than 1,500 US adults found that 80% backed police using public genealogy databases to solve violent crimes.

“I was very surprised with the Golden State Killer case how positive the reaction was across the board,” CeCe Moore, a genealogist known for her appearances on TV, told BuzzFeed News a couple of months after DeAngelo’s arrest.

The info is here.

Tuesday, December 24, 2019

DNA genealogical databases are a gold mine for police, but with few rules and little transparency

Paige St. John
The LA Times
Originally posted 24 Nov 19

Here is an excerpt:

But law enforcement has plunged into this new world with little to no rules or oversight, intense secrecy and by forming unusual alliances with private companies that collect the DNA, often from people interested not in helping close cold cases but learning their ethnic origins and ancestry.

A Times investigation found:
  • There is no uniform approach for when detectives turn to genealogical databases to solve cases. In some departments, they are to be used only as a last resort. Others are putting them at the center of their investigative process. Some, like Orlando, have no policies at all.
  • When DNA services were used, law enforcement generally declined to provide details to the public, including which companies detectives got the match from. The secrecy made it difficult to understand the extent to which privacy was invaded, how many people came under investigation, and what false leads were generated.
  • California prosecutors collaborated with a Texas genealogy company at the outset of what became a $2-million campaign to spotlight the heinous crimes they can solve with consumer DNA. Their goal is to encourage more people to make their DNA available to police matching.
There are growing concerns that the race to use genealogical databases will have serious consequences, from its inherent erosion of privacy to the implications of broadened police power.

In California, an innocent twin was thrown in jail. In Georgia, a mother was deceived into incriminating her son. In Texas, police met search guidelines by classifying a case as sexual assault but after an arrest only filed charges of burglary. And in the county that started the DNA race with the arrest of the Golden State killer suspect, prosecutors have persuaded a judge to treat unsuspecting genetic contributors as “confidential informants” and seal searches so consumers are not scared away from adding their own DNA to the forensic stockpile.

Friday, December 13, 2019

Conference warned of dangers of facial recognition technology

Because of new technologies, “we are all monitored and recorded every minute of every day of our lives”, a conference has heard. Photograph: iStockColm Keena
The Irish Times
Originally posted 13 Nov 19

Here is an excerpt:

The potential of facial recognition technology to be used by oppressive governments and manipulative corporations was such that some observers have called for it to be banned. The suggestion should be taken seriously, Dr Danaher said.

The technology is “like a fingerprint of your face”, is cheap, and “normalises blanket surveillance”. This makes it “perfect” for oppressive governments and for manipulative corporations.

While the EU’s GDPR laws on the use of data applied here, Dr Danaher said Ireland should also introduce domestic law “to save us from the depredations of facial recognition technology”.

As well as facial recognition technology, he also addressed the conference about “deepfake” technology, which allows for the creation of highly convincing fake video content, and algorithms that assess risk, as other technologies that are creating challenges for the law.

In the US, the use of algorithms to predict a person’s likelihood of re-offending has raised significant concerns.

The info is here.

Thursday, May 16, 2019

It’s Our ‘Moral Responsibility’ to Give The FBI Access to Your DNA

Jennings Brown
www.gizmodo.com
Originally published April 3, 2019

A popular DNA-testing company seems to be targeting true crime fans with a new pitch to let them share their genetic information with law enforcement so cops can catch violent criminals.

Two months ago, FamilyTreeDNA raised privacy concerns after BuzzFeed revealed the company had partnered with the FBI and given the agency access to the genealogy database. Law enforcement’s use of DNA databases has been widely known since last April when California officials revealed genealogy website information was instrumental in determining the identity of the Golden State Killer. But in that case, detectives used publicly shared raw genetic data on GEDmatch. The recent news about FamilyTreeDNA marked the first known time a home DNA test company had willingly shared private genetic information with law enforcement.

Several weeks later, FamilyTreeDNA changed their rules to allow customers to block the FBI from accessing their information. “Users now have the ability to opt out of matching with DNA relatives whose accounts are flagged as being created to identify the remains of a deceased individual or a perpetrator of a homicide or sexual assault,” the company said in a statement at the time.

But now the company seems to be embracing this partnership with law enforcement with their new campaign called, “Families Want Answers.”

The info is here.

Saturday, November 24, 2018

Establishing an AI code of ethics will be harder than people think

Karen Hao
www.technologyreview.com
Originally posted October 21, 2018

Over the past six years, the New York City police department has compiled a massive database containing the names and personal details of at least 17,500 individuals it believes to be involved in criminal gangs. The effort has already been criticized by civil rights activists who say it is inaccurate and racially discriminatory.

"Now imagine marrying facial recognition technology to the development of a database that theoretically presumes you’re in a gang," Sherrilyn Ifill, president and director-counsel of the NAACP Legal Defense fund, said at the AI Now Symposium in New York last Tuesday.

Lawyers, activists, and researchers emphasize the need for ethics and accountability in the design and implementation of AI systems. But this often ignores a couple of tricky questions: who gets to define those ethics, and who should enforce them?

Not only is facial recognition imperfect, studies have shown that the leading software is less accurate for dark-skinned individuals and women. By Ifill’s estimation, the police database is between 95 and 99 percent African American, Latino, and Asian American. "We are talking about creating a class of […] people who are branded with a kind of criminal tag," Ifill said.

The info is here.

Wednesday, June 6, 2018

The LAPD’s Terrifying Policing Algorithm: Yes It’s Basically ‘Minority Report’

Dan Robitzski
Futurism.com
Originally posted May 11, 2018

The Los Angeles Police Department was recently forced to release documents about their predictive policing and surveillance algorithms, thanks to a lawsuit from the Stop LAPD Spying Coalition (which turned the documents over to In Justice Today). And what do you think the documents have to say?

If you guessed “evidence that policing algorithms, which require officers to keep a checklist of (and keep an eye on) 12 people deemed most likely to commit a crime, are continuing to propagate a vicious cycle of disproportionately high arrests of black Angelinos, as well as other racial minorities,” you guessed correctly.

Algorithms, no matter how sophisticated, are only as good as the information that’s provided to them. So when you feed an AI data from a city where there’s a problem of demonstrably, mathematically racist over-policing of neighborhoods with concentrations of people of color, and then have it tell you who the police should be monitoring, the result will only be as great as the process. And the process? Not so great!

The article is here.

Saturday, March 19, 2016

How America's criminal justice system became the country's mental health system

By German Lopez
Vox.com
Originally published March 1, 2016

Here are two excerpts:

It's a terrifying statistic: Someone with an untreated mental illness is 16 times more likely to be killed by police than other civilians approached or stopped by law enforcement, according to a 2015 report by the Treatment Advocacy Center.

(cut)


If people were getting comprehensive care and support, police most likely would not need to get involved in many of the circumstances that end up in horrible tragedies. But very often in the US, that's not happening.

Before Kevin broke into a neighbor's house and was arrested by police, Pete tried to take steps that would have prevented the whole encounter. Kevin had just suffered a psychotic episode in 2002, and Pete raced Kevin to emergency care to hopefully get Kevin into some form of long-term care, potentially against Kevin's will if necessary.

But doctors said they couldn't do anything because Kevin, an adult, didn't appear to pose a threat to himself or others in the four hours they sat in an emergency room. So he was let free, and within 48 hours, he went through the episode in which he broke into the neighbor's house.

The article is here.

Friday, September 11, 2015

Moral Panic: Who Benefits From Public Fear?

By Scott Bohn
Psychology Today Blog
Originally published July 20, 2015

Here is an excerpt:

Moral panics arise when distorted mass media campaigns are used to create fear, reinforce stereotypes and exacerbate pre-existing divisions in the world, often based on race, ethnicity and social class.

Additionally, moral panics have three distinguishing characteristics.  First, there is a focused attention on the behavior, whether real or imagined, of certain individuals or groups that are transformed into what Cohen referred to as “folk devils” by the mass media. This is accomplished when the media strip these folk devils of all favorable characteristics and apply exclusively negative ones.

Second, there is a gap between the concern over a condition and the objective threat it poses. Typically, the objective threat is far less than popularly perceived due to how it is presented by authorities.

Third, there is a great deal of fluctuation over time in the level of concern over a condition. The typical pattern begins with the discovery of the threat, followed by a rapid rise and then peak in public concern, which then subsequently, and often abruptly, subsides.

Finally, public hysteria over a perceived problem often results in the passing of legislation that is highly punitive, unnecessary, and serves to justify the agendas of those in positions of power and authority.

The entire article is here.

Tuesday, March 10, 2015

Identifying mentally ill 'frequent fliers' first step to reducing police contact

Press Release
Oregon State University
Originally published February 11, 2015

Identifying the population of people with mental illness who have frequent contact with police could help law enforcement officials and community agencies allocate limited resources to those with the highest needs, new research from Oregon State University indicates.

These individuals, often referred to as “frequent fliers” because of their repeated interaction with law enforcement, can consume a large amount of police time and resources, according to researchers in the School of Public Policy in OSU’s College of Liberal Arts.

Identifying and understanding the population can aid policymakers as they work to reduce the frequent and time-consuming interactions, sociologists Scott Akins and Brett Burkhardt said.

“This contact is rarely criminal in nature at the outset,” said Burkhardt, an assistant professor of sociology. “It’s usually a peace officer custody arrest, which is a type of arrest that occurs because a person is believed to be a danger to themselves or others due to a suspected mental illness. But there’s a limited amount of resources, so if we identify people with the highest needs, we can focus resources on those folks.”

Once a local region has identified its population of frequent fliers, community agencies and policy-makers can use the information to change or implement policies to assist those with the highest needs, the researchers said.

“It’s a strategic way to create a more cost-effective and humane way to assist the mentally ill,” said Akins, an associate professor of sociology.

The entire press release is here.