Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Research Fraud. Show all posts
Showing posts with label Research Fraud. Show all posts

Sunday, August 20, 2023

When Scholars Sue Their Accusers. Francesca Gino is the Latest. Such Litigation Rarely Succeeds.

Adam Marcus and Ivan Oransky
The Chronicle of Higher Education
Originally posted 18 AUG 23

Francesca Gino has made headlines twice since June: once when serious allegations of misconduct involving her work became public, and again when she filed a $25-million lawsuit against her accusers, including Harvard University, where she is a professor at the business school.

The suit itself met with a barrage of criticism from those who worried that, as one scientist put it, it would have a “chilling effect on fraud detection.” A smaller number of people supported the move, saying that Harvard and her accusers had abandoned due process and that they believed in Gino’s integrity.How the case will play out, of course, remains to be seen. But Gino is hardly the first researcher to sue her critics and her employer when faced with misconduct findings. As the founders of Retraction Watch, a website devoted to covering problems in the scientific literature, we’ve reported many of these kinds of cases since we launched our blog in 2010. Platintiffs tend to claim defamation, but sometimes sue over wrongful termination or employment discrimination, and these kinds of cases typically end up in federal courts. A look at how some other suits fared might yield recommendations for how to limit the pain they can cause.The first thing to know about defamation and employment suits is that most plaintiffs, but not all, lose. Mario Saad, a diabetes researcher at Brazil’s Unicamp, found that out when he sued the American Diabetes Association in the very same federal district court in Massachusetts where Gino filed her case.Saad was trying to prevent Diabetes, the flagship research journal of the American Diabetes Association, from publishing expressions of concern about four of his papers following allegations of image manipulation. He lost that effort in 2015, and has now had 18 papers retracted.

(cut)

Such cases can be extremely expensive — not only for the defense, whether the costs are borne by institutions or insurance companies, but also for the plaintiffs. Ask Carlo Croce and Mark Jacobson.

Croce, a cancer researcher at Ohio State University, has at various points sued The New York Times, a Purdue University biologist named David Sanders, and Ohio State. He has lost all of those cases, including on appeal. The suits against the Times and Sanders claimed that a front-page story in 2017 that quoted Sanders had defamed Croce. His suit against Ohio State alleged that he had been improperly removed as department chair.

Croce racked up some $2 million in legal bills — and was sued for nonpayment. A judge has now ordered Croce’s collection of old masters paintings to be seized and sold for the benefit of his lawyers, and has also garnished Croce’s bank accounts. Another judgment means that his lawyers may now foreclose on his house to recoup their costs. Ohio State has been garnishing his wages since March by about $15,600 each month, or about a quarter of his paycheck. He continues to earn more than $800,000 per year from the university, even after a professorship and the chair were taken away from him.

When two researchers published a critique of the work of Mark Jacobson, an energy researcher at Stanford University, in the Proceedings of the National Academy of Sciences, Jacobson sued them along with the journal’s publisher for $10 million. He dropped the case just months after filing it.

But thanks to a so-called anti-SLAPP statute, “designed to provide for early dismissal of meritless lawsuits filed against people for the exercise of First Amendment rights,” a judge has ordered Jacobson to pay $500,000 in legal fees to the defendants. Jacobson wants Stanford to pay those costs, and California’s labor commissioner said the university had to pay at least some of them because protecting his reputation was part of Jacobson’s job. The fate of those fees, and who will pay them, is up in the air, with Jacobson once again appealing the judgment against him.

Sunday, June 25, 2023

Harvard Business School Professor Francesca Gino Accused of Committing Data Fraud

Rahem D. Hamid
Crimson Staff Writer
Originally published 24 June 23

Here is an excerpt:

But in a post on June 17, Data Colada wrote that they found evidence of additional data fabrication in that study in a separate experiment that Gino was responsible for.

Harvard has also been internally investigating “a series of papers” for more than a year, according to the Chronicle of Higher Education. Data Colada wrote last week that the University’s internal report may be around 1,200 pages.

The professors added that Harvard has requested that three other papers co-authored by Gino — which Data Colada flagged — also be retracted and that the 2012 paper’s retraction be amended to include Gino’s fabrications.

Last week, Bazerman told the Chronicle of Higher Education that he was informed by Harvard that the experiments he co-authored contained additional fraudulent data.

Bazerman called the evidence presented to him by the University “compelling,” but he denied to the Chronicle that he was at all involved with the data manipulation.

According to Data Colada, Gino was “the only author involved in the data collection and analysis” of the experiment in question.

“To the best of our knowledge, none of Gino’s co-authors carried out or assisted with the data collection for the studies in question,” the professors wrote.

In their second post on Tuesday, the investigators wrote that a 2015 study co-authored by Gino also contains manipulations to prove the paper’s hypothesis.

Observations in the paper, the three wrote, “were altered to produce the desired effect.”

“And if these observations were altered, then it is reasonable to suspect that other observations were altered as well,” they added.


Science is a part of a healthy society:
  • Scientific research relies on the integrity of the researchers. When researchers fabricate or falsify data, they undermine the trust that is necessary for scientific progress.
  • Data fraud can have serious consequences. It can lead to the publication of false or misleading findings, which can have a negative impact on public policy, business decisions, and other areas.

Saturday, September 18, 2021

Fraudulent data raise questions about superstar honesty researcher

Cathleen O'Grady
Sciencemag.com
Originally posted 24 Aug 21

Here is an excerpt:

Some time later, a group of anonymous researchers downloaded those data, according to last week’s post on Data Colada. A simple look at the participants’ mileage distribution revealed something very suspicious. Other data sets of people’s driving distances show a bell curve, with some people driving a lot, a few very little, and most somewhere in the middle. In the 2012 study, there was an unusually equal spread: Roughly the same number of people drove every distance between 0 and 50,000 miles. “I was flabbergasted,” says the researcher who made the discovery. (They spoke to Science on condition of anonymity because of fears for their career.)

Worrying that PNAS would not investigate the issue thoroughly, the whistleblower contacted the Data Colada bloggers instead, who conducted a follow-up review that convinced them the field study results were statistically impossible.

For example, a set of odometer readings provided by customers when they first signed up for insurance, apparently real, was duplicated to suggest the study had twice as many participants, with random numbers between one and 1000 added to the original mileages to disguise the deceit. In the spreadsheet, the original figures appeared in the font Calibri, but each had a close twin in another font, Cambria, with the same number of cars listed on the policy, and odometer readings within 1000 miles of the original. In 1 million simulated versions of the experiment, the same kind of similarity appeared not a single time, Simmons, Nelson, and Simonsohn found. “These data are not just excessively similar,” they write. “They are impossibly similar.”

Ariely calls the analysis “damning” and “clear beyond doubt.” He says he has requested a retraction, as have his co-authors, separately. “We are aware of the situation and are in communication with the authors,” PNAS Editorial Ethics Manager Yael Fitzpatrick said in a statement to Science.

Three of the authors say they were only involved in the two lab studies reported in the paper; a fourth, Boston University behavioral economist Nina Mazar, forwarded the Data Colada investigators a 16 February 2011 email from Ariely with an attached Excel file that contains the problems identified in the blog post. Its metadata suggest Ariely had created the file 3 days earlier.

Ariely tells Science he made a mistake in not checking the data he received from the insurance company, and that he no longer has the company’s original file. He says Duke’s integrity office told him the university’s IT department does not have email records from that long ago. His contacts at the insurance company no longer work there, Ariely adds, but he is seeking someone at the company who could find archived emails or files that could clear his name. His publication of the full data set last year showed he was unaware of any problems with it, he says: “I’m not an idiot. This is a very easy fraud to catch.”

Saturday, March 23, 2019

The Fake Sex Doctor Who Conned the Media Into Publicizing His Bizarre Research on Suicide, Butt-Fisting, and Bestiality

Jennings Brown
www.gizmodo.com
Originally published March 1, 2019

Here is an excerpt:

Despite Sendler’s claims that he is a doctor, and despite the stethoscope in his headshot, he is not a licensed doctor of medicine in the U.S. Two employees of the Harvard Medical School registrar confirmed to me that Sendler was never enrolled and never received a MD from the medical school. A Harvard spokesperson told me Sendler never received a PhD or any degree from Harvard University.

“I got into Harvard Medical School for MD, PhD, and Masters degree combined,” Sendler told me. I asked if he was able to get a PhD in sexual behavior from Harvard Medical School (Harvard Medical School does not provide any sexual health focuses) and he said “Yes. Yes,” without hesitation, then doubled-down: “I assume that there’s still some kind of sense of wonder on campus [about me]. Because I can see it when I go and visit [Harvard], that people are like, ‘Wow you had the balls, because no one else did that,’” presumably referring to his academic path.

Sendler told me one of his mentors when he was at Harvard Medical School was Yi Zhang, a professor of genetics at the school. Sendler said Zhang didn’t believe in him when he was studying at Harvard. But, Sendler said, he met with Zhang in Boston just a month prior to our interview. And Zhang was now impressed by Sendler’s accomplishments.

Sendler said Zhang told him in January, “Congrats. You did what you felt was right... Turns out, wow, you have way more power in research now than I do. And I’m just very proud of you, because I have people that I really put a lot of effort, after you left, into making them the best and they didn’t turn out that well.”

The info is here.

This is a fairly bizarre story and worth the long read.

Friday, January 8, 2016

Peer-Review Fraud — Hacking the Scientific Publication Process

Charlotte J. Haug
N Engl J Med 373;25 nejm.org december 17, 2015

Here is an excerpt:

How is it possible to fake peer review? Moon, who studies medicinal plants, had set up a simple
procedure. He gave journals recommendations for peer reviewers for his manuscripts, providing
them with names and email addresses.  But these addresses were ones he created, so the requests
to review went directly to him or his colleagues. Not surprisingly, the editor would be sent favorable
reviews — sometimes within hours after the reviewing requests had been sent out. The fallout from Moon’s confession: 28 articles in various journals published by Informa were retracted, and one editor resigned.

Peter Chen, who was an engineer at Taiwan’s National Pingtung University of Education at the time, developed a more sophisticated scheme: he constructed a “peer review and citation ring” in which he used 130 bogus e-mail addresses and fabricated identities to generate fake reviews. An editor at one of the journals published by Sage Publications became suspicious, sparking a lengthy and comprehensive investigation, which resulted in the retraction of 60 articles in July 2014.

The article is here. 

Tuesday, February 17, 2015

How Diederik Stapel Became A Science Fraud

By Neuroskeptic
Discover Magazine Blog
Originally published January 20, 2015

Two years ago, Dutch science fraudster Diederik Stapel published a book, Ontsporing (“Derailment”), describing how he became one of the world’s leading social psychologists, before falling from grace when it emerged that he’d fabricated the data in dozens of papers.

The entire blog post is here.

Tuesday, November 25, 2014

Fabricating and plagiarising: when researchers lie

By Mark Israel
The Conversation
Originally published November 5, 2014

Here is an excerpt:

Systematic research into the causes of scientific misconduct is scarce. However, occasionally committees of investigation and research organisations have offered some comment. Some see the researcher as a “bad apple”. A researcher’s own ambition, vanity, desire for recognition and fame, and the prospect for personal gain may lead to behaviour that crosses the limits of what is admissible. Others point to the culture that may prevail in certain disciplines or research groups (“bad barrel”).

Again others identify the creation of a research environment overwhelmed by corrupting pressures (“bad barrel maker”). Many academics are under increasing pressure to publish – and to do so in English irrespective of their competence in that language – as their nation or institution seeks to establish or defend its placing in international research rankings.

The entire article is here.

Wednesday, September 24, 2014

Linguistic Traces of a Scientific Fraud: The Case of Diederik Stapel

By David Markowitz and Jeffrey Hancock
Published: August 25, 2014
DOI: 10.1371/journal.pone.0105937

Abstract

When scientists report false data, does their writing style reflect their deception? In this study, we investigated the linguistic patterns of fraudulent (N = 24; 170,008 words) and genuine publications (N = 25; 189,705 words) first-authored by social psychologist Diederik Stapel. The analysis revealed that Stapel's fraudulent papers contained linguistic changes in science-related discourse dimensions, including more terms pertaining to methods, investigation, and certainty than his genuine papers. His writing style also matched patterns in other deceptive language, including fewer adjectives in fraudulent publications relative to genuine publications. Using differences in language dimensions we were able to classify Stapel's publications with above chance accuracy. Beyond these discourse dimensions, Stapel included fewer co-authors when reporting fake data than genuine data, although other evidentiary claims (e.g., number of references and experiments) did not differ across the two article types. This research supports recent findings that language cues vary systematically with deception, and that deception can be revealed in fraudulent scientific discourse.

The entire article is here.

Friday, August 22, 2014

Retractions in the scientific literature: is the incidence of research fraud increasing?

By R. Grant Steen
J Med Ethics 2011; 37:249-253 doi:10.1136/jme.2010.040923

Abstract

Scientific papers are retracted for many reasons including fraud (data fabrication or falsification) or error (plagiarism, scientific mistake, ethical problems). Growing attention to fraud in the lay press suggests that the incidence of fraud is increasing.

Introduction

Accusations that research is tainted by bias have become commonplace in the news media. The ClimateGate scandal arose when climate change critics hacked into a research database at the University of East Anglia, evaluated the data without authorisation and went public with accusations that data had been selectively published and perhaps even falsified.1 More recently, a scientist at Harvard has been accused of biasing or falsifying data that show tamarin monkeys can learn algebraic rules.

The entire article is here.

Wednesday, July 23, 2014

Science Journal Pulls 60 Papers in Peer-Review Fraud

By Henry Fountain
The New York Times
Originally published July 10, 2014

A scientific journal has retracted 60 papers linked to a researcher in Taiwan, accusing him of “perverting the peer-review process” by creating fraudulent online accounts to judge the papers favorably and help get them published.

Sage Publications, publisher of The Journal of Vibration and Control, in which the papers appeared over the last four years, said the researcher, Chen-Yuan Chen, had established a “peer-review and citation ring” consisting of fake scientists as well as real ones whose identities he had assumed.

The entire story is here.

Tuesday, July 22, 2014

Crack Down on Scientific Fraudsters

By Adam Marcus and Ivan Oransky
The New York Times
Originally published July 10, 2014

DONG-PYOU HAN needed impressive lab results to help his team at Iowa State University move forward with its work on an AIDS vaccine — and to continue receiving millions of dollars in federal grants. So Dr. Han did what many scientists are probably tempted to do, but don’t: He faked the tests, spiking rabbit blood with human proteins to make it appear that the animals were responding to the vaccine to fight H.I.V.

The reason you’re reading about this story, and not about the glowing success of the therapy, is that Dr. Han was caught.

The entire story is here.

Wednesday, June 25, 2014

Harvard report shines light on ex-researcher’s misconduct

By Carolyn Y. Johson
The Boston Globe
Originally published May 30, 2014

When former Harvard psychology professor Marc Hauser was found solely responsible in a series of six scientific misconduct cases in 2012, he distanced himself from the problems, portraying them as an unfortunate consequence of his heavy workload. He said he took responsibility, “whether or not I was directly involved.”

(cut)

The 85-page report details instances in which Hauser changed data so that it would show a desired effect. It shows that he more than once rebuffed or downplayed questions and concerns from people in his laboratory about how a result was obtained. The report also describes “a disturbing pattern of misrepresentation of results and shading of truth” and a “reckless disregard for basic scientific standards.”

The entire article is here.

Wednesday, May 21, 2014

Fresh Misconduct Charges Hit Dutch Social Psychology

By Frank van Kolfschooten
Science
Originally published May 6, 2014

Scientists here are still searching their souls about two previous scandals--involving Diederik Stapel of Tilburg University in 2011 and Dirk Smeesters of Erasmus University in Rotterdam a year later.

Now they have learned that a national research integrity panel has found evidence of data manipulation in the work of Jens Forster, a social psychologist at the University of Amsterdam (UvA).

The university has already announced that it will request the retraction of one of Forster's articles.

The case is drawing widespread international attention as well, in part because Forster, who's German and came to Amsterdam in 2007, enjoys a sterling reputation.

"He is among the most creative and influential social psychologists of his generation," says Jeffrey Sherman of the University of California, Davis.

The entire article is here, behind a paywall.

Thursday, May 15, 2014

The Reformation: Can Social Scientists Save Themselves?

By Jerry Adler
Pacific Standard: The Science of Society
Originally posted April 28, 2014

Here is two excerpts to a long, yet exceptional, article on research in the social sciences:

OUTRIGHT FAKERY IS CLEARLY more common in psychology and other sciences than we’d like to believe. But it may not be the biggest threat to their credibility. As the journalist Michael Kinsley once said of wrongdoing in Washington, so too in the lab: “The scandal is what’s legal.” The kind of manipulation that went into the “When I’m Sixty-Four” paper, for instance, is “nearly universally common,” Simonsohn says. It is called “p-hacking,” or, more colorfully, “torturing the data until it confesses.”

P is a central concept in statistics: It’s the mathematical factor that mediates between what happens in the laboratory and what happens in the real world. The most common form of statistical analysis proceeds by a kind of backwards logic: Technically, the researcher is trying to disprove the “null hypothesis,” the assumption that the condition under investigation actually makes no difference.

(cut)

WHILE IT IS POSSIBLE to detect suspicious patterns in scientific data from a distance, the surest way to find out whether a study’s findings are sound is to do the study all over again. The idea that experiments should be replicable, producing the same results when run under the same conditions, was identified as a defining feature of science by Roger Bacon back in the 13th century. But the replication of previously published results has rarely been a high priority for scientists, who tend to regard it as grunt work. Journal editors yawn at replications. Honors and advancement in science go to those who publish new, startling results, not to those who confirm—or disconfirm—old ones.

The entire article is here.

Tuesday, October 22, 2013

Who's Afraid of Peer Review?

By John Bohannon
Science 4 October 2013:
Vol. 342 no. 6154 pp. 60-65
DOI: 10.1126/science.342.6154.60

On 4 July, good news arrived in the inbox of Ocorrafoo Cobange, a biologist at the Wassee Institute of Medicine in Asmara. It was the official letter of acceptance for a paper he had submitted 2 months earlier to the Journal of Natural Pharmaceuticals, describing the anticancer properties of a chemical that Cobange had extracted from a lichen.

In fact, it should have been promptly rejected. Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper's short-comings immediately. Its experiments are so hopelessly flawed that the results are meaningless.

I know because I wrote the paper.

The entire story is here.

Tuesday, October 15, 2013

Professor faked 61 pieces of research: Volkskrant

Dutch News
Originally posted September 23, 2013

Here is an excerpt:

Mart Bax, who retired in 2002, was involved in fraud for at least 15 years, publishing invented research, recycling his work under other names and lying about awards and other work, the Volkskrant says.

The entire story is here.

Thanks to Gary Schoener for this information.

Wednesday, July 3, 2013

Diederik Stapel, Disgraced Dutch Psychologist, Accepts Punishment

Associated Press
Originally published June 28, 2013

A disgraced Dutch social psychologist who admitted faking or manipulating data in dozens of publications has agreed to do 120 hours of community service work and forfeit welfare benefits equivalent to 18 months' salary in exchange for not being prosecuted for fraud.

The entire story is here.

Friday, May 10, 2013

The Mind of a Con Man

By YUDHIJIT BHATTACHARJEE
The New York Times
Published: April 26, 2013

Here are some excerpts:

Stapel was an academic star in the Netherlands and abroad, the author of several well-regarded studies on human attitudes and behavior. That spring, he published a widely publicized study in Science about an experiment done at the Utrecht train station showing that a trash-filled environment tended to bring out racist tendencies in individuals. And just days earlier, he received more media attention for a study indicating that eating meat made people selfish and less social.

His enemies were targeting him because of changes he initiated as dean, Stapel replied, quoting a Dutch proverb about high trees catching a lot of wind. When Zeelenberg challenged him with specifics — to explain why certain facts and figures he reported in different studies appeared to be identical — Stapel promised to be more careful in the future. As Zeelenberg pressed him, Stapel grew increasingly agitated.

Finally, Zeelenberg said: “I have to ask you if you’re faking data.”

“No, that’s ridiculous,” Stapel replied. “Of course not.”

That weekend, Zeelenberg relayed the allegations to the university rector, a law professor named Philip Eijlander, who often played tennis with Stapel. After a brief meeting on Sunday, Eijlander invited Stapel to come by his house on Tuesday morning. Sitting in Eijlander’s living room, Stapel mounted what Eijlander described to me as a spirited defense, highlighting his work as dean and characterizing his research methods as unusual. The conversation lasted about five hours. Then Eijlander politely escorted Stapel to the door but made it plain that he was not convinced of Stapel’s innocence.

(cut)

And yet as part of a graduate seminar he taught on research ethics, Stapel would ask his students to dig back into their own research and look for things that might have been unethical. “They got back with terrible lapses­,” he told me. “No informed consent, no debriefing of subjects, then of course in data analysis, looking only at some data and not all the data.” He didn’t see the same problems in his own work, he said, because there were no real data to contend with.

The entire story is here.

Friday, April 19, 2013

Scientific Articles Accepted (Personal Checks, Too)

By GINA KOLATA
The New York Times
Published: April 7, 2013

The scientists who were recruited to appear at a conference called Entomology-2013 thought they had been selected to make a presentation to the leading professional association of scientists who study insects.

But they found out the hard way that they were wrong. The prestigious, academically sanctioned conference they had in mind has a slightly different name: Entomology 2013 (without the hyphen). The one they had signed up for featured speakers who were recruited by e-mail, not vetted by leading academics. Those who agreed to appear were later charged a hefty fee for the privilege, and pretty much anyone who paid got a spot on the podium that could be used to pad a résumé.

“I think we were duped,” one of the scientists wrote in an e-mail to the Entomological Society.

Those scientists had stumbled into a parallel world of pseudo-academia, complete with prestigiously titled conferences and journals that sponsor them. Many of the journals and meetings have names that are nearly identical to those of established, well-known publications and events.

Steven Goodman, a dean and professor of medicine at Stanford and the editor of the journal Clinical Trials, which has its own imitators, called this phenomenon “the dark side of open access,” the movement to make scholarly publications freely available.

The entire story is here.

Monday, March 11, 2013

It's time for psychologists to put their house in order

BMC Psychology pledges 'to put less emphasis on interest levels' and publish repeat studies and negative results

By Keith Laws
The Guardian, Notes & Theories
Originally published February 27, 2013

In 2005, the epidemiologist John Ioannidis provocatively claimed that "most published research findings are false". In the field of psychology – where negative results rarely see the light of day – we have a related problem: there is the very real possibility that many unpublished, negative findings are true.

Psychologists have an aversion to some essential aspects of science that they perceive to be unexciting or less valuable. Historically, the discipline has done almost nothing to ensure the reliability of findings through the publication of repeat studies and negative ("null") findings.

Psychologists find significant statistical support for their hypotheses more frequently than any other science, and this is not a new phenomenon. More than 30 years ago, it was reported that psychology researchers are eight times as likely to submit manuscripts for publication when the results are positive rather than negative.

Unpublished, "failed" replications and negative findings stay in the file-drawer and therefore remain unknown to future investigators, who may independently replicate the null-finding (each also unpublished) - until by chance, a spuriously significant effect turns up.

The entire story is here.