Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Code of Conduct. Show all posts
Showing posts with label Code of Conduct. Show all posts

Monday, October 2, 2023

Research: How One Bad Employee Can Corrupt a Whole Team

Stephen Dimmock & William Gerken
Harvard Business Review
Originally posted 5 March 2018

Here is an excerpt:

In our research, we wanted to understand just how contagious bad behavior is. To do so, we examined peer effects in misconduct by financial advisors, focusing on mergers between financial advisory firms that each have multiple branches. In these mergers, financial advisors meet new co-workers from one of the branches of the other firm, exposing them to new ideas and behaviors.

We collected an extensive data set using the detailed regulatory filings available for financial advisors. We defined misconduct as customer complaints for which the financial advisor either paid a settlement of at least $10,000 or lost an arbitration decision. We observed when complaints occurred for each financial advisor, as well as for the advisor’s co-workers.

We found that financial advisors are 37% more likely to commit misconduct if they encounter a new co-worker with a history of misconduct. This result implies that misconduct has a social multiplier of 1.59 — meaning that, on average, each case of misconduct results in an additional 0.59 cases of misconduct through peer effects.

However, observing similar behavior among co-workers does not explain why this similarity occurs. Co-workers could behave similarly because of peer effects – in which workers learn behaviors or social norms from each other — but similar behavior could arise because co-workers face the same incentives or because individuals prone to making similar choices naturally choose to work together.

In our research, we wanted to understand how peer effects contribute to the spread of misconduct. We compared financial advisors across different branches of the same firm, because this allowed us to control for the effect of the incentive structure faced by all advisors in the firm. We also focused on changes in co-workers caused by mergers, because this allowed us to remove the effect of advisors choosing their co-workers. As a result, we were able to isolate peer effects.

Here is my summary: 

The article discusses a study that found that even the most honest employees are more likely to commit misconduct if they work alongside a dishonest individual. The study, which was conducted by researchers at the University of California, Irvine, found that financial advisors were 37% more likely to commit misconduct if they encountered a new co-worker with a history of misconduct.

The researchers believe that this is because people are more likely to learn bad behavior than good behavior. When we see someone else getting away with misconduct, it can make us think that it's okay to do the same thing. Additionally, when we're surrounded by people who are behaving badly, it can create a culture of acceptance for misconduct.

Wednesday, October 23, 2019

Supreme Court Ethics Reform

Johanna Kalb and Alicia Bannon
Brennan Center for Justice
Originally published September 24, 2019

Today, the nine justices on the Supreme Court are the only U.S. judges — state or federal — not governed by a code of ethical conduct. But that may be about to change. Justice Elena Kagan recently testified during a congressional budget hearing that Chief Justice John Roberts is exploring whether to develop an ethical code for the Court. This was big news, given that the chief justice has previously rejected the need for a Supreme Court ethics code.

In fact, however, the Supreme Court regularly faces challenging ethical questions, and because of their crucial and prominent role, the justices receive intense public scrutiny for their choices. Over the last two decades, almost all members of the Supreme Court have been criticized for engaging in behaviors that are forbidden to other federal court judges, including participating in partisan convenings or fundraisers, accepting expensive gifts or travel, making partisan comments at public events or in the media, or failing to recuse themselves from cases involving apparent conflicts of interest, either financial or personal. Congress has also taken notice of the problem. The For the People Act, which was passed in March 2019 by the House of Representatives, included the latest of a series of proposals by both Republican and Democratic legislators to clarify the ethical standards that apply to the justices’ behavior.

The info is here.

Thursday, May 2, 2019

A Facebook request: Write a code of tech ethics

A Facebook request: Write a code of tech ethicsMike Godwin
Originally published April 30, 2019

Facebook is preparing to pay a multi-billion-dollar fine and dealing with ongoing ire from all corners for its user privacy lapses, the viral transmission of lies during elections, and delivery of ads in ways that skew along gender and racial lines. To grapple with these problems (and to get ahead of the bad PR they created), Chief Executive Mark Zuckerberg has proposed that governments get together and set some laws and regulations for Facebook to follow.

But Zuckerberg should be aiming higher. The question isn’t just what rules should a reformed Facebook follow. The bigger question is what all the big tech companies’ relationships with users should look like. The framework needed can’t be created out of whole cloth just by new government regulation; it has to be grounded in professional ethics.

Doctors and lawyers, as they became increasingly professionalized in the 19th century, developed formal ethical codes that became the seeds of modern-day professional practice. Tech-company professionals should follow their example. An industry-wide code of ethics could guide companies through the big questions of privacy and harmful content.

The info is here.

Editor's note: Many social media companies engage in unethical behavior on a regular basis, typically revolving around lack of consent, lack of privacy standards, filter bubble (personalized algorithms) issues, lack of accountability, lack of transparency, harmful content, and third party use of data.

Tuesday, October 23, 2018

Why you need a code of ethics (and how to build one that sticks)

Josh Fruhlinger
Originally posted September 17, 2018

Here is an excerpt:

Most of us probably think of ourselves as ethical people. But within organizations built to maximize profits, many seemingly inevitably drift towards more dubious behavior, especially when it comes to user personal data. "More companies than not are collecting data just for the sake of collecting data, without having any reason as to why or what to do with it," says Philip Jones, a GDPR regulatory compliance expert at Capgemini. "Although this is an expensive and unethical approach, most businesses don’t think twice about it. I view this approach as one of the highest risks to companies today, because they have no clue where, how long, or how accurate much of their private data is on consumers."

This is the sort of organizational ethical drift that can arise in the absence of clear ethical guidelines—and it's the sort of drift that laws like the GDPR, the EU's stringent new framework for how companies must handle customer data, are meant to counter. And the temptation is certainly there to simply use such regulations as a de facto ethics policy. "The GDPR and laws like it make the process of creating a digital ethics policy much easier than it once was," says Ian McClarty, President and CEO of PhoenixNAP.  "Anything and everything that an organization does with personal data obtained from an individual must come with the explicit consent of that data owner. It’s very hard to subvert digital ethics when one’s ability to use personal data is curtailed in such a draconian fashion."

But companies cannot simply outsource their ethics codes to regulators and think that hewing to the letter of the law will keep their reputations intact. "New possibilities emerge so fast," says Mads Hennelund, a consultant at Nextwork, "that companies will be forced by market competition to apply new technologies before any regulator has been able to grasp them and impose meaningful rules or standards." He also notes that, if different silos within a company are left to their own devices and subject to their own particular forms of regulation and technology adoption, "the organization as a whole becomes ethically fragmented, consisting of multiple ethically autonomous departments."

The info is here.

Monday, October 15, 2018

ICP Ethics Code

Institute of Contemporary Psychoanalysis

Psychoanalysts strive to reduce suffering and promote self-understanding, while respecting human dignity. Above all, we take care to do no harm. Working in the uncertain realm of unconscious emotions and feelings, our exclusive focus must be on safeguarding and benefitting our patients as we try to help them understand their unconscious mental life. Our mandate requires us to err on the side of ethical caution. As clinicians who help people understand the meaning of their dreams and unconscious longings, we are aware of our power and sway. We acknowledge a special obligation to protect people from unintended harm resulting from our own human foibles.

In recognition of our professional mandate and our authority—and the private, subjective and influential nature of our work—we commit to upholding the highest ethical standards. These standards take the guesswork out of how best to create a safe container for psychoanalysis. These ethical principles inspire tolerant and respectful behaviors, which in turn facilitate the health and safety of our candidates, members and, most especially, our patients. Ultimately, ethical behavior protects us from ourselves, while preserving the integrity of our institute and profession.

Professional misconduct is not permitted, including, but not limited to dishonesty, discrimination and boundary violations. Members are asked to keep firmly in mind our core values of personal integrity, tolerance and respect for others. These values are critical to fulfilling our mission as practitioners and educators of psychoanalytic therapy. Prejudice is never tolerated whether on the basis of age, disability, ethnicity, gender, gender identity, race, religion, sexual orientation or social class. Institute decisions (candidate advancement, professional opportunities, etc.) are to be made exclusively on the basis of merit or seniority. Boundary violations, including, but not limited to sexual misconduct, undue influence, exploitation, harassment and the illegal breaking of confidentiality, are not permitted. Members are encouraged to seek consultation readily when grappling with any ethical or clinical concerns. Participatory democracy is a primary value of ICP. All members and candidates have the responsibility for knowing these guidelines, adhering to them and helping other members comply with them.

The ethics code is here.

Big Island considers adding honesty policy to ethics code

Associated Press
Originally posted September 14, 2018

Big Island officials are considering adding language to the county's ethics code requiring officers and employees to provide the public with information that is accurate and factual.

The county council voted last week in support of the measure, requiring county employees to provide honest information to "the best of each officer's or employee's abilities and knowledge," West Hawaii Today reported . It's set to go before council for final approval next week.

The current measure has changed from Puna Councilwoman Eileen O'Hara's original bill that simply stated "officers and employees should be truthful."

She introduced the measure in response to residents' concerns, but amended it to gain the support of her colleagues, she said.

The info is here.

Thursday, October 11, 2018

Does your nonprofit have a code of ethics that works?

Mary Beth West
USA Today Network - Tennessee
Originally posted September 10, 2018

Each year, the Public Relations Society of America recognizes September as ethics month.

Our present #FakeNews / #MeToo era offers a daily diet of news coverage and exposés about ethics shortfalls in business, media and government sectors.

One arena sometimes overlooked is that of nonprofit organizations.

I am currently involved in a national ethics-driven bylaw reform movement for PRSA itself, which is a 501(c)(6) nonprofit with 21,000-plus members globally, in the “business league” category.

While PRSA’s code of ethics has stood for decades as an industry standard for communications ethics – promoting members’ adherence to only truthful and honest practices – PRSA’s code is not enforceable.

Challenges with unenforced ethics codes

Unenforced codes of ethics are commonplace in the nonprofit arena, particularly for volunteer, member-driven organizations.

PRSA converted from its enforced code of ethics to one that is unenforced by design, nearly two decades ago.

The reason: enforcing code compliance and the adjudication processes inherent to it were a pain in the neck (and a pain in the wallet, due to litigation risks).

The info is here.

Wednesday, October 10, 2018

Psychologists Are Standing Up Against Torture at Gitmo

Rebecca Gordon
Originally posted September 11, 2018

Sometimes the good guys do win. That’s what happened on August 8 in San Francisco when the Council of Representatives of the American Psychological Association (APA) decided to extend a policy keeping its members out of the US detention center at Guantánamo Bay, Cuba.

The APA’s decision is important—and not just symbolically. Today we have a president who has promised to bring back torture and “load up” Guantánamo “with some bad dudes.” When healing professionals refuse to work there, they are standing up for human rights and against torture.

It wasn’t always so. In the early days of Guantánamo, military psychologists contributed to detainee interrogations there. It was for Guantánamo that Defense Secretary Donald Rumsfeld approved multiple torture methods, including among others excruciating stress positions, prolonged isolation, sensory deprivation, and enforced nudity. Military psychologists advised on which techniques would take advantage of the weaknesses of individual detainees. And it was two psychologists, one an APA member, who designed the CIA’s whole “enhanced interrogation program.”

The info is here.

Saturday, May 5, 2018

Deep learning: Why it’s time for AI to get philosophical

Catherine Stinson
The Globe and Mail
Originally published March 23, 2018

Here is an excerpt:

Another kind of effort at fixing AI’s ethics problem is the proliferation of crowdsourced ethics projects, which have the commendable goal of a more democratic approach to science. One example is DJ Patil’s Code of Ethics for Data Science, which invites the data-science community to contribute ideas but doesn’t build up from the decades of work already done by philosophers, historians and sociologists of science. Then there’s MIT’s Moral Machine project, which asks the public to vote on questions such as whether a self-driving car with brake failure ought to run over five homeless people rather than one female doctor. Philosophers call these “trolley problems” and have published thousands of books and papers on the topic over the past half-century. Comparing the views of professional philosophers with those of the general public can be eye-opening, as experimental philosophy has repeatedly shown, but simply ignoring the experts and taking a vote instead is irresponsible.

The point of making AI more ethical is so it won’t reproduce the prejudices of random jerks on the internet. Community participation throughout the design process of new AI tools is a good idea, but let’s not do it by having trolls decide ethical questions. Instead, representatives from the populations affected by technological change should be consulted about what outcomes they value most, what needs the technology should address and whether proposed designs would be usable given the resources available. Input from residents of heavily policed neighbourhoods would have revealed that a predictive policing system trained on historical data would exacerbate racial profiling. Having a person of colour on the design team for that soap dispenser should have made it obvious that a peachy skin tone detector wouldn’t work for everyone. Anyone who has had a stalker is sure to notice the potential abuses of selfie drones. Diversifying the pool of talent in AI is part of the solution, but AI also needs outside help from experts in other fields, more public consultation and stronger government oversight.

The information is here.

Thursday, April 26, 2018

Practical Tips for Ethical Data Sharing

Michelle N. Meyer
Advances in Methods and Practices in Psychological Science
Volume: 1 issue: 1, page(s): 131-144


This Tutorial provides practical dos and don’ts for sharing research data in ways that are effective, ethical, and compliant with the federal Common Rule. I first consider best practices for prospectively incorporating data-sharing plans into research, discussing what to say—and what not to say—in consent forms and institutional review board applications, tools for data de-identification and how to think about the risks of re-identification, and what to consider when selecting a data repository. Turning to data that have already been collected, I discuss the ethical and regulatory issues raised by sharing data when the consent form either was silent about data sharing or explicitly promised participants that the data would not be shared. Finally, I discuss ethical issues in sharing “public” data.

The article is here.

Saturday, March 10, 2018

Universities Rush to Roll Out Computer Science Ethics Courses

Natasha Singer
The New York Times
Originally posted February 12, 2018

Here is an excerpt:

“Technology is not neutral,” said Professor Sahami, who formerly worked at Google as a senior research scientist. “The choices that get made in building technology then have social ramifications.”

The courses are emerging at a moment when big tech companies have been struggling to handle the side effects — fake news on Facebook, fake followers on Twitter, lewd children’s videos on YouTube — of the industry’s build-it-first mind-set. They amount to an open challenge to a common Silicon Valley attitude that has generally dismissed ethics as a hindrance.

“We need to at least teach people that there’s a dark side to the idea that you should move fast and break things,” said Laura Norén, a postdoctoral fellow at the Center for Data Science at New York University who began teaching a new data science ethics course this semester. “You can patch the software, but you can’t patch a person if you, you know, damage someone’s reputation.”

Computer science programs are required to make sure students have an understanding of ethical issues related to computing in order to be accredited by ABET, a global accreditation group for university science and engineering programs. Some computer science departments have folded the topic into a broader class, and others have stand-alone courses.

But until recently, ethics did not seem relevant to many students.

The article is here.

Thursday, August 13, 2015

Meeting the Challenge of Change

By Ken Pope
Excerpted from Ethics in Psychotherapy and Counseling: A Practical Guide, 5th Ed. 
Forthcoming January 2016.

Here is an excerpt:

When complicity with torture, violations of human rights, misleading the public, and other vital matters are at stake, organizations must address not only personnel, policies, and procedures but also the powerful incentives from inside and outside the organization, sources of institutional resistance to change, conflicting ethical and political values within the organization, and issues of institutional character and culture that allowed the problems to flourish for years, protected by APA's denials.

Organizations facing ethical scandals often publicly commit to admirable values such as accountability, transparency, openness to criticism, strict enforcement of ethical standards, and so on. These institutional commitments so often meet the same fate as our own individual promises to a program of personal change. We make a firm New Year's resolution to lead a healthier life. We pour time, energy, and sometimes money into making sure the change happens. We buy jogging shoes and a cookbook of healthy meals. We take out a gym membership. We discuss endlessly what approaches yield the best results. We commit to eating only healthy foods and to getting up five days a week at 5 a.m. for an hour of stretching, aerobics, and resistance exercises. But one, two, and three months later, the commitment to change that had taken such fierce hold of us and promised such wanted, needed, and carefully planned improvement has loosened or lost its grip.

The entire article is here.

Wednesday, August 12, 2015

Thoughts on Psychologists, Ethics, and the Use of Torture in Interrogations

Zimbardo, P.G. (2007). Thoughts on Psychologists, Ethics, and the Use of Torture in  Interrogations: Don’t Ignore Varying Roles and Complexities.
Analyses of Social Issues and Public Policy (ASAP) Online SSPSI Journal. Vol. 7, pp. 65-73.

Here is an excerpt:

Such considerations lead me to conclude that PENS has utilized the wrong model for its ethical deliberations about psychologists as consultants to military interrogations. The model featured in this task force report is that of a psychologist working for the military as an independent contractor, making rational moral decisions within a transparent setting, with full power to confront, challenge and expose unethical practices. It is left up to that individual to be alert, informed, perceptive, wise, and ready to act on principle when ethical dilemmas arise.

Instead, I will argue that those psychologists are "hired hands" working at the discretion of their military or government agency clients for as long as they provide valued service, which in the current war on terrorism is to assist by providing whatever information and advice is requested to gain "actionable intelligence" from those interrogated. PENS notes that psychologists often are part of a group of professionals, rarely acting alone. They can become part of an operational team, experiencing normative pressures to conform to the emerging standards of that group. They cannot make readily informed ethical decisions because they do not have full knowledge of how their personal contributions are being used in secret or classified missions. Their judgments and decisions may be made under conditions of uncertainty, and may include high stress. Moreover, definitions of basic terms are not constant, but shifting, so it becomes difficult or impossible to make a fully informed ethical judgment about any specific aspect of one's functions.

In addition, PENS does not recognize the reality that in field settings, the work of Ph.D./Psy.D. psychologists is often substituted by, or made operational by, numerous paraprofessionals, such as mental health counselors, personnel officers, psychological assistants and interns, and others trained in psychology. If they do not belong to professional associations, such as APA, they are relieved of the professional consequences of engaging in unethical actions. Thus, our concerns must extend to these psychologist paraprofessionals as well as those professionals within APA.

The entire article is here.

Sunday, August 9, 2015

What, exactly, does yesterday’s APA resolution prohibit?

By Marty Lederman
Just Security
Originally posted August 8, 2015

By an overwhelming vote of 156-1 (with seven abstentions and one recusal)–so lopsided that it stunned even its proponents–the American Psychological Association’s Council of Representatives yesterday approved a resolution that the APA describes as “prohibit[ing] psychologists from participating in national security interrogations.”

What does Approved Resolution No. 23B do, exactly?  As I read it, it does three principal things, in ascending order of importance:

1.  It reaffirms an existing APA ethical prohibition that psychologists “may not engage directly or indirectly in any act of torture or cruel, inhuman, or degrading treatment or punishment,” a prohibition that “applies to all persons (including foreign detainees) wherever they may be held”; and it “clarifies” that “cruel, inhuman, or degrading treatment or punishment” (CIDTP) should be understood not (or not only) as that term is defined in the U.S. Senate’s understandings of, and reservations to, the Convention Against Torture, but instead in accord with the broadest understanding of CIDTP adopted by any international legal body at the relevant time:  the definition “continues to evolve with international legal understandings of this term.”


3.  Finally, and most significantly, the Resolution establishes a new prohibition that “psychologists shall not conduct, supervise, be in the presence of, or otherwise assist any national security interrogations for any military or intelligence entities, including private contractors working on their behalf, nor advise on conditions of confinement insofar as these might facilitate such an interrogation.”

The entire article is here.

Friday, July 24, 2015

The ethics of multiple relationships: a clinical perspective

By Stephen Behnke
The Monitor on Psychology
July/August 2015, Vol 46, No. 7
Print version: page 84

APA members contact the Ethics Office on a daily basis to discuss the ethical aspects of their work. Receiving these calls is both interesting and gratifying, and educates the office about how psychologists across the country frame the ethical questions they encounter. One of the most frequent topics is multiple relationships. During the Ethics Code revision process that ended in 2002, the Ethics Code revision task force made clear that not all multiple relationships are unethical. The task force wrote a test for determining when a psychologist should refrain from entering a multiple relationship:

A psychologist refrains from entering into a multiple relationship if the multiple relationship could reasonably be expected to impair the psychologist's objectivity, competence or effectiveness in performing his or her functions as a psychologist, or otherwise risks exploitation or harm to the person with whom the professional relationship exists.

The language of ethical standard 3.05 requires the psychologist to determine when a particular relationship would impair the psychologist's objectivity, competence, or effectiveness in doing the work of a psychologist, or would otherwise risk exploitation or harm. The standard thus illustrates clinically driven ethics.

The entire article is here.

Friday, February 6, 2015

Insights for Writing a Code of Ethics or Conduct

Risk management, strategy, and analysis from Deloitte
via The Wall Street Journal

The heart of an organization is often expressed in its code of ethics or code of conduct. It tells the world what really matters to an organization and what it is all about. Companies that follow both the letter and the spirit of the law by taking a “value-based” approach to ethics and compliance may have a distinct advantage in the marketplace. Give the average employee a legalistic “thou shall not….” code, and a negative response is almost guaranteed. Give employees a document that states clearly and concisely the organization’s expectations, outlines acceptable behaviors and presents viable options for asking questions and voicing concerns, and the likelihood is much greater that they will meet those expectations and exhibit the desired behaviors. Make the contents of the code equally applicable to, and understood by, everyone in the organization—at all levels, across all business units and spanning the geographies—and you have a key ingredient for a code that becomes ingrained in the corporate culture, with all of the benefits.

The entire article is here.

Monday, January 12, 2015

Why there would have been no torture without the psychologists

By Steven Reisner
Originally published December 12, 2014

Here is an excerpt:

The psychologists were vital to the torture program for one additional reason: The Justice Department’s Office of Legal Counsel had determined that the presence of psychologists and physicians, monitoring the state and condition of the prisoner being tortured, afforded protection for the CIA leadership and the Bush administration from liability and potential prosecution for the torture. Later, the OLC applied the same rules to the Defense Department’s “enhanced interrogation program,” which, according to an investigation by the Senate Armed Services Committee, was created and overseen by a team led by a clinical psychologist, and eventually overseen exclusively by clinical psychologists.

The entire article is here.

Thursday, June 21, 2012

Editors With Ethics

By Scott Jaschik
Inside Higher Ed
Originally published June 12, 2012

Many of the public debates over ethics in scholarly journals focus on such questions as conflict of interest by biomedical researchers. And various federal regulations (and journal codes of conduct) attempt to prevent conflicts.

Now some journal editors -- primarily in the social sciences but extending to other fields -- are trying to use a new code of conduct to address ethical issues that arise in fields beyond the biological sciences (though there, too), but that also have the potential to tarnish the image of the research enterprise. In the past few months, 88 journal editors have signed on to the principles outlined by 5 other journal editors, and 71 associate editors have signed on.

The entire article is here.

Saturday, December 31, 2011

Anti-Gay Student's Suit Rejected

By Scott Jaschik
Inside Higer Ed

A federal appeals court has upheld the right of Augusta State University to enforce standards of its counseling graduate program -- even when a religious student objects to requirements to treat gay people in a nondiscriminatory manner.

While the ruling may be appealed, it represents a strong victory for advocates of counseling standards that require that students be trained to treat a range of clients in supportive, nonjudgmental ways. The student who sued Augusta State, and already lost in a lower court, maintained that her First Amendment rights were violated when the university required her to complete a "remediation plan" over her willingness to treat gay people.

She had stated her intent to recommend "conversion therapy" to gay clients and to tell them that they could choose to be straight. (A wide consensus among psychology and sexuality experts holds that people don't select their sexual orientation and that encouraging people to change their orientation can be seriously harmful to them.)

The student, Jennifer Keeton, argues that her religiously motivated beliefs are being challenged by Augusta State's policies -- and that a public university may not do so. Keeton was expelled when she declined to participate in the remediation plan, and she asked a federal district court and the appeals court to order her reinstatement in the program.

A three-judge panel of the U.S. Court of Appeals for the 11th Circuit found that Augusta State had legitimate, nondiscriminatory reasons to enforce its rules. The counseling program's accreditation depended in part on adhering to a code of conduct, and faculty members believed it was their responsibility to train students to work with a wide range of clients, the court found. The decision placed the counseling department's actions at Augusta State in the broader context of faculty members training professionals who must pay attention to the ethics of various fields.

"Just as a medical school would be permitted to bar a student who refused to administer blood transfusions for religious reasons from participating in clinical rotations, so ASU may prohibit Keeton from participating in its clinical practicum if she refuses to administer the treatment it has deemed appropriate," says the decision.

"Every profession has its own ethical codes and dictates. When someone voluntarily chooses to enter a profession, he or she must comply with its rules and ethical requirements. Lawyers must present legal arguments on behalf of their clients, notwithstanding their personal views.... So too, counselors must refrain from imposing their moral and religious views on their clients."

Read more here.

Thanks to Ken Pope for this article.