The Deception, Incentives and Behavior Conference at the Rady School of Management, University of California - San Diego, was attended by over 100 people, including me, indicating that there is real interest in studying deception and unethical behavior in the real world.
Several presentations had implications for health care, summarized below by theme (titles of presentations follow entries in parentheses).
Several presentations discussed how context affects peoples' inclination to be honest or dishonest.
Alexander Cappelan, Norwegian School of Economics, showed evoking a market context, that is, asking people to think about buying or selling something, increased dishonesty, while a personal or intuitive context decreased dishonesty (When do people tell white lies?).
John List, University of Chicago, USA, gave the first plenary talk, suggesting that simply reminding charity fund raisers about the mission of their organization decreased their likelihood of stealing contributions.
Shahar Ayal, Interdisciplinary Center Herzliya, Israel, suggested that when people can present themselves as ethical, sometimes extremely, maybe harshly ethical in one context, they may then act unethically in another context. (Moral cleansing strategies for ethical dissonance.)
Erin Krupka, University of Michigan, suggested that the availability of voluntary regulation may increase cheating: those who do not cheat may volunteer for regulation, while those who do cheat may point to the availability of regulation. (License to cheat: voluntary regulation and ethical behavior.)
Chen-Bo Zhong, University of Toronto, Canada suggested that being able to delegate increases the likelihood of deception (Deception: an unintended consequence of delegation.)
Keith Murningham, Kellogg School of Management, suggested that priming people by asking them to perform mathematical calculations increases lying, while priming them by reminders about social awareness or family values does not. (The social and ethical consequences of a calculative mindset.)
Several presentations suggested various ways in which incentives may affect honesty.
David Cooper, Florida State University, USA, suggested that the need for long-term credibility may decrease lying. (Managing credibility: an experiment in leadership and coordination.)
Jan Potters, Tilburg University, Netherlands, suggested that increasing size of incentives to advisors increases the likelihood of their dishonesty when giving advice. (Disclosing advisors' interests neither helps nor hurts.)
Anastasia Danilov, University of Cologne, Germany, suggested that people who get incentives as part of a team who identify most strongly with the team are more likely to lie. (The dark side of team identity: expermiental evidence from financial service professionals.)
Bill Nielson, University of Tennessee, USA, suggested that people will respond to short-term incentives even at the cost of long-term outcomes. (Temptation, learning and backward induction in sequential decision tasks.)
Michael Vlassopoulos, University of Southampton, UK, suggested that people who get random bonuses, that is, whose incentives are not clearly tied to particular outcomes, are more likely to cheat employers. (The impact of bonuses on effort provision and cheating.)
Lisa Ordonez, University of Arizona, USA, suggested that basing incentives on achievement of short-term goals (pay for performance) increases lying about performance. (The impact of goals on ethical behavior.)
Conflicts of Interest
Two presenations discussed conflicts of interest and its effects.
Jan Potters (see above) also found that disclosure of payments to advisors did not not lead them to give more honest advice.
Miriam Mezger, University of Cologne, Germany, presented a poster that suggested that when people must make decisions based on a mixture of advice from unbiased lay-people and conflicted individuals who do not disclose their conflicts, addition of advice from unbiased experts leads to better decisions.(Can experts reduce the impact of deceptive ratings on the internet? - an experiment on product choice with recommendations by managers and experts.)
The enthusiasm and level of intellectual engagement at the meeting suggested the importance of this field. However, it was disappointing, but not surprising that the attendance at this meeting did not suggest this topic is yet of concern in health care. Although the opening remarks by the Dean of the Rady School included a call for participation by people from health care, and other fields such as environmental science, I may have been the only person attending with a health care affiliation.
Health care could benefit from increased attention to deception, dishonesty and unethical conduct in the field, how it happens, and what can be done to prevent it. However, as we have often suggested, discussion of such topics is often anechoic because its capacity to offend or threaten those who most benefit from the status quo in health care.
It was striking that presentations suggested that many practices in our current US commercialized, business-focused health care system may be leading to increased dishonesty and other unethical behavior. These include thinking about health care in business and quantitative contexts, not in the context of the health care mission to help patients and the public, focus and incentives based on short-term goals, often revenue, and acceptance of conflicts of interest as inevitable and necessary to increase innovation.
Of course, these particular ideas may not make many of our current health care leaders happy.
They do suggest that we could improve honesty and ethics in health care by reminding peole with decision-making authority of the centrality of the health care mission, and the need to improve patients' and the public's health, and to de-emphasize focus and incentives on short-term goals, particularly financial ones. We also ought to try to minimize, not just tolerate and "manage" conflicts of interest.