Wednesday, February 1, 2012

Study suggests disarmingly simple way to better job ethics: slow down

Media contactBenjamin Haimowitz, (212) 233-6170, HHaimowitz@aol.com
When the FBI made front-page headlines recently with the arrest of a coterie of  financial traders and analysts charged with tens of millions dollars worth of securities fraud, it was only the most recent of dozens of similar cases over the past two years. Yet, such big-time rip-offs, spectacular though they are, represent only a tiny sliver of the nation's total business-related cheating, according to the Association of Certified Fraud Examiners, which estimates that U.S. business lost close to a trillion dollars from employee fraud in one recent year.

Is such a massive amount of cheating indicative of  a work force that is hopelessly corrupt? Research in the new issue of the Academy of Management Journal suggests not. It finds that, confronted with clear choices between right and wrong, people are more than five times more likely to do the right thing when they have some time to think about the matter than they are when they have to make a snap decision.

"Having time to think things over may not make much difference in big-time financial swindles, but our findings suggest that it would make a considerable difference in innumerable instances of lying and fraud that happen every day in the business world," comments J. Keith Murnighan of the Kellogg School of Management at Northwestern University, who carried out the new research with Brian Gunia of Johns Hopkins University, Long Wang of City University of Hong Kong, and Li Huang and Jiunwen Wang of Northwestern.

In explanation, Prof. Murnighan adds: "Immediate, automatic moral intuitions tend to be selfish, given that self-interest is a basic, instinctual response to external stimuli. In contrast, conscious, deliberative thought adds social concerns, setting off a battle within the individual that pits the strength of self-interested intuitive desires against the constraints established by social learning."
 
The study also finds that even a modest nudge on behalf of morality can carry the day in such battles, with ethical urgings four times more likely to engender good deeds than advice on behalf of self-interest will.

In sum, as the study puts it, "Organizations with a 'fast pulse' or tendency to reward quick decision-making may suffer ethical penalties by discouraging contemplation and conversation... At a minimum, our results suggest that individual, organizational actors facing right-wrong decisions should take the time to think or to consult an ethical colleague."

The findings emerge from experiments involving 146 college undergraduates who were confronted with a moral dilemma that the researchers manipulated in several different ways. Seated at computer terminals, participants were told that $15 would be divided between them and another student (who, unbeknownst to the subjects actually did not exist) according to one of two options. In Option A, the participant would received $10 and the other student $5; in Option B, the participant would receive $5 and the other student $10. The imaginary other student would choose one option or the other, and it was up to the participant to send beforehand one of two messages of advice. A truthful message would be "Option B earns you more than Option A"; a lie would be "Option A earns you more than Option B."

In one variation, the "immediate-choice condition," participants received instructions and proceeded directly to a screen that asked them to send the message within 30 seconds. In the "contemplation condition," subjects first encountered a screen that remained visible for three minutes and that asked the subject to "please think very carefully about which message to send."

The difference between the two groups was dramatic: 26 of 30 subjects in the contemplation condition (87%) told the truth even though it resulted in their receiving less money than their anonymous counterparts. This compared to 19 of 34 (56%) of the subjects who were asked to make an immediate choice.

In a second experiment, subjects were confronted with the same two options, but this time, before making a decision, they received one of three brief emails from a third party (again fictional) who was supposedly facing the same dilemma. One message "guess[ed] that most people would b honest on this kinda situation;" a second couldn't "help thinking that most other people would try to gain the most money"; a third offered no particular advice, "guessing most people would have a hard time deciding what to do in this kinda situation."

Again the results presented a stark contrast: 20 of 25 subjects who received the ethical email (80%) told the truth, in contrast to 14 of 28 (50%) who were urged to attend to their self-interest and 20 of 29 (69%) who were not given any particular advice.

While such results suggest the benefits that derive from an ethical workplace milieu, the professors express concern about the "potent effects" of the superficial advice proffered in the experiments. In the words of the study, "Given the current ubiquity of email and text messaging, these findings are troubling. More generally they suggest that right-wrong decisions can put people on the fence, straddling ethical action and temptation, and that even minimal influence processes can have big effects...Whether [the advice] led them toward or away from ethicality, the fact that people were influenced at all by such minimal conversations is troubling, as it suggests that people routinely depend on others, even unknown others, to direct their moral choices."  

In conclusion, the study urges organizations to "consciously design moral decision-making processes, integrating them into training and enforcing them institutionally via policies, rewards, and sanctions. Policies mandating a 'cooling-off period' or multiple levels of approval for consequential decisions, for example, might provide institutional analogs for contemplation, and ethics hotlines might act as institutional conversations. Opportunities for instituting and improving these kinds of procedures abound."

Adds Prof. Murnighan: "Executives know what types of decisions raise moral red flags in their companies. If people make these decisions electronically, their computers might be programmed to require contemplation time before decisions are finalized -- and even to fill this time with reminders of the firm's ethical values. Managers already employ similar technology, though usually for non-moral purposes. Hopefully, they will assign enough priority to ethics to adapt existing tools for the purpose of encouraging what's right."

The new study, entitled "Contemplation and Conversation: Subtle Influences on Moral Decision-Making" is in the February/March issue of the The Academy of Management Journal.  This peer-reviewed publication is published every other month by the Academy, which, with about 18,000 members in 103 countries, is the largest organization in the world devoted to management research and teaching. The Academy's other publications are theThe Academy of Management Review, The Academy of Management Perspectives and Academy of Management Learning and Education
Media Coverage:
hbr.comExtra thinking time leads to ethical decisions. (Tuesday, January 31, 2012)

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.