Conduct & Ethics
Friday, August 18, 2023
By Aaron Brown
Risk managers are professionally inclined toward full honesty. After all, deception – whether intended for good or ill – is at the heart of many disasters. Telling people what they want to hear, or what gets them to do what you want, can backfire in a crisis because people may be working from different pictures of the world, and thereby act at cross-purposes.
Clearly, false or misleading information leads to suboptimal risk levels. But to what extent, if any, should risk managers restrain their inclinations to full honesty to avoid disrupting innovation, cooperation and efficiency?
We’ll get to this in a minute, but let’s first consider a dishonesty case that recently grabbed headlines.
Dishonesty vs. Limited Honesty: What’s the Difference?
You have probably seen news stories about Harvard honesty researcher Francesca Gino being accused of faking data in research papers. I’m not going to discuss the details of that story, only note that much of the popular news coverage focuses on the irony of an honesty researcher being accused of dishonesty.
If headline writers had looked more closely into the story, they would realize that modern honesty researchers are not uncritical fans of honesty. Indeed, the subtitle for Professor Gino’s most popular book reads: “Why it pays to break the rules in work and life.” Business school ethics classes do not recommend, “the truth, the whole truth and nothing but the truth,” and risk managers should understand the reasons why.
To get a better handle on the honesty debate, let’s consider a thought exercise that requires one to make tough decisions under the following scenarios:
1. One of your newly hired staff is obviously very talented, but not working out due to mistakes and problems with teamwork. You think she may have to be let go soon. She comes to you with an idea she wants to develop on her own time and present to senior management. It’s genuinely innovative and intriguing, but you think it’s unlikely to succeed. Do you: (A) Tell her it’s unlikely to work, and that she should give it up and concentrate on improving her performance at assigned work? or (B) Encourage her with praise and some useful feedback — suppressing your doubts — to maximize her chance of success?
2. An otherwise productive team member is causing trouble with unfair and misguided criticisms of other members’ work. Do you: (A) Tell him that he should bring his criticisms to you privately, for the sake of team harmony, without stressing that they’re mostly wrong? or (B) Tell him he’s wrong about his criticisms?
3. A senior person is in your office complaining about a situation he misunderstands, and your attempts to explain have just made him more upset. Do you: (A) Continue to try to explain? or (B) Thank him for bringing the problem to your attention, and assure him you’ll take care of it?
Whatever your answers, I think you can see that it’s not always obvious that the most honest choice is the best choice. Strict honesty can impede innovation, frustrate teamwork and stretch out pointless meetings.
Aaron Brown
All these are examples of limited honesty rather than dishonesty. You tell the truth, but just avoid the whole truth.
It may be best to shield newly conceived innovations from the full blast of withering honesty until they can grow strong enough to withstand it; but that does not excuse lying or fraud to attract support.
White lies and diplomatic silence can smooth social frictions and promote harmony and cooperation, but it’s important to avoid the “tangled web we weave, when first we practice to deceive.” When honesty is complex or hard to understand, simplifications can save effort and improve short-term outcomes, but oversimplification can lead to long-term problems.
A famous disastrous attempt at limited honesty was the U.S. military’s “don’t ask, don’t tell,” policy attempt to deal with homosexuality. It’s a good cautionary example to keep in mind when weighing limits to honesty.
Gino and Mendel: Fudging Data Points
Should risk managers support the modern idea of MBA ethics that praises rule-breaking and limited honesty in some situations? Or should they be strict upholders of the whole truth, however disruptive that might be? Where should the line be drawn, for example, in the case of Professor Gino – which goes beyond white lies and silence to active lying?
Perhaps the most famous case of questionable research is that of Gregor Mendel, known as the “father of genetics” for his pathbreaking work on inheritance in the mid-1800s. Since the early 1900s, people have been complaining that his data conformed too closely to his model to have resulted from honest experiments.
Neither Gino nor Mendel – whose ideas were novel and extremely complex for 19th-century biologists – has been seriously accused of making up data. Rather, both are accused of fudging a few data points to make their results stronger, either deliberately or unconsciously. (There have also been some accusations of overzealous and honesty-challenged assistants.)
If Mendel hadn’t fudged the data to simplify and strengthen his argument, it might have taken even longer for his groundbreaking work on genetics to have been accepted – or perhaps it would have never gained attention. This could have delayed progress in biology for decades. For all we know, in the decades that followed Mendel’s research, other more honest experimenters could have been overlooked because of messier reported results – even if their findings had been similar to Mendel’s.
Balancing Usefulness and Honesty
Mendel’s dilemma is ubiquitous in research. In textbook versions, researchers do experiments that clearly demonstrate their ideas. In reality, experiments are never completely clear.
Scientific credibility comes from correctly judging when the data are strong enough to make a claim, and when more work is needed. Scientists who make premature claims that are later disproved lose respect, whether they reported their results with complete honesty or fudged them a bit by tricks like throwing away an outlier, ignoring a few experimental runs that failed or choosing a statistical method that exaggerated the certainty.
Scientists whose claims are supported by subsequent work gain prestige and credibility. Distilling results that are too complex to describe in full detail into comprehensible accounts of the key findings – without misleading anyone – is an important skill in research.
Researchers in the private sector face the same issues as academic scientists. Decision-makers lack the expertise and time to wade through the full details of an investigation. They cannot work with the whole truth. Rather, researchers must communicate what they know in simplified form and indicate accurately the uncertainty attached, balancing usefulness and honesty.
I’m willing to leave the ethical debate about limits to honesty to philosophers and psychologists, and the practical debate to top business decision-makers. Honesty is just one part of culture, and there are many different cultures — with different ideas about honesty limits — that can be productive.
Parting Thoughts
Risk managers, in short, should demand honesty about honesty.
Organizations sometimes make public statements suggesting zero-tolerance for anything less than full honesty. If that is not practiced — and it almost certainly is not — the public statements should be toned down.
New employees, especially MBAs, should be asked their views on honesty — and told whether those views conflict with the organization's culture. Clear distinctions should be made among acceptable or even approved limitations on complete honesty, tolerated deviations from the whole truth, and fireable offenses.
Without a good risk culture, risk management is hopeless. Attitudes toward honesty — when the whole truth is required, when limited truth is acceptable and when the whole truth is actually bad — are a central aspect of risk culture.
There may not be a perfect balance, and if there is, I don’t know it. But openness and transparency about honesty rules are essential to risk management.
Aaron Brown worked on Wall Street since the early 1980s as a trader, portfolio manager, head of mortgage securities and risk manager for several global financial institutions. Most recently he served for 10 years as chief risk officer of the large hedge fund AQR Capital Management. He was named the 2011 GARP Risk Manager of the Year. His books on risk management include The Poker Face of Wall Street, Red-Blooded Risk, Financial Risk Management for Dummies and A World of Chance (with Reuven and Gabriel Brenner). He currently teaches finance and mathematics as an adjunct and writes columns for Bloomberg.
•Bylaws •Code of Conduct •Privacy Notice •Terms of Use © 2024 Global Association of Risk Professionals