As the African folktale goes, a major flood broke out during the rainy season and all animals sought refuge on higher grounds. Two monkeys climbed up to the treetops, from where they could observe fish swimming and jumping frenziedly out of the water in the overflowing river below. One of the monkeys shouted to the other: “Look at those poor, struggling creatures. They are going to drown!” The other monkey replied: “Yes, how terrible! They probably couldn’t escape to the hills in time because they have no legs. I think we must do something.” So the monkeys hurried to the shore and, with some difficulty, began catching the fish and depositing them on dry land, one by one. After some time there were hundreds of fish lying on the grass, motionless. One of the monkeys said enthusiastically: “Do you see? They were tired, but now they are just sleeping and resting. Had it not been for us, they all would have drowned.” The other monkey nodded in agreement: “They were trying to avoid being caught because they could not understand our good intentions, but when they wake up they will be very grateful.”
What could have driven the monkeys to behave the way they did? Innocence? Ignorance? Arrogance? Perhaps all of the above? Moreover, how would the monkeys explain themselves once they realized what they’ve done? Would they take responsibility, try to learn from their mistake, or seek refuge in the proverbial saying “see no evil, hear no evil, speak no evil”? Would they find someone else to blame? What if they not only had the power to decide what’s best for their fellow fish but also the means to control the narrative through Facebook, WhatsApp, Twitter and YouTube? How inclined would they be to twisting historical accounts or manipulating the facts to suit their agenda? How easy would it be for them to form political alliances and drive public policies to protect their own perceived truth? For how long would they attempt to maintain their version of events? What would they do to other monkeys who disagreed? And who would eat all the bananas?
I recently had the chance to watch the HBO documentary "The Inventor: Out for Blood in Silicon Valley," which explores the spectacular downfall of Elizabeth Holmes and her health technology startup Theranos. One of the main voices in the film, behavioral economist Dan Ariely, explains how people can justify dishonest behavior by convincing themselves (and others) that they are doing it for a good cause. He describes a psychological experiment where participants roll a standard six-sided die, and before the die is rolled, they are asked to silently choose whether the top or bottom number on the die will be higher. After the die is rolled, participants must tell the moderator which side they picked, and are then paid an amount of money based on the number that reflects their choice. Perhaps unsurprisingly, people tend to cheat at least some of the time in this task (or, as Ariely jokes, they are “extra lucky”), reporting whichever side of the die has a higher number, regardless of which one they chose before rolling.
But here is where it gets interesting. In a twist to the experiment, participants were hooked to a lie detector while playing the game. Ariely found that, generally, the lie detector could tell when they were dishonestly choosing the favorable side of the die. Then he introduced another variant: instead of receiving the money themselves, participants were told that the money won playing the die game would go to the charity of their choice. In this version of the experiment, the lie detector was no longer able to reliably tell when participants were being dishonest. Why? Because when participants were cheating for selfish reasons, the machine would detect tension. However, when they were cheating for charitable reasons, it would no longer detect tension. According to Ariely, this illustrates how a sense of doing something for the greater good can impair someone's honesty, and fool even a lie detector. In other words, people think cheating is wrong, but once they’ve convinced themselves it’s not wrong, they stop worrying. “If it's for a good cause, you can still think you're a good person", he says.
In this TED Talk, Dan Ariely explains how our preconceived notions “color our world”. More specifically, he demonstrates the huge gaps that can exist between what is actually happening, what we think is happening, and what we would like to see happening. One of the consequences of this, of course, is that corrective actions (including public policy) can’t possibly be effective when our understanding of the problem is wildly distorted.