Heuristics, Biases and Nudges

“Heuristics” is the ten dollar word for the mental shortcuts human beings take when processing information and making decisions. Such shortcuts are helpful when speed or efficiency is paramount in decision making, but they leave us prone to numerous cognitive biases that skew our thinking in ways that are well documented and predictable. Being aware of such biases allows us to “nudge” others toward certain decisions or behaviours without taking away their freedom to choose.

In this article I list the biases I find to be most useful for ethically influencing the choices of clients and prospects in a sale without misleading them or impairing their ability to choose what’s right for them. It’s also helpful to watch for these biases in our own decision making.

Whether nudging people in this manner is in the domain of the dark arts or not depends entirely on your own use and motivations. So, with your solemn oath to nudge only for niceness instead of evilness, let us examine some of the more interesting cognitive biases and how to use them.

Anchoring

Anchoring is the tendency we all share to overweight the first piece of information we receive on a subject. I’ve already dedicated an entire article to this topic and have covered it in a few webcasts as well, but only as it applies to pricing, where “anchor high” is the rule for salespeople and “anchor low” the rule for buyers.

Anchoring doesn’t have to be limited to prices or even numbers, though. The first idea raised in a problem-solving discussion will skew the discussion to that idea.

The original, and more complete, term for anchoring is “anchoring & adjusting.” We quickly anchor to the first piece of information as a starting point in our decision making then we enlist another slower, deeper system of thinking to make adjustments and arrive at a decision using a balance of these two fast and slow systems. Our adjustments however are typically shown to insufficiently counter the effects of the initial anchor.

Getting your ideas out first has an effect that is not easily undone, even when the other party knows what you’re doing. Be the first one to offer a proposal in any negotiation and anchor with a position that’s more exaggerated than your desired final one.

If you’ve ever presented three creative options to a client and thought of or explained those options in the ascending order of safe, bold & audacious then you’ve attempted to anchor. The mistake most make is pitching the audacious option last. Instead, lead with the audacious option first, properly anchoring it in the mind of the client. From there, watch it skew the discussion, increasing the likelihood the client will choose your bold middle option.

Social Influence

When your mother exclaimed, “I suppose if Billy jumped off a bridge then you would too!?” she was trying to counteract the influence of others on your behaviour. There are different types of social influence — some based on overt peer pressure and others based simply on information of what others are doing – but all can have a profound effect on decision making.

The most common example of attempts to leverage social influence in a sale include anytime you see the words “most popular” next to an option. The inference is that these people know something you do not so you’d better follow their lead. By telling your client that “most of our clients choose to hire us to do Y in addition to X” you’re leveraging social influence. As you are when you show your client or prospect what their competition is doing in their marketing that your client is not.

Occasionally, you’ll be asked to sell to a prospect with requests like, “Tell me why we should hire you.” You should never accept such an invitation. Instead, counter with something like, “How about instead of trying to convince you, I tell you why our current clients hire us and you can see if those reasons make sense for you?” You’ve now dragged your client’s peers into the room and are speaking for them, explaining what they saw in you. You’ve just swapped your own self-serving bias for your prospect’s bias to be influenced by others.

A bias related to social influence is the bandwagon effect, which sees the tendency to increase the conformity of one’s behaviour as the number of other people exhibiting similar behaviour rises. Implied in the bandwagon effect is a sort of tipping point, where even a small imbalance brings instability that builds until the vast majority ultimately conform to the new belief or behaviour. The modern vernacular is “going viral,” which is apt, because the spread of infectious diseases was the area of study that led to the observation of the tipping point effect, which has since been applied to other areas thanks to Malcolm Gladwell.

Framing Effect

We all intuitively understand that how we frame a choice, to ourselves or others, has a significant impact on the decision that gets made. The framing effect sees people draw different conclusions from the same information based on how the information is presented.

Loss Aversion Bias

I’ll move quickly to loss aversion bias because one of the simplest ways of reframing a decision to your advantage is to frame the choice as giving something up rather than acquiring it.

Loss aversion bias, also known as the endowment effect, is somewhat quantifiable. It’s been proven that people dislike giving up something they possess about twice as much as they like the idea of acquiring something they do not yet possess. This is just one reason why guarantees and offers to “try before you buy” are effective.

Loss aversion bias explains why you overpaid so much at that charity auction. Once you made a bid, the item was yours (in a part of your mind, anyway) and you valued it more than before you bid. If you were bid up by competitors multiple times then your sense of ownership, and of potential loss, increased, so you paid more than you planned to. It happens every time, even at corporate auctions where we assume the bidders to be more rational. They’re not. They’re as susceptible to such biases as those of us with smaller check books.

Don’t make the mistake of using loss aversion bias as your reason for working for free in the buying cycle. The client is committed to you only once they’ve parted with their money, so use a guarantee but don’t begin working on the engagement until they’ve paid for the first phase. That’s when the endowment effect becomes your friend.

Sunk Cost Bias

If you’ve ever increased your investment in a pitch in the face of information showing that the original investment was a poor one, then you, my friend, have been had by the sunk cost bias.

We all understand that mistakes in investing and hiring need to be ruthlessly corrected once the error of our ways becomes apparent, but most of us can rationalize why we should keep going for just a little while longer. And then longer still.

What expensive new business opportunity are you pursuing right now under the undue influence of a sunk cost bias? Now that you know what to do, will you do it or will you keep investing and rationalizing until the inevitable conclusion? We all know how this movie ends. (Pass the popcorn.)

Certainty Bias

At least one study has shown that when someone claims to be “99% certain” they’re only 40% likely to be correct. So, when you hear someone make this claim, the smart thing is to bet against them.

An interesting related bias is something known as the Dunning-Kruger effect which sees unskilled individuals tending to overestimate their abilities and highly skilled experts underestimating their own abilities. The short of it is that if someone appears overconfident in their own abilities, they probably are. Place your bets accordingly.

Confirmation Bias

Confirmation bias is the most common, obvious and powerful of all the biases. It is the tendency to look for and invite information that supports preconceptions and existing beliefs while ignoring information that challenges them. Pay attention and you will see it everywhere.

The book Immunity To Change, by Psychologists Robert Keegan and Lisa Laskow Lahey, maps out three core stages of “mental complexity” in human beings that the authors claim have little to do with education or even measurable intelligence.

To paraphrase their work in simple terms that I’m sure would make them uncomfortable, the first level of mental complexity is where we embrace certain ideas of others and become followers. The second level is where we author our own ideas, and the third level is where we become aware of our biases to our own ideas and are able to further them but also to rethink, reshape and even recant them if necessary. (Keegan and Laskow Lahey use the term “stages” rather than “levels” and they identify five of them in their model but for the sake of simplicity and clarity I’m sticking to the truncated idea of three levels here.)

What’s interesting about the second level, where I believe most competent, successful people lie, is that once you start showing conviction for your ideas, those around you become aware of your biases and tend to bring you information that supports your point of view, while withholding information that challenges it. This feedback loop increases your certainty. The higher up you go in any organization, I believe, the more your certainty becomes entrenched. That’s why all dictators and most governmental leaders are delusional.

People at these first two levels also only seek out information that confirms their biases. So, while someone might be well read on a subject, the breadth of their reading rarely allows for differing ideas, leading to the type of intransigence that makes the company Christmas party so eventful.

Global warming is a fantastic arena for the demonstration of confirmation bias on both sides of the debate. Most of those who don’t believe the planet is warming at least in part because of the activities of humans become more entrenched in their position as they accumulate more information, and most of those on the other side equally become more convinced of the certainty of the apocalyptic forecasts as they accumulate more information. Both sides should be softening their positions based on the available information but their confirmation biases see them cementing those positions instead.

It’s a mistake to assume that the only thing standing between a person and enlightenment is data. It’s almost never the case. The more the average person thinks they know something, the more susceptible they are to confirmation bias. This holds true for scientists (regardless of their protests that science is above bias) and us muggles.

This highest level of mental complexity, where we are aware of our own biases, is rare air. How many successful people do you know who really invite challenges to their thinking or beliefs? How many of us truly consider viewpoints that are at odds with the ideas we’ve worked so hard to develop or the ideas of others to which we’ve become so committed? Less than 1% of the adult population, according to Keegan and Laskow Lahey. Chew on that for a minute.

We’re Only Human. So, Let’s Take Advantage.

We are all rife with biases. They originate in the gap between our two systems of thinking – our quick, intuitive, shortcut-based system that lets us do complex things like simultaneously drive a car while talking on the phone and drinking hot coffee without much conscious attention or cognitive cost, and our more considered, analytical system that lets us do long division.

These systems work together to allow us to parallel process our way to decisions in an incredible manner that even the world’s best computers cannot replicate. The intersection of these two systems just happens to leave us open to a bit of hacking, if one knows where to look. I’ve just told you some of the places to look.