Amazon

Monday, September 14, 2020

See Last Piece for the Market Update -- This One is About Cognitive Bias

 “He who knows only his own side of the case knows little of that. His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side, if he does not so much as know what they are, he has no ground for preferring either opinion... Nor is it enough that he should hear the opinions of adversaries from his own teachers, presented as they state them, and accompanied by what they offer as refutations. He must be able to hear them from persons who actually believe them...he must know them in their most plausible and persuasive form.” 

John Stuart Mill, On Liberty 


What is "truth"? Truth is simply "that which actually is." Truth is reality, in all its glory and horror. "Not truth" can be defined as "that which is not reality" (sometimes called a "lie"). 

Let's imagine you have to negotiate a minefield. Which of the following would you prefer to be armed with? 
  1.  The truth of where the mines are located, or 
  2.  a belief system that may "sound better" but which is going to get you blown up directly? 

Everyone, of course, immediately answers, "Option one."  This seems so self-evident that they may even add:  "Duh!"

But not so fast. Is that REALLY what you want? Are you acting in the world as if that's really what you want?  What if accepting the truth of that first map of the minefield means we have to give up something we love, because it's surrounded by deadly mines?  

So even though we all like to think we'd jump at the first option (at least, when it's presented unfettered), very, very few people actually behave in a manner that will bring that option into being. 

 How about these next options? Would you prefer: 
  1. An easy-to-read map that shows you how to get safely through the entire minefield, or 
  2. A map that shows you the first three safe steps, then gives vague hints on how to figure out the next step from there?

Still option 1, right?  Duh, obviously.

Kaboom. 

Wait, what?!?

Turns out option one was a lie. Oh, not entirely a lie: It is indeed an easy-to-read map that depicts a path through the entire minefield -- just NOT "safely."  (The people who sold us the map neglected to mention this little detail, for some reason.) 

This analogy is not unlike what we often face both in trading and in life. 

And, sure, the consequences may not be as immediate and dramatic as a literal minefield, but the long-term end result of opting for a "map of reality" that contains significant errors is indeed destruction. And while even the best map of reality can't guarantee success, it at least gives us a fighting chance.  (Here, think in terms of trading:  Speaking personally, there are times I've formed an excellent map of the future market, only to fail to make any money on that in execution.  But if my map had been flawed, I almost certainly would have lost money.) 

So how do we assure that we are working with an accurate map of reality? For starters, we must become aware of when we are engaging in motivated reasoning, which is: 

"a form of reasoning in which people access, construct, and evaluate arguments in a biased fashion to arrive at or endorse a preferred conclusion." 

Basically, it means starting with our conclusion ("our bias") and working backwards to try to support it.  What does "motivated reasoning" look like in operation? 

Imagine you're a coffee drinker and you come across a new study saying that coffee causes cancer. Would you accept that at face value? Probably not, if you like coffee. It's more likely you're going to dig deeper and try to find flaw with the study ("motivated skepticism"). Whereas someone who isn't a coffee drinker (i.e.- who isn't motivated by an underlying emotion) will often simply say, "Huh, I didn't know that." 

Or, if you're bearish on the market, you might give more weight to bearish ideas and dismiss or ignore bullish ideas and data (or vice versa).

And, in fact, the above is exactly what researchers have found happens in real-life testing. 

So this is "motivated reasoning" and "motivated skepticism": If we encounter an idea that we want to be true ("new study shows that coffee raises IQ!"), we are entirely too good at accepting that idea at face value with minimal skepticism. If we encounter an idea that we wish were NOT true, we are entirely too good at rejecting it out of hand ("Oh that's ridiculous! I can't believe that!") and finding (or imagining) ways to find fault with it. 

Those who are good at overcoming bias apply the same rigor and skepticism to ideas they like as to ideas they don't like. That's because, if we wish to become skilled at critical thinking (which is the precursor skill that aids us in finding reality/truth), we must set our goals higher than our base wants and desires: We must set our goal as, simply, finding the truth. Hand in hand with this, we must commit to accepting truth as we find it. 

And it ends there. That's the whole goal. 

We can't pervert that goal with any "if, ands, or buts."  ("But I'll only accept the truth if it backs up my other beliefs and helps prove me right!")

That is the only way to keep ourselves honest. Truth must be a goal in itself, because all approaches will fail if truth becomes a penultimate goal that functions in subservience to any kind of ideology. 

In other words, if we value our beliefs more than we value truth, we will fail to properly align with reality.  And that's a sure-fire way to make life harder than it already is.

The quest for truth must be detached and isolated from other belief systems and stand, as it were, at the pinnacle of our internal hierarchy.  Put in the format of our opening minefield hypothetical:
  • Do we want our beliefs to flow from (and thus be based upon) truth?
  • Or, the alternative: Do we want our "truths" to filter down from (and thus be based around) our beliefs? 
Truth has to be at the top of the value structure, or we will only reach it at such times that it just happens to align with what we already believe.  ("Truth, but only randomly.")

Sadly, the vast majority of people opt for the alternative, knowingly or unknowingly. And frankly, ALL of us do it sometimes. It seems to be built-in to human nature.

And, to make matters much more complicated (ready for this?) here's the kicker to everything we just discussed:  Motivated reasoning/skepticism is not always a bad thing.  

There are times it is quite useful and valuable.  For example, if we know for a fact that 2+2=4, then when someone at Harvard tries to tell us that 2+2=5, we are immediately skeptical.  And so we go in looking to poke holes in their proposition, and we thus quickly identify that they are making a "cheat" argument, created by misrepresenting their initial equation.  (Short version:  Their argument is, essentially, that "2+2 can equal 5 in the presence of ADDITIONAL VARIABLES" -- which means the starting equation is not really "2+2" but some variation of "2x+2y" (or even a simple "2+2+x" or another variation) and, of course, we already knew that 2+2+something else can equal 5.)

Anyway, point is, we keep engaging in motivated reasoning/skepticism because sometimes it serves us well.  But other times it can be quite damaging and literally block us from seeing reality -- so it's important to try to recognize and understand which instance we're engaged in, in real-time.

Especially since it's easy for humans to develop bias solely based on nothing more than repetition -- as the successful propagandists of history have often taken advantage of.  It was Goebbels who said, "Repeat a lie often enough, and it becomes truth."  And experiments show that people tend to rate familiar things (things they've heard or seen before) as more likely to be true.  Regardless of whether they're true or not.

In modern culture, it might not even be intentional propaganda, it may just be simple error that's been echoed so often it becomes a subjective, widespread "truth" that nobody bothers to examine.

So how much of our bias is even backed by anything, other than the fact that we've heard it repeated often?  We really don't know until we start digging.

Thus I think the first step to overcoming this particularly deep-rooted bias is to become aware of it.  Then, once one has gotten good at identifying the tendency, we can start to sort the beneficial uses of this bias from the unproductive occasions.  

And then, once we see it, we must hold our own intellectual feet to the fire and not allow ourselves to get away with it. 

This gives us the best chance at identifying reality as it actually is, and not how we wish it to be.

The most succinct way to express this as a rule might be: 

  "If it hurts to hear, then look for the truth in it. If it comforts to hear, then look for the lie in it."


So that's it.  I just wanted to touch on that briefly because:
  1. It's very valuable for traders to be aware of this tendency in themselves.
  2. I'm going to refer back to it in future articles, some of which may (will) go beyond trading.

Bias safe.  I mean... trade safe.  ;)

No comments:

Post a Comment