Professional advising overcomes client biases and adds incremental value
“The first principle is that you must not fool yourself – and you are the easiest person to fool.”
– Richard Feynman
We live in a world of misinformation. We carry the world’s knowledge in our pockets on our smartphones, yet when we try to use this knowledge to our advantage, we often find misleading or biased answers instead of correct ones. We consume a staggering amount of information daily, and it is up to us to interpret and use it to make well-informed decisions; however, so much of the information we digest is filtered and shaped through networks of influence that by the time we receive it, the meaning is often tainted. There’s also an ever-growing body of psychological research which indicates that our minds are not so good at objectively analyzing information to arrive at the truth. Instead we tend to make new information fit what we want to believe, with little regard given to evidence. This tendency to misjudge facts, combined with the exposure to an overload of information in an increasingly complex world, puts us all at risk – especially when it comes to our financial lives. Whether it’s intentional or unintentional; whether it comes from the outside world or from inside our own minds; whether we want to accept it or not, one thing is certain: we’re all being fooled, many times every single day, and we’re terrible at detecting false information[i].
Considering these facts, you might be asking yourself a very important question: why are we so bad at this? The reasons are numerous, and a bit complicated, but well worth understanding. One reason we’re so easily tricked is because we use mental shortcuts, known as “heuristics”, to solve problems and make judgements in a quick and efficient manner. Heuristics are “good enough” solutions to complicated problems, and from an evolutionary standpoint they helped us tremendously in our navigation and survival as a species. We use heuristics all the time to this day because when we use them we don’t have to think too hard about every single decision we make. For example, if I ask you what two plus two equals, the answer will come to you immediately. This is because you’ve developed a heuristic for the problem of two plus two.
Heuristics are fantastically useful things; however, because our minds are hard-wired to use heuristics and other mental shortcuts so often, we are constantly exposed to errors of logic and irrationality. These errors stem from cognitive biases, which are the roots of irrational behavior. Some of the more well-known cognitive biases are: Anchoring (the reliance on the first piece of information we receive, which informs the rest of the information we take in); Overconfidence (confidence in our own beliefs, which cause us to take greater risk); Survivorship Bias (the idea that just because we succeeded at something means anyone else can succeed at the same thing); and, my personal favorite, Confirmation Bias (wishful thinking that our preexisting beliefs are the most correct beliefs, so we should only listen to them) [ii].
Another reason we’re so bad at detecting misinformation and subject to psychological misjudgments has to do with our egos. The ego is valuable because it gives us an important sense of self-esteem, but it can also block the rational part of our minds and make us think we know more than we do. This can lead to cognitive biases of all stripes. We’re also susceptible to self-serving bias (distorting the truth for the sake of our self-esteem) and motivated reasoning (flawed decisions based on what we think we know), which are emotional reactions that protect us from feeling like nitwits. Much like heuristics, these are survival tactics our minds use to keep us alive, which helped a lot during the Pleistocene Epoch when we were hunters and gatherers but they’re not as useful in the modern world as we sit in front of our computers.
If we want to cut through the clutter of our minds, we need to confront our own biases. This is much easier said than done, as our biases make us feel safe, comfortable, and familiar. In the classic sci-fi film, The Matrix, Neo (the central character) must choose to either take a blue pill – which will allow him to return to his blissful ignorance – or a red pill – which will show him what the real word truly is, which isn’t so pretty. To confront our biases and admit we might be wrong about some of our cherished beliefs is not unlike taking the red pill. It requires courage and the acceptance that we don’t have all the answers. We need to follow the evidence wherever it leads us, and approach reality with a cold, objective lens. But it’s hard to do this alone, because our minds will always err on the side of what we want to believe. We need to depend on others who we can trust, who don’t share our egos, and who aren’t afraid to tell it like it is. But how do we know who we can trust, especially with our life savings?
In the industry of finance, misinformation, deception, cognitive biases, information overload, and irrationality are everywhere. The misinformation in the finance industry is often a result of the complexity of the finance industry, but sometimes there’s an intentional push to spread misinformation. This misinformation (or, rather, disinformation) is not limited to consumers of the finance industry. Finance professionals are often the victims of false information, which in turn can make them the perpetuators of it. Yes, that’s right – finance professionals often perpetuate misinformation to their clients and others without even knowing they’re doing so!