When the spin ended up on black yet again, I knew the streak couldn’t last forever and placed a $20 chip on red. The ball immediately landed on black again. Undeterred, I decided to double down so I could at least break even. After all, the probability of black hitting again had to be minuscule … right?
Well, not exactly. I never got back to even that weekend.
After returning home with little cash and even less dignity, I did some research and found that I’d fallen victim to the Gambler’s Fallacy, the “mistaken belief that, if something happens more frequently than normal, it will happen less frequently in the future.” In other words, despite the fact that a roulette ball landing on black 13 consecutive times is a rare occurrence, each spin is still an independent event.I also found that the Gambler’s Fallacy is just one example of what psychologists and behavioral economists have termed cognitive biases. In fact, the study1 of irrational decision making as a result of our brains falling into these logic “traps” has grown considerably since the 1970s. As you might expect, there are practical implications far beyond limiting gambling losses. So, for a special behavioral economics edition, we’ll examine three cognitive biases and how we might minimize their influence.
1) The Planning Fallacy
The Planning Fallacy is the idea that we almost always underestimate the time and cost associated with complex projects. Planners are too optimistic and don’t rely on practical past experiences frequently enough. Examples of the Planning Fallacy in government projects abound (look no further than the $400 billion+ F-35 fighter jet) but are by no means limited to that realm. This bias will sound familiar to management teams juggling a dozen or more complex projects with competing resource needs and tight deadlines.
Overcoming the Bias
Clear and practical benefit expectations should be agreed upon up front. With the finance team as a strategic partner, a true ROI process that incorporates conservative implementation estimates and “known unknowns” should be mandated. Just as importantly, a look-back process should be carried out where successes and failures can be honestly assessed. Whether it’s a system conversion or process overhaul, transparent objectives and a disciplined approach go a long way.
(Bonus reading: Brooks’ Law, the idea that adding resources to a project that’s already behind schedule will make it even later.)
2) Confirmation Bias
If you’ve ever observed someone parse through information until he finds a source that aligns with his pre-existing beliefs, then you know Confirmation Bias. This pitfall is common for bankers who are frustrated with lack of system capabilities and integration.
Confirmation Bias goes something like this:
Resulting system changes are often costly while adding little, if any, functionality benefit.
Overcoming the Bias
Beyond an ROI analysis and a disciplined project approach, the inevitable tradeoffs between systems should be assessed up front with brutal honesty. As part of the selection process, sales-oriented PowerPoints from vendors should be 1) kept to a minimum with time spent on actual system use and navigation, and 2) focused on pre-scripted areas of interest. Inviting the incumbent provider to present a “refresh” demo under the same guidelines is a best practice. Finally, involving training resources early in the process to communicate benefits and changes to impacted staff members is key.
3) Mere-Exposure Effect
This is the theory that “mere repeated exposure of the individual to a stimulus is a sufficient condition for the enhancement of his attitude toward it.” Sound familiar? My colleague Eric Weikart recently wrote that a majority of banks are still using patchwork manual processes to originate big dollar deals. It’s human nature to get comfortable, nostalgic even, about processes even if they’re sometimes painful and time consuming.
The Mere-Exposure Effect is often the culprit (or at least a contributing factor) when stale processes are preserved as part of a new system roll out. With that in mind, the definition of implementation success can’t be merely “flipping the switch.”
Overcoming the Bias
Project teams consisting of subject matter experts and even management trainees should be empowered to drive key process changes to ensure that system functionality and procedures are in sync. And, those teams should also be held accountable. Ongoing, routine (ideally quarterly) tracking of benchmarks remains crucial to enabling fact-based assessments of progress.
Are we going to get every technology decision right? Or every process optimally designed overnight? Of course not. But putting practical building blocks in place to combat our cognitive blind spots is a no-brainer.
-RB
1Two wonderful books on this topic: Predictably Irrational by Dan Ariely and Thinking, Fast and Slow by Daniel Kahneman.
Any bank or credit union that isn’t undertaking some kind of performance improvement effort on at least several fronts on an ongoing basis is giving up significant opportunities to recapture costs, gain market share and increase shareholder value.
Backed by an extensive, exclusive database of peer benchmarking comparisons, Cornerstone Advisors can highlight for your institution potential areas of improvement for performance, increased revenue, and expense reduction.
Visit our website or contact us today to learn more.
Greetings!
The “Print This Post” button has not been working on your last three posts.
Paul- Thanks for letting us know. We’re looking into the issue.
there is another cognitive bias that i have named “The Frequency Effect”. It occurs in most businesses that allow/rely on problematic issues to get “escalated” for resolution. This is not unusual as the premise is that a “higher up” employee (i.e. manager) has more authority and better understanding of what is the right answer for the business. Predictably, the higher up you gothe greater the frequency and severity of those excalated issues you encounter. This leads to flawed conclusions about the character of the work that is being produced i.e. it oversamples the number of problems and fails to recognize all the things that were just fine. S if all the vendor’s issues are brought to the manager for resolution, he/she would quickly conclude that we have a bad vendor on our hands. The solution is to ensure that those performance calculations have the proper denominator (i.e.all the units of work)so the resulting conclusions will have the appropriate context….oh, and it may help to ask why all these issues have to go up the food chain to get solved – that may be a vendor issue or it may be a client issue, but it sure looks like an issue!
Pat, Great example. This is a bias that applies to banking if I’ve ever heard one. Thanks for your comment.
“I reject your reality and substitute my own.” – Adam Savage
Great article Ryan! Bringing critical thought and some introduction to cognitive bias to the financial services world! Well done!
Thanks for the comment, Michael. Hope you’re doing well!
In addition to the Kahnemann book, “How We Decide” by Lehrer is also quite good.