GUEST POST from Greg Satell
Early in my career I was working on a natural gas trading desk and found myself in Tulsa Oklahoma visiting clients. These were genuine roughnecks, who had worked their way up from the fields to become physical gas traders. When the NYMEX introduced “paper” contracts and derivatives into the market, however, much would change.
They related to me how, when New York traders first came to town offering long-term deals, they were thrilled. For the first part of the contract, they were raking in money. Unfortunately, during the latter months, they got crushed, losing all their profits and then some. The truth was that the trade was pure arbitrage and they never had a chance.
My clients’ brains were working against them in two ways. First, availability bias, caused them to value information most familiar to them and dismiss other data. The second, confirmation bias, made them look for information that would confirm their instincts. This, of course, isn’t at all unusual. It takes real effort to avoid believing the things we think.
Becoming a Square-Peg Business in a Round-Hole World
When I was researching my book, Mapping Innovation, I spoke to every great innovator I could find. Some were world class scientists, others were top executives at major corporations and still others were incredibly successful entrepreneurs. Each one shared with me how they were able to achieve incredible things.
What I found most interesting was that the story was different every time. For every one who told me that a particular approach was the secret to their success, I found someone else who was equally successful who did things completely differently. The fact is that there is no one “true path” to innovation, everybody does it different ways.
Yet few organizations acknowledge that in any kind of serious way. Rather, they have a “way we do things around here,” and there are often significant institutional penalties for anyone who wants to do things differently. Usually these penalties are informal and unspoken, but they are very real and can threaten to derail even the most promising career.
You can see how the same cognitive biases that lost my gas trader friends money are at work here. In a profitable company, the most available information suggests things are being done the “right” way and everybody who wants to get ahead in the organization is heavily incentivized to embrace evidence to support that notion and disregarding contrary data.
That’s how organizations get disrupted. They stick to what’s worked for them in the past and fail to notice that the nature of the problems they need to solve has fundamentally changed. They become better and better at things that people care about less and less. Before they realize what happened, they become square-peg businesses in a round-hole world.
Silicon Valley Jumps the Shark
Nobody can deny the incredible success that Silicon Valley has had over the past few decades. Still mostly a backwater in the 1970s and 80s, by the end of 2020 four out of the ten most valuable companies in the world came from the Bay Area (not including Microsoft and Amazon, which are based in Seattle). No other region has ever dominated so thoroughly.
Yet lately Silicon Valley’s model of venture-funded entrepreneurship seems to have jumped the shark. From massive fraud at Theranos and out-of control founders at WeWork and Uber to, most recently, the incredible blow-up at Quibi, there is increasing evidence that the tech world’s “unicorn culture” is beginning to have a negative impact on the real economy.
One clue of where things went wrong can be found in Eric Ries’s book, The Startup Way. Ries, whose earlier effort, The Lean Startup, was a runaway bestseller, was invited to implement his methods at General Electric and transform the company to a 124 year-old startup. Much like with the “unicorns,” it didn’t end well.
The fundamental fallacy of Silicon Valley is that a model that was developed for a relatively narrow set of businesses—essentially software and consumer electronics—could be applied to solve any problem. The truth is that, much like the industrial era before it, the digital era will soon end. We need to let go of old ways and set out in new directions.
Unfortunately, because of how brains are wired for availability bias and confirmation bias, that’s a whole lot easier said than done.
Breaking Out of the Container of Your Own Experience
In 1997, when I was still in my twenties, I took a job in Warsaw, Poland to work in the nascent media industry that was developing there. I had experience working in media in New York, so I was excited to share what I’d learned and was confident that my knowledge and expertise would be well received.
It wasn’t. Whenever I began to explain how a media business was supposed to work, people would ask me, “why?” That forced me to think about it and, when I did, I began to realize that many of the principles I had taken for granted were merely conventions. Things didn’t need to work that way and could be done differently.
I also began to realize that, working for a large corporation in the US, I had been trained to work within a system, to play a specific part in a greater whole. When a problem came up that was outside my purview, I went to someone down the hall who played another part. Yet in post-Communist Poland, there was no system and no one down the hall.
So I had to learn a new outlook and a new set of skills and I consider myself lucky to have had that experience. When you are forced to explore the unknown, you end up finding valuable things that you didn’t even know to look for and begin to realize that many perspectives can be brought to bear on similar problems with similar fact patterns.
Learning How to Not Fool Yourself
In one of my favorite essays, originally given as a speech, the great physicist Richard Feynman said “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that,” and goes on further to say that simply being honest isn’t enough, you also need to “bend over backwards” to provide information so that others may prove you wrong.
So, the first step is to be hyper-vigilant and aware that your brain has a tendency to fool you. It will quickly grasp on the most readily available data and detect patterns that may or may not be there. Then it will seek out other evidence that confirms those initial hunches while disregarding contrary evidence.
Yet checking ourselves in this way isn’t nearly enough, we need to actively seek out and encourage dissent. Some of this can be done with formal processes such as pre-mortems and red teams, but a lot of it is cultural, hiring for diversity and running meetings in such a way that encourages discussion by, for instance, having the most senior leaders speak last.
Perhaps most of all, we need to have a sense of humility. It’s far too easy to be impressed with ourselves and far too difficult to see how we’re being led astray. There is often a negative correlation between our level of certainty and the likelihood of us being wrong. We all need to make an effort to believe less of what we think.
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.