GUEST POST from Greg Satell
In 1977, Ken Olsen, the founder and CEO of Digital Equipment Corporation, reportedly said, “There is no reason for any individual to have a computer in his home.” It was an amazingly foolish thing to say and, ever since, observers have pointed to Olsen’s comment to show how supposed experts can be wildly wrong.
The problem is that Olsen was misquoted. In fact, his company was actually in the business of selling personal computers and he had one in his own home. This happens more often than you would think. Other famous quotes, such IBM CEO Thomas Watson predicting that there would be a global market for only five computers, are similarly false.
There is great fun in bashing experts, which is why so many inaccurate quotes get repeated so often. If the experts are always getting it wrong, then we are liberated from the constraints of expertise and the burden of evidence. That’s the hard thing about hard facts. They can be so elusive that it’s easy to believe doubt their existence. Yet they do exist and they matter.
The Search for Absolute Truth
In the early 20th century, science and technology emerged as a rising force in western society. The new wonders of electricity, automobiles and telecommunication were quickly shaping how people lived, worked and thought. Empirical verification, rather than theoretical musing, became the standard by which ideas were measured.
It was against this backdrop that Moritz Schlick formed the Vienna Circle, which became the center of the logical positivist movement and aimed to bring a more scientific approach to human thought. Throughout the 20’s and 30’s, the movement spread and became a symbol of the new technological age.
At the core of logical positivism was Ludwig Wittgenstein’s theory of atomic facts, the idea the world could be reduced to a set of statements that could be verified as being true or false—no opinions or speculation allowed. Those statements, in turn, would be governed by a set of logical algorithms which would determine the validity of any argument.
It was, to the great thinkers of the day, both a grand vision and an exciting challenge. If all facts could be absolutely verified, then we could confirm ideas with absolute certainty. Unfortunately, the effort would fail so miserably that Wittgenstein himself would eventually disown it. Instead of building a world of verifiable objective reality, we would be plunged into uncertainty.
The Fall of Logic and the Rise of Uncertainty
Ironically, while the logical positivist movement was gaining steam, two seemingly obscure developments threatened to undermine it. The first was a hole at the center of logic called Russell’s Paradox, which suggested that some statements could be both true and false. The second was quantum mechanics, a strange new science in which even physical objects could defy measurement.
Yet the battle for absolute facts would not go down without a fight. David Hilbert, the most revered mathematician of the time, created a program to resolve Russell’s Paradox. Albert Einstein, for his part, argued passionately against the probabilistic quantum universe, declaring that “God does not play dice with the universe.”
Alas, it was all for naught. Kurt Gödel would prove that every logical system is flawed with contradictions. Alan Turing would show that all numbers are not computable. The Einstein-Bohr debates would be resolved in Bohr’s favor, destroying Einstein’s vision of an objective physical reality and leaving us with an uncertain universe.
These developments weren’t all bad. In fact, they were what made modern computing possible. However, they left us with an uncomfortable uncertainty. Facts could no longer be absolutely verifiable, but would stand until they could be falsified. We could, after thorough testing, become highly confident in our facts, but never completely sure.
Science, Truth and Falsifiability
In Richard Feynman’s 1974 commencement speech at Cal-Tech, he recounted going to a new-age resort where people were learning reflexology. A man was sitting in a hot tub rubbing a woman’s big toe and asking the instructor, “Is this the pituitary?” Unable to contain himself, the great physicist blurted out, “You’re a hell of a long way from the pituitary, man.”
His point was that it’s relatively easy to make something appear “scientific” by, for example, having people wear white coats or present charts and tables, but that doesn’t make it real science. True science is testable and falsifiable. You can’t merely state what you believe to be true, but must give others a means to test it and prove you wrong.
This is important because it’s very easy for things to look like the truth, but actually be false. That’s why we need to be careful, especially when we believe something to be true. The burden is even greater when it is something that “everybody knows.” That’s when we need to redouble our efforts, dig in and make sure we verify our facts.
“We’ve learned from experience that the truth will out,” Feynman said. “The first principle is that you must not fool yourself—and you are the easiest person to fool.” Truth doesn’t reveal itself so easily, but it’s out there and we can find it if we are willing to make the effort.
The Lie of a Post-Truth World
Writing a non-fiction book can be a grueling process. You not only need to gather hundreds of pages of facts and mold them into a coherent story that interests the reader, but also to verify that those facts are true. For both of my books, Mapping Innovation and Cascades, I spent countless hours consulting sources and sending out fact checks.
Still, I lived in fear knowing that whatever I put on the page would permanently be there for anyone to discredit. In fact, I would later find two minor inaccuracies in my first book (ironically, both had been checked with primary sources). These were not, to be sure, material errors, but they wounded me. I’m sure, in time, others will be uncovered as well.
Yet I don’t believe that those errors diminish the validity of the greater project. In fact, I think that those imperfections serve to underline the larger truth that the search for knowledge is always a journey, elusive and just out of reach. We can struggle for a lifetime to grasp even a small part of it, but to shake free even a few seemingly insignificant nuggets can be a gift.
Yet all too often people value belief more than facts. That’s why they repeat things that aren’t factual, because they believe they point to some deeper truth that defy facts in evidence. Yet that is not truth. It is just a way of fooling yourself and, if you’re persuasive, fooling others as well. Still, as Feynman pointed out long ago, “We’ve learned from experience that the truth will out.”
— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.