Tag Archives: history

Hard Facts Are a Hard Thing

Hard Facts Are a Hard Thing

GUEST POST from Greg Satell

In 1977, Ken Olsen, the founder and CEO of Digital Equipment Corporation, reportedly said, “There is no reason for any individual to have a computer in his home.” It was an amazingly foolish thing to say and, ever since, observers have pointed to Olsen’s comment to show how supposed experts can be wildly wrong.

The problem is that Olsen was misquoted. In fact, his company was actually in the business of selling personal computers and he had one in his own home. This happens more often than you would think. Other famous quotes, such IBM CEO Thomas Watson predicting that there would be a global market for only five computers, are similarly false.

There is great fun in bashing experts, which is why so many inaccurate quotes get repeated so often. If the experts are always getting it wrong, then we are liberated from the constraints of expertise and the burden of evidence. That’s the hard thing about hard facts. They can be so elusive that it’s easy to believe doubt their existence. Yet they do exist and they matter.

The Search for Absolute Truth

In the early 20th century, science and technology emerged as a rising force in western society. The new wonders of electricity, automobiles and telecommunication were quickly shaping how people lived, worked and thought. Empirical verification, rather than theoretical musing, became the standard by which ideas were measured.

It was against this backdrop that Moritz Schlick formed the Vienna Circle, which became the center of the logical positivist movement and aimed to bring a more scientific approach to human thought. Throughout the 20’s and 30’s, the movement spread and became a symbol of the new technological age.

At the core of logical positivism was Ludwig Wittgenstein’s theory of atomic facts, the idea the world could be reduced to a set of statements that could be verified as being true or false—no opinions or speculation allowed. Those statements, in turn, would be governed by a set of logical algorithms which would determine the validity of any argument.

It was, to the great thinkers of the day, both a grand vision and an exciting challenge. If all facts could be absolutely verified, then we could confirm ideas with absolute certainty. Unfortunately, the effort would fail so miserably that Wittgenstein himself would eventually disown it. Instead of building a world of verifiable objective reality, we would be plunged into uncertainty.

The Fall of Logic and the Rise of Uncertainty

Ironically, while the logical positivist movement was gaining steam, two seemingly obscure developments threatened to undermine it. The first was a hole at the center of logic called Russell’s Paradox, which suggested that some statements could be both true and false. The second was quantum mechanics, a strange new science in which even physical objects could defy measurement.

Yet the battle for absolute facts would not go down without a fight. David Hilbert, the most revered mathematician of the time, created a program to resolve Russell’s Paradox. Albert Einstein, for his part, argued passionately against the probabilistic quantum universe, declaring that “God does not play dice with the universe.”

Alas, it was all for naught. Kurt Gödel would prove that every logical system is flawed with contradictions. Alan Turing would show that all numbers are not computable. The Einstein-Bohr debates would be resolved in Bohr’s favor, destroying Einstein’s vision of an objective physical reality and leaving us with an uncertain universe.

These developments weren’t all bad. In fact, they were what made modern computing possible. However, they left us with an uncomfortable uncertainty. Facts could no longer be absolutely verifiable, but would stand until they could be falsified. We could, after thorough testing, become highly confident in our facts, but never completely sure.

Science, Truth and Falsifiability

In Richard Feynman’s 1974 commencement speech at Cal-Tech, he recounted going to a new-age resort where people were learning reflexology. A man was sitting in a hot tub rubbing a woman’s big toe and asking the instructor, “Is this the pituitary?” Unable to contain himself, the great physicist blurted out, “You’re a hell of a long way from the pituitary, man.”

His point was that it’s relatively easy to make something appear “scientific” by, for example, having people wear white coats or present charts and tables, but that doesn’t make it real science. True science is testable and falsifiable. You can’t merely state what you believe to be true, but must give others a means to test it and prove you wrong.

This is important because it’s very easy for things to look like the truth, but actually be false. That’s why we need to be careful, especially when we believe something to be true. The burden is even greater when it is something that “everybody knows.” That’s when we need to redouble our efforts, dig in and make sure we verify our facts.

“We’ve learned from experience that the truth will out,” Feynman said. “The first principle is that you must not fool yourself—and you are the easiest person to fool.” Truth doesn’t reveal itself so easily, but it’s out there and we can find it if we are willing to make the effort.

The Lie of a Post-Truth World

Writing a non-fiction book can be a grueling process. You not only need to gather hundreds of pages of facts and mold them into a coherent story that interests the reader, but also to verify that those facts are true. For both of my books, Mapping Innovation and Cascades, I spent countless hours consulting sources and sending out fact checks.

Still, I lived in fear knowing that whatever I put on the page would permanently be there for anyone to discredit. In fact, I would later find two minor inaccuracies in my first book (ironically, both had been checked with primary sources). These were not, to be sure, material errors, but they wounded me. I’m sure, in time, others will be uncovered as well.

Yet I don’t believe that those errors diminish the validity of the greater project. In fact, I think that those imperfections serve to underline the larger truth that the search for knowledge is always a journey, elusive and just out of reach. We can struggle for a lifetime to grasp even a small part of it, but to shake free even a few seemingly insignificant nuggets can be a gift.

Yet all too often people value belief more than facts. That’s why they repeat things that aren’t factual, because they believe they point to some deeper truth that defy facts in evidence. Yet that is not truth. It is just a way of fooling yourself and, if you’re persuasive, fooling others as well. Still, as Feynman pointed out long ago, “We’ve learned from experience that the truth will out.”

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.