Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

Real Change Requires a Majority

Real Change Requires a Majority

GUEST POST from Greg Satell

“Don’t worry about people stealing your ideas,” said the computing pioneer Howard Aiken. “If your ideas are any good, you’ll have to ram them down people’s throats,” and truer words were scarcely ever spoken. We tend to think that if an idea has merit, everybody will immediately recognize its value, but that’s almost never true.

Ignaz Semmelweis, quite famously, advocated for hand washing at hospitals, but was ostracized, not celebrated, for it and would himself die of an infection contracted under care before his idea caught on. William Coley discovered cancer immunotherapy over a century ago, but was thought by many to be some sort of a quack.

Good ideas fail all the time. Part of the problem is that people who believe passionately in an idea feel compelled to win over the skeptics. That’s almost always a mistake. The truth is that the difference between success or failure often has nothing to do with the inherent value of an idea, but where you choose to start and the best place to start, is with a majority.

The Fundamental Fallacy of Change Management

Pundits tell us that change is inevitable, so we need to create a sense of urgency about it. They say we must “innovate or die,” because those who don’t “get it” are dinosaurs and, much like their reptilian brethren, they are bound to die an awful, painful death once the asteroid hits (and, the implication is, they will deserve it too).

History, however, shows us exactly the opposite. People like Ignaz Semmelweis and William Colely had truly groundbreaking ideas that could have saved millions of lives if they were adopted earlier. Nevertheless, those in the medical establishment that thwarted their efforts thrived while the innovators themselves suffered greatly professionally and personally.

It’s not just the medical profession either. Take a short tour throughout history and it becomes clear that unjust and incompetent regimes can have remarkable staking power. The status quo always has inertia on its side and rarely yields its power gracefully. A bad idea can last for decades, or centuries even.

The fundamental fallacy of change management is that it is essentially a communication exercise, that change fails because people don’t understand it well enough and if you explain it to them in sufficiently powerful terms, they will embrace it. The truth is that change fails because others oppose it in ways that are devious, underhanded and deceptive.

That needs to be your primary design constraint.

The Power of Local Majorities

Merely telling someone about change, no matter how artfully, is unlikely to be effective, but that doesn’t mean that people are immune to persuasion. In fact, there are decades of studies that show that people naturally conform to ideas that are widely held by others around them.

Consider this famous series of conformity experiments conducted by Solomon Asch in the 1950s. The design of the study was simple, but ingenious. Asch merely showed a group of people pairs of cards like these:

Asch Experiment Greg Satell

Each person in the group was asked to match the line on the left with the line of the same length on the right. However, there was a catch: almost everyone in the room was a confederate who gave the wrong answer. When it came to the real subjects’ turn to answer, most conformed to the majority opinion, even when it was obviously wrong.

Clearly, most ideas are not nearly that unambiguous, which is why, despite having made breakthrough discoveries, Semmelweis and Coley had so much trouble getting traction for them. The majority of the medical establishment was resistant and Semmelweis and Coley found themselves in the minority. Majorities routinely push back against minorities.

The Threshold Model of Collective Action

One important aspect of Asch’s conformity studies was that the results were far from uniform. A quarter of the subjects never conformed, some always did, and others were somewhere in the middle. We all have different thresholds to adopt an idea or to partake in an action, based on factors like confidence in our ability to make judgments and expected punishments or rewards for getting it right or wrong.

The sociologist Mark Granovetter addressed this issue with his threshold model of collective behavior. As a thought experiment, he asks us to imagine a diverse group of people milling around in a square. Some are natural deviants, always ready to start trouble, most are susceptible to provocation in varying degrees and the remainder is made up of unusually solid citizens, almost never engaging in antisocial behavior.

Threshold Model Greg Satell

You can see a graphic representation of how the model plays out above. In the example on the left, a miscreant throws a rock and breaks a window. That’s all it takes for his friend next to him to start and then others with slightly higher thresholds join in as well. Before you know it, a full-scale riot ensues.

The example on the right is slightly different. After the first few troublemakers start, there is no one around with a low enough threshold to join in. Rather than the contagion spreading, it fizzles out, the three miscreants are isolated and little note is made of the incident. Although the groups are outwardly similar, a slight change in conformity thresholds can make a big difference.

It’s a relatively simplistic example, but through another concept Granovetter developed called the strength of weak ties, we can see how it can lead to large scale change in the final graphic below as an idea moves from group to group.

From Thresholds to Cascades Greg Satell

The top cluster is identical to the one in the first example and a local majority forms. However, no cluster is an island because people tend to belong to multiple groups. For example, we form relationships with people in our neighborhood, from work, religious communities and so on. So an idea that saturates one group soon spreads to others.

Notice how the exposure to multiple groups can help overcome higher thresholds of resistance, because of the influence emanating from other groups through weak links. When you start with a majority, even if it is a small, local majority, an idea can gain traction, move from cluster to cluster and almost infinitely scale.

As I explain in my book, Cascades, there is significant evidence that this is how ideas actually do spread in the real world. The crucial point here is that it makes a really big difference where you choose to start. If you start with people who are enthusiastic about your idea, you are much more likely to succeed than if you choose people who are resistant.

So rather than trying to convince everybody at once, you are much better of identifying people who are likeminded and working on a Keystone Change that can for them basis of a larger transformation.

Working to Attract, Rather Than Overpower

When we look at the stories of Semmelweis and Coley through the prism of local majorities and resistance thresholds, we can see the mistake that they made. Having made truly breakthrough discoveries, they naturally assumed that others would see value in them. Instead, they ran headlong into a highly resistant majority and got squashed.

In my work helping leaders drive organizational transformations, I see this happen all the time. People who believe passionately in an idea naturally assume that others will “see the light.” Not surprisingly, they want to move quickly and overpower any resistance. This is especially true if they feel that they have institutional power behind them.

Yet that is almost always a mistake. There is a reason why the vast majority of organizational transformations fail, even though they typically have big budgets and C-Suite support behind them. To drive meaningful, lasting change you can’t rely on overpowering resistance, but must work to attract and empower genuine support.

That means you need to start with a majority. In the beginning, that may mean starting with a small, local majority— say, three people in a room of five. You can always expand a majority out, but once you find yourself in the minority, you will immediately feel pushback. The secret to overcoming resistance to an idea and driving it forward is understanding that you get to choose where to start.

Revolutionary change always starts with the art of choosing wisely.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay, Greg Satell

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Stop Fooling Yourself

Stop Fooling Yourself

GUEST POST from Greg Satell

Early in my career I was working on a natural gas trading desk and found myself in Tulsa Oklahoma visiting clients. These were genuine roughnecks, who had worked their way up from the fields to become physical gas traders. When the NYMEX introduced “paper” contracts and derivatives into the market, however, much would change.

They related to me how, when New York traders first came to town offering long-term deals, they were thrilled. For the first part of the contract, they were raking in money. Unfortunately, during the latter months, they got crushed, losing all their profits and then some. The truth was that the trade was pure arbitrage and they never had a chance.

My clients’ brains were working against them in two ways. First, availability bias, caused them to value information most familiar to them and dismiss other data. The second, confirmation bias, made them look for information that would confirm their instincts. This, of course, isn’t at all unusual. It takes real effort to avoid believing the things we think.

Becoming a Square-Peg Business in a Round-Hole World

When I was researching my book, Mapping Innovation, I spoke to every great innovator I could find. Some were world class scientists, others were top executives at major corporations and still others were incredibly successful entrepreneurs. Each one shared with me how they were able to achieve incredible things.

What I found most interesting was that the story was different every time. For every one who told me that a particular approach was the secret to their success, I found someone else who was equally successful who did things completely differently. The fact is that there is no one “true path” to innovation, everybody does it different ways.

Yet few organizations acknowledge that in any kind of serious way. Rather, they have a “way we do things around here,” and there are often significant institutional penalties for anyone who wants to do things differently. Usually these penalties are informal and unspoken, but they are very real and can threaten to derail even the most promising career.

You can see how the same cognitive biases that lost my gas trader friends money are at work here. In a profitable company, the most available information suggests things are being done the “right” way and everybody who wants to get ahead in the organization is heavily incentivized to embrace evidence to support that notion and disregarding contrary data.

That’s how organizations get disrupted. They stick to what’s worked for them in the past and fail to notice that the nature of the problems they need to solve has fundamentally changed. They become better and better at things that people care about less and less. Before they realize what happened, they become square-peg businesses in a round-hole world.

Silicon Valley Jumps the Shark

Nobody can deny the incredible success that Silicon Valley has had over the past few decades. Still mostly a backwater in the 1970s and 80s, by the end of 2020 four out of the ten most valuable companies in the world came from the Bay Area (not including Microsoft and Amazon, which are based in Seattle). No other region has ever dominated so thoroughly.

Yet lately Silicon Valley’s model of venture-funded entrepreneurship seems to have jumped the shark. From massive fraud at Theranos and out-of control founders at WeWork and Uber to, most recently, the incredible blow-up at Quibi, there is increasing evidence that the tech world’s “unicorn culture” is beginning to have a negative impact on the real economy.

One clue of where things went wrong can be found in Eric Ries’s book, The Startup Way. Ries, whose earlier effort, The Lean Startup, was a runaway bestseller, was invited to implement his methods at General Electric and transform the company to a 124 year-old startup. Much like with the “unicorns,” it didn’t end well.

The fundamental fallacy of Silicon Valley is that a model that was developed for a relatively narrow set of businesses—essentially software and consumer electronics—could be applied to solve any problem. The truth is that, much like the industrial era before it, the digital era will soon end. We need to let go of old ways and set out in new directions.

Unfortunately, because of how brains are wired for availability bias and confirmation bias, that’s a whole lot easier said than done.

Breaking Out of the Container of Your Own Experience

In 1997, when I was still in my twenties, I took a job in Warsaw, Poland to work in the nascent media industry that was developing there. I had experience working in media in New York, so I was excited to share what I’d learned and was confident that my knowledge and expertise would be well received.

It wasn’t. Whenever I began to explain how a media business was supposed to work, people would ask me, “why?” That forced me to think about it and, when I did, I began to realize that many of the principles I had taken for granted were merely conventions. Things didn’t need to work that way and could be done differently.

I also began to realize that, working for a large corporation in the US, I had been trained to work within a system, to play a specific part in a greater whole. When a problem came up that was outside my purview, I went to someone down the hall who played another part. Yet in post-Communist Poland, there was no system and no one down the hall.

So I had to learn a new outlook and a new set of skills and I consider myself lucky to have had that experience. When you are forced to explore the unknown, you end up finding valuable things that you didn’t even know to look for and begin to realize that many perspectives can be brought to bear on similar problems with similar fact patterns.

Learning How to Not Fool Yourself

In one of my favorite essays, originally given as a speech, the great physicist Richard Feynman said “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that,” and goes on further to say that simply being honest isn’t enough, you also need to “bend over backwards” to provide information so that others may prove you wrong.

So, the first step is to be hyper-vigilant and aware that your brain has a tendency to fool you. It will quickly grasp on the most readily available data and detect patterns that may or may not be there. Then it will seek out other evidence that confirms those initial hunches while disregarding contrary evidence.

Yet checking ourselves in this way isn’t nearly enough, we need to actively seek out and encourage dissent. Some of this can be done with formal processes such as pre-mortems and red teams, but a lot of it is cultural, hiring for diversity and running meetings in such a way that encourages discussion by, for instance, having the most senior leaders speak last.

Perhaps most of all, we need to have a sense of humility. It’s far too easy to be impressed with ourselves and far too difficult to see how we’re being led astray. There is often a negative correlation between our level of certainty and the likelihood of us being wrong. We all need to make an effort to believe less of what we think.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Five Immutable Laws of Change

Five Immutable Laws of Change

GUEST POST from Greg Satell

When I first arrived in Poland in 1997, change was all around me. It was like watching a society transform itself through time-lapse photography. Everywhere you looked, the country was shaking off decades of post-communist rust and striving to make good on the promise of 1989’s historic Round Table Agreement.

Yet it wasn’t until the fall of 2004 that I truly understood the power of change. By then, I was living in Kyiv, Ukraine and the entire country erupted in protests now known as the Orange Revolution. While Warsaw in the 90s was like rebuilding after a tornado hit, Ukraine was like being in the eye of the storm itself.

That experience led to a 15-year long journey of discovery and my book Cascades. What I found was that throughout history many have sought to create change and most have failed, but a few succeeded brilliantly. Starting out with very different challenges, philosophies and personalities, they eventually all arrived at the same principles that allowed them to prevail.

Law #1: The Status Quo Has Inertia On Its Side And Never Yields Gracefully

We tend to overvalue ideas. We think that if we have a good idea, people will immediately see its worth. Yet that’s hardly ever the case. As computer pioneer Howard Aiken put it, “Don’t worry about people stealing your ideas. If your ideas are any good, you’ll have to ram them down people’s throats.”

Consider the case of Ignaz Semmelweis, who first came up with the idea that medical staff in hospitals should wash their hands before operating on patients. You would think that would be an obviously good idea. Nevertheless, he was ostracized for it and ended up dying in an insane asylum, ironically from an infection he contracted under care.

Semmelweis’s plight was tragic, but is also so amazingly common that the tendency for the establishment to reject ideas is referred to as the Semmelweis effect. In fact, while researching my book Mapping Innovation I interviewed dozens of successful innovators and I found that every single one had to overcome stiff resistance to transform their idea into something useful.

The fact that you will face opposition when protesting an authoritarian regime is obvious, but an organizational environment can be just as cutthroat. Make no mistake. If your idea is important and has real potential for impact, there will be some who will hate it and they will work to undermine it in ways that are dishonest, underhanded and deceptive.

That must be your primary design constraint.

Law #2: Small Groups, Loosely Connected, But United By Shared Purpose Drive Transformational Change

For decades, change consultants have been telling us that if we want to drive transformation, we should “start with a bang” and create a “sense of urgency” through a big communication campaign. The results have been atrocious. In fact, McKinsey has found that nearly three quarters of organizational transformations do not succeed.

It’s not hard to understand why. If there are people who are determined to see your change fail—and every significant change encounters resistance—than a “rally the troops” type of approach will only serve to alert those who oppose change that they better get started undermining it or it might actually happen.

Fortunately, science points to another way. The truth is that small groups, loosely connected, but united by a shared purpose drive transformational change. So instead of trying to convince everybody at once, identify those who are already enthusiastic about your idea, who want it to work as much as you do. Those are people you can empower to succeed and can help bring in others, who can bring in others still.

Yet identifying advocates is only part of the battle. You also need to find imbue the effort with purpose and give it meaning. Unfortunately, all too often the quest for purpose is treated as a communication exercise. It isn’t. For change to be meaningful it has to actually solve a problem that people care about.

Law #3: Revolutions Begin With a Cause, Not A Slogan

Every change effort starts with a grievance. There’s something that people don’t like and they want it to be different. In a social or political movement that may be a corrupt leader or a glaring injustice. In an organizational context it’s usually something like falling sales, unhappy customers, low employee morale or technological disruption.

Whatever the case may be, the first step toward bringing change about is understanding that getting mired in grievance won’t get you anywhere. You can’t just complain about things you don’t like, but must come up with an affirmative vision for how you would want things to be.

The best place to start is by asking yourself, “if I had the power to change anything, what would it look like?” Martin Luther King Jr.s vision for the civil rights movement was for a Beloved Community. Bill Gates’s vision for Microsoft was for a “computer on every desk and in every home.” A good vision should be aspirational, but not completely out of reach.

One of the things I found in my research is that successful change leaders don’t try to move from grievance to vision in one step, but rather identify a Keystone Change, which focuses on a clear and tangible goal, includes multiple stakeholders and paves the way for future change, to bridge the gap.

For King, the Keystone Change was voting rights. For Gates it was an easy-to-use operating system. For your vision, it will undoubtedly be something different. The salient point here is that every successful transformation I found started out with a Keystone Change, so that’s where you will want to start as well.

Law #4: Design Tactics That Mobilize People to Influence Institutions

Organizational change consultants often recommend that changemakers prepare a stakeholder map. This isn’t necessarily a bad idea, but it is somewhat inadequate because it fails to distinguish between different kinds of stakeholders. Some stakeholders are targets for mobilization and others are targets for influence.

For example, both parents and school boards are important stakeholders in education, but for very different reasons. School boards wield institutional power that can effect change, parents do not, so we mobilize parents to influence school boards, not the other way around. We need to approach constituencies and institutions in very different ways.

One of the things we’ve consistently found in our work helping organizations to drive transformational change is that leaders construe stakeholders far too narrowly. Fortunately, decades of non-violent activism have given us powerful tools for both: the Spectrum of Allies for constituencies and the Pillars of Support for institutions.

A crucial point to remember is that you can’t dictate change by mandate. You can’t overpower but must instead attract people and empower them so that they can take ownership of the cause and make it their own. You need to accept that people will do things for their own reasons, not for yours.

Most of all, remember that every action has to have a clear purpose and be directed at influencing specific institutions. So before taking any action, ask two questions: Who are we mobilizing and to influence what?

Law #5: Every Revolution Inspires Its Own Counter-Revolution

In the aftermath of the Orange Revolution we thought we had won. After all, we had stood up to the injustice of a falsified election and prevailed. Unfortunately, it didn’t turn out that way. Five years later, Viktor Yanukovych, the same man who we had taken to the streets to prevent from office, rose to power in an election that international observers deemed free and fair. His corrupt and incompetent rule would trigger a second Ukrainian Revolution.

We find a similar pattern with many of the executives we work with. They work for months—and sometimes years—to get a project off the ground. Yet just when they think they’re turning the corner, when they’ve won executive sponsorship, signed up key partners and procured enough financing to have a realistic budget, all the sudden things seem to get mired down.

That’s no accident. Just because you’ve won a few early battles doesn’t mean opposition to your idea has melted away. On the contrary, faced with the fact that change may actually succeed, those who oppose it have probably just begun to redouble their efforts to undermine it. These efforts are often not overt, but they are there and can easily derail an initiative.

That’s why every change effort must learn how to survive victory. The truth is that change is always a journey, never a particular destination, which is why lasting change is always built on common ground. That doesn’t mean that you need to win over your fiercest critics, but it does mean you need to try to empathize with their perspective.

There is a reason why some change leaders succeed while others fail. At some point everybody needs to decide whether they would rather make a point or make a difference and, in the end, those that prevail choose the latter.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

The Era of Moving Fast and Breaking Things is Over

The Era of Moving Fast and Breaking Things is Over

GUEST POST from Greg Satell

On July 16th, 1945, when the world’s first nuclear explosion shook the plains of New Mexico, the leader of the Manhattan Project, J. Robert Oppenheimer quoted from the Bhagavad Gita, “Now I am become Death, the destroyer of worlds.” Clearly, he was troubled by what he had unleashed and for good reason. The world was never truly the same after that.

Today, however, we have lost much of that reverence for the power of technology. Instead of proceeding deliberately and with caution, tech entrepreneurs have prided themselves on their willingness to “move fast and break things” and, almost reflexively, casually deride anyone who questions the practice as those who “don’t get it.”

It’s hard to see how, by any tangible metric, any of this has made us better off. We set out to disrupt industries, but disrupted people instead. It wasn’t always like this. Throughout our history we have asked hard questions and made good choices about technological progress. As we enter a new era of innovation, we desperately need to recapture some of that wisdom.

How We Put the Nuclear Genie Back in the Bottle

The story of nuclear weapons didn’t start with Oppenheimer, not by a long shot. In fact, if we were going to attribute the Manhattan Project to a single person, it would probably be a Hungarian immigrant physicist named Leo Szilard, who was one of the first to conceive of the possibility of a nuclear chain reaction.

In 1939, upon hearing of the discovery of nuclear fission in Germany he, along with fellow Hungarian emigre Eugene Wigner, decided that the authorities needed to be warned. Szilard then composed a letter warning of the possibility of a nuclear bomb that was eventually signed by Albert Einstein and sent to President Roosevelt. That’s what led to the American development program.

Yet after the explosions at Hiroshima and Nagasaki, many of the scientists who worked to develop the bomb wanted to educate the public of its dangers. In 1955, the philosopher Bertrand Russell issued a manifesto signed by a number of scientific luminaries. Based on this, a series of conferences at Pugwash, Nova Scotia were convened to discuss different approaches to protect the world from weapons of mass destruction.

These efforts involved far more than talk, but helped to shape the non-proliferation agenda and led to concrete achievements such as the Partial Test Ban Treaty. In fact, these contributions were so crucially important that the organizers of the Pugwash conferences were awarded the Nobel Peace Prize in 1995 and they continue even today.

Putting Limits On What We Do With the Code of Life

While the nuclear age started with a bang, the genetic age began with a simple article in the scientific journal Nature, written by two relatively unknown scientists named James Watson and Francis Crick, that described the structure of DNA. It was one of those few watershed moments when an entirely new branch of science arose from a single event.

The field progressed quickly and, roughly 20 years later, a brilliant researcher named Paul Berg discovered that you could merge human DNA with that from other living things, creating new genetic material that didn’t exist in nature. Much like Oppenheimer, Berg understood that, due to his work, humanity stood on a precipice and it wasn’t quite clear where the edge was.

He organized a conference at Asilomar State Beach in California to establish guidelines. Importantly, participation wasn’t limited to scientists. A wide swath of stakeholders were invited, including public officials, members of the media and ethical specialists. The result, now known as the Berg Letter, called for a moratorium on the riskiest experiments until the dangers were better understood. These norms were respected for decades.

Today, we’re undergoing another revolution in genomics and synthetic biology. New technologies, such as CRISPR and mRNA techniques, have opened up incredible possibilities, but also serious dangers. Yet here again, pioneers in the field like Jennifer Doudna are taking the lead in devising sensible guardrails and using the technology responsibly.

The New Economy Meets the New Era of Innovation

When Netscape went public in 1995, it hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25. Still, a tiny enterprise with no profits was almost instantly worth $2.9 billion.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet today, it’s clear that the “new economy” was a mirage. Despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

The digital revolution has been a real disappointment. In fact, when you look at outcomes, if anything we’re worse off. Rather than a democratized economy, market concentration has markedly increased in most industries. Income inequality in advanced economies has soared. In America wages have stagnated and social mobility has declined for decades. At the same time, social media has been destroying our mental health.

Now we’re entering a new era of innovation, in which we will unleash technologies much more powerful. New computing architectures like quantum and neuromorphic technologies will power things like synthetic biology and materials science to create things that would have seemed like science fiction a generation ago. We simply can no longer afford to be so reckless.

Shifting From Agility Toward Resilience

Moving fast and breaking things only seems like a good idea in a stable world. When you operate in a safe environment, it’s okay to take a little risk and see what happens. Clearly, we no longer live in such a world (if we ever did). Taking on more risk in financial markets led to the Great Recession. Being blase about data security has nearly destroyed our democracy. Failure to prepare for a pandemic has nearly brought modern society to its knees.

Over the next decade, the dangers will only increase. We will undergo four major shifts in technology, resources, migration and demographics. To put that in perspective, a similar shift in demography was enough to make the 60s a tumultuous decade. We haven’t seen a confluence of so many disruptive forces since the 1920s and that didn’t end well.

Unfortunately it’s far too easy to underinvest in order to mitigate the risk of a danger that may never come to fruition. Moving fast and breaking things can seem attractive because the costs are often diffuse. Although it has impoverished society as a whole and made us worse off in so many ways, it has created a small cadre of fabulously wealthy plutocrats.

Yet history is not destiny. We have the power to shape our path by making better choices. We can abandon the cult of disruption and begin to invest in resilience. In fact, we have to. By this point there should be no doubt that the dangers are real. The only question is whether we will act now or simply wait for it to happen and accept the consequences.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

New Skills Needed for a New Era of Innovation

New Skills Needed for a New Era of Innovation

GUEST POST from Greg Satell

The late Clayton Christensen had a theory about “jobs to be done.” In his view, customers don’t buy products as much as they “hire” companies to do specific “jobs” for them. To be competitive, firms need to understand what that job is and how to do it well. In other words, no one wants a quarter-inch drill bit, they want a quarter-inch hole.

The same can be said for an entire society. We need certain jobs to be done and will pay handsomely for ones that we hold in high regard, even as we devalue others. Just as being the best blacksmith in town won’t earn you much of a living today, great coding skills wouldn’t do you much good in a medieval village.

This is especially important to keep in mind today as the digital revolution comes to an end and we enter a new era of innovation in which some tasks will be devalued and others will be increasingly in demand. Much like Christensen said about firms, we as a society need to learn to anticipate which skills will lose value in future years and which will be considered critical.

The Evolution of Economies

The first consumer product was most likely the Acheulean hand axe, invented by some enterprising stone age entrepreneur over 100,000 years ago. Evidence suggests that, for the most part, people made stone axes themselves, but as technology evolved, some began to specialize in different crafts, such as smithing, weaving, cobbling and so on.

Inventions like the steam engine, and then later electricity and the internal combustion engine, brought about the industrial revolution, which largely put craftsmen out of work and reshaped society around cities that could support factories. It also required new skills to organize work, leading to the profession of management and the knowledge economy.

The inventions of the microchip and the internet have led to an information economy in which even a teenager with a smartphone has better access to knowledge than a specialist working in a major institution a generation ago. Much like the industrial era automated physical tasks, the digital era has automated many cognitive tasks.

Now as the digital era is ending we are entering a new era of innovation in which we will shift to post-digital computing architectures such as quantum computing and neuromorphic chips and enormous value will be created through bits powering atoms in fields like synthetic biology and materials science.

Innovation, Jobs and Wages

As economies evolved, some tasks became devalued as others increased in importance. When people could go to a smith for metal tools, they had no need to create stone axes. In much the same way, the industrial revolution put craft guilds out of business and technologies like tractors and combine harvesters drastically reduced the number of people working on farms.

Clearly replacing human labor with technology is disruptive, but it has historically led to dramatic increases in productivity. So labor displacement effects have been outweighed by greater wages and new jobs created by new industries. For the most part, innovation has made all of us better off, even, to a great extent, the workers who were displaced.

Consider the case of Henry Ford. Because technology replaced many tasks on the family farm, he didn’t need to work on it and found a job as an engineer for Thomas Edison, where he earned enough money and had enough leisure time to tinker with engines. That led him to create his own company, pioneer an industry and create good jobs for many others.

Unfortunately, there is increasing evidence that more recent innovations may not be producing comparable amounts of productivity and that’s causing problems. For example, when a company replaces a customer service agent with an automated system, it’s highly doubtful that the productivity gains will be enough to finance entire new industries that will train that call center employee to, say, design websites or run marketing campaigns.

Identifying New Jobs To Be Done

To understand the disconnect between technological innovation and productivity it’s helpful to look at some underlying economic data. In US manufacturing, for instance, productivity has skyrocketed, roughly doubling output in the 30 years between 1987 and 2017, even as employment in the sector decreased by roughly a third.

It is the increased productivity growth in manufacturing that has fueled employment growth in the service sector. However, productivity gains in service jobs have been relatively meager and automation through technological innovation has not resulted in higher wages, but greater income inequality as returns to capital dwarf returns to labor.

Further economic analysis shows that the divide isn’t so much between “white collar” and “blue collar” jobs, but between routine and non-routine tasks. So warehouse workers and retail clerks have suffered, but designers and wedding planners have fared much better. In other words, technological automation is creating major shifts in the “jobs to be done.”

A recent analysis by the McKinsey Global Institute bears this out. It identified 56 “foundational skills” that are crucial to the future of work, but aren’t in traditional categories such as “engineering” or “sales,” but rather things like self awareness, emotional intelligence and critical thinking.

Collaboration Is The New Competitive Advantage

The industrial revolution drove a shift from animal power to machine power and from physical skills to cognitive skills. What we’re seeing now is a similar shift from cognitive skills to social skills as automation takes over many routine cognitive tasks, increasingly the “job” that humans are valued for is relating to other humans.

There are some things a machine will never do. An algorithm will never strike out at a Little League game, see its child born or have a bad day at work. We can, of course, train computers to mimic these things by training them on data, but they will never actually have the experience and that limits their ability to fully relate to human emotions.

To see how this is likely to play out, simply go and visit your local Apple Store. It is a highly automated operation, without traditional checkout aisles or cash registers. Still, the first thing that catches your eye is a sea of blue shirts waiting to help you. They are not there to execute transactions, which you can easily do online, but to engage with you, understand what you’re trying to achieve and help you get it done.

We’ve seen similar trends at work even in highly technical fields. A study of 19.9 million scientific papers found that not only has the percentage of papers published by teams steadily increased over the past 50 years, the size of those teams has also grown and their research is more highly cited. The journal Nature got similar results and also found that the work being done is far more interdisciplinary and done at greater distances.

What’s becoming clear is that collaboration is increasingly becoming a competitive advantage. The ultimate skill is no longer knowledge or proficiency in a particular domain, but to build a shared purpose with others, who possess a diverse set of skills and perspectives, in order to solve complex problems. In other words, the most important jobs the ones we do in the service of a common objective.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Parallels Between the 1920’s and Today Are Frightening

Parallels Between the 1920's and Today Are Frightening

GUEST POST from Greg Satell

It should be clear by now we are entering a pivotal era. We are currently undergoing four profound shifts, that include changing patterns of demographics, migration, resources and technology. The stress lines are already beginning to show, with increasing tensions over race and class as well as questions about the influence technology and institutions have over our lives.

The last time we faced anything like this kind of tumult was in the 1960s which, much like today, saw the emergence of a new generation, the Baby-Boomers, that had very different values than their predecessors. Their activism achieved significant advances for women and minorities, but also at times, led to tumult and riots.

Yet the changes we are undergoing today appear to be even more significant than we did then. In fact, you would have to go back to the 1920s to find an era that had as much potential for both prosperity and ruin. Unfortunately, it led to economic upheaval, genocide and war on a scale never seen before in world history. We need to do better this time around.

Panics, Pandemics and War

A Wall Street crisis that threatened the greater economy and led to sweeping legislation that reshaped government influence in the financial sector was prelude to both the 1920’s and the 2020’s. Both the Bankers Panic of 1907 and the Great Recession which began in 2007 resulted in landmark legislation, the Federal Reserve Act and Dodd-Frank, respectively.

Continuing in the same vein of eerie parallel, the 1918 flu epidemic killed between 20 million and 50 million people and raged for more than two years, until 1920, when it finally got under control. Much like today, there were social distancing guidelines, significant economic impacts and long-term effects on educational attainment.

Perhaps not surprisingly, there was no small amount of controversy about measures taken to control the pandemic a century ago. People were frustrated with isolation (it goes without saying that there was no Netflix in 1918). Organizations like the Anti-Mask League of San Francisco rose up in defiance.

The years leading up to the 1920s were also war-torn, with World War I ravaging Europe and the colonial order increasingly coming under pressure. Much like the “War on Terrorism,” today, the organized violence, combined with the panics and pandemics, made for an overall feeling that society was unravelling, and many began to look for a scapegoat.

Migration, Globalization and Nativism

In 1892, Ellis Island opened its doors and America became a beacon to those around the world looking for a better life. New immigrants poured in and, by 1910, almost 15% of the US population were immigrants. As the 1920s approached, the strains in society were becoming steadily more obvious and more visceral.

The differences among the newcomers aroused suspicion, perhaps best exemplified by the Sacco and Vanzetti trial, in which two apparently innocent immigrants were convicted and executed for murder. Many believed that the new arrivals brought disease, criminality and “un-American” political and religious beliefs, especially with regard to Bolshevism.

Fears began to manifest themselves in growing nativism and there were increasing calls to limit immigration. The Immigration Act of 1917 specifically targeted Asians and established a literacy test for new arrivals. The Immigration Act of 1924 established quotas which favored northern and Western Europeans over those of Southern and Eastern Europe as well as Jews. The film Birth of A Nation led to a resurgence of the Ku Klux Klan.

Scholars see many parallels between the run-up to the 1920s and today. Although nativism these days is primarily focused against muslims and immigrants from South America, the same accusations of un-American political and religious beliefs, as well as outright criminality, are spurring on a resurgence of hate groups like the Proud Boys. Attorney General Merrick Garland has pledged to make prosecuting white supremacists a top priority.

A New Era of Innovation

As Robert Gordon explained in The Rise and Fall of American Growth, prosperity in the 20th century was largely driven by two technologies, electricity and the internal combustion engine. Neither were linear or obvious. Both were first invented in the 1880’s but didn’t really begin to scale until the 1920’s.

That’s not uncommon. In fact, it takes decades for a new discovery to make a measurable impact on the world. That’s how long is needed to first identify a useful application for a technology and then for ecosystems to form and secondary technologies to arise. Electricity and internal combustions would ignite a productivity boom that would last 50 years, from roughly 1920 until 1970.

For example, as economist Paul David explained in a highly cited paper, it wasn’t the light bulb, but in allowing managers to rearrange work in factories, that electricity first had a significant effect on society. Yet it was in the 1920s that things really began to take off. Refrigerated rail cars transformed diets and labor-saving appliances such as the vacuum cleaner would eventually pave the way for women in the workforce. The first radio stations appeared, revolutionizing entertainment.

Today, although the digital revolution itself has largely been a disappointment, there’s considerable evidence that we may be entering a new era of innovation as the emphasis shifts from bits to atoms. New computing architectures, such as quantum and neuromorphic computing, as well as synthetic biology and materials science, may help to reshape the economy for decades to come.

A Return to Normalcy?

Not surprisingly, by 1920 the American people were exhausted. Technological change, cultural disruption brought about by decades of mass immigration, economic instability and war made people yearn for calmer, gentler times. Warren G. Harding’s presidential campaign touted “a return to normalcy” and people bought in.

Yet while the “Roaring Twenties” are remembered as a golden age, they set the seeds for what came later. Although the stock market boomed, lack of regulation led to the stock market crash of 1929 and the Great Depression. The harsh reparations imposed by the Treaty of Versailles made the rise of Hitler possible.

The 1930s brought upon almost unimaginable horror. Economic hardship in Europe paved the way for fascism. Failed collectivization in the Soviet Union led to massive famine and, later, Stalin’s great purges. Rising nativism, in the US and around the world, led to diminished trade as well as violence against Jews and other minorities. World War II was almost inevitable.

It would be foolish beyond belief to deny the potential of history repeating itself. Still, the past is not necessarily prologue. The 1930s were not the inevitable result of impersonal historical forces, but of choices consciously made. We could have made different ones and received the bounty of the prosperity that followed World War II without the calamity that preceded it.

What we have to come to terms with is that technology won’t save us. Markets won’t save us. Our future will be the product of the choices we make. We should endeavor to choose wisely.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Innovation Ecosystem Building Blocks

Four Innovation Ecosystem Building Blocks

GUEST POST from Greg Satell

It’s hard to find anyone who wouldn’t agree that Microsoft’s 2001 antitrust case was a disaster for the company. Not only did it lose the case, but it wasted time, money and—perhaps most importantly—focus on its existing businesses, which could have been far better deployed on new technologies like search and mobile.

Today, Microsoft is a much different organization. Rather than considering open source software a cancer, it now says it loves Linux. Its cloud business is growing like wildfire and it is partnering widely to develop new quantum computers. What was previously a rapacious monopolist, is now an enthusiastic collaborator.

That’s no accident. Today, we need to compete in an ecosystem-driven world in which nobody, not even a firm as big and powerful as Microsoft, can go it alone. Power no longer comes from the top of value chains, but emanates from the center of networks. That means that strategy needs to shift from dominating industries to building collaborative ecosystems.

1. Connect to Startups

In its heyday, Microsoft enthusiastically followed Michael Porter’s five forces model. It saw threats coming not only from direct competitors, but also suppliers, customers, substitute products and new market entrants. Startups, in particular, were targeted for either acquisition or destruction if they were seen as posing a potential threat.

Today, however, Microsoft actively supports startups. Take, for example, its quantum development effort, in which it is partnering with more than a dozen entrepreneurial companies. These firms also get free access to Microsoft technologies, such as its Azure cloud platform and go-to-market resources and advice, through its Microsoft for Startups program.

Another approach that many firms take is corporate VC programs which actively invest in promising new companies. Unlike a typical investor, corporations bring a wealth of market and technical expertise, can help with things like distribution, supply chain management and marketing acumen. Corporations, for their part, get far more insight into new technologies than they could as an operating company.

Scott Lenet, President of Touchdown Ventures, which operates venture funds for corporations, told me that, “Startups thrive on new ideas and big firms know how to scale and improve those ideas. We’ve seen some of our investments really blossom based on that kind of partnership.”

2. Form Ties to the Academic World

When Sun Microsystems co-founder Bill Joy said, “no matter who you are, most of the smartest people work for someone else,” he was explicitly referring to Bill Gates’s assertion that Microsoft was an “IQ monopolist.” Joy’s position was that “It’s better to create an ecology that gets all the world’s smartest people toiling in your garden for your goals. If you rely solely on your own employees, you’ll never solve all your customers’ needs.”

Make no mistake. Innovation is never a single event. It is a process of discovery, engineering and transformation and those three things almost never happen in the same place or at the same time. That’s why the most innovative companies work hard to build links to the best minds in the academic world.

Today Microsoft has an extensive academic program that extends grants to graduate students and faculty members that are pursuing research that is of interest to the company. Google takes it even a step further, inviting dozens of the world’s top minds to work alongside its scientists and engineers for a sabbatical year.

Microsoft and Google are, of course, firms with enormous resources. However, just about any business can, for example, support the work of a young graduate student or postdoc at a local university. For even a senior researcher to collaborate with your staff is rarely prohibitively expensive. Researchers care far more about genuine support of their work than the size of your investment.

3. Leverage Domain-Specific Consortia

By the mid-1980’s, the American semiconductor industry seemed like it was doomed. Tp respond to what it saw as a national security threat, the American government created SEMATECH in 1986. It was a consortium of government agencies, research institutions and private firms focused on making the industry more competitive. By the mid 1990’s, the US was once again dominating semiconductors.

Any significantly complex technology takes years—and often decades—to develop before it becomes mature enough to engineer into a marketable product. So there is great potential in collaborating, even with competitive firms, in the pre-competitive phase to figure out the basic principles of a nascent technology.

For example, Boeing and Airbus are arch-rivals in aviation, much like DowDupont and BASF are in chemicals. Yet all of these companies, along with many others, collaborate at places like the Composites Institute (IACMI). They do this not out of any altruism, of course, but self-interest, because it is at places like the Composites Institute that they can collaborate with academic scientists, National Labs and startups working in the space.

As technology becomes more complex, domain specific consortia are becoming essential to any ecosystem strategy. The Composites Institute is just one node in the network of Manufacturing Institutes set up under the Obama Administration to support this type of collaboration. In areas ranging from advanced fabrics and biofabrication to additive manufacturing and wide-gap semiconductors, firms large and small are working with scientists to uncover new principles.

And the Manufacturing Institutes are just the start. The Internet of Things Consortium is helping bring computation to the physical world, while the Partnership on AI focuses on artificial intelligence and the Joint Center for Energy Storage Research is helping to develop advanced battery technology. All are open to the largest multinationals and the smallest startups.

4. Move From Hierarchies to Networks

Back in the 90s, when Microsoft still dominated the tech world, markets were still based on linear value chains dominated by one or two industry giants. Yet as I explain in Cascades, we are quickly moving from a world of hierarchies, to one dominated by networks and ecosystems. That changes how we need to develop and grow.

In a hierarchy-driven world, the optimal strategy was to build walls and moats to protect yourself against would-be invaders, which is why Microsoft fought tooth and nail to protect its operating system monopoly. Today, however, industry lines have blurred and technology moves too fast to be able to build effective barriers against disruption.

That’s why today “Microsoft loves Linux”, why it developed an academic program to collaborate with scientists at universities and why it often partners with startups instead of always trying to crush them. The technology being developed today is simply too complex for anyone to go it alone, which is why the only viable strategy is to actively connect to ecosystems of talent, technology and information.

Power today no longer sits at the top of hierarchies, but emanates from the center of ecosystems and you move to the center by widening and deepening connections. Closing yourself by erecting barriers will not protect you. In fact, it is an almost sure-fire way to hasten your demise.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Change Must Be Built on Common Ground

Why Change Must Be Built on Common Ground

GUEST POST from Greg Satell

When Steve Jobs returned to Apple in 1997, one of the first things he did was develop a marketing campaign to rebrand the ailing enterprise. Leveraging IBM’s long running “Think” campaign, Apple urged its customers to “Think Different.” The TV spots began, “Here’s to the crazy ones, the misfits, the rebels, the troublemakers, the round pegs in the square holes…”

Yet Jobs actual product strategy did exactly the opposite. While other technology companies jammed as many features into their products as they could to impress the techies and the digerati, Jobs focused on making his products so ridiculously easy to use that they were accessible to everyone. Apple became the brand people would buy for their mothers.

The truth is that while people like the idea of being different, real change is always built on common ground. Differentiation builds devotion among adherents, but to bring new people in, you need to make an idea accessible and that means focusing on values that you share with outsiders, rather than those that stir the passions of insiders. That’s how you win.

Overcoming the Desire to Be Different

Apple’s ad campaign was effective because we are tribal in nature. Setting your idea apart is a great way to unlock tribal fervor among devotees, but it also sends a strong signal to others that they don’t belong. For example, for decades LGBTQ activists celebrated their difference with “Gay Pride,” which made gay people feel better, but didn’t resonate with others.

It’s not much different in the corporate world. Those who want to promote Agile development love to tout the Agile Manifesto and its customer focused ethos. It’s what they love about the Agile methodology. Yet for those outside the Agile community, it can seem more than a bit weird. They don’t want to join a cult, they just want to get their job done.

So, the first step to driving change forward is to make the shift from differentiating values, which make ardent fans passionate about an idea, to shared values, which invite people in. That doesn’t mean you’re abandoning your core values any more than making products accessible meant that Apple had to skimp on capability. But it does create an entry point.

This is a surprisingly hard shift to make, but you won’t be able to move forward until you do.

Identifying and Leveraging Your Opposition

Make no mistake. Change fails because people want it to fail. Any change that is important, that has the potential for real impact, will inspire fierce resistance. Some people will simply hate the idea and will try to undermine your efforts in ways that are dishonest, deceptive and underhanded. That is the chief design constraint of any significant change effort.

So, you’re going to want to identify your most active opposition because you want to know where the attacks are going to be coming from. However, you don’t want to directly engage with these people because it is unlikely to be an honest conversation. Most likely, it will devolve into something that just bogs you down and drains you emotionally.

However, you can listen. People who hate your idea are, in large part, trying to persuade many of the same people you are. Listening to which arguments they find effective can help unlock shared values and that’s what holds the key to truly transformational change. But most importantly, they can help you define shared values.

So, while your main focus should be on empowering those who are excited about change, you should pay attention to your most vocal opposition. In fact, with some effort, you can learn to love your haters. They can point out early flaws. Also, as you begin to gain traction they will often lash out and overreach, undermine themselves and and end up sending people your way.

Defining Shared Values

Your most active opposition, the people who hate your idea and want to undermine it, have essentially the same task that you do. They want to move people who are passive or neutral to support their position and will design their communication efforts to achieve that objective. If you listen carefully though, you can make their efforts work for you.

For example, when faced with President Woodrow Wilson’s opposition to voting rights for women, Alice Paul’s band of Silent Sentinels picketed the White House with phrases lifted from President Wilson’s own book. How could he object, without appearing to be a tremendous hypocrite, to signs that read, “LIBERTY IS A FUNDAMENTAL DEMAND OF THE HUMAN SPIRIT?

In a similar vein, those who opposed LGBTQ rights often did so on the basis of family values and it was, for decades, a very effective strategy. That is, until LGBTQ activists used it against them. After all, shouldn’t those of different sexual orientations be able to live in committed relationships and raise happy and health families? If you believe in the importance of families, how could you not support same sex marriages?

The strategy works just as well in a corporate environment. In our Transformation & Change workshops, we ask executives what those who oppose their idea say about it. From there, we can usually identify the underlying shared value and then leverage it to make our case. Once you identify common ground, it’s much easier to move forward.

Surviving Victory

Steve Jobs, along with his co-founder Steve Wozniak, started Apple to make computers. But if that’s all Apple ever did, it would never have become the world’s most valuable company. What made Jobs the iconic figure he became had nothing to do with any one product, but because he came to represent something more: the fusion of technology and design.

In his autobiography of Steve Jobs, Walter Isaacson noted that he revolutionized six industries, ranging from music to animated movies, far afield from the computer industry. He was able to do that because he continued to focus on the core values of using technology and design to make products more accessible to ordinary people.

In other words, in every venture he undertook he looked for common ground by asking himself, “how can we make this as easy as possible for those who are not comfortable with technology.” He didn’t merely cater to the differences of his hard core enthusiasts, but constantly looked to bring everybody else in.

Many companies have had hit products, but very few have had the continued success of Apple. In fact, success often breeds failure because it attracts new networks of competitors. Put another way, many entrepreneurs fail to survive victory because they focus on a particular product rather than the shared values that product was based on.

Jobs was different. He was passionate about his products, but his true calling was tapping into basic human desires. In other words, he understood that truly revolutionary change is always built on common ground.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Not Everyone Can Transform Themselves

Not Everyone Can Transform Themselves

Here’s What Makes the Difference

GUEST POST from Greg Satell

The conservative columnist John Podhoretz recently took to the New York Post to denounce the plotline of Disney’s new miniseries The Falcon and the Winter Soldier. In particular, he took umbrage with a subplot that invoked the Tuskegee experiments and other historical warts in a manner that he termed “didactic anti-Americanism.”

His point struck a chord with me because, in my many years living overseas, I always found that people in other countries were more than aware of America’s failures such as slavery, Jim Crow, foreign policy misadventures and so on. What they admire is our ability to take a hard look at ourselves and change course.

It also reminded me of something I’ve noticed in my work helping organizations transform themselves. Some are willing to take a hard look at themselves and make tough changes, while others are addicted to happy talk and try to wish problems away. Make no mistake. You can’t tackle the future without looking with clear eyes at how the present came into being.

A Pregnant Postcard

The genesis of shareholder capitalism and our modern outlook on how things are supposed to work can, in some sense, be traced back to Paris in 1900. It was there and then that an obscure graduate student named Louis Bachelier presented his thesis on speculation to a panel of judges including the great Henri Poincaré. It described the fluctuation of market prices as a random walk, a revolutionary, albeit unappreciated, idea at the time.

Unfortunately for Bachelier, his paper went mostly unnoticed and he vanished into obscurity. Then, in 1954, he was rediscovered by a statistician named Jimmie Savage, who sent a postcard to his friend, the eminent economist Paul Samuelson, asking “ever hear of this guy?” Samuelson hadn’t, but was intrigued.

In particular, Bachelier’s assertion that “the mathematical expectation of the speculator is zero,” was intriguing because it implied that market prices were essentially governed by bell curves that are, in many respects, predictable. If it were true, then markets could be tamed through statistical modeling and the economy could be managed much more effectively.

Samuelson, who was pioneering the field of mathematical finance at the time, thought the paper was brilliant and began to actively promote it. Later, Eugene Fama would build Bachelier’s initial work into a full-blown Efficient Market Hypothesis. It would unleash a flurry of new research into financial modeling and more than a few Nobel Prizes.

A Refusal to Reckon

By the 1960s, the revolution in mathematical finance began to gain steam. Much like had happened in physics earlier in the century, a constellation of new discoveries such as efficient portfolios, the capital asset pricing model (CAPM) and, later, the Black-Scholes model for options pricing created a “standard model” for thinking about economics and finance.

As the things gathered steam, Samuelson’s colleague at MIT, Paul Cootner, compiled the most promising papers in a 500-page tome, The Random Character of Stock Market Prices, which became an instant classic. The book would become a basic reference for the new industries of financial engineering and risk management that were just beginning to emerge at the time.

However, early signs of trouble were being ignored. Included in Cootner’s book was a paper by Benoit Mandelbrot that warned that there was something seriously wrong afoot. He showed, with very clear reasoning and analysis, that actual market data displayed far more volatility than was being predicted. In essence, he was pointing out that Samuelson and his friends were vastly underestimating risk in the financial system.

In a response, Cootner wrote that Mandelbrot forced economists “to face up in a substantive way to those uncomfortable empirical observations that there is little doubt most of us have had to sweep under the carpet until now.” He then added, “but surely before consigning centuries of work to the ash pile, we should like to have some assurance that all of our work is truly useless.”

Think about that for a second. Another term for “empirical observations” is “facts in evidence,” and Cootner was admitting that these were being ignored! The train was leaving the station and everybody had to either get on or get left behind.

The Road to Shareholder Value

As financial engineering transformed Wall Street from a clubby, quiet industry to one in which dashing swashbucklers in power ties and red suspenders became “barbarians at the gate,” pressure began to build on managers. The new risk management products lowered the perceived cost of money and ushered in a new era of leveraged buyouts.

A new breed of “corporate raiders” could now get control of companies with very little capital and demand that performance—and “performance” meant stock performance— improve. They believed that society’s interest was best determined by market forces and unabashedly pursued investment returns above all else. As Wall Street anti-hero Gordon Gekko put it, the overall sentiment was that “greed is good.”

Managers were put on notice and a flood of new theories from business school professors and management consultants poured in. Harvard’s Michael Porter explained how actively managing value chains could lead to sustainable competitive advantage. New quantitative methods, such as six sigma, promised to transform management into, essentially, an engineering problem.

Today, the results are in and they are abysmal. In 2008 a systemic underestimation of risk—of exactly the type Mandelbrot warned us of—caused a financial meltdown. We are now in the midst of a second productivity paradox in which technological advance does little to improve our well-being. Income inequality, racial strife and mental health are at historic levels.

Since 1970, we have undergone three revolutions—financial, managerial and digital—and we are somehow worse off. It’s time to admit that we had the wrong theory of the case and chart a new course. Anything else is living in denial.

A Different Future Demands You Reject the Past

Underlying Mr. Podhoretz’s column is a sense of aggrievement that practically drips from each sentence. It’s hard to see the system in which you have succeeded as anything other than legitimate without tarnishing your own achievements. While he is clearly annoyed by what he sees as “didactic,” he seems unwilling to entertain the possibility that a large portion of the country desperately wants to come to terms with our history.

We often see the same thing with senior executives in our transformation work. Yet to chart a new path we must reject the past. As Thomas Kuhn pointed out in his classic, The Structure of Scientific Revolutions, every model is flawed. Some can be useful for decades or even centuries, but eventually circumstances change and they become untenable. After a period of tumult, they collapse and a new paradigm emerges.

What Podhoretz misses about both The Falcon and The Winter Soldier is that they were able to make common cause around the values that they shared, not the history that divided them, and partner on a shared mission. That’s what separates those who are able to transform themselves and those who are not. You need to take a hard look and achieve a level of honesty and integrity with yourself before you can inspire trust in others.

In order to improve we first must look with clear eyes on what needs to be corrected in the first place. To paraphrase President Kennedy, we don’t do these things because they are easy, but because they are worthwhile.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.