Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Not Everyone Can Transform Themselves

Not Everyone Can Transform Themselves

Here’s What Makes the Difference

GUEST POST from Greg Satell

The conservative columnist John Podhoretz recently took to the New York Post to denounce the plotline of Disney’s new miniseries The Falcon and the Winter Soldier. In particular, he took umbrage with a subplot that invoked the Tuskegee experiments and other historical warts in a manner that he termed “didactic anti-Americanism.”

His point struck a chord with me because, in my many years living overseas, I always found that people in other countries were more than aware of America’s failures such as slavery, Jim Crow, foreign policy misadventures and so on. What they admire is our ability to take a hard look at ourselves and change course.

It also reminded me of something I’ve noticed in my work helping organizations transform themselves. Some are willing to take a hard look at themselves and make tough changes, while others are addicted to happy talk and try to wish problems away. Make no mistake. You can’t tackle the future without looking with clear eyes at how the present came into being.

A Pregnant Postcard

The genesis of shareholder capitalism and our modern outlook on how things are supposed to work can, in some sense, be traced back to Paris in 1900. It was there and then that an obscure graduate student named Louis Bachelier presented his thesis on speculation to a panel of judges including the great Henri Poincaré. It described the fluctuation of market prices as a random walk, a revolutionary, albeit unappreciated, idea at the time.

Unfortunately for Bachelier, his paper went mostly unnoticed and he vanished into obscurity. Then, in 1954, he was rediscovered by a statistician named Jimmie Savage, who sent a postcard to his friend, the eminent economist Paul Samuelson, asking “ever hear of this guy?” Samuelson hadn’t, but was intrigued.

In particular, Bachelier’s assertion that “the mathematical expectation of the speculator is zero,” was intriguing because it implied that market prices were essentially governed by bell curves that are, in many respects, predictable. If it were true, then markets could be tamed through statistical modeling and the economy could be managed much more effectively.

Samuelson, who was pioneering the field of mathematical finance at the time, thought the paper was brilliant and began to actively promote it. Later, Eugene Fama would build Bachelier’s initial work into a full-blown Efficient Market Hypothesis. It would unleash a flurry of new research into financial modeling and more than a few Nobel Prizes.

A Refusal to Reckon

By the 1960s, the revolution in mathematical finance began to gain steam. Much like had happened in physics earlier in the century, a constellation of new discoveries such as efficient portfolios, the capital asset pricing model (CAPM) and, later, the Black-Scholes model for options pricing created a “standard model” for thinking about economics and finance.

As the things gathered steam, Samuelson’s colleague at MIT, Paul Cootner, compiled the most promising papers in a 500-page tome, The Random Character of Stock Market Prices, which became an instant classic. The book would become a basic reference for the new industries of financial engineering and risk management that were just beginning to emerge at the time.

However, early signs of trouble were being ignored. Included in Cootner’s book was a paper by Benoit Mandelbrot that warned that there was something seriously wrong afoot. He showed, with very clear reasoning and analysis, that actual market data displayed far more volatility than was being predicted. In essence, he was pointing out that Samuelson and his friends were vastly underestimating risk in the financial system.

In a response, Cootner wrote that Mandelbrot forced economists “to face up in a substantive way to those uncomfortable empirical observations that there is little doubt most of us have had to sweep under the carpet until now.” He then added, “but surely before consigning centuries of work to the ash pile, we should like to have some assurance that all of our work is truly useless.”

Think about that for a second. Another term for “empirical observations” is “facts in evidence,” and Cootner was admitting that these were being ignored! The train was leaving the station and everybody had to either get on or get left behind.

The Road to Shareholder Value

As financial engineering transformed Wall Street from a clubby, quiet industry to one in which dashing swashbucklers in power ties and red suspenders became “barbarians at the gate,” pressure began to build on managers. The new risk management products lowered the perceived cost of money and ushered in a new era of leveraged buyouts.

A new breed of “corporate raiders” could now get control of companies with very little capital and demand that performance—and “performance” meant stock performance— improve. They believed that society’s interest was best determined by market forces and unabashedly pursued investment returns above all else. As Wall Street anti-hero Gordon Gekko put it, the overall sentiment was that “greed is good.”

Managers were put on notice and a flood of new theories from business school professors and management consultants poured in. Harvard’s Michael Porter explained how actively managing value chains could lead to sustainable competitive advantage. New quantitative methods, such as six sigma, promised to transform management into, essentially, an engineering problem.

Today, the results are in and they are abysmal. In 2008 a systemic underestimation of risk—of exactly the type Mandelbrot warned us of—caused a financial meltdown. We are now in the midst of a second productivity paradox in which technological advance does little to improve our well-being. Income inequality, racial strife and mental health are at historic levels.

Since 1970, we have undergone three revolutions—financial, managerial and digital—and we are somehow worse off. It’s time to admit that we had the wrong theory of the case and chart a new course. Anything else is living in denial.

A Different Future Demands You Reject the Past

Underlying Mr. Podhoretz’s column is a sense of aggrievement that practically drips from each sentence. It’s hard to see the system in which you have succeeded as anything other than legitimate without tarnishing your own achievements. While he is clearly annoyed by what he sees as “didactic,” he seems unwilling to entertain the possibility that a large portion of the country desperately wants to come to terms with our history.

We often see the same thing with senior executives in our transformation work. Yet to chart a new path we must reject the past. As Thomas Kuhn pointed out in his classic, The Structure of Scientific Revolutions, every model is flawed. Some can be useful for decades or even centuries, but eventually circumstances change and they become untenable. After a period of tumult, they collapse and a new paradigm emerges.

What Podhoretz misses about both The Falcon and The Winter Soldier is that they were able to make common cause around the values that they shared, not the history that divided them, and partner on a shared mission. That’s what separates those who are able to transform themselves and those who are not. You need to take a hard look and achieve a level of honesty and integrity with yourself before you can inspire trust in others.

In order to improve we first must look with clear eyes on what needs to be corrected in the first place. To paraphrase President Kennedy, we don’t do these things because they are easy, but because they are worthwhile.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Outsmarting Those Who Want to Kill Change

Outsmarting Those Who Want to Kill Change

GUEST POST from Greg Satell

Look at anyone who has truly changed the world and they encountered significant resistance. In fact, while researching my book Cascades, I found that every major change effort, whether it was a political revolution, a social movement or an organizational transformation, had people who worked to undermine it in ways that were dishonest, underhanded and deceptive.

Unfortunately, we often don’t realize that there is an opposition campaign underway until it’s too late. People rarely voice open hostility to change. Opponents might even profess some excitement at our idea conceptually, but once there is a possibility of real action moving forward, they dig in their heels.

None of this means that change can’t happen. What it does mean is that, if you expect to bring about meaningful change, planning to overcome resistance has to be a primary design constraint and an organizing principle. Once you understand that, you can begin to move forward, identify shared values, design effective tactics and, ultimately, create lasting change.

Start With a Local Majority

Consider a famous set of conformity studies performed by the psychologist Solomon Asch in the 1950s. The design was simple, but ingenuous. He merely showed people pairs of cards, asking them to match the length of a single line on one card with one of three on an adjacent card. The correct answer was meant to be obvious.

However, as the experimenter went around the room, one person after another gave the same wrong answer. When it reached the final person in the group (in truth, the only real subject, the rest were confederates), the vast majority of the time that person conformed to the majority opinion, even if it was obviously wrong!

Majorities don’t just rule, they also influence, especially local majorities. The effect is even more powerful when the issue at hand is not as clear-cut as the length of a line on a card. Also, more recent research suggests that the effect applies not only to people we know well, but that we are also influenced even by second and third-degree relationships.

The key point here is that we get to choose who we expose an idea to. If you start with five people in a room, for example, you only need three advocates to start with a majority. That may not seem consequential, but consider that the movement that overthrew Serbian dictator Slobodan Milošević started with five kids in a cafe, and you can see how even the most inauspicious beginnings can lead to revolutionary outcomes.

You can always expand a majority out, but once you’re in the minority you are likely to get immediate pushback and will have to retrench. That’s why the best place to start is with those who are already enthusiastic about your idea. Then you can empower them to be successful and bring in others who can bring in others still.

Listen to Your Opposition, But Don’t Engage Them

People who are passionate about change often see themselves as evangelists. Much like Saint Paul in the bible, they thrive on winning converts and seek out those who most adamantly oppose their idea in an attempt to change their minds. This is almost always a mistake. Directly engaging with staunch opposition is unlikely to achieve anything other than exhausting and frustrating you.

However, while you shouldn’t directly engage your fiercest critics, you obviously can’t act like they don’t exist. On the contrary, you need to pay close attention to them. In fact by listening to people who hate your idea you can identify early flaws, which gives you the opportunity to fix them before they can be used against you in any serious way.

One of the most challenging things about managing change effort is balancing the need to focus on a small circle of dedicated enthusiasts while still keeping your eyes and ears open. Once you become too insular, you will quickly find yourself out of touch. It’s not enough to sing to the choir, you also need to get out of the church and mix with the heathens.

Perhaps the most important reason to listen to your critics is that they will help you identify shared values. After all, they are trying to convince the same people in the middle that you are. Very often you’ll find that, by deconstructing their arguments, you can use their objections to help you make your case.

Shift From Differentiating Values to Shared Values

Many revolutionaries, corporate and otherwise, are frustrated marketers. They want to differentiate themselves in the marketplace of ideas through catchy slogans that “cut through.” It is by emphasizing difference that they seek to gin-up enthusiasm among their most loyal supporters.

That was certainly true of LGBTQ activists, who marched through city streets shouting slogans like “We’re here, we’re queer and we’d like to say hello.” They led a different lifestyle and wanted to demand that their dignity be recognized. More recently, Black Lives Matter activists made calls to “defund the police,” which many found to be shocking and anarchistic.

Corporate change agents tend to fall into a similar trap. They rant on about “radical” innovation and “disruption,” ignoring the fact that few like to be radicalized or disrupted. Proponents of agile development methods often tout their manifesto, oblivious to the reality that many outside the agile community find the whole thing a bit weird and unsettling.

While emphasizing difference may excite people who are already on board, it is through shared values that you bring people in. So it shouldn’t be a surprise that the fight for LGBTQ rights began to gain traction when activists started focusing on family values. Innovation doesn’t succeed because it’s “radical,” but when it solves a meaningful problem. The value of Agile methods isn’t a manifesto, but the fact that they can improve performance.

Create and Build On Success

Starting with a small group of enthusiastic apostles may seem insignificant. In fact, look at almost any popular approach to change management and the first thing on the to-do-list is “create a sense of urgency around change” or “create an awareness of the need for change.” But if that really worked, the vast majority of organizational transformations wouldn’t fail, and we know that they do.

Once you accept that resistance to change needs to be your primary design constraint, it becomes clear that starting out with a massive communication campaign will only serve to alert your opponents that they better get started undermining you quickly or you might actually be successful in bringing change about.

That’s why we always advise organizations to focus on a small, but meaningful keystone change that can demonstrate success. For example, one initiative at Procter & Gamble started out with just three mid-level executives focused on improving one process. That kicked off a movement that grew to over 2500 employees in 18 months. Every successful large enterprise transformation we looked at had a similar pattern.

That, in truth, is the best way to outsmart the opponents of change. Find a way to make it successful, no matter how small that initial victory may be, then empower others to succeed as well. It’s easy to argue against an idea, you merely need to smother it in its cradle. Yet a concept that’s been proven to work and has inspired people to believe in it is an idea whose time has come.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Preparing for Organizational Transformation in a Post-COVID World

Preparing Your Organization for Transformation in a Post-COVID World

GUEST POST from Greg Satell

The Covid-19 pandemic demanded we transform across multiple planes. Businesses had to abruptly shift to empower remote work. Professionals were suddenly trading commutes and in-person meetings for home schooling and “Zoom fatigue.” Leaders needed to reimagine every system, from storefronts to supply chains to educational institutions.

It was a brutal awakening, but we can now see the light at the end of the tunnel. In fact, a recent McKinsey Global Survey found that 73% of executives believed that conditions will be moderately or substantially better in the next year. Globally, the World Bank predicts 4% growth in 2021, a marked improvement over 2020’s 4.3% drop.

Still, while the crisis may be ending, the need for fundamental change has not. Today leaders must reinvent their organizations on multiple fronts, including technological, environmental, social and skills-based transformations. These pose challenges for any organization and research suggests that traditional approaches are unlikely to succeed. Here’s what will:

Empowering Small Groups

In 1998 five friends met in a cafe in Belgrade and formed a revolutionary movement. Two years later the brutal Serbian dictator, Slobodan Milošević, was overthrown. In 2007, a lean manufacturing initiative at Wyeth Pharmaceuticals began with a single team in one plant. In 18 months it spread to more than 17,000 employees across 25 sites worldwide and resulted in a more than 25% reduction in costs across the company.

More recently, in 2017, three mid-level employees at Procter & Gamble decided to take it upon themselves, with no budget and no significant executive sponsorship, to transform a single process. It took them months, but they were able to streamline it from a matter of weeks to mere hours. Today, their PxG initiative for process improvement has become a movement for reinvention that encompasses thousands of their colleagues worldwide.

Traditionally, managers launching a new initiative have aimed to start with a bang. They work to gain approval for a sizable budget as a sign of institutional commitment, recruit high-profile executives, arrange a big “kick-off” meeting and look to move fast, gain scale and generate some quick wins. All of this is designed to create a sense of urgency and inevitability.

Yet that approach can backfire. Many change leaders who start with a “shock and awe” approach find that, while they have rallied some to their cause, they have also inspired an insurgency that bogs things down. For any significant change, there will always be some who will oppose the idea and they will resist it in ways that are often insidious and not immediately obvious.

The dangers of resistance are especially acute when, as is often the case today, you need to drive transformation on multiple fronts. That’s why it’s best to start with small groups of enthusiasts that you can empower to succeed, rather than try to push an initiative on the masses that you’ll struggle to convince.

Weaving A Network Of Loose Connections

The sociologist Mark Granovetter envisioned collective action as a series of resistance thresholds. For any idea or initiative, some will be naturally enthusiastic and have minimal or no resistance, some will have some level of skepticism and others will be dead set against it.

It’s not hard to see why focusing initial efforts on small groups with low resistance thresholds can be effective. In the examples above, the Serbian activists, the lean manufacturing pilot team at Wyeth and the three mid-level executives at Procter & Gamble were all highly motivated and willing to put in the hard work to overcome initial challenges and setbacks.

To scale, however, transformation efforts must be able to connect to those who have at least some level of reluctance. One highly effective strategy to scale change is to create “cooptable” resources in the form of workshops, training materials and other assets. For example, to scale a cloud transformation initiative at Experian, change leaders set up an “API Center of Excellence” to make it as easy as possible for product managers to try cloud-based offerings.

Another helpful practice is to update stakeholders about recent events and share best practices. In One Mission, Chris Fussell describes in detail the O&I forum he and General Stanley McChrystal used in Iraq. The Serbian activists held regular “network meetings,” that served a similar purpose. More recently, Yammer groups, Zoom calls and other digital media have proven effective in this regard.

What’s most important is that people are allowed to take ownership of a change initiative and be able to define it for themselves, rather than being bribed or coerced with incentive schemes or mandates. You can’t force authentic change. Unless people see genuine value in it, it will never gain any real traction.

Indoctrinate Shared Values And Shared Purpose

One of the biggest misconceptions about transformation efforts is that success begets more success. In practice, the opposite is often true. An initial success—especially a visible one—is likely to be met with a groundswell of opposition. We’ve seen this writ large with respect to political revolutions in which initial victories in places like Egypt, Maldives and Burma experienced reversals, but it is no less common in a corporate or organizational context.

In fact, we are often called into an engagement 6-12 months after an initiative starts because change leaders are bewildered that their efforts, which seemed so successful at first, have suddenly and mysteriously run aground. In actuality, it was those initial victories that activated latent opposition because it made what seemed unlikely change a real possibility.

The truth is that lasting change can never be built on any particular technology, program or policy, but rather must focus on shared values and a shared sense of mission. The Serbian activists focused not on any particular ideology, but on patriotism. At Wyeth, the change leaders made sure not to champion any specific technique, but tangible results. The leaders of the PXG initiative at Procter & Gamble highlighted the effect clunky and inefficient processes have on morale.

Irving Wladawsky-Berger, who was one of Lou Gerstner’s key lieutenants in IBM’s historic turnaround in the 90s made a similar point to me. “Because the transformation was about values first and technology second, we were able to continue to embrace those values as the technology and marketplace continued to evolve,” he said.

Redefining Agility

In Built to Last, management guru Jim Collins suggested that leaders should develop a “big hairy audacious goal” (BHAG) to serve as a unifying vision for their enterprise. He pointed to examples such as Boeing’s development of the 707 commercial jet liner and Jack Welch’s vision that every GE business should be #1 or #2 in its category as inspiring “moonshots.”

Yet the truth is that we no longer have the luxury of focusing transformation in a single direction, but must bring about change along multiple axes simultaneously. Leaders today can’t choose whether to leverage cutting-edge technologies or become more sustainable, nor can we choose between a highly skilled workforce and one that is diverse and inclusive.

The kind of sustained, multifaceted brand of change we need today cannot be mandated from a mountaintop but must be inspired to take root throughout an enterprise. We need to learn how to empower small loosely connected groups with a shared sense of mission and purpose. To truly take hold, people need to embrace change and they do that for their own reasons, not for ours.

That’s what will be key to making the transformations ahead successful. The answer doesn’t lie in any specific strategy or initiative, but in how people are able to internalize the need for change and transfer ideas through social bonds. A leader’s role is no longer to plan and direct action, but to inspire and empower belief.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Digital Era Replaced by an Age of Molecular Innovation

Digital Era Replaced by an Age of Molecular Innovation

GUEST POST from Greg Satell

It’s become strangely fashionable for digerati to mourn the death of innovation. “There’s nothing new,” has become a common refrain for which they blame venture capitalists, entrepreneurs and other digerati they consider to be less enlightened than themselves. They yearn for a lost age when things were better and more innovative.

What they fail to recognize is that the digital era is ending. After more than 50 years of exponential growth, the technology has matured and advancement has naturally slowed. While it is true that there are worrying signs that things in Silicon Valley have gone seriously awry and those excesses need to be curtailed, there’s more to the story.

The fact is that we’re on the brink of a new era of innovation and, while digital technology will be an enabling factor, it will no longer be center stage. The future will not be written in the digital language of ones and zeroes, but in that of atoms, molecules, genes and proteins. We do not lack potential or possibility, what we need is more imagination and wonder.

The End Of Moore’s Law

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which predicted that computing power would double about every two years. This idea, known as Moore’s Law , has driven the digital revolution for a half century. It’s what’s empowered us to shrink computers from huge machines to tiny, but powerful, devices we carry in our pockets.

Yet there are limits for everything. The simple truth is that atoms are only so small and the speed of light is only so fast. That puts a limit on how many transistors we can cram onto a silicon wafer and how fast electrons can zip around the logic gates we set up for them. At this point, Moore’s Law is effectively over.

That doesn’t mean that advancement will stop altogether. There are other ways to speed up computing. The problem is that they all come with tradeoffs. New architectures, such as quantum and neuromorphic computing, for instance, require new programming languages, new logical approaches and very different algorithmic strategies than we’re used to.

So for the next decade or two we’re likely to see a heterogeneous computing environment emerge, in which we combine different architectures for different tasks. For example, we will be augmenting traditional AI systems with techniques like quantum machine learning. It is not only possible, but fairly likely, that these types of combinations will result in an exponential increase in capability.

A Biological Revolution

Moore’s Law has become essentially shorthand for exponential improvement in any field. Anytime we see a continuous doubling of efficiency, we call it “the Moore’s Law of ‘X.’’ Yet since the Human Genome Project was completed in 2003, advancement in genetic sequencing has far outpaced what has happened in the digital arena.

What is possibly an even bigger development occurred in 2012, when Jennifer Doudna and her colleagues discovered how CRISPR could revolutionize gene editing. Now, suddenly, the work of genetic engineers that would have taken weeks could be done in hours, at a fraction of the cost, with much greater accuracy and the new era of synthetic biology had begun.

The most obvious consequence of this new era is the Covid-19 vaccine, which was designed in a matter of mere days instead of what’s traditionally taken years. The mRNA technology used to create two of the vaccines also holds promise for cancer treatment and CRISPR-based approaches have been applied to cure sickle cell and other diseases.

Yet as impressive as the medical achievements are, they make up only a fraction of the innovation that synthetic biology is making possible. Scientists are working on programming microorganisms to create new carbon-neutral biofuels and biodegradable plastics. It may very well revolutionize agriculture and help feed the world.

The truth is that the biological revolution is basically where computers were at in the 1970s or 80s and we are just beginning to understand the potential. We can expect progress to accelerate for decades to come.

The Infinite World Of Atoms

Anyone who has regularly read the business press over the past 20 years or so would naturally conclude that we live in a digital economy. Certainly, tech firms dominate any list of the world’s most valuable companies. Yet take a closer look and you will find that information and communication as a sector only makes up for 6% of GDP in advanced countries.

The truth is that we still live very much in a world of atoms and we spend most of our money on what we eat, wear, ride and live in. Any real improvement in our well-being depends on our ability to shape atoms to our liking. As noted above, reprogramming genetic material in cells to make things for us is one way we can do that, but not the only one.

In fact, there is a revolution in materials science underway. Much like in genomics, scientists are learning how to use computers to understand materials on a fundamental level and figure out how we can design them a lot better. In fact, in some cases researchers are able to discover new materials hundreds of times more efficiently than before.

Unlike digital or biological technologies this is largely a quiet revolution with very little publicity. Make no mistake, however, our newfound ability to create advanced materials will transform our ability to create and build everything from vastly more efficient solar panels to lighter, stronger and more environmentally friendly building materials.

The Next Big Thing Always Starts Out Looking Like Nothing At All

The origins of digital computing can be traced back at least a century, to the rise and fall of logical positivism, Turing’s “machine,” the invention of the transistor, the integrated circuit and the emergence of the first modern PC at Xerox PARC in the early 1970s. Yet there wasn’t a measurable impact from computing until the mid-1990s.

We tend to assume that we’ll notice when something important is afoot, but that’s rarely the case. The truth is that the next big thing always starts out looking like nothing at all. It doesn’t appear fully bloomed, but usually incubates for years—and often decades—by scientists quietly working in labs and by specialists debating at obscure conferences.

So, yes, after 50 years the digital revolution has run out of steam, but that shouldn’t blind us to the incredible opportunities that are before us. After all, a year ago very few people had heard of mRNA vaccines, but that didn’t make them any less powerful or important. There is no shortage of nascent technologies that can have just as big of an impact.

The simple fact is that innovation is not, and never has been, about what kind of apps show up on our smartphone screens. The value of a technology is not measured in how a Silicon Valley CEO can dazzle an audience on stage, but in our capacity to solve meaningful problems and, as long as there are meaningful problems to solve, innovation will live on.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Need a More Biological View of Technology

We Need a More Biological View of Technology

GUEST POST from Greg Satell

It’s no accident that Mary Shelley’s novel, Frankenstein, was published in the early 19th century, at roughly the same time as the Luddite movement was gaining momentum. It was in that moment that people first began to take stock of the technological advances that brought about the first Industrial Revolution.

Since then we have seemed to oscillate between techno-utopianism and dystopian visions of machines gone mad. For every “space odyssey” promising an automated, enlightened future, there seems to be a “Terminator” series warning of our impending destruction. Neither scenario has ever come to pass and it is unlikely that either ever will.

What both the optimists and the Cassandras miss is that technology is not something that exists independently from us. It is, in fact, intensely human. We don’t merely build it, but continue to nurture it through how we develop and shape ecosystems. We need to go beyond a simple engineering mindset and focus on a process of revealing, building and emergence.

1. Revealing

World War II brought the destructive potential of technology to the fore of human consciousness. As deadly machines ravaged Europe and bombs of unimaginable power exploded in Asia, the whole planet was engulfed in a maelstrom of human design. It seemed that the technology we had built had become a modern version of Frankenstein’s monster, destined from the start to turn on its master.

Yet the German philosopher Martin Heidegger saw things differently. In his 1954 essay, The Question Concerning Technology, he described technology as akin to art, in that it reveals truths about the nature of the world, brings them forth and puts them to some specific use. In the process, human nature and its capacity for good and evil are also revealed.

He offers the example of a hydroelectric dam, which uncovers a river’s energy and puts it to use making electricity. In much the same sense, Mark Zuckerberg did not so much “build” a social network at Facebook, but took natural human tendencies and channeled them in a particular way. That process of channeling, in turn, reveals even more.

That’s why, as I wrote in Mapping Innovation, innovation is not about coming up with new ideas, but identifying meaningful problems. It’s through exploring tough problems that we reveal new things and those new things can lead to important solutions. All who wander are not lost.

2. Building

The concept of revealing would seem to support the view of Shelley and the Luddites. It suggests that once a force is revealed, we are powerless to shape its trajectory. J. Robert Oppenheimer, upon witnessing the world’s first nuclear explosion as it shook the plains of New Mexico, expressed a similar view. “Now I am become Death, the destroyer of worlds,” he said, quoting the Bhagavad Gita.

Yet in another essay, Building Dwelling Thinking, Heideggar explains that what we build for the world is highly dependent on our interpretation of what it means to live in it. The relationship is, of course, reflexive. What we build depends on how we wish to dwell and that act, in and of itself, shapes how we build further.

Again, Mark Zuckerberg and Facebook are instructive. His insight into human nature led him to build his platform based on what he saw as The Hacker Way and resolved to “move fast and break things.” Unfortunately, that approach led to his enterprise becoming highly vulnerable to schemes by actors such as Cambridge Analytica and the Russian GRU.

Yet technology is not, by itself, determinant. Facebook is, to a great extent, the result of conscious choices that Mark Zuckerberg made. If he had a different set of experiences than that of a young, upper-middle-class kid who had never encountered a moment of true danger in his life, he may have been more cautious and chosen differently.

History has shown that those who build powerful technologies can play a vital role in shaping how they are used. Many of the scientists of Oppenheimer’s day became activists, preparing a manifesto that highlighted the dangers of nuclear weapons, which helped lead to the Partial Test Ban Treaty. In much the same way, the Asilomar Conference, held in 1975, led to important constraints on genomic technologies.

3. Emergence

No technology stands alone, but combines with other technologies to form systems. That’s where things get confusing because when things combine and interact they become more complex. As complexity theorist Sam Arbesman explained in his book, Overcomplicated, this happens because of two forces inherent to the way that technologies evolve.

The first is accretion. A product such as an iPhone represents the accumulation of many different technologies, including microchips, programming languages, gyroscopes, cameras, touchscreens and lithium ion batteries, just to name a few. As we figure out more tasks an iPhone can perform, more technologies are added, building on each other.

The second force is interaction. Put simply, much of the value of an iPhone is embedded in how it works with other technologies to make tasks easier. We want to use it to access platforms such as Facebook to keep in touch with friends, Yelp so that we can pick out a nice restaurant where we can meet them and Google Maps to help us find the place. These interactions, combined with accretion, create an onward march towards greater complexity.

It is through ever increasing complexity that we lose control. Leonard Read pointed out in his classic essay, I, Pencil, that even an object as simple as a pencil is far too complex for any single person to produce by themselves. A smartphone—or even a single microchip—is exponentially more complex.

People work their entire lives to become experts on even a minor aspect of a technology like an iPhone, a narrow practice of medicine or an obscure facet of a single legal code. As complexity increases, so does specialization, making it even harder for any one person to see the whole picture.

Shaping Ecosystems And Taking A Biological View

In 2013, I wrote that we are all Luddites now, because advances in artificial intelligence had become so powerful that anyone who wasn’t nervous didn’t really understand what was going on. Today, as we enter a new era of innovation and technologies become infinitely more powerful, we are entering a new ethical universe.

Typically, the practice of modern ethics has been fairly simple: Don’t lie, cheat or steal. Yet with many of our most advanced technologies, such as artificial intelligence and genetic engineering, the issue isn’t so much about doing the right thing, but figuring out what the right thing is when the issues are novel, abstruse and far reaching.

What’s crucial to understand, however, is that it’s not any particular invention, but ecosystems that create the future. The Luddites were right to fear textile mills, which did indeed shatter their way of life. However the mill was only one technology, when combined with other inventions, such as agricultural advances, labor unions and modern healthcare, lives greatly improved.

Make no mistake, our future will be shaped by our own choices, which is why we need to abandon our illusions of control. We need to shift from an engineering mindset, where we try to optimize for a limited set of variables and take a more biological view, growing and shaping ecosystems of talent, technology, information and cultural norms.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

To Change the World You Must First Learn Something About It

To Change the World You Must First Learn Something About It

GUEST POST from Greg Satell

Anybody who has waited for a traffic light to change, in the middle of the night at an empty intersection, knows the urge to rebel. There is always a tension between order and freedom. While we intuitively understand the need for order to constrain others, we yearn for the freedom to do what we want and to seek out a vision and sense of meaning in our lives.

Yet as we have seen over the past decade, attempts to overturn the existing order usually fail. The Tea Party erupted in 2009, but had mostly sputtered out by 2014. #Occupy protests and Black Lives Matter sent people into the streets, but achieved little, if anything. Silicon Valley “unicorns” like WeWork routinely go up in flames.

Not all revolutions flop, though. In fact, some succeed marvelously. What has struck me after researching transformational change over nearly two decades is how similar successful efforts are. They all experience failures along the way. What makes the difference is their ability to learn, adapt and change along the way. That’s what allows them to prevail.

Five Kids Meet In A Cafe

One day in 1998, a group of five friends met in a cafe in Belgrade. Although still in their 20s, they were already experienced activists and most of what they experienced was failure. They had taken part in student protests against the war in Bosnia in 1992, as well in the larger uprisings in response to election fraud in 1996. Neither had achieved much.

Having had time to reflect on their successes and failures, they hatched a new plan. They knew from their earlier efforts that they could mobilize people and get them to the polls for the presidential election in 2000. They also knew that Slobodan Milošević, who ruled the country with an iron hand, would try and steal the election, just as he did in 2006.

So that’s what they planned for.

The next day, six friends joined the five from the previous day and, together, they formed the original 11 members of Otpor, the movement that would topple the Milošević regime. They began slowly at first, performing pranks and street theater. But within two years it grew to over 70,000 members, with chapters all over Serbia. Milošević was ousted in the Bulldozer revolution in 2000. He would die in his prison cell at The Hague in 2006.

What Otpor came to understand is that it takes small groups, loosely connected, but united by a shared purpose to drive transformational change. The organization was almost totally decentralized, with just a basic “network meeting” to share best practices every two weeks. Nevertheless, by empowering those smaller groups and giving them a shared sense of mission, they were able to prevail over seemingly impossible odds.

Three Mid-Level executives See A Problem That Needs Fixing

In 2017, John Gadsby and two colleagues in Procter & Gamble’s research organization saw that there was a problem. Although cutting-edge products were being developed all around them, the processes at the 180 year-old firm were often antiquated, making it sometimes difficult to get even simple things done.

So they decided to do something about it. They chose a single process, which involved setting up experiments to test new product technologies. It usually took weeks and was generally considered a bottleneck. Utilizing digital tools, however, they were able to hone it down to just a few hours. It was a serious accomplishment and the three were recognized with a “Pathfinder” award by the company CTO.

Every change starts out with a grievance, such as the annoyance of being bogged down by inefficient processes. The first step forward is to come up with a vision for how you would like things to be different. However, you can never get there in a single step, which is why you need to identify a single keystone change to show others that change is really possible.

That’s exactly what the team at P&G did. Once they showed that one process could be dramatically improved, they were able to get the resources to start improving others. Today, more than 2,500 of their colleagues have joined their movement for process improvement, called PxG, and more than 10,000 have used their applications platform.

As PxG has grown it has also been able to effectively partner with other likeminded initiatives within the company, reinforcing not only its own vision, but those of others that share its values as well.

The One Engineer Who Simply Refused To Take “No” For An Answer

In the late 1960’s, Gary Starkweather was in trouble with his boss. As an engineer in Xerox’s long-range xerography unit, he saw that laser printing could be a huge business opportunity. Unfortunately, his manager at the company’s research facility in upstate New York was focused on improving the current product line, not looking to start a new one.

The argument got so heated that Starkweather’s job came to be in jeopardy. Fortunately, his rabble-rousing caught the attention of another division within the company, the Palo Alto Research Center (PARC), which was less interested in operational efficiency than inventing an entirely new future. They eagerly welcomed Starkweather into their ranks with open arms.

Unlike his old lab, PARC’s entire mission was to create the future. One of the technologies it had developed, bitmapping, would revolutionize computer graphics, but there was no way to print the images out. Starkweather’s work was exactly what they were looking for and, with the Xerox’s copier business in decline, would eventually save the company.

The truth is that good ideas fail all the time and it often has little to do with the quality of the idea, the passion of those who hold it or its potential impact, but rather who you choose to start with. In the New York lab, few people bought into Starkweather’s idea, but in Palo Alto, almost everyone did. In that fertile ground, it was able to grow, mature and triumph.

When trying to get traction for an idea, you always want to be in the majority, even if it is only a local majority comprising a handful of people. You can always expand a small majority out, but once you are in the minority you will get immediate pushback and will need to retrench.

The Secret to Subversion

Through my work, I’ve gotten to know truly revolutionary people. My friend Srdja Popović was one of the original founders of Otpor and has gone on to train activists in more than 50 countries. Jim Allison won a Nobel Prize for discovering Cancer Immunotherapy. Yassmin Abdel-Magied has become an important voice for diversity, equity and inclusion. Many others I profiled in my books, Mapping Innovation and Cascades.

What has always struck me is how different real revolutionaries are from the mercurial, ego-driven stereotypes Hollywood loves to sell us. The truth is that all of those mentioned above are warm, friendly and genuinely nice people who are a pleasure to be around (or were, Gary Starkweather recently passed).

What I’ve found over the years is that sense of openness helped them succeed where others failed. In fact, evidence suggests that generosity is often a competitive advantage for very practical reasons. People who are friendly and generous tend to build up strong networks of collaborators, who provide crucial support for getting an idea off the ground.

But most of all it was that sense of openness that allowed them to learn, adapt and identify a path to victory. Changing the world is hard, often frustrating work. Nobody comes to the game with all the answers. In the final analysis, it’s what you learn along the way—and your ability to change yourself in response to what you learn—that makes the difference between triumph and bitter, agonizing failure.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Competing in a New Era of Innovation

Competing in a New Era of Innovation

GUEST POST from Greg Satell

In 1998, the dotcom craze was going at full steam and it seemed like the entire world was turning upside down. So people took notice when economist Paul Krugman wrote that “by 2005 or so, it will become clear that the internet’s impact on the economy has been no greater than the fax machine’s.”

He was obviously quite a bit off base, but these types of mistakes are incredibly common. As the futurist Roy Amara famously put it, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” The truth is that it usually takes about 30 years for a technology to go from an initial discovery to a measurable impact.

Today, as we near the end of the digital age and enter a new era of innovation, Amara’s point is incredibly important to keep in mind. New technologies, such as quantum computing, blockchain and gene editing will be overhyped, but really will change the world, eventually. So we need to do more than adapt, we need to prepare for a future we can’t see yet.

Identify A “Hair-On-Fire” Use Case

Today we remember the steam engine for powering factories and railroads. In the process, it made the first industrial revolution possible. Yet that’s not how it started out. Its initial purpose was to pump water out of coal mines. At the time, it would have been tough to get people to imagine a factory that didn’t exist yet, but pretty easy for owners to see that their mine was flooded.

The truth is that innovation is never really about ideas, it’s about solving problems. So when a technology is still nascent, doesn’t gain traction in a large, established market, which by definition is already fairly well served, but in a hair-on-fire use case — a problem that somebody needs solved so badly that they almost literally have their hair on fire.

Early versions of the steam engine, such as Thomas Newcomen’s version, didn’t work well and were ill-suited to running factories or driving locomotives. Still, flooded mines were a major problem, so many were more tolerant of glitches and flaws. Later, after James Watt perfected the steam engine, it became more akin to technology that remember now.

We can see the same principle at work today. Blockchain has not had much impact as an alternative currency, but has gained traction optimizing supply chains. Virtual reality has not really caught on in the entertainment industry, but is making headway in corporate training. That’s probably not where those technologies will end up, but it’s how they make money now.

So in the early stages of a technology, don’t try to imagine how a perfected version fit in, find a problem that somebody needs solved so badly right now that they are willing to put up with some inconvenience.

The truth is that the “next big thing” never turns out like people think it will. Putting a man on the moon, for example, didn’t lead to flying cars like in the Jetsons, but instead to satellites that bring events to us from across the world, help us navigate to the corner store and call our loved ones from a business trip.

Build A Learning Curve

Things that change the world always start out arrive out of context, for the simple reason that the world hasn’t changed yet. So when a new technology first appears, we don’t really know how to use it. It takes time to learn how to leverage its advantages to create an impact.

Consider electricity, which as the economist Paul David explained in a classic paper, was first used in factories to cut down on construction costs (steam engines were heavy and needed extra bracing). What wasn’t immediately obvious was that electricity allowed factories to be designed to optimize workflow, rather than having to be arranged around the power source.

We can see the same forces at work today. Consider Amazon’s recent move to offer quantum computing to its customers through the cloud, even though the technology is so primitive that it has no practical application. Nevertheless, it is potentially so powerful—and so different from digital computing—that firms are willing to pay for the privilege of experimenting with it.

The truth is that it’s better to prepare than it is to adapt. When you are adapting you are, by definition, already behind. That’s why it’s important to build a learning curve early, before a technology has begun to impact your business.

Beware Of Switching Costs

When we look back today, it seems incredible that it took decades for factories to switch from steam to electricity. Besides the extra construction costs to build extra bracing, steam engines were dirty and inflexible. Every machine in the factory needed to be tied to one engine, so if one broke down or needed maintenance, the whole factory had to be shut down.

However, when you look at the investment from the perspective of a factory owner, things aren’t so clear cut. While electricity was relatively more attractive when building a new factory, junking an existing facility to make way for a new technology didn’t make as much sense. So most factory owners kept what they had.

These types of switching costs still exist today. Consider neuromorphic chips, which are based on the architecture of the human brain and therefore highly suited to artificial intelligence. They are also potentially millions of times more energy efficient than conventional chips. However, existing AI chips also perform very well, can be manufactured in conventional fabs and run conventional AI algorithms, so neuromorphic chips haven’t caught on yet.

All too often, when a new technology emerges we only look at how its performance compares to what exists today and ignore the importance of switching costs—both real and imagined. That’s a big part of the reason we underestimate how long a technology takes to gain traction and underestimate how much impact it will have in the long run.

Find Your Place In The Ecosystem

We tend to see history through the lens of inventions: Watt and his steam engine. Edison and his light bulb. Ford and his assembly line. Yet building a better mousetrap is never enough to truly change the world. Besides the need to identify a use case, build a learning curve and overcome switching costs, every new technology needs an ecosystem to truly drive the future.

Ford’s automobiles needed roads and gas stations, which led to supermarkets, shopping malls and suburbs. Electricity needed secondary inventions, such as home appliances and radios, which created a market for skilled technicians. It is often in the ecosystem, rather than the initial invention, where most of the value is produced.

Today, we can see similar ecosystems beginning to form around emerging technologies. The journal Nature published an analysis which showed that over $450 million was invested in more than 50 quantum startups between 2012 and 2018, but only a handful are actually making quantum computers. The rest are helping to build out the ecosystem.

So for most of us, the opportunities in the post-digital era won’t be creating new technologies themselves, but in the ecosystems they create. That’s where we’ll see new markets emerge, new jobs created and new fortunes to be made.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovation Requires Going Fast, Slow and Meta

Innovation Requires Going Both Fast and Slow

GUEST POST from Greg Satell

In the regulatory filing for Facebook’s 2012 IPO, Mark Zuckerberg included a letter outlining his management philosophy. Entitled, The Hacker Way, it encapsulated much of the zeitgeist. “We have a saying,” he wrote. “‘Move fast and break things.’ The idea is that if you never break anything, you’re probably not moving fast enough.”

At around the same time, Katalin Karikó was quietly plodding away in her lab at the University of Pennsylvania. She had been working on an idea since the early 1990s and it hadn’t amounted to much so far, but was finally beginning to attract some interest. The next year she would join a small startup named BioNTech to commercialize her work and would continue to chip at the problem.

Things would accelerate in early 2020, when Karikó’s mRNA technology was used to design a coronavirus vaccine in a matter of mere hours. Just as Daniel Kahneman explained that there are fast and slow modes of thinking, the same can be said about innovating. The truth is that moving slowly is often underrated and that moving fast can sometimes bog you down.

The Luxury Of Stability

Mark Zuckerberg had the luxury of being disruptive because he was working in a mature, stable environment. His “Hacker Way” letter showed a bias for action over deliberation in the form of “shipping code,” because he had little else to worry about. Facebook could be built fast, because it was built on top of technology that was slowly developed over decades.

The origins of modern computing are complex, with breakthroughs in multiple fields eventually converging into a single technology. Alan Turing and Claude Shannon provided much of the theoretical basis for digital computing in the 1930s and 40s. Yet the vacuum tube technology at the time only allowed for big, clunky machines that were very limited.

A hardware breakthrough came in 1948, when John Bardeen, William Shockley and Walter Brattain invented the transistor, followed by Jack Kilby and Robert Noyce’s development of the integrated circuit in the late 1960s. The first computers were connected to the Internet a decade later and, a generation after that, Tim Berners-Lee invented the World Wide Web.

All of this happened very slowly but, by the time Mark Zuckerberg became aware of it all, it was just part of the landscape. Much like older generations grew up with the Interstate Highway System and took for granted that they could ride freely on it, Millennial hackers grew up in a period of technological, not to mention political, stability.

The Dangers Of Disruption

Mark Zuckerberg founded Facebook with a bold idea. “We believe that a more open world is a better world because people with more information can make better decisions and have a greater impact,” he wrote. That vision was central to how he built the company and its products. He believed that enabling broader and more efficient communication would foster a deeper and more complete understanding.

Yet the world looks much different when your vantage point is a technology company in Menlo Park, California then it does from, say, a dacha outside Moscow. If you are an aging authoritarian who is somewhat frustrated by your place in the world rather than a young, hubristic entrepreneur, you may take a dimmer view on things.

For many, if not most, people on earth, the world is often a dark and dangerous place and the best defense is often to go on offense. From that vantage point, an open information system is less an opportunity to promote better understanding and more of a vulnerability you can leverage to exploit your enemy.

In fact, the House of Representatives Committee on Intelligence found that agents of the Russian government used the open nature of Facebook and other social media outlets to spread misinformation and sow discord. That’s the problem with moving fast and breaking things. If you’re not careful, you inevitably end up breaking something important.

This principle will become even more important in the years ahead as the potential for serious disruption increases markedly.

The Four Disruptive Shifts Of The Next Decade

While the era that shaped millennials like Mark Zuckerberg was mostly stable, the next decade is likely to be one of the most turbulent in history, with massive shifts in demography, resources, technology and migration. Each one of these has the potential to be destabilizing, the confluence of all four courts disaster and demands that we tread carefully.

Consider the demographic shift caused by the Millennials and Gen Z’ers coming of age. The last time we had a similar generational transition was with the Baby Boomers in the 1960s, which saw more than its share of social and political strife. The shift in values that will take place over the next ten years or so is likely to be similar in scale and scope.

Yet that’s just the start. We will also be shifting in resources from fossil fuels to renewables, in technology from bits to atoms and in migration globally from south to north and from rural to urban areas. The last time we had so many important structural changes going on at once it was the 1920s and that, as we should remember, did not turn out well.

It’s probably no accident that today, much like a century ago, we seem to yearn for “a return to normalcy.” The past two decades have been exhausting, with global terrorism, a massive financial meltdown and now a pandemic fraying our nerves and heightening our sense of vulnerability.

Still, I can’t help feeling that the lessons of the recent past can serve us well in creating a better future.

We Need To Rededicate Ourselves Tackling Grand Challenges

In Daniel Kahneman’s book, Thinking, Fast and Slow, he explained that we have two modes of thinking. The first is fast and intuitive. The second is slow and deliberative. His point wasn’t that one was better than the other, but that both have their purpose and we need to learn how to use both effectively. In many ways, the two go hand-in-hand.

One thing that is often overlooked is that to think fast effectively often takes years of preparation. Certain professions, such as surgeons and pilots, train for years to hone their instincts so that they will be able to react quickly and appropriately in an emergency. In many ways, you can’t think fast without first having thought slow.

Innovation is the same way. We were able to develop coronavirus vaccines in record time because of the years of slow, painstaking work by Katalin Karikó and others like her, much like how Mark Zuckerberg was able to “move fast and break things” because of the decades of breakthroughs it took to develop the technology that he “hacked.”

Today, as the digital era is ending, we need to rededicate ourselves to innovating slow. Just as our investment in things like the human genome project has returned hundreds of times what we put into it, our investment in the grand challenges of the future will enable countless new (hopefully more modest) Zuckerbergs to wax poetic about “hacker culture.”

Innovation is never a single event. It is a process of discovery, engineering and transformation and those things never happen in one place or at one time. That’s why we need to innovate fast and slow, build healthy collaborations and set our sights a bit higher.

— Article courtesy of the Digital Tonto blog
— Image credit: Wikimedia Commons

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Change Management Needs to Change

Change Management Needs to Change

GUEST POST from Greg Satell

In 1983, McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management, that described an ‘adoption penalty’ for firms that didn’t adapt to changes in the marketplace quickly enough. His ideas became McKinsey’s first change management model that it sold to clients.

But consider that research shows in 1975, during the period Phillips studied, 83% of the average US corporation’s assets were tangible assets, such as plant, machinery and buildings, while by 2015, 84% of corporate assets were intangible, such as licenses, patents and research. Clearly, that changes how we need to approach transformation.

When your assets are tangible, change is about making strategic decisions, such as building factories, buying new equipment and so on. Yet when your assets are intangible, change is connected to people—what they believe, how they think and how they act. That’s a very different matter and we need to reexamine how we approach transformation and change.

The Persuasion Model Of Change

Phillips’ point of reference for his paper on organizational change was a comparison of two companies, NCR and Burroughs, and how they adapted to changes in their industry between 1960 and 1975. Phillips was able to show that during that time, NCR paid a high price for its inability to adapt to change while it’s competitor, Burroughs prospered.

He then used that example to outline a general four-part model for change:

  • Creating a sense of concern
  • Developing a specific commitment to change
  • Pushing for major change
  • Reinforcing and consolidating the new course

Phillips’ work kicked off a number of similar approaches, the most famous of which is probably Kotter’s 8-step model. Yet despite the variations, the all follow a similar pattern. First you need to create a sense of urgency, then you devise a vision for change, communicate the need for it effectively and convince others to go along.

The fundamental assumption of these models, is that if people understand the change that you seek, they will happily go along. Yet my research indicates exactly the opposite. In fact, it turns out that people don’t like change and will often work actively to undermine it. Merely trying to be more persuasive is unlikely get you very far.

This is even more true when the target of the change is people themselves than when the change involves some sort of strategic asset. That’s probably why more recent research from McKinsey has found that only 26% of organizational transformations succeed.

Shifting From Hierarchies To Networks

Clearly, the types of assets that make up an enterprise aren’t the only thing that has changed over the past half-century. The structure of our organizations has also shifted considerably. The firms of Phillips’ and Kotter’s era were vlargely hierarchical. Strategic decisions were made at the top and carried out by others below.

Yet there is significant evidence that suggests that networks outperform hierarchies. For example, in Regional Advantage AnnaLee Saxenian explains that Boston-based technology firms, such as DEC and Data General, were vertically integrated and bound employees through non-compete contracts. Their Silicon Valley competitors such as Hewlett Packard and Sun Microsystems, on the other hand, embraced open technologies, built alliances and allowed their people to job hop.

The Boston-based companies, which dominated the microcomputer industry, were considered to be very well managed, highly efficient and innovative firms. However, when technology shifted away from microcomputers, their highly stable, vertical-integrated structure was completely cut off from the knowledge they would need to compete. The highly connected Silicon Valley firms, on the other hand, thrived.

Studies have found similar patterns in the German auto industry, among currency traders and even in Broadway plays. Wherever we see significant change today, it tends to happen side-to-side in networks rather than top-down in hierarchies.

Flipping The Model

When Barry Libenson first arrived at Experian as Global CIO in 2015, he knew that the job would be a challenge. As one of the world’s largest data companies, with leading positions in the credit, automotive and healthcare markets, the CIO’s role is especially crucial for driving the business. He was also new to the industry and needed to build a learning curve quickly.

So he devoted his first few months at the firm to looking around, talking to people and taking the measure of the place. “I especially wanted to see what our customers had on their roadmap for the next 12-24 months,” he told me and everywhere he went he heard the same thing. They wanted access to real-time data.

As an experienced CIO, Libenson knew a cloud computing architecture could solve that problem, but concerns that would need to be addressed. First, many insiders had concerns that moving from batched processed credit reports to real-time access would undermine Experian’s business model.. There were concerns about cybersecurity. The move would also necessitate a shift to agile product management, which would be controversial.

As CIO, Libenson had a lot of clout and could have, as traditional change management models suggest, created a “sense of urgency” among his fellow senior executives and then gotten a commitment to the change he sought. After the decision had been made, they then would have been able to design a communication campaign to persuade 16,000 employees that the change was a good one. The evidence suggests that effort would have failed.

Instead, he flipped the model and began working with a small team that was already enthusiastic about the move. He created an “API Center of Excellence” to help willing project managers to learn agile development and launch cloud-enabled products. After about a year, the program had gained significant traction and after three years the transformation to the cloud was complete.

Becoming The Change That You Want To See

The practice of change management got its start because businesses needed to adapt. The shift that Burroughs made to electronics was no small thing. Investments needed to be made in equipment, technology, training, marketing and so on. That required a multi-year commitment. Its competitor, NCR, was unable or unwilling to change and paid a dear price for it.

Yet change today looks much more like Experian’s shift to the cloud than it does Burroughs’ move into electronics. It’s hard, if not impossible, to persuade a product manager to make a shift if she’s convinced it will kill her business model, just it’s hard to get a project manager to adopt agile methodologies if she feels she’s been successful with more traditional methods. .

Libenson succeeded at Experian not because he was more persuasive, but because he had a better plan. Instead of trying to convince everyone at once, he focused his efforts on empowering those that were already enthusiastic. As their efforts became successful, others joined them and the program gathered steam. Those that couldn’t keep up got left behind.

The truth is that today we can’t transform organizations unless we transform the people in them and that’s why change management has got to change. It is no longer enough to simply communicate decisions made at the top. Rather, we need to put people at the center and empower them to succeed.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.