Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

Outsmarting Those Who Want to Kill Change

Outsmarting Those Who Want to Kill Change

GUEST POST from Greg Satell

Look at anyone who has truly changed the world and they encountered significant resistance. In fact, while researching my book Cascades, I found that every major change effort, whether it was a political revolution, a social movement or an organizational transformation, had people who worked to undermine it in ways that were dishonest, underhanded and deceptive.

Unfortunately, we often don’t realize that there is an opposition campaign underway until it’s too late. People rarely voice open hostility to change. Opponents might even profess some excitement at our idea conceptually, but once there is a possibility of real action moving forward, they dig in their heels.

None of this means that change can’t happen. What it does mean is that, if you expect to bring about meaningful change, planning to overcome resistance has to be a primary design constraint and an organizing principle. Once you understand that, you can begin to move forward, identify shared values, design effective tactics and, ultimately, create lasting change.

Start With a Local Majority

Consider a famous set of conformity studies performed by the psychologist Solomon Asch in the 1950s. The design was simple, but ingenuous. He merely showed people pairs of cards, asking them to match the length of a single line on one card with one of three on an adjacent card. The correct answer was meant to be obvious.

However, as the experimenter went around the room, one person after another gave the same wrong answer. When it reached the final person in the group (in truth, the only real subject, the rest were confederates), the vast majority of the time that person conformed to the majority opinion, even if it was obviously wrong!

Majorities don’t just rule, they also influence, especially local majorities. The effect is even more powerful when the issue at hand is not as clear-cut as the length of a line on a card. Also, more recent research suggests that the effect applies not only to people we know well, but that we are also influenced even by second and third-degree relationships.

The key point here is that we get to choose who we expose an idea to. If you start with five people in a room, for example, you only need three advocates to start with a majority. That may not seem consequential, but consider that the movement that overthrew Serbian dictator Slobodan Milošević started with five kids in a cafe, and you can see how even the most inauspicious beginnings can lead to revolutionary outcomes.

You can always expand a majority out, but once you’re in the minority you are likely to get immediate pushback and will have to retrench. That’s why the best place to start is with those who are already enthusiastic about your idea. Then you can empower them to be successful and bring in others who can bring in others still.

Listen to Your Opposition, But Don’t Engage Them

People who are passionate about change often see themselves as evangelists. Much like Saint Paul in the bible, they thrive on winning converts and seek out those who most adamantly oppose their idea in an attempt to change their minds. This is almost always a mistake. Directly engaging with staunch opposition is unlikely to achieve anything other than exhausting and frustrating you.

However, while you shouldn’t directly engage your fiercest critics, you obviously can’t act like they don’t exist. On the contrary, you need to pay close attention to them. In fact by listening to people who hate your idea you can identify early flaws, which gives you the opportunity to fix them before they can be used against you in any serious way.

One of the most challenging things about managing change effort is balancing the need to focus on a small circle of dedicated enthusiasts while still keeping your eyes and ears open. Once you become too insular, you will quickly find yourself out of touch. It’s not enough to sing to the choir, you also need to get out of the church and mix with the heathens.

Perhaps the most important reason to listen to your critics is that they will help you identify shared values. After all, they are trying to convince the same people in the middle that you are. Very often you’ll find that, by deconstructing their arguments, you can use their objections to help you make your case.

Shift From Differentiating Values to Shared Values

Many revolutionaries, corporate and otherwise, are frustrated marketers. They want to differentiate themselves in the marketplace of ideas through catchy slogans that “cut through.” It is by emphasizing difference that they seek to gin-up enthusiasm among their most loyal supporters.

That was certainly true of LGBTQ activists, who marched through city streets shouting slogans like “We’re here, we’re queer and we’d like to say hello.” They led a different lifestyle and wanted to demand that their dignity be recognized. More recently, Black Lives Matter activists made calls to “defund the police,” which many found to be shocking and anarchistic.

Corporate change agents tend to fall into a similar trap. They rant on about “radical” innovation and “disruption,” ignoring the fact that few like to be radicalized or disrupted. Proponents of agile development methods often tout their manifesto, oblivious to the reality that many outside the agile community find the whole thing a bit weird and unsettling.

While emphasizing difference may excite people who are already on board, it is through shared values that you bring people in. So it shouldn’t be a surprise that the fight for LGBTQ rights began to gain traction when activists started focusing on family values. Innovation doesn’t succeed because it’s “radical,” but when it solves a meaningful problem. The value of Agile methods isn’t a manifesto, but the fact that they can improve performance.

Create and Build On Success

Starting with a small group of enthusiastic apostles may seem insignificant. In fact, look at almost any popular approach to change management and the first thing on the to-do-list is “create a sense of urgency around change” or “create an awareness of the need for change.” But if that really worked, the vast majority of organizational transformations wouldn’t fail, and we know that they do.

Once you accept that resistance to change needs to be your primary design constraint, it becomes clear that starting out with a massive communication campaign will only serve to alert your opponents that they better get started undermining you quickly or you might actually be successful in bringing change about.

That’s why we always advise organizations to focus on a small, but meaningful keystone change that can demonstrate success. For example, one initiative at Procter & Gamble started out with just three mid-level executives focused on improving one process. That kicked off a movement that grew to over 2500 employees in 18 months. Every successful large enterprise transformation we looked at had a similar pattern.

That, in truth, is the best way to outsmart the opponents of change. Find a way to make it successful, no matter how small that initial victory may be, then empower others to succeed as well. It’s easy to argue against an idea, you merely need to smother it in its cradle. Yet a concept that’s been proven to work and has inspired people to believe in it is an idea whose time has come.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Preparing for Organizational Transformation in a Post-COVID World

Preparing Your Organization for Transformation in a Post-COVID World

GUEST POST from Greg Satell

The Covid-19 pandemic demanded we transform across multiple planes. Businesses had to abruptly shift to empower remote work. Professionals were suddenly trading commutes and in-person meetings for home schooling and “Zoom fatigue.” Leaders needed to reimagine every system, from storefronts to supply chains to educational institutions.

It was a brutal awakening, but we can now see the light at the end of the tunnel. In fact, a recent McKinsey Global Survey found that 73% of executives believed that conditions will be moderately or substantially better in the next year. Globally, the World Bank predicts 4% growth in 2021, a marked improvement over 2020’s 4.3% drop.

Still, while the crisis may be ending, the need for fundamental change has not. Today leaders must reinvent their organizations on multiple fronts, including technological, environmental, social and skills-based transformations. These pose challenges for any organization and research suggests that traditional approaches are unlikely to succeed. Here’s what will:

Empowering Small Groups

In 1998 five friends met in a cafe in Belgrade and formed a revolutionary movement. Two years later the brutal Serbian dictator, Slobodan Milošević, was overthrown. In 2007, a lean manufacturing initiative at Wyeth Pharmaceuticals began with a single team in one plant. In 18 months it spread to more than 17,000 employees across 25 sites worldwide and resulted in a more than 25% reduction in costs across the company.

More recently, in 2017, three mid-level employees at Procter & Gamble decided to take it upon themselves, with no budget and no significant executive sponsorship, to transform a single process. It took them months, but they were able to streamline it from a matter of weeks to mere hours. Today, their PxG initiative for process improvement has become a movement for reinvention that encompasses thousands of their colleagues worldwide.

Traditionally, managers launching a new initiative have aimed to start with a bang. They work to gain approval for a sizable budget as a sign of institutional commitment, recruit high-profile executives, arrange a big “kick-off” meeting and look to move fast, gain scale and generate some quick wins. All of this is designed to create a sense of urgency and inevitability.

Yet that approach can backfire. Many change leaders who start with a “shock and awe” approach find that, while they have rallied some to their cause, they have also inspired an insurgency that bogs things down. For any significant change, there will always be some who will oppose the idea and they will resist it in ways that are often insidious and not immediately obvious.

The dangers of resistance are especially acute when, as is often the case today, you need to drive transformation on multiple fronts. That’s why it’s best to start with small groups of enthusiasts that you can empower to succeed, rather than try to push an initiative on the masses that you’ll struggle to convince.

Weaving A Network Of Loose Connections

The sociologist Mark Granovetter envisioned collective action as a series of resistance thresholds. For any idea or initiative, some will be naturally enthusiastic and have minimal or no resistance, some will have some level of skepticism and others will be dead set against it.

It’s not hard to see why focusing initial efforts on small groups with low resistance thresholds can be effective. In the examples above, the Serbian activists, the lean manufacturing pilot team at Wyeth and the three mid-level executives at Procter & Gamble were all highly motivated and willing to put in the hard work to overcome initial challenges and setbacks.

To scale, however, transformation efforts must be able to connect to those who have at least some level of reluctance. One highly effective strategy to scale change is to create “cooptable” resources in the form of workshops, training materials and other assets. For example, to scale a cloud transformation initiative at Experian, change leaders set up an “API Center of Excellence” to make it as easy as possible for product managers to try cloud-based offerings.

Another helpful practice is to update stakeholders about recent events and share best practices. In One Mission, Chris Fussell describes in detail the O&I forum he and General Stanley McChrystal used in Iraq. The Serbian activists held regular “network meetings,” that served a similar purpose. More recently, Yammer groups, Zoom calls and other digital media have proven effective in this regard.

What’s most important is that people are allowed to take ownership of a change initiative and be able to define it for themselves, rather than being bribed or coerced with incentive schemes or mandates. You can’t force authentic change. Unless people see genuine value in it, it will never gain any real traction.

Indoctrinate Shared Values And Shared Purpose

One of the biggest misconceptions about transformation efforts is that success begets more success. In practice, the opposite is often true. An initial success—especially a visible one—is likely to be met with a groundswell of opposition. We’ve seen this writ large with respect to political revolutions in which initial victories in places like Egypt, Maldives and Burma experienced reversals, but it is no less common in a corporate or organizational context.

In fact, we are often called into an engagement 6-12 months after an initiative starts because change leaders are bewildered that their efforts, which seemed so successful at first, have suddenly and mysteriously run aground. In actuality, it was those initial victories that activated latent opposition because it made what seemed unlikely change a real possibility.

The truth is that lasting change can never be built on any particular technology, program or policy, but rather must focus on shared values and a shared sense of mission. The Serbian activists focused not on any particular ideology, but on patriotism. At Wyeth, the change leaders made sure not to champion any specific technique, but tangible results. The leaders of the PXG initiative at Procter & Gamble highlighted the effect clunky and inefficient processes have on morale.

Irving Wladawsky-Berger, who was one of Lou Gerstner’s key lieutenants in IBM’s historic turnaround in the 90s made a similar point to me. “Because the transformation was about values first and technology second, we were able to continue to embrace those values as the technology and marketplace continued to evolve,” he said.

Redefining Agility

In Built to Last, management guru Jim Collins suggested that leaders should develop a “big hairy audacious goal” (BHAG) to serve as a unifying vision for their enterprise. He pointed to examples such as Boeing’s development of the 707 commercial jet liner and Jack Welch’s vision that every GE business should be #1 or #2 in its category as inspiring “moonshots.”

Yet the truth is that we no longer have the luxury of focusing transformation in a single direction, but must bring about change along multiple axes simultaneously. Leaders today can’t choose whether to leverage cutting-edge technologies or become more sustainable, nor can we choose between a highly skilled workforce and one that is diverse and inclusive.

The kind of sustained, multifaceted brand of change we need today cannot be mandated from a mountaintop but must be inspired to take root throughout an enterprise. We need to learn how to empower small loosely connected groups with a shared sense of mission and purpose. To truly take hold, people need to embrace change and they do that for their own reasons, not for ours.

That’s what will be key to making the transformations ahead successful. The answer doesn’t lie in any specific strategy or initiative, but in how people are able to internalize the need for change and transfer ideas through social bonds. A leader’s role is no longer to plan and direct action, but to inspire and empower belief.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Digital Era Replaced by an Age of Molecular Innovation

Digital Era Replaced by an Age of Molecular Innovation

GUEST POST from Greg Satell

It’s become strangely fashionable for digerati to mourn the death of innovation. “There’s nothing new,” has become a common refrain for which they blame venture capitalists, entrepreneurs and other digerati they consider to be less enlightened than themselves. They yearn for a lost age when things were better and more innovative.

What they fail to recognize is that the digital era is ending. After more than 50 years of exponential growth, the technology has matured and advancement has naturally slowed. While it is true that there are worrying signs that things in Silicon Valley have gone seriously awry and those excesses need to be curtailed, there’s more to the story.

The fact is that we’re on the brink of a new era of innovation and, while digital technology will be an enabling factor, it will no longer be center stage. The future will not be written in the digital language of ones and zeroes, but in that of atoms, molecules, genes and proteins. We do not lack potential or possibility, what we need is more imagination and wonder.

The End Of Moore’s Law

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which predicted that computing power would double about every two years. This idea, known as Moore’s Law , has driven the digital revolution for a half century. It’s what’s empowered us to shrink computers from huge machines to tiny, but powerful, devices we carry in our pockets.

Yet there are limits for everything. The simple truth is that atoms are only so small and the speed of light is only so fast. That puts a limit on how many transistors we can cram onto a silicon wafer and how fast electrons can zip around the logic gates we set up for them. At this point, Moore’s Law is effectively over.

That doesn’t mean that advancement will stop altogether. There are other ways to speed up computing. The problem is that they all come with tradeoffs. New architectures, such as quantum and neuromorphic computing, for instance, require new programming languages, new logical approaches and very different algorithmic strategies than we’re used to.

So for the next decade or two we’re likely to see a heterogeneous computing environment emerge, in which we combine different architectures for different tasks. For example, we will be augmenting traditional AI systems with techniques like quantum machine learning. It is not only possible, but fairly likely, that these types of combinations will result in an exponential increase in capability.

A Biological Revolution

Moore’s Law has become essentially shorthand for exponential improvement in any field. Anytime we see a continuous doubling of efficiency, we call it “the Moore’s Law of ‘X.’’ Yet since the Human Genome Project was completed in 2003, advancement in genetic sequencing has far outpaced what has happened in the digital arena.

What is possibly an even bigger development occurred in 2012, when Jennifer Doudna and her colleagues discovered how CRISPR could revolutionize gene editing. Now, suddenly, the work of genetic engineers that would have taken weeks could be done in hours, at a fraction of the cost, with much greater accuracy and the new era of synthetic biology had begun.

The most obvious consequence of this new era is the Covid-19 vaccine, which was designed in a matter of mere days instead of what’s traditionally taken years. The mRNA technology used to create two of the vaccines also holds promise for cancer treatment and CRISPR-based approaches have been applied to cure sickle cell and other diseases.

Yet as impressive as the medical achievements are, they make up only a fraction of the innovation that synthetic biology is making possible. Scientists are working on programming microorganisms to create new carbon-neutral biofuels and biodegradable plastics. It may very well revolutionize agriculture and help feed the world.

The truth is that the biological revolution is basically where computers were at in the 1970s or 80s and we are just beginning to understand the potential. We can expect progress to accelerate for decades to come.

The Infinite World Of Atoms

Anyone who has regularly read the business press over the past 20 years or so would naturally conclude that we live in a digital economy. Certainly, tech firms dominate any list of the world’s most valuable companies. Yet take a closer look and you will find that information and communication as a sector only makes up for 6% of GDP in advanced countries.

The truth is that we still live very much in a world of atoms and we spend most of our money on what we eat, wear, ride and live in. Any real improvement in our well-being depends on our ability to shape atoms to our liking. As noted above, reprogramming genetic material in cells to make things for us is one way we can do that, but not the only one.

In fact, there is a revolution in materials science underway. Much like in genomics, scientists are learning how to use computers to understand materials on a fundamental level and figure out how we can design them a lot better. In fact, in some cases researchers are able to discover new materials hundreds of times more efficiently than before.

Unlike digital or biological technologies this is largely a quiet revolution with very little publicity. Make no mistake, however, our newfound ability to create advanced materials will transform our ability to create and build everything from vastly more efficient solar panels to lighter, stronger and more environmentally friendly building materials.

The Next Big Thing Always Starts Out Looking Like Nothing At All

The origins of digital computing can be traced back at least a century, to the rise and fall of logical positivism, Turing’s “machine,” the invention of the transistor, the integrated circuit and the emergence of the first modern PC at Xerox PARC in the early 1970s. Yet there wasn’t a measurable impact from computing until the mid-1990s.

We tend to assume that we’ll notice when something important is afoot, but that’s rarely the case. The truth is that the next big thing always starts out looking like nothing at all. It doesn’t appear fully bloomed, but usually incubates for years—and often decades—by scientists quietly working in labs and by specialists debating at obscure conferences.

So, yes, after 50 years the digital revolution has run out of steam, but that shouldn’t blind us to the incredible opportunities that are before us. After all, a year ago very few people had heard of mRNA vaccines, but that didn’t make them any less powerful or important. There is no shortage of nascent technologies that can have just as big of an impact.

The simple fact is that innovation is not, and never has been, about what kind of apps show up on our smartphone screens. The value of a technology is not measured in how a Silicon Valley CEO can dazzle an audience on stage, but in our capacity to solve meaningful problems and, as long as there are meaningful problems to solve, innovation will live on.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Need a More Biological View of Technology

We Need a More Biological View of Technology

GUEST POST from Greg Satell

It’s no accident that Mary Shelley’s novel, Frankenstein, was published in the early 19th century, at roughly the same time as the Luddite movement was gaining momentum. It was in that moment that people first began to take stock of the technological advances that brought about the first Industrial Revolution.

Since then we have seemed to oscillate between techno-utopianism and dystopian visions of machines gone mad. For every “space odyssey” promising an automated, enlightened future, there seems to be a “Terminator” series warning of our impending destruction. Neither scenario has ever come to pass and it is unlikely that either ever will.

What both the optimists and the Cassandras miss is that technology is not something that exists independently from us. It is, in fact, intensely human. We don’t merely build it, but continue to nurture it through how we develop and shape ecosystems. We need to go beyond a simple engineering mindset and focus on a process of revealing, building and emergence.

1. Revealing

World War II brought the destructive potential of technology to the fore of human consciousness. As deadly machines ravaged Europe and bombs of unimaginable power exploded in Asia, the whole planet was engulfed in a maelstrom of human design. It seemed that the technology we had built had become a modern version of Frankenstein’s monster, destined from the start to turn on its master.

Yet the German philosopher Martin Heidegger saw things differently. In his 1954 essay, The Question Concerning Technology, he described technology as akin to art, in that it reveals truths about the nature of the world, brings them forth and puts them to some specific use. In the process, human nature and its capacity for good and evil are also revealed.

He offers the example of a hydroelectric dam, which uncovers a river’s energy and puts it to use making electricity. In much the same sense, Mark Zuckerberg did not so much “build” a social network at Facebook, but took natural human tendencies and channeled them in a particular way. That process of channeling, in turn, reveals even more.

That’s why, as I wrote in Mapping Innovation, innovation is not about coming up with new ideas, but identifying meaningful problems. It’s through exploring tough problems that we reveal new things and those new things can lead to important solutions. All who wander are not lost.

2. Building

The concept of revealing would seem to support the view of Shelley and the Luddites. It suggests that once a force is revealed, we are powerless to shape its trajectory. J. Robert Oppenheimer, upon witnessing the world’s first nuclear explosion as it shook the plains of New Mexico, expressed a similar view. “Now I am become Death, the destroyer of worlds,” he said, quoting the Bhagavad Gita.

Yet in another essay, Building Dwelling Thinking, Heideggar explains that what we build for the world is highly dependent on our interpretation of what it means to live in it. The relationship is, of course, reflexive. What we build depends on how we wish to dwell and that act, in and of itself, shapes how we build further.

Again, Mark Zuckerberg and Facebook are instructive. His insight into human nature led him to build his platform based on what he saw as The Hacker Way and resolved to “move fast and break things.” Unfortunately, that approach led to his enterprise becoming highly vulnerable to schemes by actors such as Cambridge Analytica and the Russian GRU.

Yet technology is not, by itself, determinant. Facebook is, to a great extent, the result of conscious choices that Mark Zuckerberg made. If he had a different set of experiences than that of a young, upper-middle-class kid who had never encountered a moment of true danger in his life, he may have been more cautious and chosen differently.

History has shown that those who build powerful technologies can play a vital role in shaping how they are used. Many of the scientists of Oppenheimer’s day became activists, preparing a manifesto that highlighted the dangers of nuclear weapons, which helped lead to the Partial Test Ban Treaty. In much the same way, the Asilomar Conference, held in 1975, led to important constraints on genomic technologies.

3. Emergence

No technology stands alone, but combines with other technologies to form systems. That’s where things get confusing because when things combine and interact they become more complex. As complexity theorist Sam Arbesman explained in his book, Overcomplicated, this happens because of two forces inherent to the way that technologies evolve.

The first is accretion. A product such as an iPhone represents the accumulation of many different technologies, including microchips, programming languages, gyroscopes, cameras, touchscreens and lithium ion batteries, just to name a few. As we figure out more tasks an iPhone can perform, more technologies are added, building on each other.

The second force is interaction. Put simply, much of the value of an iPhone is embedded in how it works with other technologies to make tasks easier. We want to use it to access platforms such as Facebook to keep in touch with friends, Yelp so that we can pick out a nice restaurant where we can meet them and Google Maps to help us find the place. These interactions, combined with accretion, create an onward march towards greater complexity.

It is through ever increasing complexity that we lose control. Leonard Read pointed out in his classic essay, I, Pencil, that even an object as simple as a pencil is far too complex for any single person to produce by themselves. A smartphone—or even a single microchip—is exponentially more complex.

People work their entire lives to become experts on even a minor aspect of a technology like an iPhone, a narrow practice of medicine or an obscure facet of a single legal code. As complexity increases, so does specialization, making it even harder for any one person to see the whole picture.

Shaping Ecosystems And Taking A Biological View

In 2013, I wrote that we are all Luddites now, because advances in artificial intelligence had become so powerful that anyone who wasn’t nervous didn’t really understand what was going on. Today, as we enter a new era of innovation and technologies become infinitely more powerful, we are entering a new ethical universe.

Typically, the practice of modern ethics has been fairly simple: Don’t lie, cheat or steal. Yet with many of our most advanced technologies, such as artificial intelligence and genetic engineering, the issue isn’t so much about doing the right thing, but figuring out what the right thing is when the issues are novel, abstruse and far reaching.

What’s crucial to understand, however, is that it’s not any particular invention, but ecosystems that create the future. The Luddites were right to fear textile mills, which did indeed shatter their way of life. However the mill was only one technology, when combined with other inventions, such as agricultural advances, labor unions and modern healthcare, lives greatly improved.

Make no mistake, our future will be shaped by our own choices, which is why we need to abandon our illusions of control. We need to shift from an engineering mindset, where we try to optimize for a limited set of variables and take a more biological view, growing and shaping ecosystems of talent, technology, information and cultural norms.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

To Change the World You Must First Learn Something About It

To Change the World You Must First Learn Something About It

GUEST POST from Greg Satell

Anybody who has waited for a traffic light to change, in the middle of the night at an empty intersection, knows the urge to rebel. There is always a tension between order and freedom. While we intuitively understand the need for order to constrain others, we yearn for the freedom to do what we want and to seek out a vision and sense of meaning in our lives.

Yet as we have seen over the past decade, attempts to overturn the existing order usually fail. The Tea Party erupted in 2009, but had mostly sputtered out by 2014. #Occupy protests and Black Lives Matter sent people into the streets, but achieved little, if anything. Silicon Valley “unicorns” like WeWork routinely go up in flames.

Not all revolutions flop, though. In fact, some succeed marvelously. What has struck me after researching transformational change over nearly two decades is how similar successful efforts are. They all experience failures along the way. What makes the difference is their ability to learn, adapt and change along the way. That’s what allows them to prevail.

Five Kids Meet In A Cafe

One day in 1998, a group of five friends met in a cafe in Belgrade. Although still in their 20s, they were already experienced activists and most of what they experienced was failure. They had taken part in student protests against the war in Bosnia in 1992, as well in the larger uprisings in response to election fraud in 1996. Neither had achieved much.

Having had time to reflect on their successes and failures, they hatched a new plan. They knew from their earlier efforts that they could mobilize people and get them to the polls for the presidential election in 2000. They also knew that Slobodan Milošević, who ruled the country with an iron hand, would try and steal the election, just as he did in 2006.

So that’s what they planned for.

The next day, six friends joined the five from the previous day and, together, they formed the original 11 members of Otpor, the movement that would topple the Milošević regime. They began slowly at first, performing pranks and street theater. But within two years it grew to over 70,000 members, with chapters all over Serbia. Milošević was ousted in the Bulldozer revolution in 2000. He would die in his prison cell at The Hague in 2006.

What Otpor came to understand is that it takes small groups, loosely connected, but united by a shared purpose to drive transformational change. The organization was almost totally decentralized, with just a basic “network meeting” to share best practices every two weeks. Nevertheless, by empowering those smaller groups and giving them a shared sense of mission, they were able to prevail over seemingly impossible odds.

Three Mid-Level executives See A Problem That Needs Fixing

In 2017, John Gadsby and two colleagues in Procter & Gamble’s research organization saw that there was a problem. Although cutting-edge products were being developed all around them, the processes at the 180 year-old firm were often antiquated, making it sometimes difficult to get even simple things done.

So they decided to do something about it. They chose a single process, which involved setting up experiments to test new product technologies. It usually took weeks and was generally considered a bottleneck. Utilizing digital tools, however, they were able to hone it down to just a few hours. It was a serious accomplishment and the three were recognized with a “Pathfinder” award by the company CTO.

Every change starts out with a grievance, such as the annoyance of being bogged down by inefficient processes. The first step forward is to come up with a vision for how you would like things to be different. However, you can never get there in a single step, which is why you need to identify a single keystone change to show others that change is really possible.

That’s exactly what the team at P&G did. Once they showed that one process could be dramatically improved, they were able to get the resources to start improving others. Today, more than 2,500 of their colleagues have joined their movement for process improvement, called PxG, and more than 10,000 have used their applications platform.

As PxG has grown it has also been able to effectively partner with other likeminded initiatives within the company, reinforcing not only its own vision, but those of others that share its values as well.

The One Engineer Who Simply Refused To Take “No” For An Answer

In the late 1960’s, Gary Starkweather was in trouble with his boss. As an engineer in Xerox’s long-range xerography unit, he saw that laser printing could be a huge business opportunity. Unfortunately, his manager at the company’s research facility in upstate New York was focused on improving the current product line, not looking to start a new one.

The argument got so heated that Starkweather’s job came to be in jeopardy. Fortunately, his rabble-rousing caught the attention of another division within the company, the Palo Alto Research Center (PARC), which was less interested in operational efficiency than inventing an entirely new future. They eagerly welcomed Starkweather into their ranks with open arms.

Unlike his old lab, PARC’s entire mission was to create the future. One of the technologies it had developed, bitmapping, would revolutionize computer graphics, but there was no way to print the images out. Starkweather’s work was exactly what they were looking for and, with the Xerox’s copier business in decline, would eventually save the company.

The truth is that good ideas fail all the time and it often has little to do with the quality of the idea, the passion of those who hold it or its potential impact, but rather who you choose to start with. In the New York lab, few people bought into Starkweather’s idea, but in Palo Alto, almost everyone did. In that fertile ground, it was able to grow, mature and triumph.

When trying to get traction for an idea, you always want to be in the majority, even if it is only a local majority comprising a handful of people. You can always expand a small majority out, but once you are in the minority you will get immediate pushback and will need to retrench.

The Secret to Subversion

Through my work, I’ve gotten to know truly revolutionary people. My friend Srdja Popović was one of the original founders of Otpor and has gone on to train activists in more than 50 countries. Jim Allison won a Nobel Prize for discovering Cancer Immunotherapy. Yassmin Abdel-Magied has become an important voice for diversity, equity and inclusion. Many others I profiled in my books, Mapping Innovation and Cascades.

What has always struck me is how different real revolutionaries are from the mercurial, ego-driven stereotypes Hollywood loves to sell us. The truth is that all of those mentioned above are warm, friendly and genuinely nice people who are a pleasure to be around (or were, Gary Starkweather recently passed).

What I’ve found over the years is that sense of openness helped them succeed where others failed. In fact, evidence suggests that generosity is often a competitive advantage for very practical reasons. People who are friendly and generous tend to build up strong networks of collaborators, who provide crucial support for getting an idea off the ground.

But most of all it was that sense of openness that allowed them to learn, adapt and identify a path to victory. Changing the world is hard, often frustrating work. Nobody comes to the game with all the answers. In the final analysis, it’s what you learn along the way—and your ability to change yourself in response to what you learn—that makes the difference between triumph and bitter, agonizing failure.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Competing in a New Era of Innovation

Competing in a New Era of Innovation

GUEST POST from Greg Satell

In 1998, the dotcom craze was going at full steam and it seemed like the entire world was turning upside down. So people took notice when economist Paul Krugman wrote that “by 2005 or so, it will become clear that the internet’s impact on the economy has been no greater than the fax machine’s.”

He was obviously quite a bit off base, but these types of mistakes are incredibly common. As the futurist Roy Amara famously put it, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” The truth is that it usually takes about 30 years for a technology to go from an initial discovery to a measurable impact.

Today, as we near the end of the digital age and enter a new era of innovation, Amara’s point is incredibly important to keep in mind. New technologies, such as quantum computing, blockchain and gene editing will be overhyped, but really will change the world, eventually. So we need to do more than adapt, we need to prepare for a future we can’t see yet.

Identify A “Hair-On-Fire” Use Case

Today we remember the steam engine for powering factories and railroads. In the process, it made the first industrial revolution possible. Yet that’s not how it started out. Its initial purpose was to pump water out of coal mines. At the time, it would have been tough to get people to imagine a factory that didn’t exist yet, but pretty easy for owners to see that their mine was flooded.

The truth is that innovation is never really about ideas, it’s about solving problems. So when a technology is still nascent, doesn’t gain traction in a large, established market, which by definition is already fairly well served, but in a hair-on-fire use case — a problem that somebody needs solved so badly that they almost literally have their hair on fire.

Early versions of the steam engine, such as Thomas Newcomen’s version, didn’t work well and were ill-suited to running factories or driving locomotives. Still, flooded mines were a major problem, so many were more tolerant of glitches and flaws. Later, after James Watt perfected the steam engine, it became more akin to technology that remember now.

We can see the same principle at work today. Blockchain has not had much impact as an alternative currency, but has gained traction optimizing supply chains. Virtual reality has not really caught on in the entertainment industry, but is making headway in corporate training. That’s probably not where those technologies will end up, but it’s how they make money now.

So in the early stages of a technology, don’t try to imagine how a perfected version fit in, find a problem that somebody needs solved so badly right now that they are willing to put up with some inconvenience.

The truth is that the “next big thing” never turns out like people think it will. Putting a man on the moon, for example, didn’t lead to flying cars like in the Jetsons, but instead to satellites that bring events to us from across the world, help us navigate to the corner store and call our loved ones from a business trip.

Build A Learning Curve

Things that change the world always start out arrive out of context, for the simple reason that the world hasn’t changed yet. So when a new technology first appears, we don’t really know how to use it. It takes time to learn how to leverage its advantages to create an impact.

Consider electricity, which as the economist Paul David explained in a classic paper, was first used in factories to cut down on construction costs (steam engines were heavy and needed extra bracing). What wasn’t immediately obvious was that electricity allowed factories to be designed to optimize workflow, rather than having to be arranged around the power source.

We can see the same forces at work today. Consider Amazon’s recent move to offer quantum computing to its customers through the cloud, even though the technology is so primitive that it has no practical application. Nevertheless, it is potentially so powerful—and so different from digital computing—that firms are willing to pay for the privilege of experimenting with it.

The truth is that it’s better to prepare than it is to adapt. When you are adapting you are, by definition, already behind. That’s why it’s important to build a learning curve early, before a technology has begun to impact your business.

Beware Of Switching Costs

When we look back today, it seems incredible that it took decades for factories to switch from steam to electricity. Besides the extra construction costs to build extra bracing, steam engines were dirty and inflexible. Every machine in the factory needed to be tied to one engine, so if one broke down or needed maintenance, the whole factory had to be shut down.

However, when you look at the investment from the perspective of a factory owner, things aren’t so clear cut. While electricity was relatively more attractive when building a new factory, junking an existing facility to make way for a new technology didn’t make as much sense. So most factory owners kept what they had.

These types of switching costs still exist today. Consider neuromorphic chips, which are based on the architecture of the human brain and therefore highly suited to artificial intelligence. They are also potentially millions of times more energy efficient than conventional chips. However, existing AI chips also perform very well, can be manufactured in conventional fabs and run conventional AI algorithms, so neuromorphic chips haven’t caught on yet.

All too often, when a new technology emerges we only look at how its performance compares to what exists today and ignore the importance of switching costs—both real and imagined. That’s a big part of the reason we underestimate how long a technology takes to gain traction and underestimate how much impact it will have in the long run.

Find Your Place In The Ecosystem

We tend to see history through the lens of inventions: Watt and his steam engine. Edison and his light bulb. Ford and his assembly line. Yet building a better mousetrap is never enough to truly change the world. Besides the need to identify a use case, build a learning curve and overcome switching costs, every new technology needs an ecosystem to truly drive the future.

Ford’s automobiles needed roads and gas stations, which led to supermarkets, shopping malls and suburbs. Electricity needed secondary inventions, such as home appliances and radios, which created a market for skilled technicians. It is often in the ecosystem, rather than the initial invention, where most of the value is produced.

Today, we can see similar ecosystems beginning to form around emerging technologies. The journal Nature published an analysis which showed that over $450 million was invested in more than 50 quantum startups between 2012 and 2018, but only a handful are actually making quantum computers. The rest are helping to build out the ecosystem.

So for most of us, the opportunities in the post-digital era won’t be creating new technologies themselves, but in the ecosystems they create. That’s where we’ll see new markets emerge, new jobs created and new fortunes to be made.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovation Requires Going Fast, Slow and Meta

Innovation Requires Going Both Fast and Slow

GUEST POST from Greg Satell

In the regulatory filing for Facebook’s 2012 IPO, Mark Zuckerberg included a letter outlining his management philosophy. Entitled, The Hacker Way, it encapsulated much of the zeitgeist. “We have a saying,” he wrote. “‘Move fast and break things.’ The idea is that if you never break anything, you’re probably not moving fast enough.”

At around the same time, Katalin Karikó was quietly plodding away in her lab at the University of Pennsylvania. She had been working on an idea since the early 1990s and it hadn’t amounted to much so far, but was finally beginning to attract some interest. The next year she would join a small startup named BioNTech to commercialize her work and would continue to chip at the problem.

Things would accelerate in early 2020, when Karikó’s mRNA technology was used to design a coronavirus vaccine in a matter of mere hours. Just as Daniel Kahneman explained that there are fast and slow modes of thinking, the same can be said about innovating. The truth is that moving slowly is often underrated and that moving fast can sometimes bog you down.

The Luxury Of Stability

Mark Zuckerberg had the luxury of being disruptive because he was working in a mature, stable environment. His “Hacker Way” letter showed a bias for action over deliberation in the form of “shipping code,” because he had little else to worry about. Facebook could be built fast, because it was built on top of technology that was slowly developed over decades.

The origins of modern computing are complex, with breakthroughs in multiple fields eventually converging into a single technology. Alan Turing and Claude Shannon provided much of the theoretical basis for digital computing in the 1930s and 40s. Yet the vacuum tube technology at the time only allowed for big, clunky machines that were very limited.

A hardware breakthrough came in 1948, when John Bardeen, William Shockley and Walter Brattain invented the transistor, followed by Jack Kilby and Robert Noyce’s development of the integrated circuit in the late 1960s. The first computers were connected to the Internet a decade later and, a generation after that, Tim Berners-Lee invented the World Wide Web.

All of this happened very slowly but, by the time Mark Zuckerberg became aware of it all, it was just part of the landscape. Much like older generations grew up with the Interstate Highway System and took for granted that they could ride freely on it, Millennial hackers grew up in a period of technological, not to mention political, stability.

The Dangers Of Disruption

Mark Zuckerberg founded Facebook with a bold idea. “We believe that a more open world is a better world because people with more information can make better decisions and have a greater impact,” he wrote. That vision was central to how he built the company and its products. He believed that enabling broader and more efficient communication would foster a deeper and more complete understanding.

Yet the world looks much different when your vantage point is a technology company in Menlo Park, California then it does from, say, a dacha outside Moscow. If you are an aging authoritarian who is somewhat frustrated by your place in the world rather than a young, hubristic entrepreneur, you may take a dimmer view on things.

For many, if not most, people on earth, the world is often a dark and dangerous place and the best defense is often to go on offense. From that vantage point, an open information system is less an opportunity to promote better understanding and more of a vulnerability you can leverage to exploit your enemy.

In fact, the House of Representatives Committee on Intelligence found that agents of the Russian government used the open nature of Facebook and other social media outlets to spread misinformation and sow discord. That’s the problem with moving fast and breaking things. If you’re not careful, you inevitably end up breaking something important.

This principle will become even more important in the years ahead as the potential for serious disruption increases markedly.

The Four Disruptive Shifts Of The Next Decade

While the era that shaped millennials like Mark Zuckerberg was mostly stable, the next decade is likely to be one of the most turbulent in history, with massive shifts in demography, resources, technology and migration. Each one of these has the potential to be destabilizing, the confluence of all four courts disaster and demands that we tread carefully.

Consider the demographic shift caused by the Millennials and Gen Z’ers coming of age. The last time we had a similar generational transition was with the Baby Boomers in the 1960s, which saw more than its share of social and political strife. The shift in values that will take place over the next ten years or so is likely to be similar in scale and scope.

Yet that’s just the start. We will also be shifting in resources from fossil fuels to renewables, in technology from bits to atoms and in migration globally from south to north and from rural to urban areas. The last time we had so many important structural changes going on at once it was the 1920s and that, as we should remember, did not turn out well.

It’s probably no accident that today, much like a century ago, we seem to yearn for “a return to normalcy.” The past two decades have been exhausting, with global terrorism, a massive financial meltdown and now a pandemic fraying our nerves and heightening our sense of vulnerability.

Still, I can’t help feeling that the lessons of the recent past can serve us well in creating a better future.

We Need To Rededicate Ourselves Tackling Grand Challenges

In Daniel Kahneman’s book, Thinking, Fast and Slow, he explained that we have two modes of thinking. The first is fast and intuitive. The second is slow and deliberative. His point wasn’t that one was better than the other, but that both have their purpose and we need to learn how to use both effectively. In many ways, the two go hand-in-hand.

One thing that is often overlooked is that to think fast effectively often takes years of preparation. Certain professions, such as surgeons and pilots, train for years to hone their instincts so that they will be able to react quickly and appropriately in an emergency. In many ways, you can’t think fast without first having thought slow.

Innovation is the same way. We were able to develop coronavirus vaccines in record time because of the years of slow, painstaking work by Katalin Karikó and others like her, much like how Mark Zuckerberg was able to “move fast and break things” because of the decades of breakthroughs it took to develop the technology that he “hacked.”

Today, as the digital era is ending, we need to rededicate ourselves to innovating slow. Just as our investment in things like the human genome project has returned hundreds of times what we put into it, our investment in the grand challenges of the future will enable countless new (hopefully more modest) Zuckerbergs to wax poetic about “hacker culture.”

Innovation is never a single event. It is a process of discovery, engineering and transformation and those things never happen in one place or at one time. That’s why we need to innovate fast and slow, build healthy collaborations and set our sights a bit higher.

— Article courtesy of the Digital Tonto blog
— Image credit: Wikimedia Commons

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Change Management Needs to Change

Change Management Needs to Change

GUEST POST from Greg Satell

In 1983, McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management, that described an ‘adoption penalty’ for firms that didn’t adapt to changes in the marketplace quickly enough. His ideas became McKinsey’s first change management model that it sold to clients.

But consider that research shows in 1975, during the period Phillips studied, 83% of the average US corporation’s assets were tangible assets, such as plant, machinery and buildings, while by 2015, 84% of corporate assets were intangible, such as licenses, patents and research. Clearly, that changes how we need to approach transformation.

When your assets are tangible, change is about making strategic decisions, such as building factories, buying new equipment and so on. Yet when your assets are intangible, change is connected to people—what they believe, how they think and how they act. That’s a very different matter and we need to reexamine how we approach transformation and change.

The Persuasion Model Of Change

Phillips’ point of reference for his paper on organizational change was a comparison of two companies, NCR and Burroughs, and how they adapted to changes in their industry between 1960 and 1975. Phillips was able to show that during that time, NCR paid a high price for its inability to adapt to change while it’s competitor, Burroughs prospered.

He then used that example to outline a general four-part model for change:

  • Creating a sense of concern
  • Developing a specific commitment to change
  • Pushing for major change
  • Reinforcing and consolidating the new course

Phillips’ work kicked off a number of similar approaches, the most famous of which is probably Kotter’s 8-step model. Yet despite the variations, the all follow a similar pattern. First you need to create a sense of urgency, then you devise a vision for change, communicate the need for it effectively and convince others to go along.

The fundamental assumption of these models, is that if people understand the change that you seek, they will happily go along. Yet my research indicates exactly the opposite. In fact, it turns out that people don’t like change and will often work actively to undermine it. Merely trying to be more persuasive is unlikely get you very far.

This is even more true when the target of the change is people themselves than when the change involves some sort of strategic asset. That’s probably why more recent research from McKinsey has found that only 26% of organizational transformations succeed.

Shifting From Hierarchies To Networks

Clearly, the types of assets that make up an enterprise aren’t the only thing that has changed over the past half-century. The structure of our organizations has also shifted considerably. The firms of Phillips’ and Kotter’s era were vlargely hierarchical. Strategic decisions were made at the top and carried out by others below.

Yet there is significant evidence that suggests that networks outperform hierarchies. For example, in Regional Advantage AnnaLee Saxenian explains that Boston-based technology firms, such as DEC and Data General, were vertically integrated and bound employees through non-compete contracts. Their Silicon Valley competitors such as Hewlett Packard and Sun Microsystems, on the other hand, embraced open technologies, built alliances and allowed their people to job hop.

The Boston-based companies, which dominated the microcomputer industry, were considered to be very well managed, highly efficient and innovative firms. However, when technology shifted away from microcomputers, their highly stable, vertical-integrated structure was completely cut off from the knowledge they would need to compete. The highly connected Silicon Valley firms, on the other hand, thrived.

Studies have found similar patterns in the German auto industry, among currency traders and even in Broadway plays. Wherever we see significant change today, it tends to happen side-to-side in networks rather than top-down in hierarchies.

Flipping The Model

When Barry Libenson first arrived at Experian as Global CIO in 2015, he knew that the job would be a challenge. As one of the world’s largest data companies, with leading positions in the credit, automotive and healthcare markets, the CIO’s role is especially crucial for driving the business. He was also new to the industry and needed to build a learning curve quickly.

So he devoted his first few months at the firm to looking around, talking to people and taking the measure of the place. “I especially wanted to see what our customers had on their roadmap for the next 12-24 months,” he told me and everywhere he went he heard the same thing. They wanted access to real-time data.

As an experienced CIO, Libenson knew a cloud computing architecture could solve that problem, but concerns that would need to be addressed. First, many insiders had concerns that moving from batched processed credit reports to real-time access would undermine Experian’s business model.. There were concerns about cybersecurity. The move would also necessitate a shift to agile product management, which would be controversial.

As CIO, Libenson had a lot of clout and could have, as traditional change management models suggest, created a “sense of urgency” among his fellow senior executives and then gotten a commitment to the change he sought. After the decision had been made, they then would have been able to design a communication campaign to persuade 16,000 employees that the change was a good one. The evidence suggests that effort would have failed.

Instead, he flipped the model and began working with a small team that was already enthusiastic about the move. He created an “API Center of Excellence” to help willing project managers to learn agile development and launch cloud-enabled products. After about a year, the program had gained significant traction and after three years the transformation to the cloud was complete.

Becoming The Change That You Want To See

The practice of change management got its start because businesses needed to adapt. The shift that Burroughs made to electronics was no small thing. Investments needed to be made in equipment, technology, training, marketing and so on. That required a multi-year commitment. Its competitor, NCR, was unable or unwilling to change and paid a dear price for it.

Yet change today looks much more like Experian’s shift to the cloud than it does Burroughs’ move into electronics. It’s hard, if not impossible, to persuade a product manager to make a shift if she’s convinced it will kill her business model, just it’s hard to get a project manager to adopt agile methodologies if she feels she’s been successful with more traditional methods. .

Libenson succeeded at Experian not because he was more persuasive, but because he had a better plan. Instead of trying to convince everyone at once, he focused his efforts on empowering those that were already enthusiastic. As their efforts became successful, others joined them and the program gathered steam. Those that couldn’t keep up got left behind.

The truth is that today we can’t transform organizations unless we transform the people in them and that’s why change management has got to change. It is no longer enough to simply communicate decisions made at the top. Rather, we need to put people at the center and empower them to succeed.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Importance of Long-Term Innovation

Importance of Long-Term Innovation

GUEST POST from Greg Satell

Scientists studying data from Mars recently found that the red planet may have oceans worth of water embedded in its crust in addition to the ice caps at its poles. The finding is significant because, if we are ever to build a colony there, we will need access to water to sustain life and, eventually, to terraform the planet.

While it’s become fashionable for people to lament short-term thinking and “quarterly capitalism,” it’s worth noting that there are a lot of people working on—and a not insignificant amount of money invested in—colonizing another world. Many dedicate entire careers to a goal they do not expect to be achieved in their lifetime.

The truth is that there is no shortage of organizations that are willing to invest for the long-term. In fact, nascent technologies which are unlikely to pay off for years are still able to attract significant investment. The challenge is to come up with a vision that is compelling enough to inspire others, while still being practical enough that you can still make it happen.

The Road to a Miracle Vaccine

When the FDA announced that it was granting an emergency use authorization for Covid-19 vaccines, everybody was amazed at how quickly they were developed. That sense of wonder only increased when it was revealed that they were designed in a mere matter of days. Traditionally, vaccines take years, if not decades to develop.

Yet appearances can be deceiving. What looked like a 10-month sprint to a miracle cure was actually the culmination of a three-decade effort that started in the 90s with a vision of a young researcher named Katalin Karikó, who believed that a molecule called mRNA could hold the key to reprogramming our cells to produce specific protein molecules.

The problem was that, although theoretically once inside the cytoplasm mRNA could instruct our cell machinery to produce any protein we wanted, our bodies tend to reject it. However, working with her colleague Drew Weissman, Karikó figured out that they could slip it past our natural defenses by slightly modifying the mRNA molecule.

It was that breakthrough that led two startup companies, Moderna and BioNTech to license the technology and for investors to back it. Still, it would still take more than a decade and a pandemic before the bet paid off.

The Hard Road of Hard Tech

In the mid-90s when the Internet started to take off, companies with no profits soon began attracting valuations that seemed insane. Yet the economist W. Brian Arthur explained that under certain conditions—namely high initial investment, low or negligible marginal costs and network effects—firms could defy economic gravity and produce increasing returns.

Arthur’s insight paved the way for the incredible success of Silicon Valley’s brand of venture-funded capitalism. Before long, runaway successes such as Yahoo, Amazon and Google made those who invested in the idea of increasing returns a mountain of money.

Yet the Silicon Valley model only works for a fairly narrow slice of technologies, mostly software and consumer gadgets. For other, so-called “hard technologies,” such as biotech, clean tech, materials science and manufacturing 4.0, the approach isn’t effective. There’s no way to rapidly prototype a cure for cancer or a multimillion-dollar piece of equipment.

Still, over the last decade a new ecosystem has been emerging that specifically targets these technologies. Some, like the LEEP programs at the National Laboratories, are government funded. Others, such as Steve Blank’s I-Corps program, focus on training scientists to become entrepreneurs. There are also increasingly investors who specialize in hard tech.

Look closely and you can see a subtle shift taking place. Traditionally, venture investors have been willing to take market risk but not technical risk. In other words, they wanted to see a working prototype, but were willing to take a flyer on whether demand would emerge. This new breed of investors are taking on technical risk on technologies, such as new sources of energy, for which there is little market risk if they can be made to work.

The Quantum Computing Ecosystem

At the end of 2019, Amazon announced Braket, a new quantum computing service that would utilize technologies from companies such as D-Wave, IonQ, and Rigetti. They were not alone. IBM had already been building its network of quantum partners for years which included high profile customers ranging from Goldman Sachs to ExxonMobil to Boeing.

Here’s the catch. Quantum computers can’t be used by anybody for any practical purpose. In fact, there’s nobody on earth who can even tell you definitively how quantum computing should work or exactly what types of problems it can be used to solve. There are, in fact, a number of different approaches being pursued, but none of them have proved out yet.

Nevertheless, an analysis by Nature found that private funding for quantum computing is surging and not just for hardware, but enabling technologies like software and services. The US government has created a $1 billion quantum technology plan and has set up five quantum computing centers at the national labs.

So if quantum computing is not yet a proven technology why is it generating so much interest? The truth is that the smart players understand that the potential of quantum is so massive, and the technology itself so different from anything we’ve ever seen before, that it’s imperative to start early. Get behind and you may never catch up.

In other words, they’re thinking for the long-term.

A Plan Isn’t Enough, You Need To Have A Vision

It’s become fashionable to bemoan the influence of investors and blame them for short-term and “quarterly capitalism,” but that’s just an excuse for failed leadership. If you look at the world’s most valuable companies—the ones investors most highly prize—you’ll find a very different story.

Apple’s Steve Jobs famously disregarded the opinions of investors, (and just about everybody else as well). Amazon’s Jeff Bezos, who habitually keeps margins low in order to increase market share, has long been a Wall Street darling. Microsoft invested heavily in a research division aimed at creating technologies that won’t pan out for years or even decades.

The truth is that it’s not enough to have a long-term plan, you have to have a vision to go along with it. Nobody wants to “wait” for profits, but everybody can get excited about a vision that inspires them. Who doesn’t get thrilled by the possibility of a colony on Mars, miracle cures, revolutionary new materials or a new era of computing?

Here’s the thing: Just because you’re not thinking long-term doesn’t mean somebody else isn’t and, quite frankly, if they are able to articulate a vision to go along with that plan, you don’t stand a chance. You won’t survive. So take some time to look around, to dream a little bit and, maybe, to be inspired to do something worthy of a legacy.

All who wander are not lost.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Change Failure Occurs

Why Change Failure Occurs

GUEST POST from Greg Satell

Never has the need for transformation been so dire or so clear. Still, that’s no guarantee that we will muster the wisdom to make the changes we need to. After all, President Bush warned us about the risks of a global pandemic way back in 2005 and, in the end, we were left wholly vulnerable and exposed.

It’s not like pandemics are the only thing to worry about either. A 2018 climate assessment warns of major economic impacts unless we make some serious shifts. Public debt, already high before the current crisis, is now exploding upwards. Our electricity grid is insecure and vulnerable to cyberattack. The list goes on.

All too often, we assume that mere necessity can drive change forward, yet history has shown that not to be the case. There’s a reason why nations fail and businesses go bankrupt. The truth is that if a change is important, some people won’t like it and they will work to undermine it in underhanded and insidious ways. That’s what we need to overcome.

A Short History Of Change

For most of history, until the industrial revolution, people existed as they had for millennia and could live their entire lives without seeing much change. They farmed or herded for a living, used animals for power and rarely travelled far from home. Even in the 20th century, most people worked in an industry that changed little during their career.

In the 1980s, management consultants began to notice that industries were beginning to evolve more rapidly and firms that didn’t adapt would lose out in the marketplace. One famous case study showed how Burroughs moved aggressively into electronic computing and prospered while its competitor NCR lagged and faded into obscurity.

In 1983, McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management, that described an “adoption penalty” for firms that didn’t adapt to changes in the marketplace quickly enough. His ideas became McKinsey’s first change management model that it sold to clients.

Yet consider that research shows in 1975, during the period Phillips studied, 83% of the average US corporation’s assets were tangible, such as plant, machinery and buildings, while by 2015, 84% of corporate assets were intangible, such as licenses, patents and human capital. In other words, change today involves mostly people, their knowledge and behaviors than it does strategic assets.

Clearly, that changes the game entirely.

What Change Looks Like Today

Think about how America was transformed after World War II. We created the Interstate Highway System to tie our nation together. We established a new scientific infrastructure that made us a technological superpower. We built airports, shopping malls and department stores. We even sent a man to the moon.

Despite the enormous impact of these accomplishments, none of those things demanded that people had to dramatically change their behavior. Nobody had to drive on an Interstate highway, work in a lab, travel in space or move to the suburbs. Many chose to do those things, but others did not and paid little or no penalty for their failure to change with the times.

Today the story is vastly different. A crisis like Covid-19 required us to significantly alter our behavior and, not surprisingly, some people didn’t like it and resisted. We could, as individuals, choose to wear a mask, but if others didn’t follow suit the danger remained. We can, as a society, invest billions in a vaccine, but if a significant portion don’t take it, the virus will continue to mutate at a rapid rate, undermining the effectiveness of the entire enterprise.

Organizations face similar challenges. Sure they invest in tangible assets, such as plant and equipment, but any significant change will involve changing people’s beliefs and behaviors and that is a different matter altogether. Today, even technological transformations have a significant human component.

Making Room For Identity And Dignity

In the early 19th century, a movement of textile workers known as the Luddites smashed machines to protest the new, automated mode of work. As skilled workers, they saw their way of life being destroyed in the name of progress because the new technology could make fabrics faster and cheaper with less workers of lower skill.

Today, “Luddite” has become a pejorative term to describe people who are unable or unwilling to accept technological change. Many observers point out that the rise of industry created new and different jobs and increased overall prosperity. Yet that largely misses the point. Weavers were skilled artisans who worked for years to hone their craft. What they did wasn’t just a job, it was who they were and what they took pride in.

One of the great misconceptions of our modern age is that people make decisions based on rational calculations of utility and that, by engineering the right incentives, we can control behavior. Yet people are far more than economic entities, They crave dignity and recognition, to be valued, in other words, as ends in themselves rather than as merely means to an end.

That’s why changing behaviors can be such a tricky thing. While some may see being told to wear a mask or socially distance as simply doing what “science says,” for others it is an imposition on their identity and dignity from outside their community. Perhaps not surprisingly, they rebel and demand to have their right to choose be recognized.

Building Change On Common Ground

The biggest misconception about change is that once people understand it, they will embrace and so the best way to drive change forward is to explain the need for change in a very convincing and persuasive way. Change, in this view, is essentially a communication exercise and the right combination of words and images is all that is required.

Yet as should be clear by now that is clearly not true. People will often oppose change because it asks them to alter their identity. The Luddites didn’t just oppose textile machinery on economic grounds, but because it failed to recognize their skills as weavers. People don’t necessarily oppose wearing masks because they are “anti-science,” but because they resent having their behavior mandated from outside their community.

In other words, change is always, at some level, about what people value. That’s why to bring change about you need to identify shared values that reaffirm, rather than undermine, people’s sense of identity. Recognition is often a more powerful incentive than even financial rewards. In the final analysis, lasting change always needs to be built on common ground.

Over the next decade, we will undergo some of the most profound shifts in history, encompassing technology, resources, migration patterns and demography and, if we are to compete, we will need to achieve enormous transformation in business and society. Whether we are able to do that or not depends less on economics or “science” than it does on our ability to trust each other again.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.