Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

We Need a More Biological View of Technology

We Need a More Biological View of Technology

GUEST POST from Greg Satell

It’s no accident that Mary Shelley’s novel, Frankenstein, was published in the early 19th century, at roughly the same time as the Luddite movement was gaining momentum. It was in that moment that people first began to take stock of the technological advances that brought about the first Industrial Revolution.

Since then we have seemed to oscillate between techno-utopianism and dystopian visions of machines gone mad. For every “space odyssey” promising an automated, enlightened future, there seems to be a “Terminator” series warning of our impending destruction. Neither scenario has ever come to pass and it is unlikely that either ever will.

What both the optimists and the Cassandras miss is that technology is not something that exists independently from us. It is, in fact, intensely human. We don’t merely build it, but continue to nurture it through how we develop and shape ecosystems. We need to go beyond a simple engineering mindset and focus on a process of revealing, building and emergence.

1. Revealing

World War II brought the destructive potential of technology to the fore of human consciousness. As deadly machines ravaged Europe and bombs of unimaginable power exploded in Asia, the whole planet was engulfed in a maelstrom of human design. It seemed that the technology we had built had become a modern version of Frankenstein’s monster, destined from the start to turn on its master.

Yet the German philosopher Martin Heidegger saw things differently. In his 1954 essay, The Question Concerning Technology, he described technology as akin to art, in that it reveals truths about the nature of the world, brings them forth and puts them to some specific use. In the process, human nature and its capacity for good and evil are also revealed.

He offers the example of a hydroelectric dam, which uncovers a river’s energy and puts it to use making electricity. In much the same sense, Mark Zuckerberg did not so much “build” a social network at Facebook, but took natural human tendencies and channeled them in a particular way. That process of channeling, in turn, reveals even more.

That’s why, as I wrote in Mapping Innovation, innovation is not about coming up with new ideas, but identifying meaningful problems. It’s through exploring tough problems that we reveal new things and those new things can lead to important solutions. All who wander are not lost.

2. Building

The concept of revealing would seem to support the view of Shelley and the Luddites. It suggests that once a force is revealed, we are powerless to shape its trajectory. J. Robert Oppenheimer, upon witnessing the world’s first nuclear explosion as it shook the plains of New Mexico, expressed a similar view. “Now I am become Death, the destroyer of worlds,” he said, quoting the Bhagavad Gita.

Yet in another essay, Building Dwelling Thinking, Heideggar explains that what we build for the world is highly dependent on our interpretation of what it means to live in it. The relationship is, of course, reflexive. What we build depends on how we wish to dwell and that act, in and of itself, shapes how we build further.

Again, Mark Zuckerberg and Facebook are instructive. His insight into human nature led him to build his platform based on what he saw as The Hacker Way and resolved to “move fast and break things.” Unfortunately, that approach led to his enterprise becoming highly vulnerable to schemes by actors such as Cambridge Analytica and the Russian GRU.

Yet technology is not, by itself, determinant. Facebook is, to a great extent, the result of conscious choices that Mark Zuckerberg made. If he had a different set of experiences than that of a young, upper-middle-class kid who had never encountered a moment of true danger in his life, he may have been more cautious and chosen differently.

History has shown that those who build powerful technologies can play a vital role in shaping how they are used. Many of the scientists of Oppenheimer’s day became activists, preparing a manifesto that highlighted the dangers of nuclear weapons, which helped lead to the Partial Test Ban Treaty. In much the same way, the Asilomar Conference, held in 1975, led to important constraints on genomic technologies.

3. Emergence

No technology stands alone, but combines with other technologies to form systems. That’s where things get confusing because when things combine and interact they become more complex. As complexity theorist Sam Arbesman explained in his book, Overcomplicated, this happens because of two forces inherent to the way that technologies evolve.

The first is accretion. A product such as an iPhone represents the accumulation of many different technologies, including microchips, programming languages, gyroscopes, cameras, touchscreens and lithium ion batteries, just to name a few. As we figure out more tasks an iPhone can perform, more technologies are added, building on each other.

The second force is interaction. Put simply, much of the value of an iPhone is embedded in how it works with other technologies to make tasks easier. We want to use it to access platforms such as Facebook to keep in touch with friends, Yelp so that we can pick out a nice restaurant where we can meet them and Google Maps to help us find the place. These interactions, combined with accretion, create an onward march towards greater complexity.

It is through ever increasing complexity that we lose control. Leonard Read pointed out in his classic essay, I, Pencil, that even an object as simple as a pencil is far too complex for any single person to produce by themselves. A smartphone—or even a single microchip—is exponentially more complex.

People work their entire lives to become experts on even a minor aspect of a technology like an iPhone, a narrow practice of medicine or an obscure facet of a single legal code. As complexity increases, so does specialization, making it even harder for any one person to see the whole picture.

Shaping Ecosystems And Taking A Biological View

In 2013, I wrote that we are all Luddites now, because advances in artificial intelligence had become so powerful that anyone who wasn’t nervous didn’t really understand what was going on. Today, as we enter a new era of innovation and technologies become infinitely more powerful, we are entering a new ethical universe.

Typically, the practice of modern ethics has been fairly simple: Don’t lie, cheat or steal. Yet with many of our most advanced technologies, such as artificial intelligence and genetic engineering, the issue isn’t so much about doing the right thing, but figuring out what the right thing is when the issues are novel, abstruse and far reaching.

What’s crucial to understand, however, is that it’s not any particular invention, but ecosystems that create the future. The Luddites were right to fear textile mills, which did indeed shatter their way of life. However the mill was only one technology, when combined with other inventions, such as agricultural advances, labor unions and modern healthcare, lives greatly improved.

Make no mistake, our future will be shaped by our own choices, which is why we need to abandon our illusions of control. We need to shift from an engineering mindset, where we try to optimize for a limited set of variables and take a more biological view, growing and shaping ecosystems of talent, technology, information and cultural norms.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

To Change the World You Must First Learn Something About It

To Change the World You Must First Learn Something About It

GUEST POST from Greg Satell

Anybody who has waited for a traffic light to change, in the middle of the night at an empty intersection, knows the urge to rebel. There is always a tension between order and freedom. While we intuitively understand the need for order to constrain others, we yearn for the freedom to do what we want and to seek out a vision and sense of meaning in our lives.

Yet as we have seen over the past decade, attempts to overturn the existing order usually fail. The Tea Party erupted in 2009, but had mostly sputtered out by 2014. #Occupy protests and Black Lives Matter sent people into the streets, but achieved little, if anything. Silicon Valley “unicorns” like WeWork routinely go up in flames.

Not all revolutions flop, though. In fact, some succeed marvelously. What has struck me after researching transformational change over nearly two decades is how similar successful efforts are. They all experience failures along the way. What makes the difference is their ability to learn, adapt and change along the way. That’s what allows them to prevail.

Five Kids Meet In A Cafe

One day in 1998, a group of five friends met in a cafe in Belgrade. Although still in their 20s, they were already experienced activists and most of what they experienced was failure. They had taken part in student protests against the war in Bosnia in 1992, as well in the larger uprisings in response to election fraud in 1996. Neither had achieved much.

Having had time to reflect on their successes and failures, they hatched a new plan. They knew from their earlier efforts that they could mobilize people and get them to the polls for the presidential election in 2000. They also knew that Slobodan Milošević, who ruled the country with an iron hand, would try and steal the election, just as he did in 2006.

So that’s what they planned for.

The next day, six friends joined the five from the previous day and, together, they formed the original 11 members of Otpor, the movement that would topple the Milošević regime. They began slowly at first, performing pranks and street theater. But within two years it grew to over 70,000 members, with chapters all over Serbia. Milošević was ousted in the Bulldozer revolution in 2000. He would die in his prison cell at The Hague in 2006.

What Otpor came to understand is that it takes small groups, loosely connected, but united by a shared purpose to drive transformational change. The organization was almost totally decentralized, with just a basic “network meeting” to share best practices every two weeks. Nevertheless, by empowering those smaller groups and giving them a shared sense of mission, they were able to prevail over seemingly impossible odds.

Three Mid-Level executives See A Problem That Needs Fixing

In 2017, John Gadsby and two colleagues in Procter & Gamble’s research organization saw that there was a problem. Although cutting-edge products were being developed all around them, the processes at the 180 year-old firm were often antiquated, making it sometimes difficult to get even simple things done.

So they decided to do something about it. They chose a single process, which involved setting up experiments to test new product technologies. It usually took weeks and was generally considered a bottleneck. Utilizing digital tools, however, they were able to hone it down to just a few hours. It was a serious accomplishment and the three were recognized with a “Pathfinder” award by the company CTO.

Every change starts out with a grievance, such as the annoyance of being bogged down by inefficient processes. The first step forward is to come up with a vision for how you would like things to be different. However, you can never get there in a single step, which is why you need to identify a single keystone change to show others that change is really possible.

That’s exactly what the team at P&G did. Once they showed that one process could be dramatically improved, they were able to get the resources to start improving others. Today, more than 2,500 of their colleagues have joined their movement for process improvement, called PxG, and more than 10,000 have used their applications platform.

As PxG has grown it has also been able to effectively partner with other likeminded initiatives within the company, reinforcing not only its own vision, but those of others that share its values as well.

The One Engineer Who Simply Refused To Take “No” For An Answer

In the late 1960’s, Gary Starkweather was in trouble with his boss. As an engineer in Xerox’s long-range xerography unit, he saw that laser printing could be a huge business opportunity. Unfortunately, his manager at the company’s research facility in upstate New York was focused on improving the current product line, not looking to start a new one.

The argument got so heated that Starkweather’s job came to be in jeopardy. Fortunately, his rabble-rousing caught the attention of another division within the company, the Palo Alto Research Center (PARC), which was less interested in operational efficiency than inventing an entirely new future. They eagerly welcomed Starkweather into their ranks with open arms.

Unlike his old lab, PARC’s entire mission was to create the future. One of the technologies it had developed, bitmapping, would revolutionize computer graphics, but there was no way to print the images out. Starkweather’s work was exactly what they were looking for and, with the Xerox’s copier business in decline, would eventually save the company.

The truth is that good ideas fail all the time and it often has little to do with the quality of the idea, the passion of those who hold it or its potential impact, but rather who you choose to start with. In the New York lab, few people bought into Starkweather’s idea, but in Palo Alto, almost everyone did. In that fertile ground, it was able to grow, mature and triumph.

When trying to get traction for an idea, you always want to be in the majority, even if it is only a local majority comprising a handful of people. You can always expand a small majority out, but once you are in the minority you will get immediate pushback and will need to retrench.

The Secret to Subversion

Through my work, I’ve gotten to know truly revolutionary people. My friend Srdja Popović was one of the original founders of Otpor and has gone on to train activists in more than 50 countries. Jim Allison won a Nobel Prize for discovering Cancer Immunotherapy. Yassmin Abdel-Magied has become an important voice for diversity, equity and inclusion. Many others I profiled in my books, Mapping Innovation and Cascades.

What has always struck me is how different real revolutionaries are from the mercurial, ego-driven stereotypes Hollywood loves to sell us. The truth is that all of those mentioned above are warm, friendly and genuinely nice people who are a pleasure to be around (or were, Gary Starkweather recently passed).

What I’ve found over the years is that sense of openness helped them succeed where others failed. In fact, evidence suggests that generosity is often a competitive advantage for very practical reasons. People who are friendly and generous tend to build up strong networks of collaborators, who provide crucial support for getting an idea off the ground.

But most of all it was that sense of openness that allowed them to learn, adapt and identify a path to victory. Changing the world is hard, often frustrating work. Nobody comes to the game with all the answers. In the final analysis, it’s what you learn along the way—and your ability to change yourself in response to what you learn—that makes the difference between triumph and bitter, agonizing failure.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Competing in a New Era of Innovation

Competing in a New Era of Innovation

GUEST POST from Greg Satell

In 1998, the dotcom craze was going at full steam and it seemed like the entire world was turning upside down. So people took notice when economist Paul Krugman wrote that “by 2005 or so, it will become clear that the internet’s impact on the economy has been no greater than the fax machine’s.”

He was obviously quite a bit off base, but these types of mistakes are incredibly common. As the futurist Roy Amara famously put it, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” The truth is that it usually takes about 30 years for a technology to go from an initial discovery to a measurable impact.

Today, as we near the end of the digital age and enter a new era of innovation, Amara’s point is incredibly important to keep in mind. New technologies, such as quantum computing, blockchain and gene editing will be overhyped, but really will change the world, eventually. So we need to do more than adapt, we need to prepare for a future we can’t see yet.

Identify A “Hair-On-Fire” Use Case

Today we remember the steam engine for powering factories and railroads. In the process, it made the first industrial revolution possible. Yet that’s not how it started out. Its initial purpose was to pump water out of coal mines. At the time, it would have been tough to get people to imagine a factory that didn’t exist yet, but pretty easy for owners to see that their mine was flooded.

The truth is that innovation is never really about ideas, it’s about solving problems. So when a technology is still nascent, doesn’t gain traction in a large, established market, which by definition is already fairly well served, but in a hair-on-fire use case — a problem that somebody needs solved so badly that they almost literally have their hair on fire.

Early versions of the steam engine, such as Thomas Newcomen’s version, didn’t work well and were ill-suited to running factories or driving locomotives. Still, flooded mines were a major problem, so many were more tolerant of glitches and flaws. Later, after James Watt perfected the steam engine, it became more akin to technology that remember now.

We can see the same principle at work today. Blockchain has not had much impact as an alternative currency, but has gained traction optimizing supply chains. Virtual reality has not really caught on in the entertainment industry, but is making headway in corporate training. That’s probably not where those technologies will end up, but it’s how they make money now.

So in the early stages of a technology, don’t try to imagine how a perfected version fit in, find a problem that somebody needs solved so badly right now that they are willing to put up with some inconvenience.

The truth is that the “next big thing” never turns out like people think it will. Putting a man on the moon, for example, didn’t lead to flying cars like in the Jetsons, but instead to satellites that bring events to us from across the world, help us navigate to the corner store and call our loved ones from a business trip.

Build A Learning Curve

Things that change the world always start out arrive out of context, for the simple reason that the world hasn’t changed yet. So when a new technology first appears, we don’t really know how to use it. It takes time to learn how to leverage its advantages to create an impact.

Consider electricity, which as the economist Paul David explained in a classic paper, was first used in factories to cut down on construction costs (steam engines were heavy and needed extra bracing). What wasn’t immediately obvious was that electricity allowed factories to be designed to optimize workflow, rather than having to be arranged around the power source.

We can see the same forces at work today. Consider Amazon’s recent move to offer quantum computing to its customers through the cloud, even though the technology is so primitive that it has no practical application. Nevertheless, it is potentially so powerful—and so different from digital computing—that firms are willing to pay for the privilege of experimenting with it.

The truth is that it’s better to prepare than it is to adapt. When you are adapting you are, by definition, already behind. That’s why it’s important to build a learning curve early, before a technology has begun to impact your business.

Beware Of Switching Costs

When we look back today, it seems incredible that it took decades for factories to switch from steam to electricity. Besides the extra construction costs to build extra bracing, steam engines were dirty and inflexible. Every machine in the factory needed to be tied to one engine, so if one broke down or needed maintenance, the whole factory had to be shut down.

However, when you look at the investment from the perspective of a factory owner, things aren’t so clear cut. While electricity was relatively more attractive when building a new factory, junking an existing facility to make way for a new technology didn’t make as much sense. So most factory owners kept what they had.

These types of switching costs still exist today. Consider neuromorphic chips, which are based on the architecture of the human brain and therefore highly suited to artificial intelligence. They are also potentially millions of times more energy efficient than conventional chips. However, existing AI chips also perform very well, can be manufactured in conventional fabs and run conventional AI algorithms, so neuromorphic chips haven’t caught on yet.

All too often, when a new technology emerges we only look at how its performance compares to what exists today and ignore the importance of switching costs—both real and imagined. That’s a big part of the reason we underestimate how long a technology takes to gain traction and underestimate how much impact it will have in the long run.

Find Your Place In The Ecosystem

We tend to see history through the lens of inventions: Watt and his steam engine. Edison and his light bulb. Ford and his assembly line. Yet building a better mousetrap is never enough to truly change the world. Besides the need to identify a use case, build a learning curve and overcome switching costs, every new technology needs an ecosystem to truly drive the future.

Ford’s automobiles needed roads and gas stations, which led to supermarkets, shopping malls and suburbs. Electricity needed secondary inventions, such as home appliances and radios, which created a market for skilled technicians. It is often in the ecosystem, rather than the initial invention, where most of the value is produced.

Today, we can see similar ecosystems beginning to form around emerging technologies. The journal Nature published an analysis which showed that over $450 million was invested in more than 50 quantum startups between 2012 and 2018, but only a handful are actually making quantum computers. The rest are helping to build out the ecosystem.

So for most of us, the opportunities in the post-digital era won’t be creating new technologies themselves, but in the ecosystems they create. That’s where we’ll see new markets emerge, new jobs created and new fortunes to be made.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovation Requires Going Fast, Slow and Meta

Innovation Requires Going Both Fast and Slow

GUEST POST from Greg Satell

In the regulatory filing for Facebook’s 2012 IPO, Mark Zuckerberg included a letter outlining his management philosophy. Entitled, The Hacker Way, it encapsulated much of the zeitgeist. “We have a saying,” he wrote. “‘Move fast and break things.’ The idea is that if you never break anything, you’re probably not moving fast enough.”

At around the same time, Katalin Karikó was quietly plodding away in her lab at the University of Pennsylvania. She had been working on an idea since the early 1990s and it hadn’t amounted to much so far, but was finally beginning to attract some interest. The next year she would join a small startup named BioNTech to commercialize her work and would continue to chip at the problem.

Things would accelerate in early 2020, when Karikó’s mRNA technology was used to design a coronavirus vaccine in a matter of mere hours. Just as Daniel Kahneman explained that there are fast and slow modes of thinking, the same can be said about innovating. The truth is that moving slowly is often underrated and that moving fast can sometimes bog you down.

The Luxury Of Stability

Mark Zuckerberg had the luxury of being disruptive because he was working in a mature, stable environment. His “Hacker Way” letter showed a bias for action over deliberation in the form of “shipping code,” because he had little else to worry about. Facebook could be built fast, because it was built on top of technology that was slowly developed over decades.

The origins of modern computing are complex, with breakthroughs in multiple fields eventually converging into a single technology. Alan Turing and Claude Shannon provided much of the theoretical basis for digital computing in the 1930s and 40s. Yet the vacuum tube technology at the time only allowed for big, clunky machines that were very limited.

A hardware breakthrough came in 1948, when John Bardeen, William Shockley and Walter Brattain invented the transistor, followed by Jack Kilby and Robert Noyce’s development of the integrated circuit in the late 1960s. The first computers were connected to the Internet a decade later and, a generation after that, Tim Berners-Lee invented the World Wide Web.

All of this happened very slowly but, by the time Mark Zuckerberg became aware of it all, it was just part of the landscape. Much like older generations grew up with the Interstate Highway System and took for granted that they could ride freely on it, Millennial hackers grew up in a period of technological, not to mention political, stability.

The Dangers Of Disruption

Mark Zuckerberg founded Facebook with a bold idea. “We believe that a more open world is a better world because people with more information can make better decisions and have a greater impact,” he wrote. That vision was central to how he built the company and its products. He believed that enabling broader and more efficient communication would foster a deeper and more complete understanding.

Yet the world looks much different when your vantage point is a technology company in Menlo Park, California then it does from, say, a dacha outside Moscow. If you are an aging authoritarian who is somewhat frustrated by your place in the world rather than a young, hubristic entrepreneur, you may take a dimmer view on things.

For many, if not most, people on earth, the world is often a dark and dangerous place and the best defense is often to go on offense. From that vantage point, an open information system is less an opportunity to promote better understanding and more of a vulnerability you can leverage to exploit your enemy.

In fact, the House of Representatives Committee on Intelligence found that agents of the Russian government used the open nature of Facebook and other social media outlets to spread misinformation and sow discord. That’s the problem with moving fast and breaking things. If you’re not careful, you inevitably end up breaking something important.

This principle will become even more important in the years ahead as the potential for serious disruption increases markedly.

The Four Disruptive Shifts Of The Next Decade

While the era that shaped millennials like Mark Zuckerberg was mostly stable, the next decade is likely to be one of the most turbulent in history, with massive shifts in demography, resources, technology and migration. Each one of these has the potential to be destabilizing, the confluence of all four courts disaster and demands that we tread carefully.

Consider the demographic shift caused by the Millennials and Gen Z’ers coming of age. The last time we had a similar generational transition was with the Baby Boomers in the 1960s, which saw more than its share of social and political strife. The shift in values that will take place over the next ten years or so is likely to be similar in scale and scope.

Yet that’s just the start. We will also be shifting in resources from fossil fuels to renewables, in technology from bits to atoms and in migration globally from south to north and from rural to urban areas. The last time we had so many important structural changes going on at once it was the 1920s and that, as we should remember, did not turn out well.

It’s probably no accident that today, much like a century ago, we seem to yearn for “a return to normalcy.” The past two decades have been exhausting, with global terrorism, a massive financial meltdown and now a pandemic fraying our nerves and heightening our sense of vulnerability.

Still, I can’t help feeling that the lessons of the recent past can serve us well in creating a better future.

We Need To Rededicate Ourselves Tackling Grand Challenges

In Daniel Kahneman’s book, Thinking, Fast and Slow, he explained that we have two modes of thinking. The first is fast and intuitive. The second is slow and deliberative. His point wasn’t that one was better than the other, but that both have their purpose and we need to learn how to use both effectively. In many ways, the two go hand-in-hand.

One thing that is often overlooked is that to think fast effectively often takes years of preparation. Certain professions, such as surgeons and pilots, train for years to hone their instincts so that they will be able to react quickly and appropriately in an emergency. In many ways, you can’t think fast without first having thought slow.

Innovation is the same way. We were able to develop coronavirus vaccines in record time because of the years of slow, painstaking work by Katalin Karikó and others like her, much like how Mark Zuckerberg was able to “move fast and break things” because of the decades of breakthroughs it took to develop the technology that he “hacked.”

Today, as the digital era is ending, we need to rededicate ourselves to innovating slow. Just as our investment in things like the human genome project has returned hundreds of times what we put into it, our investment in the grand challenges of the future will enable countless new (hopefully more modest) Zuckerbergs to wax poetic about “hacker culture.”

Innovation is never a single event. It is a process of discovery, engineering and transformation and those things never happen in one place or at one time. That’s why we need to innovate fast and slow, build healthy collaborations and set our sights a bit higher.

— Article courtesy of the Digital Tonto blog
— Image credit: Wikimedia Commons

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Change Management Needs to Change

Change Management Needs to Change

GUEST POST from Greg Satell

In 1983, McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management, that described an ‘adoption penalty’ for firms that didn’t adapt to changes in the marketplace quickly enough. His ideas became McKinsey’s first change management model that it sold to clients.

But consider that research shows in 1975, during the period Phillips studied, 83% of the average US corporation’s assets were tangible assets, such as plant, machinery and buildings, while by 2015, 84% of corporate assets were intangible, such as licenses, patents and research. Clearly, that changes how we need to approach transformation.

When your assets are tangible, change is about making strategic decisions, such as building factories, buying new equipment and so on. Yet when your assets are intangible, change is connected to people—what they believe, how they think and how they act. That’s a very different matter and we need to reexamine how we approach transformation and change.

The Persuasion Model Of Change

Phillips’ point of reference for his paper on organizational change was a comparison of two companies, NCR and Burroughs, and how they adapted to changes in their industry between 1960 and 1975. Phillips was able to show that during that time, NCR paid a high price for its inability to adapt to change while it’s competitor, Burroughs prospered.

He then used that example to outline a general four-part model for change:

  • Creating a sense of concern
  • Developing a specific commitment to change
  • Pushing for major change
  • Reinforcing and consolidating the new course

Phillips’ work kicked off a number of similar approaches, the most famous of which is probably Kotter’s 8-step model. Yet despite the variations, the all follow a similar pattern. First you need to create a sense of urgency, then you devise a vision for change, communicate the need for it effectively and convince others to go along.

The fundamental assumption of these models, is that if people understand the change that you seek, they will happily go along. Yet my research indicates exactly the opposite. In fact, it turns out that people don’t like change and will often work actively to undermine it. Merely trying to be more persuasive is unlikely get you very far.

This is even more true when the target of the change is people themselves than when the change involves some sort of strategic asset. That’s probably why more recent research from McKinsey has found that only 26% of organizational transformations succeed.

Shifting From Hierarchies To Networks

Clearly, the types of assets that make up an enterprise aren’t the only thing that has changed over the past half-century. The structure of our organizations has also shifted considerably. The firms of Phillips’ and Kotter’s era were vlargely hierarchical. Strategic decisions were made at the top and carried out by others below.

Yet there is significant evidence that suggests that networks outperform hierarchies. For example, in Regional Advantage AnnaLee Saxenian explains that Boston-based technology firms, such as DEC and Data General, were vertically integrated and bound employees through non-compete contracts. Their Silicon Valley competitors such as Hewlett Packard and Sun Microsystems, on the other hand, embraced open technologies, built alliances and allowed their people to job hop.

The Boston-based companies, which dominated the microcomputer industry, were considered to be very well managed, highly efficient and innovative firms. However, when technology shifted away from microcomputers, their highly stable, vertical-integrated structure was completely cut off from the knowledge they would need to compete. The highly connected Silicon Valley firms, on the other hand, thrived.

Studies have found similar patterns in the German auto industry, among currency traders and even in Broadway plays. Wherever we see significant change today, it tends to happen side-to-side in networks rather than top-down in hierarchies.

Flipping The Model

When Barry Libenson first arrived at Experian as Global CIO in 2015, he knew that the job would be a challenge. As one of the world’s largest data companies, with leading positions in the credit, automotive and healthcare markets, the CIO’s role is especially crucial for driving the business. He was also new to the industry and needed to build a learning curve quickly.

So he devoted his first few months at the firm to looking around, talking to people and taking the measure of the place. “I especially wanted to see what our customers had on their roadmap for the next 12-24 months,” he told me and everywhere he went he heard the same thing. They wanted access to real-time data.

As an experienced CIO, Libenson knew a cloud computing architecture could solve that problem, but concerns that would need to be addressed. First, many insiders had concerns that moving from batched processed credit reports to real-time access would undermine Experian’s business model.. There were concerns about cybersecurity. The move would also necessitate a shift to agile product management, which would be controversial.

As CIO, Libenson had a lot of clout and could have, as traditional change management models suggest, created a “sense of urgency” among his fellow senior executives and then gotten a commitment to the change he sought. After the decision had been made, they then would have been able to design a communication campaign to persuade 16,000 employees that the change was a good one. The evidence suggests that effort would have failed.

Instead, he flipped the model and began working with a small team that was already enthusiastic about the move. He created an “API Center of Excellence” to help willing project managers to learn agile development and launch cloud-enabled products. After about a year, the program had gained significant traction and after three years the transformation to the cloud was complete.

Becoming The Change That You Want To See

The practice of change management got its start because businesses needed to adapt. The shift that Burroughs made to electronics was no small thing. Investments needed to be made in equipment, technology, training, marketing and so on. That required a multi-year commitment. Its competitor, NCR, was unable or unwilling to change and paid a dear price for it.

Yet change today looks much more like Experian’s shift to the cloud than it does Burroughs’ move into electronics. It’s hard, if not impossible, to persuade a product manager to make a shift if she’s convinced it will kill her business model, just it’s hard to get a project manager to adopt agile methodologies if she feels she’s been successful with more traditional methods. .

Libenson succeeded at Experian not because he was more persuasive, but because he had a better plan. Instead of trying to convince everyone at once, he focused his efforts on empowering those that were already enthusiastic. As their efforts became successful, others joined them and the program gathered steam. Those that couldn’t keep up got left behind.

The truth is that today we can’t transform organizations unless we transform the people in them and that’s why change management has got to change. It is no longer enough to simply communicate decisions made at the top. Rather, we need to put people at the center and empower them to succeed.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Importance of Long-Term Innovation

Importance of Long-Term Innovation

GUEST POST from Greg Satell

Scientists studying data from Mars recently found that the red planet may have oceans worth of water embedded in its crust in addition to the ice caps at its poles. The finding is significant because, if we are ever to build a colony there, we will need access to water to sustain life and, eventually, to terraform the planet.

While it’s become fashionable for people to lament short-term thinking and “quarterly capitalism,” it’s worth noting that there are a lot of people working on—and a not insignificant amount of money invested in—colonizing another world. Many dedicate entire careers to a goal they do not expect to be achieved in their lifetime.

The truth is that there is no shortage of organizations that are willing to invest for the long-term. In fact, nascent technologies which are unlikely to pay off for years are still able to attract significant investment. The challenge is to come up with a vision that is compelling enough to inspire others, while still being practical enough that you can still make it happen.

The Road to a Miracle Vaccine

When the FDA announced that it was granting an emergency use authorization for Covid-19 vaccines, everybody was amazed at how quickly they were developed. That sense of wonder only increased when it was revealed that they were designed in a mere matter of days. Traditionally, vaccines take years, if not decades to develop.

Yet appearances can be deceiving. What looked like a 10-month sprint to a miracle cure was actually the culmination of a three-decade effort that started in the 90s with a vision of a young researcher named Katalin Karikó, who believed that a molecule called mRNA could hold the key to reprogramming our cells to produce specific protein molecules.

The problem was that, although theoretically once inside the cytoplasm mRNA could instruct our cell machinery to produce any protein we wanted, our bodies tend to reject it. However, working with her colleague Drew Weissman, Karikó figured out that they could slip it past our natural defenses by slightly modifying the mRNA molecule.

It was that breakthrough that led two startup companies, Moderna and BioNTech to license the technology and for investors to back it. Still, it would still take more than a decade and a pandemic before the bet paid off.

The Hard Road of Hard Tech

In the mid-90s when the Internet started to take off, companies with no profits soon began attracting valuations that seemed insane. Yet the economist W. Brian Arthur explained that under certain conditions—namely high initial investment, low or negligible marginal costs and network effects—firms could defy economic gravity and produce increasing returns.

Arthur’s insight paved the way for the incredible success of Silicon Valley’s brand of venture-funded capitalism. Before long, runaway successes such as Yahoo, Amazon and Google made those who invested in the idea of increasing returns a mountain of money.

Yet the Silicon Valley model only works for a fairly narrow slice of technologies, mostly software and consumer gadgets. For other, so-called “hard technologies,” such as biotech, clean tech, materials science and manufacturing 4.0, the approach isn’t effective. There’s no way to rapidly prototype a cure for cancer or a multimillion-dollar piece of equipment.

Still, over the last decade a new ecosystem has been emerging that specifically targets these technologies. Some, like the LEEP programs at the National Laboratories, are government funded. Others, such as Steve Blank’s I-Corps program, focus on training scientists to become entrepreneurs. There are also increasingly investors who specialize in hard tech.

Look closely and you can see a subtle shift taking place. Traditionally, venture investors have been willing to take market risk but not technical risk. In other words, they wanted to see a working prototype, but were willing to take a flyer on whether demand would emerge. This new breed of investors are taking on technical risk on technologies, such as new sources of energy, for which there is little market risk if they can be made to work.

The Quantum Computing Ecosystem

At the end of 2019, Amazon announced Braket, a new quantum computing service that would utilize technologies from companies such as D-Wave, IonQ, and Rigetti. They were not alone. IBM had already been building its network of quantum partners for years which included high profile customers ranging from Goldman Sachs to ExxonMobil to Boeing.

Here’s the catch. Quantum computers can’t be used by anybody for any practical purpose. In fact, there’s nobody on earth who can even tell you definitively how quantum computing should work or exactly what types of problems it can be used to solve. There are, in fact, a number of different approaches being pursued, but none of them have proved out yet.

Nevertheless, an analysis by Nature found that private funding for quantum computing is surging and not just for hardware, but enabling technologies like software and services. The US government has created a $1 billion quantum technology plan and has set up five quantum computing centers at the national labs.

So if quantum computing is not yet a proven technology why is it generating so much interest? The truth is that the smart players understand that the potential of quantum is so massive, and the technology itself so different from anything we’ve ever seen before, that it’s imperative to start early. Get behind and you may never catch up.

In other words, they’re thinking for the long-term.

A Plan Isn’t Enough, You Need To Have A Vision

It’s become fashionable to bemoan the influence of investors and blame them for short-term and “quarterly capitalism,” but that’s just an excuse for failed leadership. If you look at the world’s most valuable companies—the ones investors most highly prize—you’ll find a very different story.

Apple’s Steve Jobs famously disregarded the opinions of investors, (and just about everybody else as well). Amazon’s Jeff Bezos, who habitually keeps margins low in order to increase market share, has long been a Wall Street darling. Microsoft invested heavily in a research division aimed at creating technologies that won’t pan out for years or even decades.

The truth is that it’s not enough to have a long-term plan, you have to have a vision to go along with it. Nobody wants to “wait” for profits, but everybody can get excited about a vision that inspires them. Who doesn’t get thrilled by the possibility of a colony on Mars, miracle cures, revolutionary new materials or a new era of computing?

Here’s the thing: Just because you’re not thinking long-term doesn’t mean somebody else isn’t and, quite frankly, if they are able to articulate a vision to go along with that plan, you don’t stand a chance. You won’t survive. So take some time to look around, to dream a little bit and, maybe, to be inspired to do something worthy of a legacy.

All who wander are not lost.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Change Failure Occurs

Why Change Failure Occurs

GUEST POST from Greg Satell

Never has the need for transformation been so dire or so clear. Still, that’s no guarantee that we will muster the wisdom to make the changes we need to. After all, President Bush warned us about the risks of a global pandemic way back in 2005 and, in the end, we were left wholly vulnerable and exposed.

It’s not like pandemics are the only thing to worry about either. A 2018 climate assessment warns of major economic impacts unless we make some serious shifts. Public debt, already high before the current crisis, is now exploding upwards. Our electricity grid is insecure and vulnerable to cyberattack. The list goes on.

All too often, we assume that mere necessity can drive change forward, yet history has shown that not to be the case. There’s a reason why nations fail and businesses go bankrupt. The truth is that if a change is important, some people won’t like it and they will work to undermine it in underhanded and insidious ways. That’s what we need to overcome.

A Short History Of Change

For most of history, until the industrial revolution, people existed as they had for millennia and could live their entire lives without seeing much change. They farmed or herded for a living, used animals for power and rarely travelled far from home. Even in the 20th century, most people worked in an industry that changed little during their career.

In the 1980s, management consultants began to notice that industries were beginning to evolve more rapidly and firms that didn’t adapt would lose out in the marketplace. One famous case study showed how Burroughs moved aggressively into electronic computing and prospered while its competitor NCR lagged and faded into obscurity.

In 1983, McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management, that described an “adoption penalty” for firms that didn’t adapt to changes in the marketplace quickly enough. His ideas became McKinsey’s first change management model that it sold to clients.

Yet consider that research shows in 1975, during the period Phillips studied, 83% of the average US corporation’s assets were tangible, such as plant, machinery and buildings, while by 2015, 84% of corporate assets were intangible, such as licenses, patents and human capital. In other words, change today involves mostly people, their knowledge and behaviors than it does strategic assets.

Clearly, that changes the game entirely.

What Change Looks Like Today

Think about how America was transformed after World War II. We created the Interstate Highway System to tie our nation together. We established a new scientific infrastructure that made us a technological superpower. We built airports, shopping malls and department stores. We even sent a man to the moon.

Despite the enormous impact of these accomplishments, none of those things demanded that people had to dramatically change their behavior. Nobody had to drive on an Interstate highway, work in a lab, travel in space or move to the suburbs. Many chose to do those things, but others did not and paid little or no penalty for their failure to change with the times.

Today the story is vastly different. A crisis like Covid-19 required us to significantly alter our behavior and, not surprisingly, some people didn’t like it and resisted. We could, as individuals, choose to wear a mask, but if others didn’t follow suit the danger remained. We can, as a society, invest billions in a vaccine, but if a significant portion don’t take it, the virus will continue to mutate at a rapid rate, undermining the effectiveness of the entire enterprise.

Organizations face similar challenges. Sure they invest in tangible assets, such as plant and equipment, but any significant change will involve changing people’s beliefs and behaviors and that is a different matter altogether. Today, even technological transformations have a significant human component.

Making Room For Identity And Dignity

In the early 19th century, a movement of textile workers known as the Luddites smashed machines to protest the new, automated mode of work. As skilled workers, they saw their way of life being destroyed in the name of progress because the new technology could make fabrics faster and cheaper with less workers of lower skill.

Today, “Luddite” has become a pejorative term to describe people who are unable or unwilling to accept technological change. Many observers point out that the rise of industry created new and different jobs and increased overall prosperity. Yet that largely misses the point. Weavers were skilled artisans who worked for years to hone their craft. What they did wasn’t just a job, it was who they were and what they took pride in.

One of the great misconceptions of our modern age is that people make decisions based on rational calculations of utility and that, by engineering the right incentives, we can control behavior. Yet people are far more than economic entities, They crave dignity and recognition, to be valued, in other words, as ends in themselves rather than as merely means to an end.

That’s why changing behaviors can be such a tricky thing. While some may see being told to wear a mask or socially distance as simply doing what “science says,” for others it is an imposition on their identity and dignity from outside their community. Perhaps not surprisingly, they rebel and demand to have their right to choose be recognized.

Building Change On Common Ground

The biggest misconception about change is that once people understand it, they will embrace and so the best way to drive change forward is to explain the need for change in a very convincing and persuasive way. Change, in this view, is essentially a communication exercise and the right combination of words and images is all that is required.

Yet as should be clear by now that is clearly not true. People will often oppose change because it asks them to alter their identity. The Luddites didn’t just oppose textile machinery on economic grounds, but because it failed to recognize their skills as weavers. People don’t necessarily oppose wearing masks because they are “anti-science,” but because they resent having their behavior mandated from outside their community.

In other words, change is always, at some level, about what people value. That’s why to bring change about you need to identify shared values that reaffirm, rather than undermine, people’s sense of identity. Recognition is often a more powerful incentive than even financial rewards. In the final analysis, lasting change always needs to be built on common ground.

Over the next decade, we will undergo some of the most profound shifts in history, encompassing technology, resources, migration patterns and demography and, if we are to compete, we will need to achieve enormous transformation in business and society. Whether we are able to do that or not depends less on economics or “science” than it does on our ability to trust each other again.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We’re Disrupting People Instead of Industries Now

We're Disrupting People Instead of Industries Now

In 1997, when Clayton Christensen first published The Innovator’s Dilemma and introduced the term “disruptive innovation,” it was a clarion call. Business leaders were put on notice: It is no longer enough to simply get better at what you already do, you need to watch out for a change in the basis of competition that will open the door for a disruptive competitor.

Today, it’s become fashionable for business pundits to say that we live in a VUCA era, one that is volatile, uncertain, complex and ambiguous, but the evidence says otherwise. Increasingly researchers are finding that businesses are enjoying a period that is less disruptive, less competitive and less dynamic.

The truth is that we don’t really disrupt businesses anymore, we disrupt people and that’s truly becoming a problem. As businesses are increasingly protected from competition, they are becoming less innovative and less productive. Americans, meanwhile, are earning less and paying more. It’s time we stop doubling down on failed ideas and begin to right the ship.

The Productivity Paradox

In the 1920s two emerging technologies, internal combustion and electricity, finally began to hit their stride and kicked off a 50-year boom in productivity growth. During that time things changed dramatically. We shifted from a world where few Americans had indoor plumbing, an automobile or electrical appliances to one in which the average family had all of these things.

Technology enthusiasts like to compare the digital revolution with that earlier era, but that’s hardly the case. If anybody today was magically transported 50 years back to 1970, they would see much they would recognize. Yet if most modern people had to live in 1920, where even something as simple as cooking a meal required hours of backbreaking labor, they would struggle to survive.

The evidence is far more than anecdotal however. Productivity statistics clearly show that productivity growth started to slow in the early 1970s, just as computer investment began to rise. With the introduction of the Internet, there was a brief bump in productivity between 1996 and 2004, but then it disappeared again. Today, even with the introduction of social media, mobile Internet and artificial intelligence, we appear to be in a second productivity paradox.

Businesses can earn an economic profit in one of two ways. They can unlock new value through innovation or they can seek to reduce competition. In an era of diminished productivity, it shouldn’t be surprising that many firms have chosen the latter. What is truly startling is the ease and extent to which we have let them get away with it.

Rent Seeking And Regulatory Capture

Investment decisions are driven by profit expectations. If, for instance, a firm sees great potential in a new technology, they will invest in research and development. On the other hand, if they see greater potential influencing governments, they will invest in that. So it is worrying that lobbying expenditures have more than doubled since 1998.

The money goes towards two basic purposes. The first, called rent seeking, involves businesses increasing profits by the law to work in their favor, as when car dealerships in New Jersey sued against Tesla’s direct sales model. The second, regulatory capture, seeks to co-opt agencies that are supposed to govern industry.

It seems like they’re getting their money’s worth. Corporate tax rates in the US have steadily decreased and are now among the lowest in the developed world. Occupational licensing, often the result of lobbying by trade associations, has increased fivefold since the 1950s. Antitrust regulation has become virtually nonexistent, while competition has been reduced.

The result is that while corporations earn record profits, we pay more and get less. This is especially clear in some highly visible industries, such as airlines, cable and mobile carriers, but the effect is much more widespread than that. Keep in mind that, in many states, legislators earn less than $20,000 per year. It’s easy to see how a little investment can go a long way.

Decreasing Returns To Labor

With businesses facing less competition and a more favorable regulatory environment, which not only lowers costs but raises barriers to new market entrants, it shouldn’t be surprising that the stock market has hit record highs. Ordinarily that would be something to cheer, but evidence suggests that the gains are coming at the expense of the rest of us.

A report from MicKinsey Global Institute finds that labor’s share of income has been declining rapidly since 2000, especially in the United States. This is, of course, due to a number of factors, such as low productivity, automation, globalization. Decreased labor bargaining power due to increased market power of employers, however, has been shown to play an especially significant role.

At the same time that our wages have been reduced, the prices we pay have increased, especially in education and healthcare. A study from Pew shows that, for most Americans, real wages have hardly budged since 1964. Instead of becoming better off over time, many families are actually doing worse.

The effects of this long-term squeeze have become dire. Increasingly, Americans are dying deaths of despair from things like alcohol abuse, drug overdose, and suicide. Recent research has also shown that the situation has gotten worse during Covid.

We Are Entering A Dangerous Decade

Decades of disruption have left us considerably worse off. Income inequality is at record highs. Anxiety and depression, already at epidemic levels, has worsened during the Covid-19 pandemic. These trends are most acute in the US, but are essentially global in nature and have contributed to the rise in populist authoritarianism around the world.

Things are likely to get worse over the next decade as we undergo profound shifts in technology, resources, migration and demographics. To put that in perspective, a demographic shift alone was enough to make the 60s a tumultuous era. Clearly, our near future is fraught with danger.

Yet history is not destiny. We have the power to shape our path by making better choices. A good first step would be to finally abandon the cult of disruption that’s served us so poorly and begin to once again invest in stability and resilience, by creating better, safer technology, more competitive and stable markets and a happier, more productive workforce.

Perhaps most of all, we need to internalize the obvious principle that systems and ideologies should serve people, not the other way around. If we increase GDP and the stock market hits record highs, but the population is poorer, less healthy and less happy, then what have we won?

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

A New Age Of Innovation and Our Next Steps

A New Age Of Innovation and Our Next Steps

GUEST POST from Greg Satell

In Mapping Innovation, I wrote that innovation is never a single event, but a process of discovery, engineering and transformation and that those three things hardly ever happen at the same time or in the same place. Clearly, the Covid-19 pandemic marked an inflection point which demarcated several important shifts in those phases.

Digital technology showed itself to be transformative, as we descended into quarantine and found an entire world of video conferencing and other technologies that we scarcely knew existed. At the same time it was revealed that the engineering of synthetic biology—and mRNA technology in particular—was more advanced than we had thought.

This is just the beginning. I titled the last chapter of my book, “A New Era of Innovation,” because it had become clear that we had begun to cross a new rubicon in which digital technology becomes so ordinary and mundane that it’s hard to remember what life was like without it, while new possibilities alter existence to such an extent we will scarcely believe it.

Post-Digital Architectures

For the past 50 years, the computer industry—and information technology in general—has been driven by the principle known as Moore’s Law, which determined we could double the number of transistors on chips every 18 months. Yet now Moore’s Law is ending and that means we will have to revisit some very basic assumptions about how technology works.

To be clear, the end of Moore’s Law does not mean the end of advancement. There are a number of ways we can speed up computing. We can, for instance, use technologies such as ASIC and FPGA to optimize chips for specialized tasks. Still, those approaches come with tradeoffs, Moore’s law essentially gave us innovation for free.

Another way out of the Moore’s Law conundrum is to shift to completely new architectures, such as quantum, neuromorphic and, possibly, biological computers. Yet here again, the transition will not be seamless or without tradeoffs. Instead of technology based on transistors, we will have multiple architectures based on entirely different logical principles.

So it seems that we will soon be entering a new era of heterogeneous computing, in which we use digital technology to access different technologies suited to different tasks. Each of these technologies will require very different programming languages and algorithmic approaches and, most likely, different teams of specialists to work on them.

What that means is that those who run the IT operations in the future, whether that person is a vaunted CTO or a lowly IT manager, will be unlikely to understand more than a small part of the system. They will have to rely heavily on the expertise of others to an extent that isn’t required today.

Bits Driving Atoms

While the digital revolution does appear to be slowing down, computers have taken on a new role in helping to empower technologies in other fields, such as synthetic biology, materials science and manufacturing 4.0. These, unlike so many digital technologies, are rooted in the physical world and may have the potential to be far more impactful.

Consider the revolutionary mRNA technology, which not only empowered us to develop a Covid vaccine in record time and save the planet from a deadly pandemic, but also makes it possible to design new vaccines in a matter of hours. There is no way we could achieve this without powerful computers driving the process.

There is similar potential in materials discovery. Suffice it to say, every product we use, whether it is a car, a house, a solar panel or whatever, depends on the properties of materials to perform its function. Some need to be strong and light, while others need special electrical properties. Powerful computers and machine learning algorithms can vastly improve our ability to discover better materials (not to mention overcome supply chain disruptions).

Make no mistake, this new era of innovation will be one of atoms, not bits. The challenge we face now is to develop computer scientists who can work effectively with biologists, chemists, factory managers and experts of all kinds to truly create a new future.

Creation And Destruction

The term creative destruction has become so ingrained in our culture we scarcely stop to think where it came from. It was largely coined by economist Joseph Schumpeter to overcome what many saw as an essential “contradiction” of capitalism. Essentially, some thought that if capitalists did their jobs well, then there would be increasing surplus value, which would then be appropriated to accumulate power to rig the system further in capitalists favor.

Schumpeter pointed out that this wasn’t necessarily true because of technological innovation. Railroads, for example, completely changed the contours of competition in the American Midwest. Surely, there had been unfair competition in many cities and towns, but once the railroad came to town, competition flourished (and if it didn’t come, the town died).

For most of history since the beginning of the Industrial Revolution, this has been a happy story. Technological innovation displaced businesses and workers, but resulted in increased productivity which led to more prosperity and entirely new industries. This cycle of creation and destruction has, for the most part, been a virtuous one.

That is, until fairly recently. Digital technology, despite the hype, hasn’t produced the type of productivity gains that earlier technologies, such as electricity and internal combustion, did but actually displaced labor at a faster rate. Put simply, the productivity gains from digital technology are too meager to finance enough new industries with better jobs, which has created income inequality rather than greater prosperity.

We Need To Move From Disrupting Markets To Tackling Grand Challenges

There’s no doubt that digital technology has been highly disruptive. In industry after industry, from retail to media to travel and hospitality, nimble digital upstarts have set established industries on their head, completely changing the basis upon which firms compete. Many incumbents haven’t survived. Many others are greatly diminished.

Still, in many ways, the digital revolution has been a huge disappointment. Besides the meager productivity gains, we’ve seen a ​​global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets, not to mention an anxiety epidemic, increased obesity and, at least in the US, decreased life expectancy.

We can—and must—do better. We can learn from the mistakes we made during the digital revolution and shift our mindset from disrupting markets to tackling grand challenges. This new era of innovation will give us the ability to shape the world around us like never before, at a molecular level and achieve incredible things.

Yet we can’t just leave our destiny to the whims of market and technological forces. We must actually choose the outcomes we prefer and build strategies to achieve them. The possibilities that we will unlock from new computing architectures, synthetic biology, advanced materials science, artificial intelligence and other things will give us that power.

What we do with it is up to us.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

A Trigger Strategy for Driving Radical, Transformational Change

A Trigger Strategy for Driving Radical, Transformational Change

GUEST POST from Greg Satell

There’s an old adage that says we should never let a crisis go to waste. The point is that during a crisis there is a visceral sense of urgency and resistance often falls by the wayside. We’ve certainly seen that during the Covid pandemic. Digital technologies such as video conferencing, online grocery and telehealth have gone from fringe to mainstream in record time.

Seasoned leaders learn how to make good use of a crisis. Consider Bill Gates and his Internet Tidal Wave memo, which leveraged what could have been a mortal threat to Microsoft into a springboard to even greater dominance. Or how Steve Jobs used Apple’s near-death experience to reshape the ailing company into a powerhouse.

But what if we could prepare for a trigger before it happens? The truth is that indications of trouble are often clear long before the crisis arrives. Clearly, there were a number of warning signs that a pandemic was possible, if not likely. As every good leader knows, there’s never a shortage of looming threats. If we learn to plan ahead, we can make a crisis work for us.

The Plan Hatched In A Belgrade Cafe

In the fall of 1998, five young activists met in a coffee shop in Belgrade, Serbia. Although still in their twenties, they were already grizzled veterans. In 1992, they took part in student protests against the war in Bosnia. In 1996, they helped organize a series of rallies in response to Slobodan Milošević’s attempt to steal local elections.

To date, their results were decidedly mixed. The student protests were fun, but when the semester ended, everyone went home for the summer and that was the end of that. The 1996 protests were more successful, overturning the fraudulent results, but the opposition coalition, called “Zajedno,” soon devolved into infighting.

So they met in the coffee shop to discuss their options for the upcoming presidential election to be held in 2000. They knew from experience that they could organize rallies effectively and get people to the polls. They also knew that when they got people to the polls and won, Milošević would use his power and position to steal the election.

That would be their trigger.

The next day, six friends joined them and they called their new organization Otpor. Things began slowly, with mostly street theatre and pranks, but within 2 years their ranks had swelled to more than 70,000. When Milošević tried to steal the election they were ready and what is now known as the Bulldozer Revolution erupted.

The Serbian strongman was forced to concede. The next year, Milošević would be arrested and sent to The Hague for his crimes against humanity. He would die in his prison cell in 1996, awaiting trial.

Opportunity From The Ashes

In 2014, in the wake of the Euromaidan protests that swept the thoroughly corrupt autocrat Viktor Yanukovych from power, Ukraine was in shambles. Having been looted of roughly $100 billion (roughly the amount of the country’s entire GDP) and invaded by Russia, things looked bleak. Without western aid, the proud nation’s very survival was in doubt.

Yet for Vitaliy Shabunin and the Anti-Corruption Action Center, it was a moment he had been waiting for. He established the organization with his friend Dasha Kaleniuk a few years earlier. Since then they, along with a small staff, had been working with international NGOs to document corruption and develop effective legislation to fight it.

With Ukraine’s history of endemic graft, which had greatly worsened under Yanukovych, progress had been negligible. Yet now, with the IMF and other international institutions demanding reform, Shabunin and Kaleniuk were instantly in demand to advise the government on instituting a comprehensive anti-corruption program, which passed in record time.

Yet they didn’t stop there either. “Our long-term strategy is to create a situation in which it will be impossible not to do anti-corruption reforms,” Shabunin would later tell me. “We are working to ensure that these reforms will be done, either by these politicians or by another, because they will lose their office if they don’t do these reforms.”

Vitaliy, Dasha and the Anti-Corruption Action Center continue to prepare for future triggers.

The Genius Of Xerox PARC

One story that Silicon Valley folks love to tell involves Steve Jobs and Xerox. After the copier giant made an investment in Apple, which was then a fledgling company, it gave Jobs access to its Palo Alto Research Center (PARC). He then used the technology he saw there to create the Macintosh. Jobs built an empire based on Xerox’s oversight.

Yet the story misses the point. By the late 60s, its Xerox CEO Peter McColough knew that the copier business, while still incredibly profitable, was bound to be disrupted eventually. At the same time it was becoming clear that computer technology was advancing quickly and, someday, would revolutionize how we worked. PARC was created to prepare for that trigger.

The number of groundbreaking technologies created at PARC is astounding. The graphical user interface, networked computing, object oriented programing, the list goes on. Virtually everything that we came to know as “personal computing” had its roots in the work done at PARC in the 1970s.

Most of all, PARC saved Xerox. The laser printer invented there would bring in billions and, eventually, largely replace the copier business. Some technologies were spun off into new companies, such as Adobe and 3Com, with an equity stake going to Xerox. And, of course, the company even made a tidy profit off the Macintosh, because of the equity stake that gave Jobs access to the technology in the first place.

Transforming An Obstacle Into A Design Constraint

The hardest thing about change is that, typically, most people don’t want it. If they did, it have already been accepted as the normal state of affairs. That can make transformation a lonely business. The status quo has inertia on its side and never yields its power gracefully. The path for an aspiring changemaker can be heartbreaking and soul crushing.

Many would see the near-certainty that Milosevic would try to steal the election as an excuse to do nothing. Most people would look at the almost impossibly corrupt Yanukovych regime and see the idea of devoting your life to anti-corruption reforms as quixotic folly. It is extremely rare for a CEO whose firm dominates an industry to ask, “What comes after?”

Yet anything can happen and often does. Circumstances conspire. Events converge. Round-hole businesses meet their square-peg world. We can’t predict exactly when or where or how or what will happen, but we know that everybody and everything gets disrupted eventually. It’s all just a matter of time.

When that happens resistance to change temporarily abates. So there’s lots to do and no time to wait. We need to empower our allies, as well as listen to our adversaries. We need to build out a network to connect to others who are sympathetic to our cause. Transformational change is always driven by small groups, loosely connected, but united by a common purpose.

Most of all, we need to prepare. A trigger always comes and, when it does, it brings great opportunity with it.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.