Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

The Era of Moving Fast and Breaking Things is Over

The Era of Moving Fast and Breaking Things is Over

GUEST POST from Greg Satell

On July 16th, 1945, when the world’s first nuclear explosion shook the plains of New Mexico, the leader of the Manhattan Project, J. Robert Oppenheimer quoted from the Bhagavad Gita, “Now I am become Death, the destroyer of worlds.” Clearly, he was troubled by what he had unleashed and for good reason. The world was never truly the same after that.

Today, however, we have lost much of that reverence for the power of technology. Instead of proceeding deliberately and with caution, tech entrepreneurs have prided themselves on their willingness to “move fast and break things” and, almost reflexively, casually deride anyone who questions the practice as those who “don’t get it.”

It’s hard to see how, by any tangible metric, any of this has made us better off. We set out to disrupt industries, but disrupted people instead. It wasn’t always like this. Throughout our history we have asked hard questions and made good choices about technological progress. As we enter a new era of innovation, we desperately need to recapture some of that wisdom.

How We Put the Nuclear Genie Back in the Bottle

The story of nuclear weapons didn’t start with Oppenheimer, not by a long shot. In fact, if we were going to attribute the Manhattan Project to a single person, it would probably be a Hungarian immigrant physicist named Leo Szilard, who was one of the first to conceive of the possibility of a nuclear chain reaction.

In 1939, upon hearing of the discovery of nuclear fission in Germany he, along with fellow Hungarian emigre Eugene Wigner, decided that the authorities needed to be warned. Szilard then composed a letter warning of the possibility of a nuclear bomb that was eventually signed by Albert Einstein and sent to President Roosevelt. That’s what led to the American development program.

Yet after the explosions at Hiroshima and Nagasaki, many of the scientists who worked to develop the bomb wanted to educate the public of its dangers. In 1955, the philosopher Bertrand Russell issued a manifesto signed by a number of scientific luminaries. Based on this, a series of conferences at Pugwash, Nova Scotia were convened to discuss different approaches to protect the world from weapons of mass destruction.

These efforts involved far more than talk, but helped to shape the non-proliferation agenda and led to concrete achievements such as the Partial Test Ban Treaty. In fact, these contributions were so crucially important that the organizers of the Pugwash conferences were awarded the Nobel Peace Prize in 1995 and they continue even today.

Putting Limits On What We Do With the Code of Life

While the nuclear age started with a bang, the genetic age began with a simple article in the scientific journal Nature, written by two relatively unknown scientists named James Watson and Francis Crick, that described the structure of DNA. It was one of those few watershed moments when an entirely new branch of science arose from a single event.

The field progressed quickly and, roughly 20 years later, a brilliant researcher named Paul Berg discovered that you could merge human DNA with that from other living things, creating new genetic material that didn’t exist in nature. Much like Oppenheimer, Berg understood that, due to his work, humanity stood on a precipice and it wasn’t quite clear where the edge was.

He organized a conference at Asilomar State Beach in California to establish guidelines. Importantly, participation wasn’t limited to scientists. A wide swath of stakeholders were invited, including public officials, members of the media and ethical specialists. The result, now known as the Berg Letter, called for a moratorium on the riskiest experiments until the dangers were better understood. These norms were respected for decades.

Today, we’re undergoing another revolution in genomics and synthetic biology. New technologies, such as CRISPR and mRNA techniques, have opened up incredible possibilities, but also serious dangers. Yet here again, pioneers in the field like Jennifer Doudna are taking the lead in devising sensible guardrails and using the technology responsibly.

The New Economy Meets the New Era of Innovation

When Netscape went public in 1995, it hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25. Still, a tiny enterprise with no profits was almost instantly worth $2.9 billion.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet today, it’s clear that the “new economy” was a mirage. Despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

The digital revolution has been a real disappointment. In fact, when you look at outcomes, if anything we’re worse off. Rather than a democratized economy, market concentration has markedly increased in most industries. Income inequality in advanced economies has soared. In America wages have stagnated and social mobility has declined for decades. At the same time, social media has been destroying our mental health.

Now we’re entering a new era of innovation, in which we will unleash technologies much more powerful. New computing architectures like quantum and neuromorphic technologies will power things like synthetic biology and materials science to create things that would have seemed like science fiction a generation ago. We simply can no longer afford to be so reckless.

Shifting From Agility Toward Resilience

Moving fast and breaking things only seems like a good idea in a stable world. When you operate in a safe environment, it’s okay to take a little risk and see what happens. Clearly, we no longer live in such a world (if we ever did). Taking on more risk in financial markets led to the Great Recession. Being blase about data security has nearly destroyed our democracy. Failure to prepare for a pandemic has nearly brought modern society to its knees.

Over the next decade, the dangers will only increase. We will undergo four major shifts in technology, resources, migration and demographics. To put that in perspective, a similar shift in demography was enough to make the 60s a tumultuous decade. We haven’t seen a confluence of so many disruptive forces since the 1920s and that didn’t end well.

Unfortunately it’s far too easy to underinvest in order to mitigate the risk of a danger that may never come to fruition. Moving fast and breaking things can seem attractive because the costs are often diffuse. Although it has impoverished society as a whole and made us worse off in so many ways, it has created a small cadre of fabulously wealthy plutocrats.

Yet history is not destiny. We have the power to shape our path by making better choices. We can abandon the cult of disruption and begin to invest in resilience. In fact, we have to. By this point there should be no doubt that the dangers are real. The only question is whether we will act now or simply wait for it to happen and accept the consequences.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

New Skills Needed for a New Era of Innovation

New Skills Needed for a New Era of Innovation

GUEST POST from Greg Satell

The late Clayton Christensen had a theory about “jobs to be done.” In his view, customers don’t buy products as much as they “hire” companies to do specific “jobs” for them. To be competitive, firms need to understand what that job is and how to do it well. In other words, no one wants a quarter-inch drill bit, they want a quarter-inch hole.

The same can be said for an entire society. We need certain jobs to be done and will pay handsomely for ones that we hold in high regard, even as we devalue others. Just as being the best blacksmith in town won’t earn you much of a living today, great coding skills wouldn’t do you much good in a medieval village.

This is especially important to keep in mind today as the digital revolution comes to an end and we enter a new era of innovation in which some tasks will be devalued and others will be increasingly in demand. Much like Christensen said about firms, we as a society need to learn to anticipate which skills will lose value in future years and which will be considered critical.

The Evolution of Economies

The first consumer product was most likely the Acheulean hand axe, invented by some enterprising stone age entrepreneur over 100,000 years ago. Evidence suggests that, for the most part, people made stone axes themselves, but as technology evolved, some began to specialize in different crafts, such as smithing, weaving, cobbling and so on.

Inventions like the steam engine, and then later electricity and the internal combustion engine, brought about the industrial revolution, which largely put craftsmen out of work and reshaped society around cities that could support factories. It also required new skills to organize work, leading to the profession of management and the knowledge economy.

The inventions of the microchip and the internet have led to an information economy in which even a teenager with a smartphone has better access to knowledge than a specialist working in a major institution a generation ago. Much like the industrial era automated physical tasks, the digital era has automated many cognitive tasks.

Now as the digital era is ending we are entering a new era of innovation in which we will shift to post-digital computing architectures such as quantum computing and neuromorphic chips and enormous value will be created through bits powering atoms in fields like synthetic biology and materials science.

Innovation, Jobs and Wages

As economies evolved, some tasks became devalued as others increased in importance. When people could go to a smith for metal tools, they had no need to create stone axes. In much the same way, the industrial revolution put craft guilds out of business and technologies like tractors and combine harvesters drastically reduced the number of people working on farms.

Clearly replacing human labor with technology is disruptive, but it has historically led to dramatic increases in productivity. So labor displacement effects have been outweighed by greater wages and new jobs created by new industries. For the most part, innovation has made all of us better off, even, to a great extent, the workers who were displaced.

Consider the case of Henry Ford. Because technology replaced many tasks on the family farm, he didn’t need to work on it and found a job as an engineer for Thomas Edison, where he earned enough money and had enough leisure time to tinker with engines. That led him to create his own company, pioneer an industry and create good jobs for many others.

Unfortunately, there is increasing evidence that more recent innovations may not be producing comparable amounts of productivity and that’s causing problems. For example, when a company replaces a customer service agent with an automated system, it’s highly doubtful that the productivity gains will be enough to finance entire new industries that will train that call center employee to, say, design websites or run marketing campaigns.

Identifying New Jobs To Be Done

To understand the disconnect between technological innovation and productivity it’s helpful to look at some underlying economic data. In US manufacturing, for instance, productivity has skyrocketed, roughly doubling output in the 30 years between 1987 and 2017, even as employment in the sector decreased by roughly a third.

It is the increased productivity growth in manufacturing that has fueled employment growth in the service sector. However, productivity gains in service jobs have been relatively meager and automation through technological innovation has not resulted in higher wages, but greater income inequality as returns to capital dwarf returns to labor.

Further economic analysis shows that the divide isn’t so much between “white collar” and “blue collar” jobs, but between routine and non-routine tasks. So warehouse workers and retail clerks have suffered, but designers and wedding planners have fared much better. In other words, technological automation is creating major shifts in the “jobs to be done.”

A recent analysis by the McKinsey Global Institute bears this out. It identified 56 “foundational skills” that are crucial to the future of work, but aren’t in traditional categories such as “engineering” or “sales,” but rather things like self awareness, emotional intelligence and critical thinking.

Collaboration Is The New Competitive Advantage

The industrial revolution drove a shift from animal power to machine power and from physical skills to cognitive skills. What we’re seeing now is a similar shift from cognitive skills to social skills as automation takes over many routine cognitive tasks, increasingly the “job” that humans are valued for is relating to other humans.

There are some things a machine will never do. An algorithm will never strike out at a Little League game, see its child born or have a bad day at work. We can, of course, train computers to mimic these things by training them on data, but they will never actually have the experience and that limits their ability to fully relate to human emotions.

To see how this is likely to play out, simply go and visit your local Apple Store. It is a highly automated operation, without traditional checkout aisles or cash registers. Still, the first thing that catches your eye is a sea of blue shirts waiting to help you. They are not there to execute transactions, which you can easily do online, but to engage with you, understand what you’re trying to achieve and help you get it done.

We’ve seen similar trends at work even in highly technical fields. A study of 19.9 million scientific papers found that not only has the percentage of papers published by teams steadily increased over the past 50 years, the size of those teams has also grown and their research is more highly cited. The journal Nature got similar results and also found that the work being done is far more interdisciplinary and done at greater distances.

What’s becoming clear is that collaboration is increasingly becoming a competitive advantage. The ultimate skill is no longer knowledge or proficiency in a particular domain, but to build a shared purpose with others, who possess a diverse set of skills and perspectives, in order to solve complex problems. In other words, the most important jobs the ones we do in the service of a common objective.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Parallels Between the 1920’s and Today Are Frightening

Parallels Between the 1920's and Today Are Frightening

GUEST POST from Greg Satell

It should be clear by now we are entering a pivotal era. We are currently undergoing four profound shifts, that include changing patterns of demographics, migration, resources and technology. The stress lines are already beginning to show, with increasing tensions over race and class as well as questions about the influence technology and institutions have over our lives.

The last time we faced anything like this kind of tumult was in the 1960s which, much like today, saw the emergence of a new generation, the Baby-Boomers, that had very different values than their predecessors. Their activism achieved significant advances for women and minorities, but also at times, led to tumult and riots.

Yet the changes we are undergoing today appear to be even more significant than we did then. In fact, you would have to go back to the 1920s to find an era that had as much potential for both prosperity and ruin. Unfortunately, it led to economic upheaval, genocide and war on a scale never seen before in world history. We need to do better this time around.

Panics, Pandemics and War

A Wall Street crisis that threatened the greater economy and led to sweeping legislation that reshaped government influence in the financial sector was prelude to both the 1920’s and the 2020’s. Both the Bankers Panic of 1907 and the Great Recession which began in 2007 resulted in landmark legislation, the Federal Reserve Act and Dodd-Frank, respectively.

Continuing in the same vein of eerie parallel, the 1918 flu epidemic killed between 20 million and 50 million people and raged for more than two years, until 1920, when it finally got under control. Much like today, there were social distancing guidelines, significant economic impacts and long-term effects on educational attainment.

Perhaps not surprisingly, there was no small amount of controversy about measures taken to control the pandemic a century ago. People were frustrated with isolation (it goes without saying that there was no Netflix in 1918). Organizations like the Anti-Mask League of San Francisco rose up in defiance.

The years leading up to the 1920s were also war-torn, with World War I ravaging Europe and the colonial order increasingly coming under pressure. Much like the “War on Terrorism,” today, the organized violence, combined with the panics and pandemics, made for an overall feeling that society was unravelling, and many began to look for a scapegoat.

Migration, Globalization and Nativism

In 1892, Ellis Island opened its doors and America became a beacon to those around the world looking for a better life. New immigrants poured in and, by 1910, almost 15% of the US population were immigrants. As the 1920s approached, the strains in society were becoming steadily more obvious and more visceral.

The differences among the newcomers aroused suspicion, perhaps best exemplified by the Sacco and Vanzetti trial, in which two apparently innocent immigrants were convicted and executed for murder. Many believed that the new arrivals brought disease, criminality and “un-American” political and religious beliefs, especially with regard to Bolshevism.

Fears began to manifest themselves in growing nativism and there were increasing calls to limit immigration. The Immigration Act of 1917 specifically targeted Asians and established a literacy test for new arrivals. The Immigration Act of 1924 established quotas which favored northern and Western Europeans over those of Southern and Eastern Europe as well as Jews. The film Birth of A Nation led to a resurgence of the Ku Klux Klan.

Scholars see many parallels between the run-up to the 1920s and today. Although nativism these days is primarily focused against muslims and immigrants from South America, the same accusations of un-American political and religious beliefs, as well as outright criminality, are spurring on a resurgence of hate groups like the Proud Boys. Attorney General Merrick Garland has pledged to make prosecuting white supremacists a top priority.

A New Era of Innovation

As Robert Gordon explained in The Rise and Fall of American Growth, prosperity in the 20th century was largely driven by two technologies, electricity and the internal combustion engine. Neither were linear or obvious. Both were first invented in the 1880’s but didn’t really begin to scale until the 1920’s.

That’s not uncommon. In fact, it takes decades for a new discovery to make a measurable impact on the world. That’s how long is needed to first identify a useful application for a technology and then for ecosystems to form and secondary technologies to arise. Electricity and internal combustions would ignite a productivity boom that would last 50 years, from roughly 1920 until 1970.

For example, as economist Paul David explained in a highly cited paper, it wasn’t the light bulb, but in allowing managers to rearrange work in factories, that electricity first had a significant effect on society. Yet it was in the 1920s that things really began to take off. Refrigerated rail cars transformed diets and labor-saving appliances such as the vacuum cleaner would eventually pave the way for women in the workforce. The first radio stations appeared, revolutionizing entertainment.

Today, although the digital revolution itself has largely been a disappointment, there’s considerable evidence that we may be entering a new era of innovation as the emphasis shifts from bits to atoms. New computing architectures, such as quantum and neuromorphic computing, as well as synthetic biology and materials science, may help to reshape the economy for decades to come.

A Return to Normalcy?

Not surprisingly, by 1920 the American people were exhausted. Technological change, cultural disruption brought about by decades of mass immigration, economic instability and war made people yearn for calmer, gentler times. Warren G. Harding’s presidential campaign touted “a return to normalcy” and people bought in.

Yet while the “Roaring Twenties” are remembered as a golden age, they set the seeds for what came later. Although the stock market boomed, lack of regulation led to the stock market crash of 1929 and the Great Depression. The harsh reparations imposed by the Treaty of Versailles made the rise of Hitler possible.

The 1930s brought upon almost unimaginable horror. Economic hardship in Europe paved the way for fascism. Failed collectivization in the Soviet Union led to massive famine and, later, Stalin’s great purges. Rising nativism, in the US and around the world, led to diminished trade as well as violence against Jews and other minorities. World War II was almost inevitable.

It would be foolish beyond belief to deny the potential of history repeating itself. Still, the past is not necessarily prologue. The 1930s were not the inevitable result of impersonal historical forces, but of choices consciously made. We could have made different ones and received the bounty of the prosperity that followed World War II without the calamity that preceded it.

What we have to come to terms with is that technology won’t save us. Markets won’t save us. Our future will be the product of the choices we make. We should endeavor to choose wisely.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Innovation Ecosystem Building Blocks

Four Innovation Ecosystem Building Blocks

GUEST POST from Greg Satell

It’s hard to find anyone who wouldn’t agree that Microsoft’s 2001 antitrust case was a disaster for the company. Not only did it lose the case, but it wasted time, money and—perhaps most importantly—focus on its existing businesses, which could have been far better deployed on new technologies like search and mobile.

Today, Microsoft is a much different organization. Rather than considering open source software a cancer, it now says it loves Linux. Its cloud business is growing like wildfire and it is partnering widely to develop new quantum computers. What was previously a rapacious monopolist, is now an enthusiastic collaborator.

That’s no accident. Today, we need to compete in an ecosystem-driven world in which nobody, not even a firm as big and powerful as Microsoft, can go it alone. Power no longer comes from the top of value chains, but emanates from the center of networks. That means that strategy needs to shift from dominating industries to building collaborative ecosystems.

1. Connect to Startups

In its heyday, Microsoft enthusiastically followed Michael Porter’s five forces model. It saw threats coming not only from direct competitors, but also suppliers, customers, substitute products and new market entrants. Startups, in particular, were targeted for either acquisition or destruction if they were seen as posing a potential threat.

Today, however, Microsoft actively supports startups. Take, for example, its quantum development effort, in which it is partnering with more than a dozen entrepreneurial companies. These firms also get free access to Microsoft technologies, such as its Azure cloud platform and go-to-market resources and advice, through its Microsoft for Startups program.

Another approach that many firms take is corporate VC programs which actively invest in promising new companies. Unlike a typical investor, corporations bring a wealth of market and technical expertise, can help with things like distribution, supply chain management and marketing acumen. Corporations, for their part, get far more insight into new technologies than they could as an operating company.

Scott Lenet, President of Touchdown Ventures, which operates venture funds for corporations, told me that, “Startups thrive on new ideas and big firms know how to scale and improve those ideas. We’ve seen some of our investments really blossom based on that kind of partnership.”

2. Form Ties to the Academic World

When Sun Microsystems co-founder Bill Joy said, “no matter who you are, most of the smartest people work for someone else,” he was explicitly referring to Bill Gates’s assertion that Microsoft was an “IQ monopolist.” Joy’s position was that “It’s better to create an ecology that gets all the world’s smartest people toiling in your garden for your goals. If you rely solely on your own employees, you’ll never solve all your customers’ needs.”

Make no mistake. Innovation is never a single event. It is a process of discovery, engineering and transformation and those three things almost never happen in the same place or at the same time. That’s why the most innovative companies work hard to build links to the best minds in the academic world.

Today Microsoft has an extensive academic program that extends grants to graduate students and faculty members that are pursuing research that is of interest to the company. Google takes it even a step further, inviting dozens of the world’s top minds to work alongside its scientists and engineers for a sabbatical year.

Microsoft and Google are, of course, firms with enormous resources. However, just about any business can, for example, support the work of a young graduate student or postdoc at a local university. For even a senior researcher to collaborate with your staff is rarely prohibitively expensive. Researchers care far more about genuine support of their work than the size of your investment.

3. Leverage Domain-Specific Consortia

By the mid-1980’s, the American semiconductor industry seemed like it was doomed. Tp respond to what it saw as a national security threat, the American government created SEMATECH in 1986. It was a consortium of government agencies, research institutions and private firms focused on making the industry more competitive. By the mid 1990’s, the US was once again dominating semiconductors.

Any significantly complex technology takes years—and often decades—to develop before it becomes mature enough to engineer into a marketable product. So there is great potential in collaborating, even with competitive firms, in the pre-competitive phase to figure out the basic principles of a nascent technology.

For example, Boeing and Airbus are arch-rivals in aviation, much like DowDupont and BASF are in chemicals. Yet all of these companies, along with many others, collaborate at places like the Composites Institute (IACMI). They do this not out of any altruism, of course, but self-interest, because it is at places like the Composites Institute that they can collaborate with academic scientists, National Labs and startups working in the space.

As technology becomes more complex, domain specific consortia are becoming essential to any ecosystem strategy. The Composites Institute is just one node in the network of Manufacturing Institutes set up under the Obama Administration to support this type of collaboration. In areas ranging from advanced fabrics and biofabrication to additive manufacturing and wide-gap semiconductors, firms large and small are working with scientists to uncover new principles.

And the Manufacturing Institutes are just the start. The Internet of Things Consortium is helping bring computation to the physical world, while the Partnership on AI focuses on artificial intelligence and the Joint Center for Energy Storage Research is helping to develop advanced battery technology. All are open to the largest multinationals and the smallest startups.

4. Move From Hierarchies to Networks

Back in the 90s, when Microsoft still dominated the tech world, markets were still based on linear value chains dominated by one or two industry giants. Yet as I explain in Cascades, we are quickly moving from a world of hierarchies, to one dominated by networks and ecosystems. That changes how we need to develop and grow.

In a hierarchy-driven world, the optimal strategy was to build walls and moats to protect yourself against would-be invaders, which is why Microsoft fought tooth and nail to protect its operating system monopoly. Today, however, industry lines have blurred and technology moves too fast to be able to build effective barriers against disruption.

That’s why today “Microsoft loves Linux”, why it developed an academic program to collaborate with scientists at universities and why it often partners with startups instead of always trying to crush them. The technology being developed today is simply too complex for anyone to go it alone, which is why the only viable strategy is to actively connect to ecosystems of talent, technology and information.

Power today no longer sits at the top of hierarchies, but emanates from the center of ecosystems and you move to the center by widening and deepening connections. Closing yourself by erecting barriers will not protect you. In fact, it is an almost sure-fire way to hasten your demise.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Change Must Be Built on Common Ground

Why Change Must Be Built on Common Ground

GUEST POST from Greg Satell

When Steve Jobs returned to Apple in 1997, one of the first things he did was develop a marketing campaign to rebrand the ailing enterprise. Leveraging IBM’s long running “Think” campaign, Apple urged its customers to “Think Different.” The TV spots began, “Here’s to the crazy ones, the misfits, the rebels, the troublemakers, the round pegs in the square holes…”

Yet Jobs actual product strategy did exactly the opposite. While other technology companies jammed as many features into their products as they could to impress the techies and the digerati, Jobs focused on making his products so ridiculously easy to use that they were accessible to everyone. Apple became the brand people would buy for their mothers.

The truth is that while people like the idea of being different, real change is always built on common ground. Differentiation builds devotion among adherents, but to bring new people in, you need to make an idea accessible and that means focusing on values that you share with outsiders, rather than those that stir the passions of insiders. That’s how you win.

Overcoming the Desire to Be Different

Apple’s ad campaign was effective because we are tribal in nature. Setting your idea apart is a great way to unlock tribal fervor among devotees, but it also sends a strong signal to others that they don’t belong. For example, for decades LGBTQ activists celebrated their difference with “Gay Pride,” which made gay people feel better, but didn’t resonate with others.

It’s not much different in the corporate world. Those who want to promote Agile development love to tout the Agile Manifesto and its customer focused ethos. It’s what they love about the Agile methodology. Yet for those outside the Agile community, it can seem more than a bit weird. They don’t want to join a cult, they just want to get their job done.

So, the first step to driving change forward is to make the shift from differentiating values, which make ardent fans passionate about an idea, to shared values, which invite people in. That doesn’t mean you’re abandoning your core values any more than making products accessible meant that Apple had to skimp on capability. But it does create an entry point.

This is a surprisingly hard shift to make, but you won’t be able to move forward until you do.

Identifying and Leveraging Your Opposition

Make no mistake. Change fails because people want it to fail. Any change that is important, that has the potential for real impact, will inspire fierce resistance. Some people will simply hate the idea and will try to undermine your efforts in ways that are dishonest, deceptive and underhanded. That is the chief design constraint of any significant change effort.

So, you’re going to want to identify your most active opposition because you want to know where the attacks are going to be coming from. However, you don’t want to directly engage with these people because it is unlikely to be an honest conversation. Most likely, it will devolve into something that just bogs you down and drains you emotionally.

However, you can listen. People who hate your idea are, in large part, trying to persuade many of the same people you are. Listening to which arguments they find effective can help unlock shared values and that’s what holds the key to truly transformational change. But most importantly, they can help you define shared values.

So, while your main focus should be on empowering those who are excited about change, you should pay attention to your most vocal opposition. In fact, with some effort, you can learn to love your haters. They can point out early flaws. Also, as you begin to gain traction they will often lash out and overreach, undermine themselves and and end up sending people your way.

Defining Shared Values

Your most active opposition, the people who hate your idea and want to undermine it, have essentially the same task that you do. They want to move people who are passive or neutral to support their position and will design their communication efforts to achieve that objective. If you listen carefully though, you can make their efforts work for you.

For example, when faced with President Woodrow Wilson’s opposition to voting rights for women, Alice Paul’s band of Silent Sentinels picketed the White House with phrases lifted from President Wilson’s own book. How could he object, without appearing to be a tremendous hypocrite, to signs that read, “LIBERTY IS A FUNDAMENTAL DEMAND OF THE HUMAN SPIRIT?

In a similar vein, those who opposed LGBTQ rights often did so on the basis of family values and it was, for decades, a very effective strategy. That is, until LGBTQ activists used it against them. After all, shouldn’t those of different sexual orientations be able to live in committed relationships and raise happy and health families? If you believe in the importance of families, how could you not support same sex marriages?

The strategy works just as well in a corporate environment. In our Transformation & Change workshops, we ask executives what those who oppose their idea say about it. From there, we can usually identify the underlying shared value and then leverage it to make our case. Once you identify common ground, it’s much easier to move forward.

Surviving Victory

Steve Jobs, along with his co-founder Steve Wozniak, started Apple to make computers. But if that’s all Apple ever did, it would never have become the world’s most valuable company. What made Jobs the iconic figure he became had nothing to do with any one product, but because he came to represent something more: the fusion of technology and design.

In his autobiography of Steve Jobs, Walter Isaacson noted that he revolutionized six industries, ranging from music to animated movies, far afield from the computer industry. He was able to do that because he continued to focus on the core values of using technology and design to make products more accessible to ordinary people.

In other words, in every venture he undertook he looked for common ground by asking himself, “how can we make this as easy as possible for those who are not comfortable with technology.” He didn’t merely cater to the differences of his hard core enthusiasts, but constantly looked to bring everybody else in.

Many companies have had hit products, but very few have had the continued success of Apple. In fact, success often breeds failure because it attracts new networks of competitors. Put another way, many entrepreneurs fail to survive victory because they focus on a particular product rather than the shared values that product was based on.

Jobs was different. He was passionate about his products, but his true calling was tapping into basic human desires. In other words, he understood that truly revolutionary change is always built on common ground.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Not Everyone Can Transform Themselves

Not Everyone Can Transform Themselves

Here’s What Makes the Difference

GUEST POST from Greg Satell

The conservative columnist John Podhoretz recently took to the New York Post to denounce the plotline of Disney’s new miniseries The Falcon and the Winter Soldier. In particular, he took umbrage with a subplot that invoked the Tuskegee experiments and other historical warts in a manner that he termed “didactic anti-Americanism.”

His point struck a chord with me because, in my many years living overseas, I always found that people in other countries were more than aware of America’s failures such as slavery, Jim Crow, foreign policy misadventures and so on. What they admire is our ability to take a hard look at ourselves and change course.

It also reminded me of something I’ve noticed in my work helping organizations transform themselves. Some are willing to take a hard look at themselves and make tough changes, while others are addicted to happy talk and try to wish problems away. Make no mistake. You can’t tackle the future without looking with clear eyes at how the present came into being.

A Pregnant Postcard

The genesis of shareholder capitalism and our modern outlook on how things are supposed to work can, in some sense, be traced back to Paris in 1900. It was there and then that an obscure graduate student named Louis Bachelier presented his thesis on speculation to a panel of judges including the great Henri Poincaré. It described the fluctuation of market prices as a random walk, a revolutionary, albeit unappreciated, idea at the time.

Unfortunately for Bachelier, his paper went mostly unnoticed and he vanished into obscurity. Then, in 1954, he was rediscovered by a statistician named Jimmie Savage, who sent a postcard to his friend, the eminent economist Paul Samuelson, asking “ever hear of this guy?” Samuelson hadn’t, but was intrigued.

In particular, Bachelier’s assertion that “the mathematical expectation of the speculator is zero,” was intriguing because it implied that market prices were essentially governed by bell curves that are, in many respects, predictable. If it were true, then markets could be tamed through statistical modeling and the economy could be managed much more effectively.

Samuelson, who was pioneering the field of mathematical finance at the time, thought the paper was brilliant and began to actively promote it. Later, Eugene Fama would build Bachelier’s initial work into a full-blown Efficient Market Hypothesis. It would unleash a flurry of new research into financial modeling and more than a few Nobel Prizes.

A Refusal to Reckon

By the 1960s, the revolution in mathematical finance began to gain steam. Much like had happened in physics earlier in the century, a constellation of new discoveries such as efficient portfolios, the capital asset pricing model (CAPM) and, later, the Black-Scholes model for options pricing created a “standard model” for thinking about economics and finance.

As the things gathered steam, Samuelson’s colleague at MIT, Paul Cootner, compiled the most promising papers in a 500-page tome, The Random Character of Stock Market Prices, which became an instant classic. The book would become a basic reference for the new industries of financial engineering and risk management that were just beginning to emerge at the time.

However, early signs of trouble were being ignored. Included in Cootner’s book was a paper by Benoit Mandelbrot that warned that there was something seriously wrong afoot. He showed, with very clear reasoning and analysis, that actual market data displayed far more volatility than was being predicted. In essence, he was pointing out that Samuelson and his friends were vastly underestimating risk in the financial system.

In a response, Cootner wrote that Mandelbrot forced economists “to face up in a substantive way to those uncomfortable empirical observations that there is little doubt most of us have had to sweep under the carpet until now.” He then added, “but surely before consigning centuries of work to the ash pile, we should like to have some assurance that all of our work is truly useless.”

Think about that for a second. Another term for “empirical observations” is “facts in evidence,” and Cootner was admitting that these were being ignored! The train was leaving the station and everybody had to either get on or get left behind.

The Road to Shareholder Value

As financial engineering transformed Wall Street from a clubby, quiet industry to one in which dashing swashbucklers in power ties and red suspenders became “barbarians at the gate,” pressure began to build on managers. The new risk management products lowered the perceived cost of money and ushered in a new era of leveraged buyouts.

A new breed of “corporate raiders” could now get control of companies with very little capital and demand that performance—and “performance” meant stock performance— improve. They believed that society’s interest was best determined by market forces and unabashedly pursued investment returns above all else. As Wall Street anti-hero Gordon Gekko put it, the overall sentiment was that “greed is good.”

Managers were put on notice and a flood of new theories from business school professors and management consultants poured in. Harvard’s Michael Porter explained how actively managing value chains could lead to sustainable competitive advantage. New quantitative methods, such as six sigma, promised to transform management into, essentially, an engineering problem.

Today, the results are in and they are abysmal. In 2008 a systemic underestimation of risk—of exactly the type Mandelbrot warned us of—caused a financial meltdown. We are now in the midst of a second productivity paradox in which technological advance does little to improve our well-being. Income inequality, racial strife and mental health are at historic levels.

Since 1970, we have undergone three revolutions—financial, managerial and digital—and we are somehow worse off. It’s time to admit that we had the wrong theory of the case and chart a new course. Anything else is living in denial.

A Different Future Demands You Reject the Past

Underlying Mr. Podhoretz’s column is a sense of aggrievement that practically drips from each sentence. It’s hard to see the system in which you have succeeded as anything other than legitimate without tarnishing your own achievements. While he is clearly annoyed by what he sees as “didactic,” he seems unwilling to entertain the possibility that a large portion of the country desperately wants to come to terms with our history.

We often see the same thing with senior executives in our transformation work. Yet to chart a new path we must reject the past. As Thomas Kuhn pointed out in his classic, The Structure of Scientific Revolutions, every model is flawed. Some can be useful for decades or even centuries, but eventually circumstances change and they become untenable. After a period of tumult, they collapse and a new paradigm emerges.

What Podhoretz misses about both The Falcon and The Winter Soldier is that they were able to make common cause around the values that they shared, not the history that divided them, and partner on a shared mission. That’s what separates those who are able to transform themselves and those who are not. You need to take a hard look and achieve a level of honesty and integrity with yourself before you can inspire trust in others.

In order to improve we first must look with clear eyes on what needs to be corrected in the first place. To paraphrase President Kennedy, we don’t do these things because they are easy, but because they are worthwhile.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Outsmarting Those Who Want to Kill Change

Outsmarting Those Who Want to Kill Change

GUEST POST from Greg Satell

Look at anyone who has truly changed the world and they encountered significant resistance. In fact, while researching my book Cascades, I found that every major change effort, whether it was a political revolution, a social movement or an organizational transformation, had people who worked to undermine it in ways that were dishonest, underhanded and deceptive.

Unfortunately, we often don’t realize that there is an opposition campaign underway until it’s too late. People rarely voice open hostility to change. Opponents might even profess some excitement at our idea conceptually, but once there is a possibility of real action moving forward, they dig in their heels.

None of this means that change can’t happen. What it does mean is that, if you expect to bring about meaningful change, planning to overcome resistance has to be a primary design constraint and an organizing principle. Once you understand that, you can begin to move forward, identify shared values, design effective tactics and, ultimately, create lasting change.

Start With a Local Majority

Consider a famous set of conformity studies performed by the psychologist Solomon Asch in the 1950s. The design was simple, but ingenuous. He merely showed people pairs of cards, asking them to match the length of a single line on one card with one of three on an adjacent card. The correct answer was meant to be obvious.

However, as the experimenter went around the room, one person after another gave the same wrong answer. When it reached the final person in the group (in truth, the only real subject, the rest were confederates), the vast majority of the time that person conformed to the majority opinion, even if it was obviously wrong!

Majorities don’t just rule, they also influence, especially local majorities. The effect is even more powerful when the issue at hand is not as clear-cut as the length of a line on a card. Also, more recent research suggests that the effect applies not only to people we know well, but that we are also influenced even by second and third-degree relationships.

The key point here is that we get to choose who we expose an idea to. If you start with five people in a room, for example, you only need three advocates to start with a majority. That may not seem consequential, but consider that the movement that overthrew Serbian dictator Slobodan Milošević started with five kids in a cafe, and you can see how even the most inauspicious beginnings can lead to revolutionary outcomes.

You can always expand a majority out, but once you’re in the minority you are likely to get immediate pushback and will have to retrench. That’s why the best place to start is with those who are already enthusiastic about your idea. Then you can empower them to be successful and bring in others who can bring in others still.

Listen to Your Opposition, But Don’t Engage Them

People who are passionate about change often see themselves as evangelists. Much like Saint Paul in the bible, they thrive on winning converts and seek out those who most adamantly oppose their idea in an attempt to change their minds. This is almost always a mistake. Directly engaging with staunch opposition is unlikely to achieve anything other than exhausting and frustrating you.

However, while you shouldn’t directly engage your fiercest critics, you obviously can’t act like they don’t exist. On the contrary, you need to pay close attention to them. In fact by listening to people who hate your idea you can identify early flaws, which gives you the opportunity to fix them before they can be used against you in any serious way.

One of the most challenging things about managing change effort is balancing the need to focus on a small circle of dedicated enthusiasts while still keeping your eyes and ears open. Once you become too insular, you will quickly find yourself out of touch. It’s not enough to sing to the choir, you also need to get out of the church and mix with the heathens.

Perhaps the most important reason to listen to your critics is that they will help you identify shared values. After all, they are trying to convince the same people in the middle that you are. Very often you’ll find that, by deconstructing their arguments, you can use their objections to help you make your case.

Shift From Differentiating Values to Shared Values

Many revolutionaries, corporate and otherwise, are frustrated marketers. They want to differentiate themselves in the marketplace of ideas through catchy slogans that “cut through.” It is by emphasizing difference that they seek to gin-up enthusiasm among their most loyal supporters.

That was certainly true of LGBTQ activists, who marched through city streets shouting slogans like “We’re here, we’re queer and we’d like to say hello.” They led a different lifestyle and wanted to demand that their dignity be recognized. More recently, Black Lives Matter activists made calls to “defund the police,” which many found to be shocking and anarchistic.

Corporate change agents tend to fall into a similar trap. They rant on about “radical” innovation and “disruption,” ignoring the fact that few like to be radicalized or disrupted. Proponents of agile development methods often tout their manifesto, oblivious to the reality that many outside the agile community find the whole thing a bit weird and unsettling.

While emphasizing difference may excite people who are already on board, it is through shared values that you bring people in. So it shouldn’t be a surprise that the fight for LGBTQ rights began to gain traction when activists started focusing on family values. Innovation doesn’t succeed because it’s “radical,” but when it solves a meaningful problem. The value of Agile methods isn’t a manifesto, but the fact that they can improve performance.

Create and Build On Success

Starting with a small group of enthusiastic apostles may seem insignificant. In fact, look at almost any popular approach to change management and the first thing on the to-do-list is “create a sense of urgency around change” or “create an awareness of the need for change.” But if that really worked, the vast majority of organizational transformations wouldn’t fail, and we know that they do.

Once you accept that resistance to change needs to be your primary design constraint, it becomes clear that starting out with a massive communication campaign will only serve to alert your opponents that they better get started undermining you quickly or you might actually be successful in bringing change about.

That’s why we always advise organizations to focus on a small, but meaningful keystone change that can demonstrate success. For example, one initiative at Procter & Gamble started out with just three mid-level executives focused on improving one process. That kicked off a movement that grew to over 2500 employees in 18 months. Every successful large enterprise transformation we looked at had a similar pattern.

That, in truth, is the best way to outsmart the opponents of change. Find a way to make it successful, no matter how small that initial victory may be, then empower others to succeed as well. It’s easy to argue against an idea, you merely need to smother it in its cradle. Yet a concept that’s been proven to work and has inspired people to believe in it is an idea whose time has come.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Preparing for Organizational Transformation in a Post-COVID World

Preparing Your Organization for Transformation in a Post-COVID World

GUEST POST from Greg Satell

The Covid-19 pandemic demanded we transform across multiple planes. Businesses had to abruptly shift to empower remote work. Professionals were suddenly trading commutes and in-person meetings for home schooling and “Zoom fatigue.” Leaders needed to reimagine every system, from storefronts to supply chains to educational institutions.

It was a brutal awakening, but we can now see the light at the end of the tunnel. In fact, a recent McKinsey Global Survey found that 73% of executives believed that conditions will be moderately or substantially better in the next year. Globally, the World Bank predicts 4% growth in 2021, a marked improvement over 2020’s 4.3% drop.

Still, while the crisis may be ending, the need for fundamental change has not. Today leaders must reinvent their organizations on multiple fronts, including technological, environmental, social and skills-based transformations. These pose challenges for any organization and research suggests that traditional approaches are unlikely to succeed. Here’s what will:

Empowering Small Groups

In 1998 five friends met in a cafe in Belgrade and formed a revolutionary movement. Two years later the brutal Serbian dictator, Slobodan Milošević, was overthrown. In 2007, a lean manufacturing initiative at Wyeth Pharmaceuticals began with a single team in one plant. In 18 months it spread to more than 17,000 employees across 25 sites worldwide and resulted in a more than 25% reduction in costs across the company.

More recently, in 2017, three mid-level employees at Procter & Gamble decided to take it upon themselves, with no budget and no significant executive sponsorship, to transform a single process. It took them months, but they were able to streamline it from a matter of weeks to mere hours. Today, their PxG initiative for process improvement has become a movement for reinvention that encompasses thousands of their colleagues worldwide.

Traditionally, managers launching a new initiative have aimed to start with a bang. They work to gain approval for a sizable budget as a sign of institutional commitment, recruit high-profile executives, arrange a big “kick-off” meeting and look to move fast, gain scale and generate some quick wins. All of this is designed to create a sense of urgency and inevitability.

Yet that approach can backfire. Many change leaders who start with a “shock and awe” approach find that, while they have rallied some to their cause, they have also inspired an insurgency that bogs things down. For any significant change, there will always be some who will oppose the idea and they will resist it in ways that are often insidious and not immediately obvious.

The dangers of resistance are especially acute when, as is often the case today, you need to drive transformation on multiple fronts. That’s why it’s best to start with small groups of enthusiasts that you can empower to succeed, rather than try to push an initiative on the masses that you’ll struggle to convince.

Weaving A Network Of Loose Connections

The sociologist Mark Granovetter envisioned collective action as a series of resistance thresholds. For any idea or initiative, some will be naturally enthusiastic and have minimal or no resistance, some will have some level of skepticism and others will be dead set against it.

It’s not hard to see why focusing initial efforts on small groups with low resistance thresholds can be effective. In the examples above, the Serbian activists, the lean manufacturing pilot team at Wyeth and the three mid-level executives at Procter & Gamble were all highly motivated and willing to put in the hard work to overcome initial challenges and setbacks.

To scale, however, transformation efforts must be able to connect to those who have at least some level of reluctance. One highly effective strategy to scale change is to create “cooptable” resources in the form of workshops, training materials and other assets. For example, to scale a cloud transformation initiative at Experian, change leaders set up an “API Center of Excellence” to make it as easy as possible for product managers to try cloud-based offerings.

Another helpful practice is to update stakeholders about recent events and share best practices. In One Mission, Chris Fussell describes in detail the O&I forum he and General Stanley McChrystal used in Iraq. The Serbian activists held regular “network meetings,” that served a similar purpose. More recently, Yammer groups, Zoom calls and other digital media have proven effective in this regard.

What’s most important is that people are allowed to take ownership of a change initiative and be able to define it for themselves, rather than being bribed or coerced with incentive schemes or mandates. You can’t force authentic change. Unless people see genuine value in it, it will never gain any real traction.

Indoctrinate Shared Values And Shared Purpose

One of the biggest misconceptions about transformation efforts is that success begets more success. In practice, the opposite is often true. An initial success—especially a visible one—is likely to be met with a groundswell of opposition. We’ve seen this writ large with respect to political revolutions in which initial victories in places like Egypt, Maldives and Burma experienced reversals, but it is no less common in a corporate or organizational context.

In fact, we are often called into an engagement 6-12 months after an initiative starts because change leaders are bewildered that their efforts, which seemed so successful at first, have suddenly and mysteriously run aground. In actuality, it was those initial victories that activated latent opposition because it made what seemed unlikely change a real possibility.

The truth is that lasting change can never be built on any particular technology, program or policy, but rather must focus on shared values and a shared sense of mission. The Serbian activists focused not on any particular ideology, but on patriotism. At Wyeth, the change leaders made sure not to champion any specific technique, but tangible results. The leaders of the PXG initiative at Procter & Gamble highlighted the effect clunky and inefficient processes have on morale.

Irving Wladawsky-Berger, who was one of Lou Gerstner’s key lieutenants in IBM’s historic turnaround in the 90s made a similar point to me. “Because the transformation was about values first and technology second, we were able to continue to embrace those values as the technology and marketplace continued to evolve,” he said.

Redefining Agility

In Built to Last, management guru Jim Collins suggested that leaders should develop a “big hairy audacious goal” (BHAG) to serve as a unifying vision for their enterprise. He pointed to examples such as Boeing’s development of the 707 commercial jet liner and Jack Welch’s vision that every GE business should be #1 or #2 in its category as inspiring “moonshots.”

Yet the truth is that we no longer have the luxury of focusing transformation in a single direction, but must bring about change along multiple axes simultaneously. Leaders today can’t choose whether to leverage cutting-edge technologies or become more sustainable, nor can we choose between a highly skilled workforce and one that is diverse and inclusive.

The kind of sustained, multifaceted brand of change we need today cannot be mandated from a mountaintop but must be inspired to take root throughout an enterprise. We need to learn how to empower small loosely connected groups with a shared sense of mission and purpose. To truly take hold, people need to embrace change and they do that for their own reasons, not for ours.

That’s what will be key to making the transformations ahead successful. The answer doesn’t lie in any specific strategy or initiative, but in how people are able to internalize the need for change and transfer ideas through social bonds. A leader’s role is no longer to plan and direct action, but to inspire and empower belief.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Digital Era Replaced by an Age of Molecular Innovation

Digital Era Replaced by an Age of Molecular Innovation

GUEST POST from Greg Satell

It’s become strangely fashionable for digerati to mourn the death of innovation. “There’s nothing new,” has become a common refrain for which they blame venture capitalists, entrepreneurs and other digerati they consider to be less enlightened than themselves. They yearn for a lost age when things were better and more innovative.

What they fail to recognize is that the digital era is ending. After more than 50 years of exponential growth, the technology has matured and advancement has naturally slowed. While it is true that there are worrying signs that things in Silicon Valley have gone seriously awry and those excesses need to be curtailed, there’s more to the story.

The fact is that we’re on the brink of a new era of innovation and, while digital technology will be an enabling factor, it will no longer be center stage. The future will not be written in the digital language of ones and zeroes, but in that of atoms, molecules, genes and proteins. We do not lack potential or possibility, what we need is more imagination and wonder.

The End Of Moore’s Law

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which predicted that computing power would double about every two years. This idea, known as Moore’s Law , has driven the digital revolution for a half century. It’s what’s empowered us to shrink computers from huge machines to tiny, but powerful, devices we carry in our pockets.

Yet there are limits for everything. The simple truth is that atoms are only so small and the speed of light is only so fast. That puts a limit on how many transistors we can cram onto a silicon wafer and how fast electrons can zip around the logic gates we set up for them. At this point, Moore’s Law is effectively over.

That doesn’t mean that advancement will stop altogether. There are other ways to speed up computing. The problem is that they all come with tradeoffs. New architectures, such as quantum and neuromorphic computing, for instance, require new programming languages, new logical approaches and very different algorithmic strategies than we’re used to.

So for the next decade or two we’re likely to see a heterogeneous computing environment emerge, in which we combine different architectures for different tasks. For example, we will be augmenting traditional AI systems with techniques like quantum machine learning. It is not only possible, but fairly likely, that these types of combinations will result in an exponential increase in capability.

A Biological Revolution

Moore’s Law has become essentially shorthand for exponential improvement in any field. Anytime we see a continuous doubling of efficiency, we call it “the Moore’s Law of ‘X.’’ Yet since the Human Genome Project was completed in 2003, advancement in genetic sequencing has far outpaced what has happened in the digital arena.

What is possibly an even bigger development occurred in 2012, when Jennifer Doudna and her colleagues discovered how CRISPR could revolutionize gene editing. Now, suddenly, the work of genetic engineers that would have taken weeks could be done in hours, at a fraction of the cost, with much greater accuracy and the new era of synthetic biology had begun.

The most obvious consequence of this new era is the Covid-19 vaccine, which was designed in a matter of mere days instead of what’s traditionally taken years. The mRNA technology used to create two of the vaccines also holds promise for cancer treatment and CRISPR-based approaches have been applied to cure sickle cell and other diseases.

Yet as impressive as the medical achievements are, they make up only a fraction of the innovation that synthetic biology is making possible. Scientists are working on programming microorganisms to create new carbon-neutral biofuels and biodegradable plastics. It may very well revolutionize agriculture and help feed the world.

The truth is that the biological revolution is basically where computers were at in the 1970s or 80s and we are just beginning to understand the potential. We can expect progress to accelerate for decades to come.

The Infinite World Of Atoms

Anyone who has regularly read the business press over the past 20 years or so would naturally conclude that we live in a digital economy. Certainly, tech firms dominate any list of the world’s most valuable companies. Yet take a closer look and you will find that information and communication as a sector only makes up for 6% of GDP in advanced countries.

The truth is that we still live very much in a world of atoms and we spend most of our money on what we eat, wear, ride and live in. Any real improvement in our well-being depends on our ability to shape atoms to our liking. As noted above, reprogramming genetic material in cells to make things for us is one way we can do that, but not the only one.

In fact, there is a revolution in materials science underway. Much like in genomics, scientists are learning how to use computers to understand materials on a fundamental level and figure out how we can design them a lot better. In fact, in some cases researchers are able to discover new materials hundreds of times more efficiently than before.

Unlike digital or biological technologies this is largely a quiet revolution with very little publicity. Make no mistake, however, our newfound ability to create advanced materials will transform our ability to create and build everything from vastly more efficient solar panels to lighter, stronger and more environmentally friendly building materials.

The Next Big Thing Always Starts Out Looking Like Nothing At All

The origins of digital computing can be traced back at least a century, to the rise and fall of logical positivism, Turing’s “machine,” the invention of the transistor, the integrated circuit and the emergence of the first modern PC at Xerox PARC in the early 1970s. Yet there wasn’t a measurable impact from computing until the mid-1990s.

We tend to assume that we’ll notice when something important is afoot, but that’s rarely the case. The truth is that the next big thing always starts out looking like nothing at all. It doesn’t appear fully bloomed, but usually incubates for years—and often decades—by scientists quietly working in labs and by specialists debating at obscure conferences.

So, yes, after 50 years the digital revolution has run out of steam, but that shouldn’t blind us to the incredible opportunities that are before us. After all, a year ago very few people had heard of mRNA vaccines, but that didn’t make them any less powerful or important. There is no shortage of nascent technologies that can have just as big of an impact.

The simple fact is that innovation is not, and never has been, about what kind of apps show up on our smartphone screens. The value of a technology is not measured in how a Silicon Valley CEO can dazzle an audience on stage, but in our capacity to solve meaningful problems and, as long as there are meaningful problems to solve, innovation will live on.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.