Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

How to Free Ourselves of Conspiracy Theories

How to Free Ourselves of Conspiracy Theories

GUEST POST from Greg Satell

If you think about it, postal carriers should be a little bit creepy. If someone told you that an agent of the federal government would come to your house everyday with access to information about places you shop, businesses you transact with and people you know well enough to trade holiday cards with, it might cause you some alarm.

Yet we don’t find postal carriers creepy. In fact, despite vigorous efforts to malign the Postal Service, we trust it far more than most institutions. The truth is that we don’t conjure up conspiracy theories to explain the everyday and mundane, but some far off yonder which we cannot clearly designate, yet find threatening nonetheless.

The function conspiracy theories play is to explain things that we don’t understand and feel out of our control. So it shouldn’t be surprising that the age of Covid has spawned a myriad of crazy, dangerous notions. What we need to come to terms with is that the real problem plaguing society is a basic lack of trust and that is where the battle for truth must be fought.

The Visceral Abstract

One of the frustrating things about modern life is that we experience so little of it directly. As Leonard Read pointed out in his 1966 essay, I, Pencil, the manufacture of even the simplest modern object is beyond the reach of a single person. Today, people depend on technologies to get through their day, but have only the barest notion of how they function.

The truth is that we live in a world of the visceral abstract, where strange theories govern our everyday lives. People may not care much, or even believe in, Einstein’s theory of special relativity, but if GPS satellites aren’t calibrated to take it into account, the delivery man won’t be able to bring their dinner. In much the same way, the Coronavirus will mutate, and the most infectious variant will dominate, no matter what you think of Darwin’s theory.

As Francis Fukuyama explains in his recent book, Identity, the pace of change and disruption in modern society demands that we make choices about who we are. Faced with so much we don’t understand there is no small amount of appeal to rejecting the unknown in favor of simpler explanations in the form of conspiracy theories.

Populists often say that they want to “take our country back,” but what they really mean is that they want to take our existence back. They want to banish the fabulous yonder for something closer and more tangible. They offer safe harbor and, for people who feel stranded on the rocks, with the sea crashing over them, the attraction can be undeniable.

Conforming To Local Majorities

We all have a certain capacity to believe in an idea to or to partake in an action. We may be highly skeptical or wildly enthusiastic, depending on our innate preferences and previous experiences, but history shows that individuals—and, in fact, entire societies—are vulnerable to suggestion.

We are, for example, highly affected by what those around us think. In fact, a series of famous experiments first performed in the 1950’s, and confirmed many times since then, showed that we will conform to the opinions of those around us even if they are obviously wrong. More recent research has found that the effect extends to three degrees of social distance.

The effect is then multiplied by our tendency to be tribal, even when the source of division is arbitrary. For example, in a study where young children were randomly assigned to a red or a blue group, they liked pictures of other kids who wore t-shirts that reflected their own group better. In another study of adults that were randomly assigned to “leopards” and “tigers,” fMRI studies noted hostility to outgroup members regardless of their race.

So it isn’t surprising that people will be more willing to believe, say, a conspiracy theory floated by a high school friend than information from a government agency or recognized news source. If the majority of people around you believe something, you’re likely to believe it too, because that’s what’s close and tangible.

During the pandemic, when everybody is stuck inside, the effect of local majorities, especially in isolated online communities, is significantly more powerful than usual. These communities may be, in fact, at a long distance geographically, but in mental and social space, they make up a large part of our immediate environment.

The Psychology Of Delusion

Once we are exposed to an idea and influenced by those around us to be sympathetic to it, two cognitive biases begin to kick in. The first, called availability bias, is our tendency overweight information that is most available to us. For example, reading or hearing about traffic fatalities on the news will do little to affect our driving habits, but when we pass a bad accident on the road, we’ll naturally slow down and become more cautious.

It’s amazing how powerful availability bias can be. Researchers have found that it even affects how investors react to analysts reports, how corporations invest in research and how jurors evaluate witness testimony. Other studies find that availability bias affects medical judgments. Even in matters of great import, we tend not to look very far for information.

Again, it’s easy to see how the pandemic combined with the Internet can make us more susceptible. Stuck at home, we spend more time engaging with communities online, where we tend to be surrounded by likeminded people. Their opinion will seem more real to us than those of “experts” from outside our community, whether that community is virtual or not.

This effect is then combined with confirmation bias, our tendency to seek out information that supports our prior beliefs and reject contrary evidence. Those who fall prey to conspiracy theories often report spending a lot of time searching the Internet and watching YouTube videos, which confirm and extend their discussions with “fellow travelers.”

Rebuilding Trust

Once we become aware of where conspiracy theories come from, it becomes easier to understand why we tend to be far more suspicious of, say, public officials or medical experts than our postal carriers. We tend to trust those we see as being part of our communities and are suspicious of those we see as outsiders.

Unfortunately, the stresses on our society will only intensify over the next decade as we undergo major shifts in technology, resources, migration and demography. These changes will inevitably hit some segments of society harder than others and, it’s safe to assume, those left behind will likely feel that society has forsaken them.

We need to learn how to rebuild trust, even with our enemies and the best—perhaps the only way—to do that is by focusing on shared values. We might, for example, disagree on exactly how our criminal justice system should function, but we can all agree that everyone has the right to live in a safe community. We may not agree on the specifics of a “Green New Deal,” but can all see the importance of investing in our rural communities and small towns.

Most of all, we need to rebuild a sense of connection. Fortunately, network science tells us that it takes relatively few connections to drastically reduce social distance. Trust is personal, not political. It can’t be legislated or mandated but arises out of shared experience that contributes to the collective well-being. Like our mail carriers, our institutions must be seen to be competently serving us and having our best interests at heart.

In the final analysis, our problem is not one of information, but that of basic good will. The antidote is not stronger arguments, but more capable public service.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Paradigm Shifts Defining Our Next Decade

Four Paradigm Shifts Defining Our Next Decade

GUEST POST from Greg Satell

The statistician George Box pointed out that “all models are wrong, but some are useful.” He meant that we create models as simplified representations of reality. They are merely tools and should never be mistaken for reality itself. Unfortunately, that’s much easier to say than it is to practice.

All too often, models take on the illusion of reality. We are trained, first at school and then on the job, to use models to make decisions. Most of the time the models are close enough to reality that we don’t really notice the discrepancy. Other times we notice that the model is off, but we dismiss it an unusual case or anomaly.

Yet the real world is always changing. So, models tend to get more wrong—and hence less useful— over time. Eventually, the once-useful models become misleading and we undergo a paradigm shift. Today, as we experience a period of enormous change, we need to unlearn old models and replace them with new ones. They too will be wrong, but hopefully useful.

1. From Value Chains to Ecosystems

The dominant view of strategy in the 20th century was based on Michael Porter’s ideas about competitive advantage. In essence, he argued that the key to long-term success was to dominate the value chain by maximizing bargaining power among suppliers, customers, new market entrants and substitute goods.

Yet markets today are much faster, more interconnected and more complex than they were when Porter formulated his ideas about competitive advantage. In a fast-moving information economy, firms increasingly depend on ecosystems to compete. That drastically changes the game.

Ecosystems are nonlinear and complex. Power emanates from the center instead of at the top of a value chain. You move to the center by connecting out. In a networked-driven world you need to continually widen and deepen links to other stakeholders within the ecosystem. That’s how you gain access to resources like talent, technology and information.

Consider the mobility revolution that is disrupting the auto industry. In an earlier age, the auto giants would have sought to use their market clout to dominate nascent players in an attempt to preserve their position. Now however, they are creating partnerships with tech companies, startups and others in order to innovate more effectively in the space.

Even more impressive has been the global effort to fight the Covid crisis, in which unprecedented collaboration between governments, large pharmaceutical companies, innovative startups and academic scientists developed a life-saving vaccine in record time. Similar, albeit fledgling, efforts have been going on for years.

2. From Maximizing Bargaining Power to Building Resilience and Trust

Porter’s ideas dominated thinking in corporate strategy for decades, yet they had a fatal flaw that wasn’t always obvious. Thinking in terms of value chains is viable when technology is relatively static, but when the marketplace is rapidly evolving it can get you locked out of important ecosystems and greatly diminish your ability to compete.

A report from Accenture Strategy analyzing over 7000 firms found that trust itself is increasingly becoming a competitive advantage. When evaluating competitive agility, it found trust “disproportionately impacts revenue and EBITDA.” The truth is that to compete effectively you need to build deep bonds of trust throughout a complex ecosystem of stakeholders.

If you are always looking to maximize your bargaining power, you are likely to cut yourself off from important information and capabilities that you will need to effectively compete. That’s one reason that the Business Roundtable, an influential group of almost 200 CEOs of America’s largest companies, issued a statement that discarded the old notion that the purpose of a business is solely to create shareholder value in favor of a broader stakeholder approach.

It is through forging bonds of trust that a business can build resiliency. If a company is seen as trustworthy, then it can draw on the goodwill of customers, employees, partners and communities to help it overcome a disruptive event. If, on the other hand, it is seen as greedy and predatory, everything becomes much harder. We need to learn how to rebuild trust.

3. From Vertical Agility to Horizontal Agility

For the past 50 years, innovation has largely been driven by our ability to cram more transistors onto a silicon wafer. That’s what’s allowed us to double the power of our technology every 18 months or so and led to the continuous flow of new products and services streaming out of innovative organizations.

Perhaps not surprisingly, over the past few decades agility has become a defining competitive attribute. Because the fundamentals of digital technology have been so well understood, much of the value shifted to applications, rather than fundamental technologies and things like design and user experience. Yet that will change in the years ahead.

Over the past few decades, agility has largely meant moving faster and faster down a predetermined path. Over the next few decades, however, agility will take on a new meaning: the ability to explore multiple domains at once and combine them into something that produces value. We’ll need to learn how to go slower to deliver much larger impacts.

Over the next few decades we will struggle to adapt to a post-digital age and we will need to rethink old notions about agility. To win in this new era of innovation we will have to do far more than just move fast and break things.

4. From Bits to Atoms

In The Rise and Fall of American Growth, economist Robert Gordon argues that the rapid productivity growth the US experienced from 1920-1970 is largely a thing of the past. While there may be short spurts of growth, like there was in the late 90’s, we’re not likely to see a sustained period of progress anytime soon.

Among the reasons he gives is that, while earlier innovations such as electricity and the internal combustion engine had broad implications, the impact of digital technology has been amazingly narrow. The evidence bears this out. We see, to paraphrase Robert Solow, digital technology just about everywhere except in the productivity statistics.

Still, there are indications that the future will look very different than the past. Digital technology is beginning to power new areas in the physical world, such as synthetic biology and materials science, that are already having a profound impact on such high potential fields as medical research renewable energy and manufacturing.

It is all too easy to get caught up in old paradigms. When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.

In an age of disruption, the only viable strategy is to adapt.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Real Change Requires a Majority

Real Change Requires a Majority

GUEST POST from Greg Satell

“Don’t worry about people stealing your ideas,” said the computing pioneer Howard Aiken. “If your ideas are any good, you’ll have to ram them down people’s throats,” and truer words were scarcely ever spoken. We tend to think that if an idea has merit, everybody will immediately recognize its value, but that’s almost never true.

Ignaz Semmelweis, quite famously, advocated for hand washing at hospitals, but was ostracized, not celebrated, for it and would himself die of an infection contracted under care before his idea caught on. William Coley discovered cancer immunotherapy over a century ago, but was thought by many to be some sort of a quack.

Good ideas fail all the time. Part of the problem is that people who believe passionately in an idea feel compelled to win over the skeptics. That’s almost always a mistake. The truth is that the difference between success or failure often has nothing to do with the inherent value of an idea, but where you choose to start and the best place to start, is with a majority.

The Fundamental Fallacy of Change Management

Pundits tell us that change is inevitable, so we need to create a sense of urgency about it. They say we must “innovate or die,” because those who don’t “get it” are dinosaurs and, much like their reptilian brethren, they are bound to die an awful, painful death once the asteroid hits (and, the implication is, they will deserve it too).

History, however, shows us exactly the opposite. People like Ignaz Semmelweis and William Colely had truly groundbreaking ideas that could have saved millions of lives if they were adopted earlier. Nevertheless, those in the medical establishment that thwarted their efforts thrived while the innovators themselves suffered greatly professionally and personally.

It’s not just the medical profession either. Take a short tour throughout history and it becomes clear that unjust and incompetent regimes can have remarkable staking power. The status quo always has inertia on its side and rarely yields its power gracefully. A bad idea can last for decades, or centuries even.

The fundamental fallacy of change management is that it is essentially a communication exercise, that change fails because people don’t understand it well enough and if you explain it to them in sufficiently powerful terms, they will embrace it. The truth is that change fails because others oppose it in ways that are devious, underhanded and deceptive.

That needs to be your primary design constraint.

The Power of Local Majorities

Merely telling someone about change, no matter how artfully, is unlikely to be effective, but that doesn’t mean that people are immune to persuasion. In fact, there are decades of studies that show that people naturally conform to ideas that are widely held by others around them.

Consider this famous series of conformity experiments conducted by Solomon Asch in the 1950s. The design of the study was simple, but ingenious. Asch merely showed a group of people pairs of cards like these:

Asch Experiment Greg Satell

Each person in the group was asked to match the line on the left with the line of the same length on the right. However, there was a catch: almost everyone in the room was a confederate who gave the wrong answer. When it came to the real subjects’ turn to answer, most conformed to the majority opinion, even when it was obviously wrong.

Clearly, most ideas are not nearly that unambiguous, which is why, despite having made breakthrough discoveries, Semmelweis and Coley had so much trouble getting traction for them. The majority of the medical establishment was resistant and Semmelweis and Coley found themselves in the minority. Majorities routinely push back against minorities.

The Threshold Model of Collective Action

One important aspect of Asch’s conformity studies was that the results were far from uniform. A quarter of the subjects never conformed, some always did, and others were somewhere in the middle. We all have different thresholds to adopt an idea or to partake in an action, based on factors like confidence in our ability to make judgments and expected punishments or rewards for getting it right or wrong.

The sociologist Mark Granovetter addressed this issue with his threshold model of collective behavior. As a thought experiment, he asks us to imagine a diverse group of people milling around in a square. Some are natural deviants, always ready to start trouble, most are susceptible to provocation in varying degrees and the remainder is made up of unusually solid citizens, almost never engaging in antisocial behavior.

Threshold Model Greg Satell

You can see a graphic representation of how the model plays out above. In the example on the left, a miscreant throws a rock and breaks a window. That’s all it takes for his friend next to him to start and then others with slightly higher thresholds join in as well. Before you know it, a full-scale riot ensues.

The example on the right is slightly different. After the first few troublemakers start, there is no one around with a low enough threshold to join in. Rather than the contagion spreading, it fizzles out, the three miscreants are isolated and little note is made of the incident. Although the groups are outwardly similar, a slight change in conformity thresholds can make a big difference.

It’s a relatively simplistic example, but through another concept Granovetter developed called the strength of weak ties, we can see how it can lead to large scale change in the final graphic below as an idea moves from group to group.

From Thresholds to Cascades Greg Satell

The top cluster is identical to the one in the first example and a local majority forms. However, no cluster is an island because people tend to belong to multiple groups. For example, we form relationships with people in our neighborhood, from work, religious communities and so on. So an idea that saturates one group soon spreads to others.

Notice how the exposure to multiple groups can help overcome higher thresholds of resistance, because of the influence emanating from other groups through weak links. When you start with a majority, even if it is a small, local majority, an idea can gain traction, move from cluster to cluster and almost infinitely scale.

As I explain in my book, Cascades, there is significant evidence that this is how ideas actually do spread in the real world. The crucial point here is that it makes a really big difference where you choose to start. If you start with people who are enthusiastic about your idea, you are much more likely to succeed than if you choose people who are resistant.

So rather than trying to convince everybody at once, you are much better of identifying people who are likeminded and working on a Keystone Change that can for them basis of a larger transformation.

Working to Attract, Rather Than Overpower

When we look at the stories of Semmelweis and Coley through the prism of local majorities and resistance thresholds, we can see the mistake that they made. Having made truly breakthrough discoveries, they naturally assumed that others would see value in them. Instead, they ran headlong into a highly resistant majority and got squashed.

In my work helping leaders drive organizational transformations, I see this happen all the time. People who believe passionately in an idea naturally assume that others will “see the light.” Not surprisingly, they want to move quickly and overpower any resistance. This is especially true if they feel that they have institutional power behind them.

Yet that is almost always a mistake. There is a reason why the vast majority of organizational transformations fail, even though they typically have big budgets and C-Suite support behind them. To drive meaningful, lasting change you can’t rely on overpowering resistance, but must work to attract and empower genuine support.

That means you need to start with a majority. In the beginning, that may mean starting with a small, local majority— say, three people in a room of five. You can always expand a majority out, but once you find yourself in the minority, you will immediately feel pushback. The secret to overcoming resistance to an idea and driving it forward is understanding that you get to choose where to start.

Revolutionary change always starts with the art of choosing wisely.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay, Greg Satell

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Stop Fooling Yourself

Stop Fooling Yourself

GUEST POST from Greg Satell

Early in my career I was working on a natural gas trading desk and found myself in Tulsa Oklahoma visiting clients. These were genuine roughnecks, who had worked their way up from the fields to become physical gas traders. When the NYMEX introduced “paper” contracts and derivatives into the market, however, much would change.

They related to me how, when New York traders first came to town offering long-term deals, they were thrilled. For the first part of the contract, they were raking in money. Unfortunately, during the latter months, they got crushed, losing all their profits and then some. The truth was that the trade was pure arbitrage and they never had a chance.

My clients’ brains were working against them in two ways. First, availability bias, caused them to value information most familiar to them and dismiss other data. The second, confirmation bias, made them look for information that would confirm their instincts. This, of course, isn’t at all unusual. It takes real effort to avoid believing the things we think.

Becoming a Square-Peg Business in a Round-Hole World

When I was researching my book, Mapping Innovation, I spoke to every great innovator I could find. Some were world class scientists, others were top executives at major corporations and still others were incredibly successful entrepreneurs. Each one shared with me how they were able to achieve incredible things.

What I found most interesting was that the story was different every time. For every one who told me that a particular approach was the secret to their success, I found someone else who was equally successful who did things completely differently. The fact is that there is no one “true path” to innovation, everybody does it different ways.

Yet few organizations acknowledge that in any kind of serious way. Rather, they have a “way we do things around here,” and there are often significant institutional penalties for anyone who wants to do things differently. Usually these penalties are informal and unspoken, but they are very real and can threaten to derail even the most promising career.

You can see how the same cognitive biases that lost my gas trader friends money are at work here. In a profitable company, the most available information suggests things are being done the “right” way and everybody who wants to get ahead in the organization is heavily incentivized to embrace evidence to support that notion and disregarding contrary data.

That’s how organizations get disrupted. They stick to what’s worked for them in the past and fail to notice that the nature of the problems they need to solve has fundamentally changed. They become better and better at things that people care about less and less. Before they realize what happened, they become square-peg businesses in a round-hole world.

Silicon Valley Jumps the Shark

Nobody can deny the incredible success that Silicon Valley has had over the past few decades. Still mostly a backwater in the 1970s and 80s, by the end of 2020 four out of the ten most valuable companies in the world came from the Bay Area (not including Microsoft and Amazon, which are based in Seattle). No other region has ever dominated so thoroughly.

Yet lately Silicon Valley’s model of venture-funded entrepreneurship seems to have jumped the shark. From massive fraud at Theranos and out-of control founders at WeWork and Uber to, most recently, the incredible blow-up at Quibi, there is increasing evidence that the tech world’s “unicorn culture” is beginning to have a negative impact on the real economy.

One clue of where things went wrong can be found in Eric Ries’s book, The Startup Way. Ries, whose earlier effort, The Lean Startup, was a runaway bestseller, was invited to implement his methods at General Electric and transform the company to a 124 year-old startup. Much like with the “unicorns,” it didn’t end well.

The fundamental fallacy of Silicon Valley is that a model that was developed for a relatively narrow set of businesses—essentially software and consumer electronics—could be applied to solve any problem. The truth is that, much like the industrial era before it, the digital era will soon end. We need to let go of old ways and set out in new directions.

Unfortunately, because of how brains are wired for availability bias and confirmation bias, that’s a whole lot easier said than done.

Breaking Out of the Container of Your Own Experience

In 1997, when I was still in my twenties, I took a job in Warsaw, Poland to work in the nascent media industry that was developing there. I had experience working in media in New York, so I was excited to share what I’d learned and was confident that my knowledge and expertise would be well received.

It wasn’t. Whenever I began to explain how a media business was supposed to work, people would ask me, “why?” That forced me to think about it and, when I did, I began to realize that many of the principles I had taken for granted were merely conventions. Things didn’t need to work that way and could be done differently.

I also began to realize that, working for a large corporation in the US, I had been trained to work within a system, to play a specific part in a greater whole. When a problem came up that was outside my purview, I went to someone down the hall who played another part. Yet in post-Communist Poland, there was no system and no one down the hall.

So I had to learn a new outlook and a new set of skills and I consider myself lucky to have had that experience. When you are forced to explore the unknown, you end up finding valuable things that you didn’t even know to look for and begin to realize that many perspectives can be brought to bear on similar problems with similar fact patterns.

Learning How to Not Fool Yourself

In one of my favorite essays, originally given as a speech, the great physicist Richard Feynman said “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that,” and goes on further to say that simply being honest isn’t enough, you also need to “bend over backwards” to provide information so that others may prove you wrong.

So, the first step is to be hyper-vigilant and aware that your brain has a tendency to fool you. It will quickly grasp on the most readily available data and detect patterns that may or may not be there. Then it will seek out other evidence that confirms those initial hunches while disregarding contrary evidence.

Yet checking ourselves in this way isn’t nearly enough, we need to actively seek out and encourage dissent. Some of this can be done with formal processes such as pre-mortems and red teams, but a lot of it is cultural, hiring for diversity and running meetings in such a way that encourages discussion by, for instance, having the most senior leaders speak last.

Perhaps most of all, we need to have a sense of humility. It’s far too easy to be impressed with ourselves and far too difficult to see how we’re being led astray. There is often a negative correlation between our level of certainty and the likelihood of us being wrong. We all need to make an effort to believe less of what we think.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Five Immutable Laws of Change

Five Immutable Laws of Change

GUEST POST from Greg Satell

When I first arrived in Poland in 1997, change was all around me. It was like watching a society transform itself through time-lapse photography. Everywhere you looked, the country was shaking off decades of post-communist rust and striving to make good on the promise of 1989’s historic Round Table Agreement.

Yet it wasn’t until the fall of 2004 that I truly understood the power of change. By then, I was living in Kyiv, Ukraine and the entire country erupted in protests now known as the Orange Revolution. While Warsaw in the 90s was like rebuilding after a tornado hit, Ukraine was like being in the eye of the storm itself.

That experience led to a 15-year long journey of discovery and my book Cascades. What I found was that throughout history many have sought to create change and most have failed, but a few succeeded brilliantly. Starting out with very different challenges, philosophies and personalities, they eventually all arrived at the same principles that allowed them to prevail.

Law #1: The Status Quo Has Inertia On Its Side And Never Yields Gracefully

We tend to overvalue ideas. We think that if we have a good idea, people will immediately see its worth. Yet that’s hardly ever the case. As computer pioneer Howard Aiken put it, “Don’t worry about people stealing your ideas. If your ideas are any good, you’ll have to ram them down people’s throats.”

Consider the case of Ignaz Semmelweis, who first came up with the idea that medical staff in hospitals should wash their hands before operating on patients. You would think that would be an obviously good idea. Nevertheless, he was ostracized for it and ended up dying in an insane asylum, ironically from an infection he contracted under care.

Semmelweis’s plight was tragic, but is also so amazingly common that the tendency for the establishment to reject ideas is referred to as the Semmelweis effect. In fact, while researching my book Mapping Innovation I interviewed dozens of successful innovators and I found that every single one had to overcome stiff resistance to transform their idea into something useful.

The fact that you will face opposition when protesting an authoritarian regime is obvious, but an organizational environment can be just as cutthroat. Make no mistake. If your idea is important and has real potential for impact, there will be some who will hate it and they will work to undermine it in ways that are dishonest, underhanded and deceptive.

That must be your primary design constraint.

Law #2: Small Groups, Loosely Connected, But United By Shared Purpose Drive Transformational Change

For decades, change consultants have been telling us that if we want to drive transformation, we should “start with a bang” and create a “sense of urgency” through a big communication campaign. The results have been atrocious. In fact, McKinsey has found that nearly three quarters of organizational transformations do not succeed.

It’s not hard to understand why. If there are people who are determined to see your change fail—and every significant change encounters resistance—than a “rally the troops” type of approach will only serve to alert those who oppose change that they better get started undermining it or it might actually happen.

Fortunately, science points to another way. The truth is that small groups, loosely connected, but united by a shared purpose drive transformational change. So instead of trying to convince everybody at once, identify those who are already enthusiastic about your idea, who want it to work as much as you do. Those are people you can empower to succeed and can help bring in others, who can bring in others still.

Yet identifying advocates is only part of the battle. You also need to find imbue the effort with purpose and give it meaning. Unfortunately, all too often the quest for purpose is treated as a communication exercise. It isn’t. For change to be meaningful it has to actually solve a problem that people care about.

Law #3: Revolutions Begin With a Cause, Not A Slogan

Every change effort starts with a grievance. There’s something that people don’t like and they want it to be different. In a social or political movement that may be a corrupt leader or a glaring injustice. In an organizational context it’s usually something like falling sales, unhappy customers, low employee morale or technological disruption.

Whatever the case may be, the first step toward bringing change about is understanding that getting mired in grievance won’t get you anywhere. You can’t just complain about things you don’t like, but must come up with an affirmative vision for how you would want things to be.

The best place to start is by asking yourself, “if I had the power to change anything, what would it look like?” Martin Luther King Jr.s vision for the civil rights movement was for a Beloved Community. Bill Gates’s vision for Microsoft was for a “computer on every desk and in every home.” A good vision should be aspirational, but not completely out of reach.

One of the things I found in my research is that successful change leaders don’t try to move from grievance to vision in one step, but rather identify a Keystone Change, which focuses on a clear and tangible goal, includes multiple stakeholders and paves the way for future change, to bridge the gap.

For King, the Keystone Change was voting rights. For Gates it was an easy-to-use operating system. For your vision, it will undoubtedly be something different. The salient point here is that every successful transformation I found started out with a Keystone Change, so that’s where you will want to start as well.

Law #4: Design Tactics That Mobilize People to Influence Institutions

Organizational change consultants often recommend that changemakers prepare a stakeholder map. This isn’t necessarily a bad idea, but it is somewhat inadequate because it fails to distinguish between different kinds of stakeholders. Some stakeholders are targets for mobilization and others are targets for influence.

For example, both parents and school boards are important stakeholders in education, but for very different reasons. School boards wield institutional power that can effect change, parents do not, so we mobilize parents to influence school boards, not the other way around. We need to approach constituencies and institutions in very different ways.

One of the things we’ve consistently found in our work helping organizations to drive transformational change is that leaders construe stakeholders far too narrowly. Fortunately, decades of non-violent activism have given us powerful tools for both: the Spectrum of Allies for constituencies and the Pillars of Support for institutions.

A crucial point to remember is that you can’t dictate change by mandate. You can’t overpower but must instead attract people and empower them so that they can take ownership of the cause and make it their own. You need to accept that people will do things for their own reasons, not for yours.

Most of all, remember that every action has to have a clear purpose and be directed at influencing specific institutions. So before taking any action, ask two questions: Who are we mobilizing and to influence what?

Law #5: Every Revolution Inspires Its Own Counter-Revolution

In the aftermath of the Orange Revolution we thought we had won. After all, we had stood up to the injustice of a falsified election and prevailed. Unfortunately, it didn’t turn out that way. Five years later, Viktor Yanukovych, the same man who we had taken to the streets to prevent from office, rose to power in an election that international observers deemed free and fair. His corrupt and incompetent rule would trigger a second Ukrainian Revolution.

We find a similar pattern with many of the executives we work with. They work for months—and sometimes years—to get a project off the ground. Yet just when they think they’re turning the corner, when they’ve won executive sponsorship, signed up key partners and procured enough financing to have a realistic budget, all the sudden things seem to get mired down.

That’s no accident. Just because you’ve won a few early battles doesn’t mean opposition to your idea has melted away. On the contrary, faced with the fact that change may actually succeed, those who oppose it have probably just begun to redouble their efforts to undermine it. These efforts are often not overt, but they are there and can easily derail an initiative.

That’s why every change effort must learn how to survive victory. The truth is that change is always a journey, never a particular destination, which is why lasting change is always built on common ground. That doesn’t mean that you need to win over your fiercest critics, but it does mean you need to try to empathize with their perspective.

There is a reason why some change leaders succeed while others fail. At some point everybody needs to decide whether they would rather make a point or make a difference and, in the end, those that prevail choose the latter.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

The Era of Moving Fast and Breaking Things is Over

The Era of Moving Fast and Breaking Things is Over

GUEST POST from Greg Satell

On July 16th, 1945, when the world’s first nuclear explosion shook the plains of New Mexico, the leader of the Manhattan Project, J. Robert Oppenheimer quoted from the Bhagavad Gita, “Now I am become Death, the destroyer of worlds.” Clearly, he was troubled by what he had unleashed and for good reason. The world was never truly the same after that.

Today, however, we have lost much of that reverence for the power of technology. Instead of proceeding deliberately and with caution, tech entrepreneurs have prided themselves on their willingness to “move fast and break things” and, almost reflexively, casually deride anyone who questions the practice as those who “don’t get it.”

It’s hard to see how, by any tangible metric, any of this has made us better off. We set out to disrupt industries, but disrupted people instead. It wasn’t always like this. Throughout our history we have asked hard questions and made good choices about technological progress. As we enter a new era of innovation, we desperately need to recapture some of that wisdom.

How We Put the Nuclear Genie Back in the Bottle

The story of nuclear weapons didn’t start with Oppenheimer, not by a long shot. In fact, if we were going to attribute the Manhattan Project to a single person, it would probably be a Hungarian immigrant physicist named Leo Szilard, who was one of the first to conceive of the possibility of a nuclear chain reaction.

In 1939, upon hearing of the discovery of nuclear fission in Germany he, along with fellow Hungarian emigre Eugene Wigner, decided that the authorities needed to be warned. Szilard then composed a letter warning of the possibility of a nuclear bomb that was eventually signed by Albert Einstein and sent to President Roosevelt. That’s what led to the American development program.

Yet after the explosions at Hiroshima and Nagasaki, many of the scientists who worked to develop the bomb wanted to educate the public of its dangers. In 1955, the philosopher Bertrand Russell issued a manifesto signed by a number of scientific luminaries. Based on this, a series of conferences at Pugwash, Nova Scotia were convened to discuss different approaches to protect the world from weapons of mass destruction.

These efforts involved far more than talk, but helped to shape the non-proliferation agenda and led to concrete achievements such as the Partial Test Ban Treaty. In fact, these contributions were so crucially important that the organizers of the Pugwash conferences were awarded the Nobel Peace Prize in 1995 and they continue even today.

Putting Limits On What We Do With the Code of Life

While the nuclear age started with a bang, the genetic age began with a simple article in the scientific journal Nature, written by two relatively unknown scientists named James Watson and Francis Crick, that described the structure of DNA. It was one of those few watershed moments when an entirely new branch of science arose from a single event.

The field progressed quickly and, roughly 20 years later, a brilliant researcher named Paul Berg discovered that you could merge human DNA with that from other living things, creating new genetic material that didn’t exist in nature. Much like Oppenheimer, Berg understood that, due to his work, humanity stood on a precipice and it wasn’t quite clear where the edge was.

He organized a conference at Asilomar State Beach in California to establish guidelines. Importantly, participation wasn’t limited to scientists. A wide swath of stakeholders were invited, including public officials, members of the media and ethical specialists. The result, now known as the Berg Letter, called for a moratorium on the riskiest experiments until the dangers were better understood. These norms were respected for decades.

Today, we’re undergoing another revolution in genomics and synthetic biology. New technologies, such as CRISPR and mRNA techniques, have opened up incredible possibilities, but also serious dangers. Yet here again, pioneers in the field like Jennifer Doudna are taking the lead in devising sensible guardrails and using the technology responsibly.

The New Economy Meets the New Era of Innovation

When Netscape went public in 1995, it hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25. Still, a tiny enterprise with no profits was almost instantly worth $2.9 billion.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet today, it’s clear that the “new economy” was a mirage. Despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

The digital revolution has been a real disappointment. In fact, when you look at outcomes, if anything we’re worse off. Rather than a democratized economy, market concentration has markedly increased in most industries. Income inequality in advanced economies has soared. In America wages have stagnated and social mobility has declined for decades. At the same time, social media has been destroying our mental health.

Now we’re entering a new era of innovation, in which we will unleash technologies much more powerful. New computing architectures like quantum and neuromorphic technologies will power things like synthetic biology and materials science to create things that would have seemed like science fiction a generation ago. We simply can no longer afford to be so reckless.

Shifting From Agility Toward Resilience

Moving fast and breaking things only seems like a good idea in a stable world. When you operate in a safe environment, it’s okay to take a little risk and see what happens. Clearly, we no longer live in such a world (if we ever did). Taking on more risk in financial markets led to the Great Recession. Being blase about data security has nearly destroyed our democracy. Failure to prepare for a pandemic has nearly brought modern society to its knees.

Over the next decade, the dangers will only increase. We will undergo four major shifts in technology, resources, migration and demographics. To put that in perspective, a similar shift in demography was enough to make the 60s a tumultuous decade. We haven’t seen a confluence of so many disruptive forces since the 1920s and that didn’t end well.

Unfortunately it’s far too easy to underinvest in order to mitigate the risk of a danger that may never come to fruition. Moving fast and breaking things can seem attractive because the costs are often diffuse. Although it has impoverished society as a whole and made us worse off in so many ways, it has created a small cadre of fabulously wealthy plutocrats.

Yet history is not destiny. We have the power to shape our path by making better choices. We can abandon the cult of disruption and begin to invest in resilience. In fact, we have to. By this point there should be no doubt that the dangers are real. The only question is whether we will act now or simply wait for it to happen and accept the consequences.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

New Skills Needed for a New Era of Innovation

New Skills Needed for a New Era of Innovation

GUEST POST from Greg Satell

The late Clayton Christensen had a theory about “jobs to be done.” In his view, customers don’t buy products as much as they “hire” companies to do specific “jobs” for them. To be competitive, firms need to understand what that job is and how to do it well. In other words, no one wants a quarter-inch drill bit, they want a quarter-inch hole.

The same can be said for an entire society. We need certain jobs to be done and will pay handsomely for ones that we hold in high regard, even as we devalue others. Just as being the best blacksmith in town won’t earn you much of a living today, great coding skills wouldn’t do you much good in a medieval village.

This is especially important to keep in mind today as the digital revolution comes to an end and we enter a new era of innovation in which some tasks will be devalued and others will be increasingly in demand. Much like Christensen said about firms, we as a society need to learn to anticipate which skills will lose value in future years and which will be considered critical.

The Evolution of Economies

The first consumer product was most likely the Acheulean hand axe, invented by some enterprising stone age entrepreneur over 100,000 years ago. Evidence suggests that, for the most part, people made stone axes themselves, but as technology evolved, some began to specialize in different crafts, such as smithing, weaving, cobbling and so on.

Inventions like the steam engine, and then later electricity and the internal combustion engine, brought about the industrial revolution, which largely put craftsmen out of work and reshaped society around cities that could support factories. It also required new skills to organize work, leading to the profession of management and the knowledge economy.

The inventions of the microchip and the internet have led to an information economy in which even a teenager with a smartphone has better access to knowledge than a specialist working in a major institution a generation ago. Much like the industrial era automated physical tasks, the digital era has automated many cognitive tasks.

Now as the digital era is ending we are entering a new era of innovation in which we will shift to post-digital computing architectures such as quantum computing and neuromorphic chips and enormous value will be created through bits powering atoms in fields like synthetic biology and materials science.

Innovation, Jobs and Wages

As economies evolved, some tasks became devalued as others increased in importance. When people could go to a smith for metal tools, they had no need to create stone axes. In much the same way, the industrial revolution put craft guilds out of business and technologies like tractors and combine harvesters drastically reduced the number of people working on farms.

Clearly replacing human labor with technology is disruptive, but it has historically led to dramatic increases in productivity. So labor displacement effects have been outweighed by greater wages and new jobs created by new industries. For the most part, innovation has made all of us better off, even, to a great extent, the workers who were displaced.

Consider the case of Henry Ford. Because technology replaced many tasks on the family farm, he didn’t need to work on it and found a job as an engineer for Thomas Edison, where he earned enough money and had enough leisure time to tinker with engines. That led him to create his own company, pioneer an industry and create good jobs for many others.

Unfortunately, there is increasing evidence that more recent innovations may not be producing comparable amounts of productivity and that’s causing problems. For example, when a company replaces a customer service agent with an automated system, it’s highly doubtful that the productivity gains will be enough to finance entire new industries that will train that call center employee to, say, design websites or run marketing campaigns.

Identifying New Jobs To Be Done

To understand the disconnect between technological innovation and productivity it’s helpful to look at some underlying economic data. In US manufacturing, for instance, productivity has skyrocketed, roughly doubling output in the 30 years between 1987 and 2017, even as employment in the sector decreased by roughly a third.

It is the increased productivity growth in manufacturing that has fueled employment growth in the service sector. However, productivity gains in service jobs have been relatively meager and automation through technological innovation has not resulted in higher wages, but greater income inequality as returns to capital dwarf returns to labor.

Further economic analysis shows that the divide isn’t so much between “white collar” and “blue collar” jobs, but between routine and non-routine tasks. So warehouse workers and retail clerks have suffered, but designers and wedding planners have fared much better. In other words, technological automation is creating major shifts in the “jobs to be done.”

A recent analysis by the McKinsey Global Institute bears this out. It identified 56 “foundational skills” that are crucial to the future of work, but aren’t in traditional categories such as “engineering” or “sales,” but rather things like self awareness, emotional intelligence and critical thinking.

Collaboration Is The New Competitive Advantage

The industrial revolution drove a shift from animal power to machine power and from physical skills to cognitive skills. What we’re seeing now is a similar shift from cognitive skills to social skills as automation takes over many routine cognitive tasks, increasingly the “job” that humans are valued for is relating to other humans.

There are some things a machine will never do. An algorithm will never strike out at a Little League game, see its child born or have a bad day at work. We can, of course, train computers to mimic these things by training them on data, but they will never actually have the experience and that limits their ability to fully relate to human emotions.

To see how this is likely to play out, simply go and visit your local Apple Store. It is a highly automated operation, without traditional checkout aisles or cash registers. Still, the first thing that catches your eye is a sea of blue shirts waiting to help you. They are not there to execute transactions, which you can easily do online, but to engage with you, understand what you’re trying to achieve and help you get it done.

We’ve seen similar trends at work even in highly technical fields. A study of 19.9 million scientific papers found that not only has the percentage of papers published by teams steadily increased over the past 50 years, the size of those teams has also grown and their research is more highly cited. The journal Nature got similar results and also found that the work being done is far more interdisciplinary and done at greater distances.

What’s becoming clear is that collaboration is increasingly becoming a competitive advantage. The ultimate skill is no longer knowledge or proficiency in a particular domain, but to build a shared purpose with others, who possess a diverse set of skills and perspectives, in order to solve complex problems. In other words, the most important jobs the ones we do in the service of a common objective.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Parallels Between the 1920’s and Today Are Frightening

Parallels Between the 1920's and Today Are Frightening

GUEST POST from Greg Satell

It should be clear by now we are entering a pivotal era. We are currently undergoing four profound shifts, that include changing patterns of demographics, migration, resources and technology. The stress lines are already beginning to show, with increasing tensions over race and class as well as questions about the influence technology and institutions have over our lives.

The last time we faced anything like this kind of tumult was in the 1960s which, much like today, saw the emergence of a new generation, the Baby-Boomers, that had very different values than their predecessors. Their activism achieved significant advances for women and minorities, but also at times, led to tumult and riots.

Yet the changes we are undergoing today appear to be even more significant than we did then. In fact, you would have to go back to the 1920s to find an era that had as much potential for both prosperity and ruin. Unfortunately, it led to economic upheaval, genocide and war on a scale never seen before in world history. We need to do better this time around.

Panics, Pandemics and War

A Wall Street crisis that threatened the greater economy and led to sweeping legislation that reshaped government influence in the financial sector was prelude to both the 1920’s and the 2020’s. Both the Bankers Panic of 1907 and the Great Recession which began in 2007 resulted in landmark legislation, the Federal Reserve Act and Dodd-Frank, respectively.

Continuing in the same vein of eerie parallel, the 1918 flu epidemic killed between 20 million and 50 million people and raged for more than two years, until 1920, when it finally got under control. Much like today, there were social distancing guidelines, significant economic impacts and long-term effects on educational attainment.

Perhaps not surprisingly, there was no small amount of controversy about measures taken to control the pandemic a century ago. People were frustrated with isolation (it goes without saying that there was no Netflix in 1918). Organizations like the Anti-Mask League of San Francisco rose up in defiance.

The years leading up to the 1920s were also war-torn, with World War I ravaging Europe and the colonial order increasingly coming under pressure. Much like the “War on Terrorism,” today, the organized violence, combined with the panics and pandemics, made for an overall feeling that society was unravelling, and many began to look for a scapegoat.

Migration, Globalization and Nativism

In 1892, Ellis Island opened its doors and America became a beacon to those around the world looking for a better life. New immigrants poured in and, by 1910, almost 15% of the US population were immigrants. As the 1920s approached, the strains in society were becoming steadily more obvious and more visceral.

The differences among the newcomers aroused suspicion, perhaps best exemplified by the Sacco and Vanzetti trial, in which two apparently innocent immigrants were convicted and executed for murder. Many believed that the new arrivals brought disease, criminality and “un-American” political and religious beliefs, especially with regard to Bolshevism.

Fears began to manifest themselves in growing nativism and there were increasing calls to limit immigration. The Immigration Act of 1917 specifically targeted Asians and established a literacy test for new arrivals. The Immigration Act of 1924 established quotas which favored northern and Western Europeans over those of Southern and Eastern Europe as well as Jews. The film Birth of A Nation led to a resurgence of the Ku Klux Klan.

Scholars see many parallels between the run-up to the 1920s and today. Although nativism these days is primarily focused against muslims and immigrants from South America, the same accusations of un-American political and religious beliefs, as well as outright criminality, are spurring on a resurgence of hate groups like the Proud Boys. Attorney General Merrick Garland has pledged to make prosecuting white supremacists a top priority.

A New Era of Innovation

As Robert Gordon explained in The Rise and Fall of American Growth, prosperity in the 20th century was largely driven by two technologies, electricity and the internal combustion engine. Neither were linear or obvious. Both were first invented in the 1880’s but didn’t really begin to scale until the 1920’s.

That’s not uncommon. In fact, it takes decades for a new discovery to make a measurable impact on the world. That’s how long is needed to first identify a useful application for a technology and then for ecosystems to form and secondary technologies to arise. Electricity and internal combustions would ignite a productivity boom that would last 50 years, from roughly 1920 until 1970.

For example, as economist Paul David explained in a highly cited paper, it wasn’t the light bulb, but in allowing managers to rearrange work in factories, that electricity first had a significant effect on society. Yet it was in the 1920s that things really began to take off. Refrigerated rail cars transformed diets and labor-saving appliances such as the vacuum cleaner would eventually pave the way for women in the workforce. The first radio stations appeared, revolutionizing entertainment.

Today, although the digital revolution itself has largely been a disappointment, there’s considerable evidence that we may be entering a new era of innovation as the emphasis shifts from bits to atoms. New computing architectures, such as quantum and neuromorphic computing, as well as synthetic biology and materials science, may help to reshape the economy for decades to come.

A Return to Normalcy?

Not surprisingly, by 1920 the American people were exhausted. Technological change, cultural disruption brought about by decades of mass immigration, economic instability and war made people yearn for calmer, gentler times. Warren G. Harding’s presidential campaign touted “a return to normalcy” and people bought in.

Yet while the “Roaring Twenties” are remembered as a golden age, they set the seeds for what came later. Although the stock market boomed, lack of regulation led to the stock market crash of 1929 and the Great Depression. The harsh reparations imposed by the Treaty of Versailles made the rise of Hitler possible.

The 1930s brought upon almost unimaginable horror. Economic hardship in Europe paved the way for fascism. Failed collectivization in the Soviet Union led to massive famine and, later, Stalin’s great purges. Rising nativism, in the US and around the world, led to diminished trade as well as violence against Jews and other minorities. World War II was almost inevitable.

It would be foolish beyond belief to deny the potential of history repeating itself. Still, the past is not necessarily prologue. The 1930s were not the inevitable result of impersonal historical forces, but of choices consciously made. We could have made different ones and received the bounty of the prosperity that followed World War II without the calamity that preceded it.

What we have to come to terms with is that technology won’t save us. Markets won’t save us. Our future will be the product of the choices we make. We should endeavor to choose wisely.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Innovation Ecosystem Building Blocks

Four Innovation Ecosystem Building Blocks

GUEST POST from Greg Satell

It’s hard to find anyone who wouldn’t agree that Microsoft’s 2001 antitrust case was a disaster for the company. Not only did it lose the case, but it wasted time, money and—perhaps most importantly—focus on its existing businesses, which could have been far better deployed on new technologies like search and mobile.

Today, Microsoft is a much different organization. Rather than considering open source software a cancer, it now says it loves Linux. Its cloud business is growing like wildfire and it is partnering widely to develop new quantum computers. What was previously a rapacious monopolist, is now an enthusiastic collaborator.

That’s no accident. Today, we need to compete in an ecosystem-driven world in which nobody, not even a firm as big and powerful as Microsoft, can go it alone. Power no longer comes from the top of value chains, but emanates from the center of networks. That means that strategy needs to shift from dominating industries to building collaborative ecosystems.

1. Connect to Startups

In its heyday, Microsoft enthusiastically followed Michael Porter’s five forces model. It saw threats coming not only from direct competitors, but also suppliers, customers, substitute products and new market entrants. Startups, in particular, were targeted for either acquisition or destruction if they were seen as posing a potential threat.

Today, however, Microsoft actively supports startups. Take, for example, its quantum development effort, in which it is partnering with more than a dozen entrepreneurial companies. These firms also get free access to Microsoft technologies, such as its Azure cloud platform and go-to-market resources and advice, through its Microsoft for Startups program.

Another approach that many firms take is corporate VC programs which actively invest in promising new companies. Unlike a typical investor, corporations bring a wealth of market and technical expertise, can help with things like distribution, supply chain management and marketing acumen. Corporations, for their part, get far more insight into new technologies than they could as an operating company.

Scott Lenet, President of Touchdown Ventures, which operates venture funds for corporations, told me that, “Startups thrive on new ideas and big firms know how to scale and improve those ideas. We’ve seen some of our investments really blossom based on that kind of partnership.”

2. Form Ties to the Academic World

When Sun Microsystems co-founder Bill Joy said, “no matter who you are, most of the smartest people work for someone else,” he was explicitly referring to Bill Gates’s assertion that Microsoft was an “IQ monopolist.” Joy’s position was that “It’s better to create an ecology that gets all the world’s smartest people toiling in your garden for your goals. If you rely solely on your own employees, you’ll never solve all your customers’ needs.”

Make no mistake. Innovation is never a single event. It is a process of discovery, engineering and transformation and those three things almost never happen in the same place or at the same time. That’s why the most innovative companies work hard to build links to the best minds in the academic world.

Today Microsoft has an extensive academic program that extends grants to graduate students and faculty members that are pursuing research that is of interest to the company. Google takes it even a step further, inviting dozens of the world’s top minds to work alongside its scientists and engineers for a sabbatical year.

Microsoft and Google are, of course, firms with enormous resources. However, just about any business can, for example, support the work of a young graduate student or postdoc at a local university. For even a senior researcher to collaborate with your staff is rarely prohibitively expensive. Researchers care far more about genuine support of their work than the size of your investment.

3. Leverage Domain-Specific Consortia

By the mid-1980’s, the American semiconductor industry seemed like it was doomed. Tp respond to what it saw as a national security threat, the American government created SEMATECH in 1986. It was a consortium of government agencies, research institutions and private firms focused on making the industry more competitive. By the mid 1990’s, the US was once again dominating semiconductors.

Any significantly complex technology takes years—and often decades—to develop before it becomes mature enough to engineer into a marketable product. So there is great potential in collaborating, even with competitive firms, in the pre-competitive phase to figure out the basic principles of a nascent technology.

For example, Boeing and Airbus are arch-rivals in aviation, much like DowDupont and BASF are in chemicals. Yet all of these companies, along with many others, collaborate at places like the Composites Institute (IACMI). They do this not out of any altruism, of course, but self-interest, because it is at places like the Composites Institute that they can collaborate with academic scientists, National Labs and startups working in the space.

As technology becomes more complex, domain specific consortia are becoming essential to any ecosystem strategy. The Composites Institute is just one node in the network of Manufacturing Institutes set up under the Obama Administration to support this type of collaboration. In areas ranging from advanced fabrics and biofabrication to additive manufacturing and wide-gap semiconductors, firms large and small are working with scientists to uncover new principles.

And the Manufacturing Institutes are just the start. The Internet of Things Consortium is helping bring computation to the physical world, while the Partnership on AI focuses on artificial intelligence and the Joint Center for Energy Storage Research is helping to develop advanced battery technology. All are open to the largest multinationals and the smallest startups.

4. Move From Hierarchies to Networks

Back in the 90s, when Microsoft still dominated the tech world, markets were still based on linear value chains dominated by one or two industry giants. Yet as I explain in Cascades, we are quickly moving from a world of hierarchies, to one dominated by networks and ecosystems. That changes how we need to develop and grow.

In a hierarchy-driven world, the optimal strategy was to build walls and moats to protect yourself against would-be invaders, which is why Microsoft fought tooth and nail to protect its operating system monopoly. Today, however, industry lines have blurred and technology moves too fast to be able to build effective barriers against disruption.

That’s why today “Microsoft loves Linux”, why it developed an academic program to collaborate with scientists at universities and why it often partners with startups instead of always trying to crush them. The technology being developed today is simply too complex for anyone to go it alone, which is why the only viable strategy is to actively connect to ecosystems of talent, technology and information.

Power today no longer sits at the top of hierarchies, but emanates from the center of ecosystems and you move to the center by widening and deepening connections. Closing yourself by erecting barriers will not protect you. In fact, it is an almost sure-fire way to hasten your demise.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Change Must Be Built on Common Ground

Why Change Must Be Built on Common Ground

GUEST POST from Greg Satell

When Steve Jobs returned to Apple in 1997, one of the first things he did was develop a marketing campaign to rebrand the ailing enterprise. Leveraging IBM’s long running “Think” campaign, Apple urged its customers to “Think Different.” The TV spots began, “Here’s to the crazy ones, the misfits, the rebels, the troublemakers, the round pegs in the square holes…”

Yet Jobs actual product strategy did exactly the opposite. While other technology companies jammed as many features into their products as they could to impress the techies and the digerati, Jobs focused on making his products so ridiculously easy to use that they were accessible to everyone. Apple became the brand people would buy for their mothers.

The truth is that while people like the idea of being different, real change is always built on common ground. Differentiation builds devotion among adherents, but to bring new people in, you need to make an idea accessible and that means focusing on values that you share with outsiders, rather than those that stir the passions of insiders. That’s how you win.

Overcoming the Desire to Be Different

Apple’s ad campaign was effective because we are tribal in nature. Setting your idea apart is a great way to unlock tribal fervor among devotees, but it also sends a strong signal to others that they don’t belong. For example, for decades LGBTQ activists celebrated their difference with “Gay Pride,” which made gay people feel better, but didn’t resonate with others.

It’s not much different in the corporate world. Those who want to promote Agile development love to tout the Agile Manifesto and its customer focused ethos. It’s what they love about the Agile methodology. Yet for those outside the Agile community, it can seem more than a bit weird. They don’t want to join a cult, they just want to get their job done.

So, the first step to driving change forward is to make the shift from differentiating values, which make ardent fans passionate about an idea, to shared values, which invite people in. That doesn’t mean you’re abandoning your core values any more than making products accessible meant that Apple had to skimp on capability. But it does create an entry point.

This is a surprisingly hard shift to make, but you won’t be able to move forward until you do.

Identifying and Leveraging Your Opposition

Make no mistake. Change fails because people want it to fail. Any change that is important, that has the potential for real impact, will inspire fierce resistance. Some people will simply hate the idea and will try to undermine your efforts in ways that are dishonest, deceptive and underhanded. That is the chief design constraint of any significant change effort.

So, you’re going to want to identify your most active opposition because you want to know where the attacks are going to be coming from. However, you don’t want to directly engage with these people because it is unlikely to be an honest conversation. Most likely, it will devolve into something that just bogs you down and drains you emotionally.

However, you can listen. People who hate your idea are, in large part, trying to persuade many of the same people you are. Listening to which arguments they find effective can help unlock shared values and that’s what holds the key to truly transformational change. But most importantly, they can help you define shared values.

So, while your main focus should be on empowering those who are excited about change, you should pay attention to your most vocal opposition. In fact, with some effort, you can learn to love your haters. They can point out early flaws. Also, as you begin to gain traction they will often lash out and overreach, undermine themselves and and end up sending people your way.

Defining Shared Values

Your most active opposition, the people who hate your idea and want to undermine it, have essentially the same task that you do. They want to move people who are passive or neutral to support their position and will design their communication efforts to achieve that objective. If you listen carefully though, you can make their efforts work for you.

For example, when faced with President Woodrow Wilson’s opposition to voting rights for women, Alice Paul’s band of Silent Sentinels picketed the White House with phrases lifted from President Wilson’s own book. How could he object, without appearing to be a tremendous hypocrite, to signs that read, “LIBERTY IS A FUNDAMENTAL DEMAND OF THE HUMAN SPIRIT?

In a similar vein, those who opposed LGBTQ rights often did so on the basis of family values and it was, for decades, a very effective strategy. That is, until LGBTQ activists used it against them. After all, shouldn’t those of different sexual orientations be able to live in committed relationships and raise happy and health families? If you believe in the importance of families, how could you not support same sex marriages?

The strategy works just as well in a corporate environment. In our Transformation & Change workshops, we ask executives what those who oppose their idea say about it. From there, we can usually identify the underlying shared value and then leverage it to make our case. Once you identify common ground, it’s much easier to move forward.

Surviving Victory

Steve Jobs, along with his co-founder Steve Wozniak, started Apple to make computers. But if that’s all Apple ever did, it would never have become the world’s most valuable company. What made Jobs the iconic figure he became had nothing to do with any one product, but because he came to represent something more: the fusion of technology and design.

In his autobiography of Steve Jobs, Walter Isaacson noted that he revolutionized six industries, ranging from music to animated movies, far afield from the computer industry. He was able to do that because he continued to focus on the core values of using technology and design to make products more accessible to ordinary people.

In other words, in every venture he undertook he looked for common ground by asking himself, “how can we make this as easy as possible for those who are not comfortable with technology.” He didn’t merely cater to the differences of his hard core enthusiasts, but constantly looked to bring everybody else in.

Many companies have had hit products, but very few have had the continued success of Apple. In fact, success often breeds failure because it attracts new networks of competitors. Put another way, many entrepreneurs fail to survive victory because they focus on a particular product rather than the shared values that product was based on.

Jobs was different. He was passionate about his products, but his true calling was tapping into basic human desires. In other words, he understood that truly revolutionary change is always built on common ground.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.