Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

Uber Economy is Killing Innovation, Prosperity and Entrepreneurship

Uber Economy is Killing Innovation, Prosperity and Entrepreneurship

GUEST POST from Greg Satell

Today, it seems that almost everyone wants to be the “Uber” of something, and why not? With very little capital investment, the company has completely disrupted the taxicab industry and attained a market value of over $100 billion. In an earlier era, it would have taken decades to have created that kind of impact on a global scale.

Still, we’re not exactly talking about Henry Ford and his Model T here. Or even the Boeing 707 or the IBM 360. Like Uber, those innovations quickly grew to dominance, but also unleashed incredible productivity. Uber, on the other hand, gushed red ink for more than a decade despite $25 billion invested. In 2021 it lost more than $6 billion, the company made progress in 2022 but still lost money, and it was only in 2023 that they finally made a profit.

The truth is that we have a major problem and, while Uber didn’t cause it, the company is emblematic of it. Put simply, a market economy runs on innovation. It is only through consistent gains in productivity that we can create real prosperity. The data and evidence strongly suggests that we have failed to do that for the past 50 years. We need to do better.

The Productivity Paradox Writ Large

The 20th century was, for the most part, an era of unprecedented prosperity. The emergence of electricity and internal combustion kicked off a 50-year productivity boom between 1920 and 1970. Yet after that, gains in productivity mysteriously disappeared even as business investment in computing technology increased, causing economist Robert Solow to observe that “You can see the computer age everywhere but in the productivity statistics.”

When the internet emerged in the mid-90’s things improved and everybody assumed that the mystery of the productivity paradox had been resolved. However, after 2004 productivity growth disappeared once again. Today, despite the hype surrounding things such as Web 2.0, the mobile Internet and, most recently, artificial intelligence, productivity continues to slump.

Take a closer look at Uber and you can begin to see why. Compare the $25 billion invested in the ride-sharing company with the $5 billion (worth about $45 billion today) IBM invested to build its System 360 in the early 1960s. The System 360 was considered revolutionary, changed computing forever and dominated the industry for decades.

Uber, on the other hand, launched with no hardware or software that was particularly new or revolutionary. In fact, the company used fairly ordinary technology to dis-intermediate relatively low-paid taxi dispatchers. The money invested was largely used to fend off would-be competitors through promoting the service and discounting rides.

Maybe the “productivity paradox” isn’t so mysterious after all.

Two Paths To Profitability

Anybody who’s ever taken an Economics 101 course knows that, under conditions of perfect competition, the forces of supply and demand are supposed to drive markets toward equilibrium. It is at this magical point that prices are high enough to attract supply sufficient to satisfy demand, but not any higher.

Unfortunately for anyone running a business, that equilibrium point is the same point at which economic profit disappears. So to make a profit over the long-term, managers need to alter market dynamics either through limiting competition, often through strategies such as rent seeking and regulatory capture, or by creating new markets through innovation.

As should be clear by now, the digital revolution has been relatively ineffective at creating meaningful innovation. Economists Daron Acemoglu and Pascual Restrepo refer to technologies like Uber, as well as things like automated customer service, as “so-so technologies,” because they displace workers without significantly increasing productivity.

Joseph Schumpeter pointed out long ago, market economies need innovation to fuel prosperity. Without meaningful innovation, managers are left with only strategies that limit innovation, undermine markets and impoverish society, which is what largely seems to have happened over the past few decades.

The Silicon Valley Doomsday Machine

The arrogance of Silicon Valley entrepreneurs seems so outrageous—and so childishly naive— that it is scarcely hard to believe. How could an industry that has produced so little in terms of productivity seem so sure that they’ve been “changing the world” for the better. And how have they made so much money?

The answer lies in something called increasing returns. As it turns out, under certain conditions, namely high up-front investment, negligible marginal costs, network effects and “winner-take-all markets,” the normal laws of economics can be somewhat suspended. In these conditions, it makes sense to pump as much money as possible into an early Amazon, Google or Facebook.

However this seemingly happy story has a few important downsides. First, to a large extent these technologies do not create new markets as much as they disrupt or displace old ones, which is one reason why productivity gains are so meager. Second, the conditions apply to a small set of products, namely software and consumer gadgets, which makes the Silicon Valley model a bad fit for many groundbreaking technologies.

Still, if the perception is that you can make a business viable by pumping a lot of cash into it, you can actually crowd-out a lot of good businesses with bad, albeit well-funded ones. In fact, there is increasing evidence that is exactly what is happening. Rather than an engine of prosperity, Silicon Valley is increasingly looking like a doomsday machine.

Returning To An Innovation Economy

Clearly, we cannot continue “Ubering” ourselves to death. We must return to an economy fueled by innovation, rather than disruption, which produces the kind of prosperity that lifts all boats, rather than outsized profits for a meager few. It is clearly in our power to do that, but we must begin to make better choices.

First, we need to recognize that innovation is something that people do, but instead of investing in human capital, we are actively undermining it. In the US, food insecurity has become an epidemic on college campuses. To make matters worse, the cost of college has created a student debt crisis, essentially condemning our best and brightest to decades of indentured servitude. To add insult to injury, healthcare costs continue to soar. Should we be at all surprised that entrepreneurship is in decline?

Second, we need to rebuild scientific capital. As Vannevar Bush once put it, “There must be a stream of new scientific knowledge to turn the wheels of private and public enterprise.” To take just one example, it is estimated that the $3.8 billion invested in the Human Genome Project generated nearly $800 billion of economic activity as of 2011. Clearly, we need to renew our commitment to basic research.

Finally, we need to rededicate ourselves to free and fair markets. In the United States, by almost every metric imaginable, whether it is industry concentration, occupational licensing, higher prices, lower wages or whatever else you want to look at capitalism has been weakened by poor regulation and oversight. Not surprisingly, innovation has suffered.

Perhaps most importantly, we need to shift our focus from disrupting markets to creating them, from “The Hacker Way”, to tackling grand challenges and from a reductionist approach to an economy based on dignity and well being. Make no mistake: The “Uber Economy” is not the solution, it’s the problem.

— Article courtesy of the Digital Tonto blog
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Surviving Change

Surviving Change

GUEST POST from Greg Satell

The Greek philosopher Heraclitus observed that “There is nothing permanent except change” and events over the past few thousand years would seem to prove him right. Yet while change may endure, the rate of change fluctuates over time. Throughout history, forces tend to cascade and converge on particular points.

By all indications, we are in such a period now. We are undergoing four major shifts in technology, resources, migration and demography that will be transformative. Clearly, these shifts will create significant opportunities, but also great peril. The last time we saw this much change afoot was during the 1920s and that didn’t end well.

Yet history is not destiny. We’re entering a new era of innovation in which our ability to solve problems will be unprecedented and can shape our path by making wise choices. Still, as we have seen with the Covid pandemic, the toughest challenges we will face will have had less to do with devising solutions than with changing behaviors and conquering ourselves.

Building A Shared Understanding Of The Problems We Need To Solve

The first step toward solving a problem is acknowledging that there is one. Even before Covid skeptics came into vogue, there was no shortage of pundits who denied climate change. For years, many considered Alan Greenspan to possess sage-like wisdom when he asserted that markets would self-correct. In the end, even he would admit that he was gravely mistaken.

The truth is that we live in a world of the visceral abstract, where strange theories govern much of our existence. People can debate the “big bang,” deny Darwin’s theory of natural selection or even deride these ideas as “lies straight from the pit of hell.” Many agreed when Senator Marco Rubio asserted that these things have nothing to do with our everyday lives.

Still, the reality is that modern existence depends on abstract theories almost every second of the day. Einstein’s theories may seem strange, but if GPS satellites aren’t calibrated to take them into account, we’re going to have a hard time getting where we want to go. In much the same way the coronavirus doesn’t care what we think about Darwin, if it is allowed to replicate it will mutate and new, more deadly variants are likely to arise.

History shows that building a consensus to confront shared challenges is something that is firmly within our capability. The non-proliferation agenda of the 1950s led to concrete achievements such as the Partial Test Ban Treaty. When advances in gene therapy made the potential for danger clear, the Berg Letter called for a moratorium on the riskiest experiments until the dangers were better understood. These norms have been respected for decades.

Discovering Novel Solutions

Identifying and defining our challenges is just a first step. As Bill Gates pointed out, we still don’t know how to solve the climate crisis. Despite all the happy talk about technological advancement, productivity growth remains depressed. We’ve seen a global rise in populist authoritarianism and our inability to solve problems has surely contributed.

Put simply, we do not know how to overcome all of the challenges we face today. We need to innovate. However, innovation is never a single event, but a process of discovery, engineering and transformation. We can’t simply hope to adapt and overcome when a crisis hits, we need to innovate for the long-term.

Consider our response to the Covid crisis. Yes, the pandemic caught us off-guard and we should have been better prepared. But our most effective response wasn’t any of the emergency measures, but a three-decade effort that resulted in the development of mRNA vaccines. Even that was nearly killed in its cradle and surely would have been if it had not been for the dedication and perseverance of a young researcher named Katalin Karikó.

An emerging model taking hold is collaboration between government, academia and private industry. For example, JCESR is helping to create next generation technologies in energy storage, the Partnership on AI is helping to map the future for cognitive technologies and the Manufacturing USA Institutes, bring together diverse stakeholders to drive advancement.

Perhaps most of all, we need to start taking a more biological view of technology. We can no longer expect advancement to progress in an organized, linear way. We need to think less like engineers building a machine and more like gardeners who grow ecosystems to nurture new possibilities that we can’t yet imagine, but are lying beneath the surface.

Driving Adoption And Scaling Change

If there’s anything we’ve learned during the Covid pandemic is that developing a viable solution isn’t enough. Early measures, such as masking and social distancing, were met with disdain. The development of effective vaccines in record time was something of a miracle. Still, it was met with derision rather than gratitude in many communities.

This is not a new phenomenon. Good ideas fail all the time. From famous cases like that of Ignaz Semmelweis and William Coley to the great multitudes whose names are lost to history, any time a new idea threatens the status quo there will always be some that will seek to undermine it and they will do it in ways that are dishonest, underhanded and deceptive. If change is ever to prevail, we need to learn to anticipate and overcome resistance.

The good news is that it only takes a minority to embrace change in order for it to prevail. Everett Rogers found that it took only 10%-20% of system members to adopt an innovation for rapid adoption to follow. An analysis of over 300 political revolutions estimated that 3.5% active participation was enough. Other research suggests that the tipping point is 25% in an organization.

What we need is not more catchy slogans, divisive rhetoric or even charismatic leaders, but to empower movements made up of small groups, loosely connected but united by a shared purpose. My friend Srdja Popović provides great guides for social and political revolutionaries in both his book and his organization’s website. I have adapted many of these ideas for corporate and organizational contexts in Cascades.

Perhaps most importantly, as I recently pointed out in Harvard Business Review, is that transformation is fundamentally distinct from other stages of innovation. Coming up with a new idea or solution takes very different skills—and often different people—than driving adoption and scale.

Building A Bridge Through Shared Identity

Marshal McLuhan, one of the most influential thinkers of the 20th century, described media as “extensions of man” and predicted that electronic media would eventually lead to a global village. Communities, he predicted, would no longer be tied to a single, isolated physical space but connect and interact with others on a world stage.

What often goes untold is that McLuhan did not see the global village as a peaceful place. In fact, he predicted it would lead to a new form of tribalism and result in a “release of human power and aggressive violence” greater than ever in human history, as long separated —and emotionally charged— cultural norms would constantly intermingle, clash and explode.

Today, what we most need to grapple with is the dystopia that McLuhan foresaw and described so eloquently and accurately. People do not vehemently refute science, trash Darwin, deny climate change or oppose life-saving vaccines because they have undergone some rational deductive process, but because it offends their identity and sense of self. That, more than anything else, is why change fails.

Yet as Francis Fukuyama pointed out in his recent book, our identities are not fixed, but develop and change over time. We can seek to create a larger sense of self through building communities rooted in shared values. What’s missing in our public discourse today isn’t more or better information. What we lack is a shared sense of mission and purpose.

That is the challenge before us. It is not enough to devise solutions to the problems we face, although that in itself will require us to apply the best of our energies and skills. We will also have to learn to survive victory by overcoming the inevitable strife that change leaves in its wake.

— Article courtesy of the Digital Tonto blog
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The Reasons History Converges and Cascades

The Reasons History Converges and Cascades

GUEST POST from Greg Satell

Throughout history there have been certain times and places that have given rise to phenomenal intellectual activity. The Vienna Circle and Cambridge’s Bloomsbury Group in the early 20th century are certainly examples, as is the Golden Age of Russian Literature in the mid-19th century and the post-war existentialist movement in Paris.

In a certain sense, these seem random, but they aren’t really. In each case, we can see undercurrents of politics, economics and other forces that gave rise to tensions people were trying to resolve. Great thinkers would explore, meet and influence each other, creating new directions and possibilities.

Yet it isn’t only intellectual life that converges in this way. History has a way of assembling forces around certain points of time and space, when long-standing trends intersect and give rise to new things. That’s why we study past events and learn about the lives of great personages long gone, so that we can hope to proactively recognize these forces and adapt.

1948: The Birth Of The Post-War Era

1948 was a pivotal year in many ways. Harry Truman was elected in a surprise upset over Thomas Dewey. Gandhi was assassinated in India. In South Africa, the white supremacist Nationalist party took power, making way for a half-century of Apartheid. The communists took power in Czechoslovakia and the Soviets sealed off Berlin. The western allies responded with a massive airlift, the likes of which the world had never seen.

Yet what probably would have more lasting effects than anything else that year didn’t involve great powers, armies or even political parties. In fact, the most consequential events that year hardly made the newspapers and most people probably weren’t even aware of them. It was in 1948 that two breakthrough innovations at Bell Labs ushered in the digital age.

The first was the transistor, invented by John Bardeen, William Shockley and Walter Brattain. Up until that time, computers used vacuum tubes, which were big, clunky, slow and tended to burn out. Transistors made it possible to make computers exponentially faster and more reliable. They also made way for the integrated circuits we still use today.

The second breakthrough, Claude Shannon’s creation of information theory, was less obvious, but no less important. The basic idea was that information can be broken down into quantifiable entities he called binary digits (or bits for short). It was information theory, along with Shannon’s earlier work that showed how Boolean algebra could be transformed through mechanical means into logic gates, that made the information age possible.

When I spoke to Fred Brooks, who led the development of IBM’s legendary System 360 that would dominate computing for a generation, he explained how both innovations proved pivotal to his work. Of course, it was the transistor that made the IBM 360 possible, but he also told me that it was his decision to switch from a 6-bit byte to an 8-bit byte, which enabled the use of lowercase letters, that helped make it transformative.

1968 – A Historically Tumultuous Year

While 1948 is remembered as a year of great events, 1946 is remembered for very different reasons. With armistices firmly in place in both the Atlantic and Pacific theaters, soldiers coming home from war started settling down and making love. With the inevitable result that came in the years that followed, the Baby Boom generation was born.

As the horrors of war receded and a new era of prosperity emerged, the Boomers began to see things very differently than previous generations. They would question authority, challenging old values and ways of doing things. Many began to advocate for gender and racial equality. Unwilling to take the world as it was, they sought to remake it in their own image.

Tensions simmered throughout the 60s, but in 1968 they would combine and explode. The year started with the Prague Spring, when a number of modest reforms in Czechoslovakia, intended to bring about “Socialism with a human face,” were met by a brutal Soviet crackdown. A few months later, Polish authorities got the message and crushed internal protests advocating for similar reforms.

During the American spring of that year Martin Luther King Jr. and Robert F. Kennedy were both assassinated. The summer brought, if anything, greater tumult. Bloody clashes between police and demonstrators at the Democratic National Convention discredited the party amongst many and paved the way for the election of Richard Nixon. Tommie Smith and John Carlos would raise their hands in a black power salute on the Olympic Podium.

Perhaps most of all, 1968 represented a handing of the baton. The 20-somethings of the 1960s would become 30-somethings of the 1970s. In the 1980s, they voted for Reagan in droves, and would shift how the the United States saw and governed itself as well as its place in the world.

1989 – Berlin Wall and World Wide Web

In November 1989, there were two watershed events that would fundamentally change how the world worked. The fall of the Berlin Wall would end the Cold War and open up markets across the world. That very same month, Tim Berners-Lee would create the World Wide Web and usher in a new technological era of networked computing.

Like in 1948 and 1968, the forces leading up to these events had been building for some time. The Polish Solidarity movement, which had been active since 1980, united activists from labor and the intelligentsia. It showed that the Soviets could be successfully defied. As the price of oil dropped throughout the 1980s, the Eastern Bloc became increasingly untenable.

In a similar way, the development of the World Wide Web had been brewing for decades. The US government had been building out ARPANET and computer scientists had been developing hypertext since the 1960s. All of the technology was in place in 1989 and Berners-Lee was able to create what became the World Wide Web in less than a month.

1989 would mark an inflection point in which the world would shift from hierarchies to networks and the global village which Marshall McLuhan had envisioned came into being. Much like he predicted, however, this village was not a friendly place, but would result in a “release of human power and aggressive violence” from which we are still reeling.

The Power Of Cascades

In my book Cascades, I explained how small groups, loosely connected but united by a shared purpose drive transformational change. It happens gradually, almost imperceptibly, at first. Connections accumulate under the surface, barely noticed, as small groups slowly begin to link together and congeal into a network. Eventually things hit a tipping point.

It’s not just people that are networked though, events are as well. There are always unseen connections between the forces of economics, technology, culture, politics and many other things. Much like social and political movements, the effects are almost impossible to detect at first, but can accelerate in nonlinear ways that defy the prediction of experts.

By all indications, we are in such a period now. We are undergoing four major shifts in technology, resources, migration and demography that will be transformative. Clearly, these shifts will create significant opportunities, but also great peril. The last time we saw this much change afoot was during the 1920s and that didn’t end well.

Yet that doesn’t have to happen. In 1948 we were able to create a new world order that ushered in an era of peace and prosperity unequalled in human history. The events of 1968 and 1989 also helped to bring about enormous progress. The difference between those epochs wasn’t so much due to any underlying forces, but the choices that were made.

Every generation faces great challenges. Some are remembered for their achievements, others for their tragedies. Like earlier generations, we have important choices to make. We should endeavor to choose wisely.

— Article courtesy of the Digital Tonto blog
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The Power of Cultural Competence in Leadership

The Power of Cultural Competence in Leadership

GUEST POST from Greg Satell

When I first moved to Kyiv about 20 years ago, I met my friend Pavlo, who is from Belarus. Eventually our talk turned to that country’s leader, Alexander Lukashenko, and an incident in which he turned off the utilities at the US Ambassador’s residence, as well as those of other diplomats. It seemed totally outlandish and crazy to me.

“But he won,” Pavlo countered. I was incredulous, until he explained. “Lukashenko knows he’s a bastard and that the world will never accept him. In that situation all you can win is your freedom and that’s what he won.” It was a mode of thinking so outrageous and foreign to me that I could scarcely believe it.

Yet it opened my eyes and made me a more effective operator. We tend to think of empathy as an act of generosity, but it’s far more than that. Learning how to internalize diverse viewpoints is a skill we should learn not only because it helps make others more comfortable, but because it empowers us to successfully navigate an often complex and difficult world.

Shared Identity And Dominant Culture

We naturally tend to form groups based on identity. For example, in a study of adults that were randomly assigned to “leopards” and “tigers,” fMRI studies noted hostility to outgroup members. Similar results were found in a study involving five year-old children and even in infants. So to a certain extent, tribalism is unavoidable.

Evolutionary psychologists attribute this tendency to kin selection. Essentially, groups favor those who are most like them are more likely to see their own genes passed on. As Richard Dawkins famously pointed out, what we traditionally consider altruism can also be seen as selfish genes conniving to perpetuate themselves.

So it shouldn’t be surprising that organizational and institutional settings sustain our penchant for tribalism. Managers will tend to hire people who think and act in familiar ways. Those who reflect the preferences of the higher-ups will be more likely to get promoted and, in turn, recruit people like themselves. This all leads to a level of homogeneity that provides comfort and confidence.

We rarely welcome someone who threatens our sense of self. So those outside the dominant culture are encouraged to conform and are often punished when they don’t. They are less often invited to join in routine office socializing and promotions are less likely to come their way. When things go poorly, it’s much easier to blame the odd duck than the trusted insider.

How Success Leads To Failure

In The Structure of Scientific Revolutions, author Thomas Kuhn describes how paradigms have a life cycle. They begin as an insight that succeeds in helping to solve a certain class of problems. As their usefulness grows, they rise to dominance and they are rarely questioned or scrutinized anymore. People use them almost reflexively.

Yet over time certain anomalies arise in which the paradigm doesn’t quite fit. For example, in the 1970s and 80s, the minicomputer industry in Boston reigned supreme. Yes, there were some small startup firms out in California, but they were no match for giants like DEC, Data General and Apollo Computer. The California firms were gaining traction, but weren’t considered threats.

Unfortunately for the Boston firms, paradigms were shifting in a way that was extremely disadvantageous to them. They had built strong, dominant cultures that could execute plans efficiently, but weren’t sensitive to outside information. The Silicon Valley firms, which were more diverse and interconnected, were much more able to adapt to changing market realities.

The truth is that the next big thing always starts out looking like nothing at all. That’s why we so often miss it. If it was obvious, everyone would know about it and it wouldn’t surprise us. That’s why it’s so often those odd ducks, the anomalies that don’t quite fit with the dominant culture, that end up disrupting industries.

Acknowledging Difference

Organizations crave efficiency. It’s easy to measure, evaluate and compensate for. That’s why managers tend to favor cohesive cultures. If you hire and promote likeminded people they will tend to be able to achieve consensus quickly without much deliberation or debate, and move quickly to action.

At the same time, inserting diversity stirs things up and makes people uneasy. For example, in a 2003 study involving 242 members of sororities and fraternities at Northwestern University, groups of students were asked to solve a murder mystery. The more homogenous teams felt more comfortable and confident, but were also much more likely to be wrong.

Real world data suggests that these results are the rule, rather than the exception. A McKinsey report that covered 366 public companies in a variety of countries and industries found that groups that were more ethnically and gender diverse performed significantly better than others. Many other studies have shown similar results

What’s crucial to grasp is that diverse teams don’t perform better in spite of discomfort, but because of it. When people expect people to think like them, they look to build an easy consensus. However, when they expect a range of opinions and perspectives, they need to entertain more possibilities and scrutinize information more skeptically.

That’s why it’s so important to acknowledge differences. If we see the world as homogeneous, we are more likely to see dissenting opinions as hostile challenges instead of new possibilities to be explored. It is through acknowledging diverse viewpoints that we can best catalyze the kind of creativity and innovation that helps solve complex and important problems.

Building Shared Identity Through Shared Purpose

Living as a foreigner for 15 years forced me to acknowledge viewpoints very different from my own. Many seemed strange to me, some I found morally questionable, but all forced me to grow. Perhaps even more importantly, I made friends like Pavlo, all of whom helped shape me and how I perceive things.

Our identity and sense of self drives a lot of what we see and do, yet we rarely examine these things because we spend most of our time with people who are a lot like us, who live in similar places and experience similar things. That’s why our innate perceptions and beliefs seem normal and those of others strange, because our social networks shape us that way.

The purpose of an education is to help us see beyond our own experience, but that is in no way a passive skill. We need to continually renew it. Diversity only has value if we appreciate difference; are willing to explore and learn from it. It is only then that we can glean new insights and apply them to some useful endeavor

It is at this nexus of identity and purpose that creativity and innovation reside, because when we learn to collaborate with others who possess knowledge, skills and perspectives that we don’t, new possibilities emerge. Make no mistake, however, breakthroughs are never truly serendipitous or random, but the product of a prepared mind.

And you prepare your mind through building cultural competence.

— Article courtesy of the Digital Tonto blog
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Don’t Listen to These Three Change Consultant Recommendations

Don't Listen to These Three Change Consultant Recommendations

GUEST POST from Greg Satell

The practice of change management is a relatively young discipline. It got its start in 1983, when a McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management. His ideas became McKinsey’s first change management model that it sold to clients and set the stage for much that came afterward.

Phillips’ work kicked off a number of similar approaches such as Kotter’s 8-step model and the Prosci ADKAR model and an industry was born. Today, hordes of “change consultants” ply their craft working to communicate transformational ideas to inspire change. The results, unfortunately, have been rather dismal.

The simple truth is that change rarely fails because people don’t understand it, but that it is actively sabotaged by those who, for whatever reason, oppose it. That’s why any change strategy that depends on persuasion is bound to fail. The truth is that if you want to bring change about you need to identify those who believe in it and empower them to succeed.

1. Create A Sense Of Urgency Around Change

One of the basic tenets of change management that dates back to Phillips’ original paper is that you need to create a “sense of urgency” around change. So change leaders work to gain approval for a sizable budget as a sign of institutional commitment, recruit high-profile executives, arrange a big “kick-off” meeting and look to move fast and gain scale.

That may work for a conventional project, but for something that’s truly transformational, it’s a sure path to failure. The problem is that if a change is important and has real potential to impact what people believe and what they do, there will always be those who will hate it and they will work to undermine it in ways that are dishonest, underhanded and deceptive.

Starting off with a “big bang” can often unwittingly aid these efforts. Large scale change of any kind, even if the net effect is overwhelmingly positive, always causes some disruption. So appearing to work to overpower, rather than to attract, others can feed into the atmosphere of fear and loathing that opponents of change want to create. It also gives the opponents of change a head start to kill change before it really even starts.

A much better approach is to work to empower small groups, loosely connected, united by a shared purpose. For example, when Wyeth Pharmaceuticals began its shift to lean manufacturing, it started with a single team at a single plant, but led to a 25% reduction of costs across 25 sites encompassing 17,000 employees.

2. Start With A Quick, Easy Win

Another thing that change consultants regularly recommend is going for a “quick and easy win” in order to build momentum and establish credibility. The problem is that if the “win” isn’t meaningful, it will do little to drive change forward. In fact, touting a meaningless and irrelevant pseudo-accomplishment can make change leaders look out-of-touch and impractical.

A much more effective strategy is to start with a keystone change that represents a concrete and tangible goal, involves multiple stakeholders and paves the way for future change. That’s how you can begin to build real traction. While the impact of that early keystone change might be limited, a small, but meaningful, success can show what’s possible.

Consider PxG, a process improvement initiative at Procter & Gamble. It started out when three young executives set out to improve a single process. It wasn’t quick or easy. In fact, it took months of hard work. Nevertheless, they were able to transform a bottleneck that held up projects for weeks into a streamlined procedure that is completed in mere hours.

In a similar vein, when the global data giant Experian sought to transform itself into a cloud-based enterprise, it started with internal API’s that were much less risky than those that allowed access to outsiders. These weren’t really that much simpler or easier than public API’s, but showed the potential of cloud technology.

The truth is that sometimes you need to go slow in order to go fast. Transformation is not a linear process, but accelerates as it gains momentum. It pays to build your change effort on solid ground, rather than trying to lurch forward. Nothing slows you down more than a setback.

3. Prepare A Stakeholder Map

In any change process, a variety of stakeholders will have concerns. So consultants often suggest mapping the various stakeholders in terms of their level of enthusiasm, engagement, power to influence and other parameters. The idea is that by categorizing and cataloguing, you can better understand the forces at play.

This type of approach makes for impressive looking PowerPoint decks and intellectually appealing reports, but does little to achieve real change. The truth is that what most influences stakeholders are other stakeholders. Slicing and dicing them eighteen different ways isn’t going to do much more than confuse the situation.

However, for decades social and political movements have used tools such as the Spectrum of Allies and the Pillars of Support to change entire societies and they are just as effective in organizational transformations. Essentially, the idea is to divide stakeholders into two categories: constituencies and institutions (or those who wield institutional power).

So to transform education, you might mobilize support from parents, teachers and students to influence school boards, administrators and teachers unions to make changes. In a corporate context, you might want to mobilize groups of employees, customers and other constituencies to influence internal and external institutions such as senior leaders, the media, professional associations, regulators, labor unions etc.

The point is that you are always mobilizing somebody to influence something. Pure mobilization is nothing more than rabble rousing. Working quietly behind closed doors leaves you vulnerable to an uprising among the rank and file. Building support among constituencies can not only influence those with institutional power to act, it builds change on solid ground.

Focusing On The 25% That Matters

There is an inherent flaw in human nature that has endowed us with a burning desire to convince skeptics. So it shouldn’t be surprising that change consultants focus on persuasion. Nothing validates a high fee like some clever wordsmithing designed to persuade those hostile to the ideas of those paying the bill.

Yet anybody who has ever been married or had children knows how difficult it can be to convince even a single person of something they don’t want to be convinced of. To set out to persuade hundreds—or even thousands—that they should adopt an idea that they are inherently hostile to is not only hubris, but incredibly foolish.

It is also unnecessary. Scientific research suggests that the tipping point for change is only a 25% minority. Once a quarter of the people involved become committed to change, the rest will largely go along. So there is no need to convince skeptics. Your time and effort will be much better spent helping those who are enthusiastic about change to make it succeed.

That’s what the change consultants get wrong. You don’t “manage” change. You empower it by enabling those who believe in it to show it can work and then bringing in others who can bring in others still. The truth is that you don’t need a clever slogan to bring change about, you need a network. That’s how you create a movement that drives transformation.

— Article courtesy of the Digital Tonto blog
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

A Shared Language for Radical Change

A Shared Language for Radical Change

GUEST POST from Greg Satell

One of the toughest things about change is simply to have your idea understood. The status quo always has inertia on its side and never yields its power gracefully. People need a reason to believe in change, but they never need much convincing to allow things to go along as they always have. Inaction is the easiest thing in the world.

This can be incredibly frustrating. It doesn’t matter if you’re a political revolutionary, a social visionary or an entrepreneur, if you have an idea you think can impact the world, you want people to be as excited about it as you are. So you try to describe it in vivid language that highlights how wonderfully different it really is.

The pitfall that many would-be revolutionaries fall into is they fail to communicate in terms that others are able to accept and internalize. Make no mistake. Nobody needs to understand your idea. If you think your idea is important and want it to spread, then you need to meet people where they are, not where you’d like them to be. That’s how you make change real.

The Importance Of Finding Your Tribe

There’s no question that Pixar is one of the most successful creative enterprises ever. Yet in his memoir, Creativity, Inc., Pixar founder Ed Catmull wrote that “early on, all of our movies suck.” Catmull calls initial ideas “ugly babies,” because they start out, “awkward and unformed, vulnerable and incomplete.” Few can see what those ugly babies can grow into.

That’s why it’s important to start with a majority. You can always expand a majority out, but once you are in the minority you will either immediately feel pushback or, even worse, you will simply be ignored. If you can find a tribe of people who are as passionate about your idea as you are, you can empower them to succeed and bring in others to join you as well.

There is, however, a danger to this approach. Consider a study that examined networks of the cast and crew of Broadway plays. The researchers found that if no one had ever worked together before, results tended to be poor. However, if the networks among the cast and crew became too dense— becoming a close-knit tribe—performance also suffered.

The problem is that tribes tend to be echo chambers that filter outside voices. Consensus becomes doctrine and, eventually, gospel. Dissension is not only discouraged, but often punished. Eventually, a private language emerges that encodes the gospel into linguistic convention and customs. The outside world loses internal tribal relevance.

The Pitfalls Of A Private Language

Every field of endeavor must navigate the two competing needs: specialization and relevance. For example, a doctor treating a complex disease must master the private, technical language of her field to confer with colleagues, but must also translate those same concepts to a public, common language to communicate with patients in ways they can understand.

Yet as the philosopher Ludwig Wittgenstein explained, these types of private languages can be problematic. He made the analogy of a beetle in a box. If everybody had something in a box that they called a beetle, but no one could examine each other’s box, there would be no way of knowing whether everybody was actually talking about the same thing or not.

What Wittgenstein pointed out was that in this situation, the term “beetle” would lose relevance and meaning. It would simply refer to something that everybody had in their box, whatever that was. Everybody could just nod their heads not knowing whether they were talking about an insect, a German automobile or a British rock band. The same also happens with professional jargon and lingo.

I see this problem all the time in my work helping organizations to bring change about. People leading, say, a digital transformation are, not surprisingly, enthusiastic about digital technology and speak to other enthusiasts in the private, technical language native to their tribe. Unfortunately, to everyone else, this language holds little meaning or relevance. For all practical purposes, it might as well be a “beetle in a box.”

Creating A Shared Identity Through Shared Values And Shared Purpose

The easiest way to attack change is to position it as fundamentally at odds with the prevailing culture. In an organizational environment, those who oppose change often speak of undermining business models or corporate “DNA.” In much the same way, social and political movements are often portrayed as “foreign” or “radical.”

That’s why successful change efforts create shared identity through shared values and shared purpose. In the struggle for women’s voting rights in America, groups of Silent Sentinels would picket the White House with slogans taken from President Woodrow Wilson’s own books. To win over nationalistic populations in rural areas, the Serbian revolutionary movement Otpor made the patriotic plea, “Resistance, Because I Love Serbia.”

We find the same strategy effective in our work with organizational transformations. Not everybody loves technology, for example, but everybody can see the value of serving customers better, in operating more efficiently and in creating a better workplace. If you can communicate the need for change in terms of shared values and purpose, it’ll be easier for others to accept.

Even more importantly, people need to see that change can work. That’s why we always recommend starting with a keystone change, which represents a clear and tangible objective, involves multiple stakeholders and paves the way for future change. For example, with digital transformations, we advise our clients to automate the most mundane tasks first, even if those aren’t necessarily the highest priority tasks for the project.

Would You Rather Make A Point Or Make A Difference?

One of the most difficult things about leading change is that you need to let people embrace it for their own reasons, which might not necessarily be your own. When you’re passionate about an idea, you want others to see it the same way you do, with all its beautiful complexity and nuance. You want people to share your devotion and fervor.

Many change efforts end up sabotaging themselves for exactly this reason. People who love technology want others to love it too. Those who feel strongly about racial and gender-based diversity want everyone to see injustice and inequality just as they do. Innovators in any area can often be single-minded in their pursuit of change.

The truth is that we all have a need to be recognized and when others don’t share a view that we feel strongly about, it offends our sense of dignity. The danger, of course, is that in our rapture we descend into solipsism and fail to recognize the dignity of others. We proudly speak in a private language amongst our tribe and expect others to try and find a way in.

Yet the world simply doesn’t work that way. If you care about change, you need to hold yourself accountable to be an effective messenger. You have to make the effort to express yourself in terms that your targets of influence are willing to accept. That doesn’t in any way mean you have to compromise. It simply means that you need to advocate effectively.

In the final analysis, you need to decide whether you’d rather make a point, or make a difference.

— Article courtesy of the Digital Tonto blog
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

We Are Starving Our Innovation Economy

We Are Starving Our Innovation Economy

GUEST POST from Greg Satell

The Cold War was fundamentally different from any conflict in history. It was, to be sure, less over land, blood and treasure than it was about ideas. Communist countries believed that their ideology would prevail. They were wrong. The Berlin Wall fell and capitalism, it seemed, was triumphant.

Today, however, capitalism is in real trouble. Besides the threat of a rising China, the system seems to be crumbling from within. Income inequality in developed countries is at 50-year highs. In the US, the bastion of capitalism, markets have weakened by almost every imaginable metric. This wasn’t what we imagined winning would look like.

Yet we can’t blame capitalism. The truth is that its earliest thinkers warned about the potential for excesses that lead to market failure. The fact is that we did this to ourselves. We believed that we could blindly leave our fates to market and technological forces. We were wrong. Prosperity doesn’t happen by itself. We need to invest in an innovation economy.

Capitalism’s (Seemingly) Fatal Contradiction

Anyone who’s taken an “Economics 101” course knows about Adam Smith and his invisible hand. Essentially, the forces of self-interest, by their very nature, work to identify the optimal price that attracts just enough supply of a particular good or service to satisfy demand. This magical equilibrium point creates prosperity through an optimal use of resources.

However, some argued that the story wasn’t necessarily a happy one. After all, equilibrium implies a lack of economic profit and certainly businesses would want to do better than that. They would seek to gain a competitive advantage and, in doing so, create surplus value, which would then be appropriated to accumulate power to rig the system further in their favor.

Indeed, Adam Smith himself was aware of this danger. “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices,” he wrote. In fact, the preservation of free markets was a major concern that ran throughout his work.

Yet as the economist Joseph Schumpeter pointed out, with innovation the contradiction dissipates. As long as we have creative destruction, market equilibriums are constantly shifting and don’t require capitalists to employ extractive, anti-competitive practices in order to earn excellent profits.

Two Paths To Profit

Anyone who manages a business must pursue at least one of two paths to profit. The first is to innovate. By identifying and solving problems in a competitive marketplace, firms can find new ways to create, deliver and capture value. Everybody wins.

Google’s search engine improved our lives in countless ways. Amazon and Walmart have dramatically improved distribution of goods throughout the economy, making it possible for us to pay less and get more. Pfizer and Moderna invested in an unproven technology that uses mRNA to deliver life-saving molecules and saved us from a deadly pandemic.

Still, the truth is that the business reality is not, “innovate or die,” but rather “innovate or find ways to reduce competition.” There are some positive ways to tilt the playing field, such as building a strong brand or specializing in some niche market. However, other strategies are not so innocent. They seek to profit by imposing costs on the rest of us

The first, called rent seeking, involves businesses increasing profits through getting litigation passed in their favor, as when car dealerships in New Jersey sued against Tesla’s direct sales model. The second, regulatory capture, seeks to co-opt agencies that are supposed to govern industry, resulting in favorable implementation and enforcement of the legal code.

Why “Pro-Business” Often Means Anti-Market

Corporations lobby federal, state and local governments to advance their interests and there’s nothing wrong with that. Elected officials should be responsive to their constituents’ concerns. That is, after all, how democracy is supposed to work. However, very often business interests try to maintain that they are arguing for the public good rather than their own.

Consider the issue of a minimum wage. Businesses argue that government regulation of wages is an imposition on the free market and that, given the magical forces of the invisible hand, letting the market set the price for wages would produce optimal outcomes. Artificially increasing wages, on the other hand, would unduly raise prices on the public and reduce profits needed to invest in competitiveness.

This line of argument is nothing new, of course. In fact, Adam Smith addressed it in The Wealth of Nations nearly 250 years ago:

Our merchants and master-manufacturers complain much of the bad effects of high wages in raising the price, and thereby lessening the sale of their goods both at home and abroad. They say nothing concerning the bad effects of high profits. They are silent with regard to the pernicious effects of their own gains. They complain only of those of other people.

At the same time corporations have themselves been undermining the free market for wages through the abuse of non-compete agreements. Incredibly, 38% of American workers have signed some form of non-compete agreement. Of course, most of these are illegal and wouldn’t hold up in court, but serve to intimidate employees, especially low-wage workers.

That’s just for starters. Everywhere you look, free markets are under attack. Occupational licensing, often the result of lobbying by trade associations, has increased five-fold since the 1950s. Antitrust regulation has become virtually nonexistent, while competition has been reduced in the vast majority of American industries.

Perhaps not surprisingly, while all this lobbying has been going on, recent decades have seen business investment and innovation decline, and productivity growth falter while new business formation has fallen by 50%. Corporate profits, on the other hand, are at record highs.

Getting Back On Track

At the end of World War II, America made important investments to create the world’s greatest innovation economy. The GI Bill made what is perhaps the biggest investment ever in human capital, sending millions to college and creating a new middle class. Investments in institutions such as the National Science Foundation (NSF) and the National Institutes of Health (NIH) would create scientific capital that would fuel US industry.

Unfortunately, we abandoned that very successful playbook. Over the past 20 years, college tuition in the US has roughly doubled in the last 20 years. Perhaps not surprisingly, we’ve fallen to ninth among OECD countries for post-secondary education. The ones who do graduate are often forced into essentially decades of indentured servitude in the form of student loans.

At the same time, government investment in research as a percentage of GDP has been declining for decades, limiting our ability to produce the kinds of breakthrough discoveries that lead to exciting new industries. What passes for innovation these days displaces workers, but does not lead to significant productivity gains. Legislation designed to rectify the situation and increase our competitiveness stalled in the Senate.

So after 250 years, capitalism remains pretty much as Adam Smith first conceived, powerful yet fragile, always at risk of being undermined and corrupted by the same basic animal spirits that it depends on to set prices efficiently. He never wrote, nor is there any indication he ever intended, that markets should be left to their own devices. In fact, he and others warned us that markets need to be actively promoted and protected.

We are free to choose. We need to choose more wisely.

— Article courtesy of the Digital Tonto blog
— Image credits: Microsoft CoPilot

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

We Must Unlearn These Three Management Myths

We Must Unlearn These Three Management Myths

GUEST POST from Greg Satell

Mark Twain is reported to have said, “It’s not what you don’t know that kills you, it’s what you know for sure that ain’t true.” Ignorance of facts is easily remedied. We can read books, watch documentaries or simply do a quick Google search. Yet our misapprehensions and biases endure, even in the face of contradicting facts.

The truth is that much of what we believe has less to do with how we weigh evidence than how we see ourselves. In fact, fMRI studies have suggested have shown that evidence which contradicts our firmly held beliefs violates our sense of identity. Instead of adapting our views, we double down and lash out at those who criticize them.

This can be problematic in our personal lives, but in business it can be fatal. There is a reason that even prominent CEOs can pursue failed strategies and sophisticated investors will back hucksters to the hilt. Yet as Adam Grant points out in Think Again, we can make the effort to reexamine and alter our beliefs. Here are three myths that we need to watch out for.

Myth #1: The “Global Village” Will Be A Nice Place

Marshal McLuhan, in Understanding Media, one of the most influential books of the 20th century, described media as “extensions of man” and predicted that electronic media would eventually lead to a global village. Communities would no longer be tied to a single, isolated physical space but connect and interact with others on a world stage.

To many, the rise of the Internet confirmed McLuhan’s prophecy and, with the fall of the Berlin Wall, digital entrepreneurs saw their work elevated to a sacred mission. In Facebook’s IPO filing, Mark Zuckerberg wrote, “Facebook was not originally created to be a company. It was built to accomplish a social mission — to make the world more open and connected.

Yet, importantly, McLuhan did not see the global village as a peaceful place. In fact, he predicted it would lead to a new form of tribalism and result in a “release of human power and aggressive violence” greater than ever in human history, as long separated—and emotionally charged—cultural norms would now constantly intermingle, clash and explode.

For many, if not most, people on earth, the world is often a dark and dangerous place. When your world is not secure, “open” is less of an opportunity to connect than it is a vulnerability to exploit. Things can look fundamentally different from the vantage point of, say, a tech company in Menlo Park, California then it does from, say, a dacha outside Moscow.

Context matters. Our most lethal failures are less often those of planning, logic or execution than they are that of imagination. Chances are, most of the world does not see things the way we do. We need to avoid strategic solipsism and constantly question our own assumptions.

Myth #2: Winning The “War For Talent” Will Make You More Competitive

In 1997, three McKinsey consultants published a popular book titled The War for Talent, which argued that due to demographic shifts, recruiting the “best and the brightest” was even more important than “capital, strategy, or R&D.” The idea made a lot of sense. What could be more important for a company than its people?

Yet as Malcolm Gladwell explained in an article about Enron, strict adherence to the talent rule contributed to the firm’s downfall. Executives that were perceived to be talented moved up fast. So fast, in fact, that it became impossible to evaluate their performance. People began to worry more about impressing their boss and appearing to be clever than doing their jobs.

The culture became increasingly toxic and management continued to bet on the same failed platitude until the only way to move up in the organization was to undermine others. As we now know, it didn’t end well. Enron went bankrupt in 2001, just four years after The War for Talent highlighted it as a model for others to follow.

The simple truth is that talent isn’t what you win in a battle. It’s what you build by actualizing the potential of those in your organization and throughout your ecosystem, including partners, customers and the communities in which you operate. In the final analysis, Enron didn’t fail because it lost the war for talent, it failed because it was at war with itself.

Myth #3: We Can “Engineer” Management

In 1911, Frederick Winslow Taylor published The Principles of Scientific Management, based on his experience as a manager in a steel factory. It took aim at traditional management methods and suggested a more disciplined approach. Rather than have workers pursue tasks in their own manner, he sought to find “the one best way” and train accordingly.

Before long, Taylor’s ideas became gospel, spawning offshoots such as scientific marketing, financial engineering and the six sigma movement. It was no longer enough to simply work hard, you had to measure, analyze and optimize everything. Over the years these ideas became so central to business thinking that they were rarely questioned.

Yet they should have been. The truth is that this engineering mindset is a zombie idea, a remnant of the logical positivism that was discredited way back in the 1930s and more recent versions haven’t fared any better. To take just one example, a study found that of 58 large companies that announced Six Sigma programs, 91 percent trailed the S&P 500 in stock performance. Yet that didn’t stop the endless parade of false promises.

At the root of the problem is a simple fact: We don’t manage machines, we manage ecosystems and we need to think more about networks and less about nodes. Our success or failure depend less on individual entities, than the connections between them. We need to think less like engineers and more like gardeners.

Don’t Believe Everything You Think

At any given time, there are any number of clever people saying clever things. When you invoke a legendary icon like Marshall McLuhan and say “Global Village,” the concept acquires the glow of some historical, unalterable destiny. But that’s an illusion, just like the “War for Talent” and the idea of “engineering” your way out of managing a business and making wise choices.

Yet notice the trap. None of these things were put forward as mere opinions or perspectives. The McKinsey consultants who declared the “War for Talent” weren’t just expressing an opinion, but revealing the results of a “yearlong study…involving 77 companies and almost 6,000 managers and executives.” (And presumably, they sold the study right back to every one of those 77 companies).

The truth is that an idea can never be validated backward, only forward. No amount of analysis can shape reality. We need to continually test our ideas, reconsider them and adapt them to ever-changing conditions. The problem with concepts like six sigma isn’t necessarily in their design, but that they become elevated something approaching the sublime.

That’s why we shouldn’t believe everything we think. There are simply too many ways to get things wrong, while getting them right is always a relatively narrow path. Or, as Richard Feynman put it, “The first principle is that you must not fool yourself—and you are the easiest person to fool.”

— Article courtesy of the Digital Tonto blog
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Change Leaders Must Anticipate and Overcome Resistance

Change Leaders Must Anticipate and Overcome Resistance

GUEST POST from Greg Satell

When Barry Libenson arrived at Experian as Global CIO in 2015, he devoted his first few months to speaking with customers. Everywhere he went he heard the same thing: they wanted access to real-time data. On the surface, it was a straightforward business transformation, but Libenson knew that it was far more complicated than that

To switch from batch processed credit reports to real-time access would require a technology transformation—from an on-premise to a cloud architecture—and in order to develop cloud applications effectively, he would have to initiate a skills-based transformation—from waterfall to agile development.”

So what at first appeared to be a straightforward initiative was actually three separate transformations stacked on top of one another. To make things even more difficult, people had good reason to be hostile to each aspect. Still, by being strategic about overcoming resistance from the start, he achieved a full transformation in less than three years.

Understanding Cognitive Biases

One of the key concerns about Libenson’s program at Experian was that the company would lose control over its business model. The firm had prospered selling processed credit reports. Giving customers real-time access to data seemed to undercut a value proposition that had proven itself over decades, almost as if McDonald’s decided to stop selling hamburgers.

These were not casual criticisms. In fact, they reflected instinctual cognitive biases that are deeply rooted in our consciousness. The first, loss aversion, reflects our tendency to avoid losses rather than seek out new gains. The second, called the availability heuristic, reflects our preference for information that is easy to access and internalize, such as the decades of profits generated by credit reports rather than the vague promise of a new cloud-driven business model.

A similar dynamic is plays out between the Black Lives Matter movement and police unions. One could argue, with significant evidence, that the smart play for police unions would be to come to some accommodation with protesters’ concerns to avoid more draconian policies later on. Yet after meticulously building their power base for decades, they have shown little willingness to make concessions.

Libensen and his team were able to navigate these challenges with two key strategies. First, he started with internal API’s, rather than fully open applications, as a keystone change,. That helped bridge the gap between the initial and desired future state. Second, the program was opt-in at first. Those program managers who were excited about creating cloud-based products got significant support. Those who weren’t were left alone.

Navigating Asymmetrical Impacts

Another obstacle to overcome was the fact that some people were more affected than others. In the case of Experian’s skills-based transformation from waterfall to agile development, which was essential to making the business and technology transformations possible, the change hit more senior personnel harder than junior ones.

Many of the project managers at the company had been doing their jobs for years—even decades—and took great pride in their work. Now they were being told they needed to do their jobs very differently. For a junior employee with limited experience, that can be exciting. For those more invested in traditional methods, the transition can more difficult.

Here again, the opt-in strategy helped navigate some thorny issues. Because no one was being forced to switch to agile development, it was hard for anyone to muster much resistance. At the same time, Libenson established an “API Center of Excellence” to empower those who were enthusiastic about creating cloud-based products.

As the movement to the cloud gained steam and began to generate real business results, the ability to build cloud-based projects became a performance issue. Managers that lagged began to feel subtle pressure to get with the program and to achieve what their colleagues had been able to deliver.

Overcoming Switching Costs

Experian facilitates billions of transactions a month. At that scale, you can’t just turn the ship on a dime. Another factor that increased the risk is the very nature of the credit business itself, which makes cybersecurity a major concern. In fact, one of Experian’s direct competitors, Equifax, had one of the biggest data breaches of the decade.

Every change encounters switching costs and that can slow the pace of change. In one particularly glaring example, the main library at Princeton University took 120 years to switch to the Library of Congress classification system because of the time and expense involved. Clearly, that’s an extreme case, but every change effort needs to take inevitable frictions into account.

That’s why Libenson didn’t push for speed initially, but started small, allowing the cloud strategy to slowly prove itself over time. As win piled upon win, the process accelerated and the transformation became more ingrained in the organization. Within just a few years, those who opposed the move to the cloud were in the distinct minority.

As General Stanley McChrystal explained in Team of Teams, he experienced a similar dynamic revamping Special Operations in Iraq. By shifting his force’s focus from individual team goals to effective collaboration between teams, he may have slowed down individual units. However, as a collective, his forces increased their efficiency by a factor of seventeen, measured by the amount of raids they were able to execute.

In every transformation, there is an inherent efficiency paradox. In order to produce change for the long-term, you almost always lose a little bit of efficiency in the short-term. That’s why it’s important to start small and build momentum as you go.

Leveraging Resistance To Forge A New Vision

Any change, if it is important and potentially impactful, is going to encounter fierce resistance. As Saul Alinsky noted, every revolution inspires its own counter-revolution. That’s why three quarters of organizational transformations fail, because managers too often see it as a communication exercise, rather than a strategic effort to empower those who are already enthusiastic about change to influence everyone else.

In the case of Experian’s move to the cloud, the objections were not unfounded. Offering customers real-time access to data did have the potential to upend the traditional credit report business model. Switching to a new technology architecture does raise cybersecurity concerns. Many senior project managers really had served the company well for decades with traditional development methods.

As Global CIO, Libenson could have ignored these concerns. He could have held a “townhall” and launched a major communication effort to convince the skeptics. Yet he did neither of these things. Instead, he treated the resistance not as an obstacle, but as a design constraint. He identified people who were already enthusiastic about the shift and empowered them to make it work. Their success built momentum and paved the way for what became a major transformation .

In fact, Experian’s cloud architecture unlocked enormous value for the firm and its customers. The company’s API hub made good on Libenson’s initial promise of supporting real-time access to data and today processes over 100 million transactions a month. It has also enabled a completely new business, called Ascend, now one of the company’s most successful products.

The truth is that bringing about fundamental, transformational change takes more than clever slogans and happy talk. The status quo always has inertia on its side and never yields its power gracefully. You need to be clear-eyed and hard-nosed. You need to understand that for every significant change, there will be some who seek to undermine it in ways that are dishonest, underhanded and deceptive.

The difference between successful revolutionaries and mere dreamers is that those who succeed anticipate resistance and build a plan to overcome it.

— Article courtesy of the Digital Tonto blog
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Innovation is Combination

Silicon Valley’s Innovator’s Dilemma – The Atom, the Bit and the Gene

Innovation is Combination

GUEST POST from Greg Satell

Over the past several decades, innovation has become largely synonymous with digital technology. When the topic of innovation comes up, somebody points to a company like Apple, Google or Meta rather than, say, a car company, a hotel or a restaurant. Management gurus wax poetically about the “Silicon Valley way.”

Of course, that doesn’t mean that other industries haven’t been innovative. In fact, there are no shortage of excellent examples of innovation in cars, hotels, restaurants and many other things. Still, the fact remains that for most of recent memory digital technology has moved further and faster than anything else.

This has been largely due to Moore’s Law, our ability to consistently double the number of transistors we’re able to cram onto a silicon wafer. Now, however, Moore’s Law is ending and we’re entering a new era of innovation. Our future will not be written in ones and zeros, but will be determined by our ability to use information to shape the physical world.

The Atom

The concept of the atom has been around at least since the time of the ancient Greek philosopher Democritus. Yet it didn’t take on any real significance until the early 20th century. In fact, the paper Albert Einstein used for his dissertation helped to establish the existence of atoms through a statistical analysis of Brownian motion.

Yet it was the other papers from Einstein’s miracle year of 1905 that transformed the atom from an abstract concept to a transformative force, maybe even the most transformative force in the 20th century. His theory of mass-energy equivalence would usher in the atomic age, while his work on black-body radiation would give rise to quantum mechanics and ideas so radical that even he would refuse to accept them.

Ironically, despite Einstein’s reluctance, quantum theory would lead to the development of the transistor and the rise of computers. These, in turn, would usher in the digital economy, which provided an alternative to the physical economy of goods and services based on things made from atoms and molecules.

Still, the vast majority of what we buy is made up of what we live in, ride in, eat and wear. In fact, information and communication technologies only make up about 6% of GDP in advanced countries, which is what makes the recent revolution in materials science is so exciting. We’re beginning to exponentially improve the efficiency of how we design the materials that make up everything from solar panels to building materials.

The Bit

While the concept of the atom evolved slowly over millennia, the bit is one of the rare instances in which an idea seems to have arisen in the mind of a single person with little or no real precursor. Introduced by Claude Shannon in a paper in 1948—incidentally, the same year the transistor was invented—the bit has shaped how we see and interact with the world ever since.

The basic idea was that information isn’t a function of content, but the absence of ambiguity, which can be broken down to a single unit – a choice between two alternatives. Much like how a coin toss which lacks information while in the air, but takes on a level of certainty when it lands, information arises when ambiguity disappears.

He called this unit, a “binary digit” or a “bit” and much like the pound, quart, meter or liter, it has become such a basic unit of measurement that it’s hard to imagine our modern world without it. Shannon’s work would soon combine with Alan Turing’s concept of a universal computer to create the digital computer.

Now the digital revolution is ending and we will soon be entering a heterogeneous computing environment that will include things like quantum, neuromorphic and biological computing. Still, Claude Shannon’s simple idea will remain central to how we understand how information interacts with the world it describes.

The Gene

The concept of the gene was first discovered by an obscure Austrian monk named Gregor Mendel, but in one of those strange peculiarities of history, his work went almost totally unnoticed until the turn of the century. Even then, no one really knew what a gene was or how they functioned. The term was, for the most part, just an abstract concept.

That changed abruptly when James Watson and Francis Crick published their article in the scientific journal Nature. In a single stroke, the pair were able to show that genes were, in fact, made up of a molecule called DNA and that they operated through a surprisingly simple code made up of A,T,C and G.

Things really began to kick into high gear when the Human Genome Project was completed in 2003. Since then the cost to sequence a genome has been falling faster than the rate of Moore’s Law, which has unleashed a flurry of innovation. Jennifer Doudna’s discovery of CRISPR in 2012 revolutionized our ability to edit genes. More recently, mRNA technology has helped develop COVID-19 vaccines in record time.

Today, we have entered a new era of synthetic biology in which we can manipulate the genetic code of A,T,C and G almost as easily as we can the bits in the machines that Turing imagined all those years ago. Researchers are also exploring how we can use genes to create advanced materials and maybe even create better computers.

Innovation Is Combination

The similarity of the atom, the bit and the gene as elemental concepts is hard to miss and they’ve allowed us to understand our universe in a visceral, substantial way. Still, they arose in vastly different domains and have been largely applied to separate and distinct fields. In the future, however, we can expect vastly greater convergence between the three.

We’ve already seen glimpses of this. For example, as a graduate student Charlie Bennett was a teaching assistant for James Watson. Yet in between his sessions instructing undergraduates in Watson’s work on genes, he took an elective course on the theory of computing in which he learned about the work of Shannon and Turing. That led him to go work for IBM and become a pioneer in quantum computing.

In much the same way, scientists are applying powerful computers to develop new materials and design genetic sequences. Some of these new materials will be used to create more powerful computers. In the future, we can expect the concepts of the atom, the bit and the gene to combine and recombine in exciting ways that we can only begin to imagine today.

The truth is that innovation is combination and has, in truth, always been. The past few decades, in which one technology so thoroughly dominated that it was able to function largely in isolation to other fields, was an anomaly. What we are beginning to see now is, in large part, a reversion to the mean, where the most exciting work will be interdisciplinary.

This is Silicon Valley’s innovator’s dilemma. Nerdy young geeks will no longer be able to prosper coding blithely away in blissful isolation. It is no longer sufficient to work in bits alone. Increasingly we need to combine those bits with atoms and genes to create significant value. If you want to get a glimpse of the future, that’s where to look.

— Article courtesy of the Digital Tonto blog
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.