Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

Importance of Long-Term Innovation

Importance of Long-Term Innovation

GUEST POST from Greg Satell

Scientists studying data from Mars recently found that the red planet may have oceans worth of water embedded in its crust in addition to the ice caps at its poles. The finding is significant because, if we are ever to build a colony there, we will need access to water to sustain life and, eventually, to terraform the planet.

While it’s become fashionable for people to lament short-term thinking and “quarterly capitalism,” it’s worth noting that there are a lot of people working on—and a not insignificant amount of money invested in—colonizing another world. Many dedicate entire careers to a goal they do not expect to be achieved in their lifetime.

The truth is that there is no shortage of organizations that are willing to invest for the long-term. In fact, nascent technologies which are unlikely to pay off for years are still able to attract significant investment. The challenge is to come up with a vision that is compelling enough to inspire others, while still being practical enough that you can still make it happen.

The Road to a Miracle Vaccine

When the FDA announced that it was granting an emergency use authorization for Covid-19 vaccines, everybody was amazed at how quickly they were developed. That sense of wonder only increased when it was revealed that they were designed in a mere matter of days. Traditionally, vaccines take years, if not decades to develop.

Yet appearances can be deceiving. What looked like a 10-month sprint to a miracle cure was actually the culmination of a three-decade effort that started in the 90s with a vision of a young researcher named Katalin Karikó, who believed that a molecule called mRNA could hold the key to reprogramming our cells to produce specific protein molecules.

The problem was that, although theoretically once inside the cytoplasm mRNA could instruct our cell machinery to produce any protein we wanted, our bodies tend to reject it. However, working with her colleague Drew Weissman, Karikó figured out that they could slip it past our natural defenses by slightly modifying the mRNA molecule.

It was that breakthrough that led two startup companies, Moderna and BioNTech to license the technology and for investors to back it. Still, it would still take more than a decade and a pandemic before the bet paid off.

The Hard Road of Hard Tech

In the mid-90s when the Internet started to take off, companies with no profits soon began attracting valuations that seemed insane. Yet the economist W. Brian Arthur explained that under certain conditions—namely high initial investment, low or negligible marginal costs and network effects—firms could defy economic gravity and produce increasing returns.

Arthur’s insight paved the way for the incredible success of Silicon Valley’s brand of venture-funded capitalism. Before long, runaway successes such as Yahoo, Amazon and Google made those who invested in the idea of increasing returns a mountain of money.

Yet the Silicon Valley model only works for a fairly narrow slice of technologies, mostly software and consumer gadgets. For other, so-called “hard technologies,” such as biotech, clean tech, materials science and manufacturing 4.0, the approach isn’t effective. There’s no way to rapidly prototype a cure for cancer or a multimillion-dollar piece of equipment.

Still, over the last decade a new ecosystem has been emerging that specifically targets these technologies. Some, like the LEEP programs at the National Laboratories, are government funded. Others, such as Steve Blank’s I-Corps program, focus on training scientists to become entrepreneurs. There are also increasingly investors who specialize in hard tech.

Look closely and you can see a subtle shift taking place. Traditionally, venture investors have been willing to take market risk but not technical risk. In other words, they wanted to see a working prototype, but were willing to take a flyer on whether demand would emerge. This new breed of investors are taking on technical risk on technologies, such as new sources of energy, for which there is little market risk if they can be made to work.

The Quantum Computing Ecosystem

At the end of 2019, Amazon announced Braket, a new quantum computing service that would utilize technologies from companies such as D-Wave, IonQ, and Rigetti. They were not alone. IBM had already been building its network of quantum partners for years which included high profile customers ranging from Goldman Sachs to ExxonMobil to Boeing.

Here’s the catch. Quantum computers can’t be used by anybody for any practical purpose. In fact, there’s nobody on earth who can even tell you definitively how quantum computing should work or exactly what types of problems it can be used to solve. There are, in fact, a number of different approaches being pursued, but none of them have proved out yet.

Nevertheless, an analysis by Nature found that private funding for quantum computing is surging and not just for hardware, but enabling technologies like software and services. The US government has created a $1 billion quantum technology plan and has set up five quantum computing centers at the national labs.

So if quantum computing is not yet a proven technology why is it generating so much interest? The truth is that the smart players understand that the potential of quantum is so massive, and the technology itself so different from anything we’ve ever seen before, that it’s imperative to start early. Get behind and you may never catch up.

In other words, they’re thinking for the long-term.

A Plan Isn’t Enough, You Need To Have A Vision

It’s become fashionable to bemoan the influence of investors and blame them for short-term and “quarterly capitalism,” but that’s just an excuse for failed leadership. If you look at the world’s most valuable companies—the ones investors most highly prize—you’ll find a very different story.

Apple’s Steve Jobs famously disregarded the opinions of investors, (and just about everybody else as well). Amazon’s Jeff Bezos, who habitually keeps margins low in order to increase market share, has long been a Wall Street darling. Microsoft invested heavily in a research division aimed at creating technologies that won’t pan out for years or even decades.

The truth is that it’s not enough to have a long-term plan, you have to have a vision to go along with it. Nobody wants to “wait” for profits, but everybody can get excited about a vision that inspires them. Who doesn’t get thrilled by the possibility of a colony on Mars, miracle cures, revolutionary new materials or a new era of computing?

Here’s the thing: Just because you’re not thinking long-term doesn’t mean somebody else isn’t and, quite frankly, if they are able to articulate a vision to go along with that plan, you don’t stand a chance. You won’t survive. So take some time to look around, to dream a little bit and, maybe, to be inspired to do something worthy of a legacy.

All who wander are not lost.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Change Failure Occurs

Why Change Failure Occurs

GUEST POST from Greg Satell

Never has the need for transformation been so dire or so clear. Still, that’s no guarantee that we will muster the wisdom to make the changes we need to. After all, President Bush warned us about the risks of a global pandemic way back in 2005 and, in the end, we were left wholly vulnerable and exposed.

It’s not like pandemics are the only thing to worry about either. A 2018 climate assessment warns of major economic impacts unless we make some serious shifts. Public debt, already high before the current crisis, is now exploding upwards. Our electricity grid is insecure and vulnerable to cyberattack. The list goes on.

All too often, we assume that mere necessity can drive change forward, yet history has shown that not to be the case. There’s a reason why nations fail and businesses go bankrupt. The truth is that if a change is important, some people won’t like it and they will work to undermine it in underhanded and insidious ways. That’s what we need to overcome.

A Short History Of Change

For most of history, until the industrial revolution, people existed as they had for millennia and could live their entire lives without seeing much change. They farmed or herded for a living, used animals for power and rarely travelled far from home. Even in the 20th century, most people worked in an industry that changed little during their career.

In the 1980s, management consultants began to notice that industries were beginning to evolve more rapidly and firms that didn’t adapt would lose out in the marketplace. One famous case study showed how Burroughs moved aggressively into electronic computing and prospered while its competitor NCR lagged and faded into obscurity.

In 1983, McKinsey consultant Julien Phillips published a paper in the journal, Human Resource Management, that described an “adoption penalty” for firms that didn’t adapt to changes in the marketplace quickly enough. His ideas became McKinsey’s first change management model that it sold to clients.

Yet consider that research shows in 1975, during the period Phillips studied, 83% of the average US corporation’s assets were tangible, such as plant, machinery and buildings, while by 2015, 84% of corporate assets were intangible, such as licenses, patents and human capital. In other words, change today involves mostly people, their knowledge and behaviors than it does strategic assets.

Clearly, that changes the game entirely.

What Change Looks Like Today

Think about how America was transformed after World War II. We created the Interstate Highway System to tie our nation together. We established a new scientific infrastructure that made us a technological superpower. We built airports, shopping malls and department stores. We even sent a man to the moon.

Despite the enormous impact of these accomplishments, none of those things demanded that people had to dramatically change their behavior. Nobody had to drive on an Interstate highway, work in a lab, travel in space or move to the suburbs. Many chose to do those things, but others did not and paid little or no penalty for their failure to change with the times.

Today the story is vastly different. A crisis like Covid-19 required us to significantly alter our behavior and, not surprisingly, some people didn’t like it and resisted. We could, as individuals, choose to wear a mask, but if others didn’t follow suit the danger remained. We can, as a society, invest billions in a vaccine, but if a significant portion don’t take it, the virus will continue to mutate at a rapid rate, undermining the effectiveness of the entire enterprise.

Organizations face similar challenges. Sure they invest in tangible assets, such as plant and equipment, but any significant change will involve changing people’s beliefs and behaviors and that is a different matter altogether. Today, even technological transformations have a significant human component.

Making Room For Identity And Dignity

In the early 19th century, a movement of textile workers known as the Luddites smashed machines to protest the new, automated mode of work. As skilled workers, they saw their way of life being destroyed in the name of progress because the new technology could make fabrics faster and cheaper with less workers of lower skill.

Today, “Luddite” has become a pejorative term to describe people who are unable or unwilling to accept technological change. Many observers point out that the rise of industry created new and different jobs and increased overall prosperity. Yet that largely misses the point. Weavers were skilled artisans who worked for years to hone their craft. What they did wasn’t just a job, it was who they were and what they took pride in.

One of the great misconceptions of our modern age is that people make decisions based on rational calculations of utility and that, by engineering the right incentives, we can control behavior. Yet people are far more than economic entities, They crave dignity and recognition, to be valued, in other words, as ends in themselves rather than as merely means to an end.

That’s why changing behaviors can be such a tricky thing. While some may see being told to wear a mask or socially distance as simply doing what “science says,” for others it is an imposition on their identity and dignity from outside their community. Perhaps not surprisingly, they rebel and demand to have their right to choose be recognized.

Building Change On Common Ground

The biggest misconception about change is that once people understand it, they will embrace and so the best way to drive change forward is to explain the need for change in a very convincing and persuasive way. Change, in this view, is essentially a communication exercise and the right combination of words and images is all that is required.

Yet as should be clear by now that is clearly not true. People will often oppose change because it asks them to alter their identity. The Luddites didn’t just oppose textile machinery on economic grounds, but because it failed to recognize their skills as weavers. People don’t necessarily oppose wearing masks because they are “anti-science,” but because they resent having their behavior mandated from outside their community.

In other words, change is always, at some level, about what people value. That’s why to bring change about you need to identify shared values that reaffirm, rather than undermine, people’s sense of identity. Recognition is often a more powerful incentive than even financial rewards. In the final analysis, lasting change always needs to be built on common ground.

Over the next decade, we will undergo some of the most profound shifts in history, encompassing technology, resources, migration patterns and demography and, if we are to compete, we will need to achieve enormous transformation in business and society. Whether we are able to do that or not depends less on economics or “science” than it does on our ability to trust each other again.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We’re Disrupting People Instead of Industries Now

We're Disrupting People Instead of Industries Now

In 1997, when Clayton Christensen first published The Innovator’s Dilemma and introduced the term “disruptive innovation,” it was a clarion call. Business leaders were put on notice: It is no longer enough to simply get better at what you already do, you need to watch out for a change in the basis of competition that will open the door for a disruptive competitor.

Today, it’s become fashionable for business pundits to say that we live in a VUCA era, one that is volatile, uncertain, complex and ambiguous, but the evidence says otherwise. Increasingly researchers are finding that businesses are enjoying a period that is less disruptive, less competitive and less dynamic.

The truth is that we don’t really disrupt businesses anymore, we disrupt people and that’s truly becoming a problem. As businesses are increasingly protected from competition, they are becoming less innovative and less productive. Americans, meanwhile, are earning less and paying more. It’s time we stop doubling down on failed ideas and begin to right the ship.

The Productivity Paradox

In the 1920s two emerging technologies, internal combustion and electricity, finally began to hit their stride and kicked off a 50-year boom in productivity growth. During that time things changed dramatically. We shifted from a world where few Americans had indoor plumbing, an automobile or electrical appliances to one in which the average family had all of these things.

Technology enthusiasts like to compare the digital revolution with that earlier era, but that’s hardly the case. If anybody today was magically transported 50 years back to 1970, they would see much they would recognize. Yet if most modern people had to live in 1920, where even something as simple as cooking a meal required hours of backbreaking labor, they would struggle to survive.

The evidence is far more than anecdotal however. Productivity statistics clearly show that productivity growth started to slow in the early 1970s, just as computer investment began to rise. With the introduction of the Internet, there was a brief bump in productivity between 1996 and 2004, but then it disappeared again. Today, even with the introduction of social media, mobile Internet and artificial intelligence, we appear to be in a second productivity paradox.

Businesses can earn an economic profit in one of two ways. They can unlock new value through innovation or they can seek to reduce competition. In an era of diminished productivity, it shouldn’t be surprising that many firms have chosen the latter. What is truly startling is the ease and extent to which we have let them get away with it.

Rent Seeking And Regulatory Capture

Investment decisions are driven by profit expectations. If, for instance, a firm sees great potential in a new technology, they will invest in research and development. On the other hand, if they see greater potential influencing governments, they will invest in that. So it is worrying that lobbying expenditures have more than doubled since 1998.

The money goes towards two basic purposes. The first, called rent seeking, involves businesses increasing profits by the law to work in their favor, as when car dealerships in New Jersey sued against Tesla’s direct sales model. The second, regulatory capture, seeks to co-opt agencies that are supposed to govern industry.

It seems like they’re getting their money’s worth. Corporate tax rates in the US have steadily decreased and are now among the lowest in the developed world. Occupational licensing, often the result of lobbying by trade associations, has increased fivefold since the 1950s. Antitrust regulation has become virtually nonexistent, while competition has been reduced.

The result is that while corporations earn record profits, we pay more and get less. This is especially clear in some highly visible industries, such as airlines, cable and mobile carriers, but the effect is much more widespread than that. Keep in mind that, in many states, legislators earn less than $20,000 per year. It’s easy to see how a little investment can go a long way.

Decreasing Returns To Labor

With businesses facing less competition and a more favorable regulatory environment, which not only lowers costs but raises barriers to new market entrants, it shouldn’t be surprising that the stock market has hit record highs. Ordinarily that would be something to cheer, but evidence suggests that the gains are coming at the expense of the rest of us.

A report from MicKinsey Global Institute finds that labor’s share of income has been declining rapidly since 2000, especially in the United States. This is, of course, due to a number of factors, such as low productivity, automation, globalization. Decreased labor bargaining power due to increased market power of employers, however, has been shown to play an especially significant role.

At the same time that our wages have been reduced, the prices we pay have increased, especially in education and healthcare. A study from Pew shows that, for most Americans, real wages have hardly budged since 1964. Instead of becoming better off over time, many families are actually doing worse.

The effects of this long-term squeeze have become dire. Increasingly, Americans are dying deaths of despair from things like alcohol abuse, drug overdose, and suicide. Recent research has also shown that the situation has gotten worse during Covid.

We Are Entering A Dangerous Decade

Decades of disruption have left us considerably worse off. Income inequality is at record highs. Anxiety and depression, already at epidemic levels, has worsened during the Covid-19 pandemic. These trends are most acute in the US, but are essentially global in nature and have contributed to the rise in populist authoritarianism around the world.

Things are likely to get worse over the next decade as we undergo profound shifts in technology, resources, migration and demographics. To put that in perspective, a demographic shift alone was enough to make the 60s a tumultuous era. Clearly, our near future is fraught with danger.

Yet history is not destiny. We have the power to shape our path by making better choices. A good first step would be to finally abandon the cult of disruption that’s served us so poorly and begin to once again invest in stability and resilience, by creating better, safer technology, more competitive and stable markets and a happier, more productive workforce.

Perhaps most of all, we need to internalize the obvious principle that systems and ideologies should serve people, not the other way around. If we increase GDP and the stock market hits record highs, but the population is poorer, less healthy and less happy, then what have we won?

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

A New Age Of Innovation and Our Next Steps

A New Age Of Innovation and Our Next Steps

GUEST POST from Greg Satell

In Mapping Innovation, I wrote that innovation is never a single event, but a process of discovery, engineering and transformation and that those three things hardly ever happen at the same time or in the same place. Clearly, the Covid-19 pandemic marked an inflection point which demarcated several important shifts in those phases.

Digital technology showed itself to be transformative, as we descended into quarantine and found an entire world of video conferencing and other technologies that we scarcely knew existed. At the same time it was revealed that the engineering of synthetic biology—and mRNA technology in particular—was more advanced than we had thought.

This is just the beginning. I titled the last chapter of my book, “A New Era of Innovation,” because it had become clear that we had begun to cross a new rubicon in which digital technology becomes so ordinary and mundane that it’s hard to remember what life was like without it, while new possibilities alter existence to such an extent we will scarcely believe it.

Post-Digital Architectures

For the past 50 years, the computer industry—and information technology in general—has been driven by the principle known as Moore’s Law, which determined we could double the number of transistors on chips every 18 months. Yet now Moore’s Law is ending and that means we will have to revisit some very basic assumptions about how technology works.

To be clear, the end of Moore’s Law does not mean the end of advancement. There are a number of ways we can speed up computing. We can, for instance, use technologies such as ASIC and FPGA to optimize chips for specialized tasks. Still, those approaches come with tradeoffs, Moore’s law essentially gave us innovation for free.

Another way out of the Moore’s Law conundrum is to shift to completely new architectures, such as quantum, neuromorphic and, possibly, biological computers. Yet here again, the transition will not be seamless or without tradeoffs. Instead of technology based on transistors, we will have multiple architectures based on entirely different logical principles.

So it seems that we will soon be entering a new era of heterogeneous computing, in which we use digital technology to access different technologies suited to different tasks. Each of these technologies will require very different programming languages and algorithmic approaches and, most likely, different teams of specialists to work on them.

What that means is that those who run the IT operations in the future, whether that person is a vaunted CTO or a lowly IT manager, will be unlikely to understand more than a small part of the system. They will have to rely heavily on the expertise of others to an extent that isn’t required today.

Bits Driving Atoms

While the digital revolution does appear to be slowing down, computers have taken on a new role in helping to empower technologies in other fields, such as synthetic biology, materials science and manufacturing 4.0. These, unlike so many digital technologies, are rooted in the physical world and may have the potential to be far more impactful.

Consider the revolutionary mRNA technology, which not only empowered us to develop a Covid vaccine in record time and save the planet from a deadly pandemic, but also makes it possible to design new vaccines in a matter of hours. There is no way we could achieve this without powerful computers driving the process.

There is similar potential in materials discovery. Suffice it to say, every product we use, whether it is a car, a house, a solar panel or whatever, depends on the properties of materials to perform its function. Some need to be strong and light, while others need special electrical properties. Powerful computers and machine learning algorithms can vastly improve our ability to discover better materials (not to mention overcome supply chain disruptions).

Make no mistake, this new era of innovation will be one of atoms, not bits. The challenge we face now is to develop computer scientists who can work effectively with biologists, chemists, factory managers and experts of all kinds to truly create a new future.

Creation And Destruction

The term creative destruction has become so ingrained in our culture we scarcely stop to think where it came from. It was largely coined by economist Joseph Schumpeter to overcome what many saw as an essential “contradiction” of capitalism. Essentially, some thought that if capitalists did their jobs well, then there would be increasing surplus value, which would then be appropriated to accumulate power to rig the system further in capitalists favor.

Schumpeter pointed out that this wasn’t necessarily true because of technological innovation. Railroads, for example, completely changed the contours of competition in the American Midwest. Surely, there had been unfair competition in many cities and towns, but once the railroad came to town, competition flourished (and if it didn’t come, the town died).

For most of history since the beginning of the Industrial Revolution, this has been a happy story. Technological innovation displaced businesses and workers, but resulted in increased productivity which led to more prosperity and entirely new industries. This cycle of creation and destruction has, for the most part, been a virtuous one.

That is, until fairly recently. Digital technology, despite the hype, hasn’t produced the type of productivity gains that earlier technologies, such as electricity and internal combustion, did but actually displaced labor at a faster rate. Put simply, the productivity gains from digital technology are too meager to finance enough new industries with better jobs, which has created income inequality rather than greater prosperity.

We Need To Move From Disrupting Markets To Tackling Grand Challenges

There’s no doubt that digital technology has been highly disruptive. In industry after industry, from retail to media to travel and hospitality, nimble digital upstarts have set established industries on their head, completely changing the basis upon which firms compete. Many incumbents haven’t survived. Many others are greatly diminished.

Still, in many ways, the digital revolution has been a huge disappointment. Besides the meager productivity gains, we’ve seen a ​​global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets, not to mention an anxiety epidemic, increased obesity and, at least in the US, decreased life expectancy.

We can—and must—do better. We can learn from the mistakes we made during the digital revolution and shift our mindset from disrupting markets to tackling grand challenges. This new era of innovation will give us the ability to shape the world around us like never before, at a molecular level and achieve incredible things.

Yet we can’t just leave our destiny to the whims of market and technological forces. We must actually choose the outcomes we prefer and build strategies to achieve them. The possibilities that we will unlock from new computing architectures, synthetic biology, advanced materials science, artificial intelligence and other things will give us that power.

What we do with it is up to us.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

A Trigger Strategy for Driving Radical, Transformational Change

A Trigger Strategy for Driving Radical, Transformational Change

GUEST POST from Greg Satell

There’s an old adage that says we should never let a crisis go to waste. The point is that during a crisis there is a visceral sense of urgency and resistance often falls by the wayside. We’ve certainly seen that during the Covid pandemic. Digital technologies such as video conferencing, online grocery and telehealth have gone from fringe to mainstream in record time.

Seasoned leaders learn how to make good use of a crisis. Consider Bill Gates and his Internet Tidal Wave memo, which leveraged what could have been a mortal threat to Microsoft into a springboard to even greater dominance. Or how Steve Jobs used Apple’s near-death experience to reshape the ailing company into a powerhouse.

But what if we could prepare for a trigger before it happens? The truth is that indications of trouble are often clear long before the crisis arrives. Clearly, there were a number of warning signs that a pandemic was possible, if not likely. As every good leader knows, there’s never a shortage of looming threats. If we learn to plan ahead, we can make a crisis work for us.

The Plan Hatched In A Belgrade Cafe

In the fall of 1998, five young activists met in a coffee shop in Belgrade, Serbia. Although still in their twenties, they were already grizzled veterans. In 1992, they took part in student protests against the war in Bosnia. In 1996, they helped organize a series of rallies in response to Slobodan Milošević’s attempt to steal local elections.

To date, their results were decidedly mixed. The student protests were fun, but when the semester ended, everyone went home for the summer and that was the end of that. The 1996 protests were more successful, overturning the fraudulent results, but the opposition coalition, called “Zajedno,” soon devolved into infighting.

So they met in the coffee shop to discuss their options for the upcoming presidential election to be held in 2000. They knew from experience that they could organize rallies effectively and get people to the polls. They also knew that when they got people to the polls and won, Milošević would use his power and position to steal the election.

That would be their trigger.

The next day, six friends joined them and they called their new organization Otpor. Things began slowly, with mostly street theatre and pranks, but within 2 years their ranks had swelled to more than 70,000. When Milošević tried to steal the election they were ready and what is now known as the Bulldozer Revolution erupted.

The Serbian strongman was forced to concede. The next year, Milošević would be arrested and sent to The Hague for his crimes against humanity. He would die in his prison cell in 1996, awaiting trial.

Opportunity From The Ashes

In 2014, in the wake of the Euromaidan protests that swept the thoroughly corrupt autocrat Viktor Yanukovych from power, Ukraine was in shambles. Having been looted of roughly $100 billion (roughly the amount of the country’s entire GDP) and invaded by Russia, things looked bleak. Without western aid, the proud nation’s very survival was in doubt.

Yet for Vitaliy Shabunin and the Anti-Corruption Action Center, it was a moment he had been waiting for. He established the organization with his friend Dasha Kaleniuk a few years earlier. Since then they, along with a small staff, had been working with international NGOs to document corruption and develop effective legislation to fight it.

With Ukraine’s history of endemic graft, which had greatly worsened under Yanukovych, progress had been negligible. Yet now, with the IMF and other international institutions demanding reform, Shabunin and Kaleniuk were instantly in demand to advise the government on instituting a comprehensive anti-corruption program, which passed in record time.

Yet they didn’t stop there either. “Our long-term strategy is to create a situation in which it will be impossible not to do anti-corruption reforms,” Shabunin would later tell me. “We are working to ensure that these reforms will be done, either by these politicians or by another, because they will lose their office if they don’t do these reforms.”

Vitaliy, Dasha and the Anti-Corruption Action Center continue to prepare for future triggers.

The Genius Of Xerox PARC

One story that Silicon Valley folks love to tell involves Steve Jobs and Xerox. After the copier giant made an investment in Apple, which was then a fledgling company, it gave Jobs access to its Palo Alto Research Center (PARC). He then used the technology he saw there to create the Macintosh. Jobs built an empire based on Xerox’s oversight.

Yet the story misses the point. By the late 60s, its Xerox CEO Peter McColough knew that the copier business, while still incredibly profitable, was bound to be disrupted eventually. At the same time it was becoming clear that computer technology was advancing quickly and, someday, would revolutionize how we worked. PARC was created to prepare for that trigger.

The number of groundbreaking technologies created at PARC is astounding. The graphical user interface, networked computing, object oriented programing, the list goes on. Virtually everything that we came to know as “personal computing” had its roots in the work done at PARC in the 1970s.

Most of all, PARC saved Xerox. The laser printer invented there would bring in billions and, eventually, largely replace the copier business. Some technologies were spun off into new companies, such as Adobe and 3Com, with an equity stake going to Xerox. And, of course, the company even made a tidy profit off the Macintosh, because of the equity stake that gave Jobs access to the technology in the first place.

Transforming An Obstacle Into A Design Constraint

The hardest thing about change is that, typically, most people don’t want it. If they did, it have already been accepted as the normal state of affairs. That can make transformation a lonely business. The status quo has inertia on its side and never yields its power gracefully. The path for an aspiring changemaker can be heartbreaking and soul crushing.

Many would see the near-certainty that Milosevic would try to steal the election as an excuse to do nothing. Most people would look at the almost impossibly corrupt Yanukovych regime and see the idea of devoting your life to anti-corruption reforms as quixotic folly. It is extremely rare for a CEO whose firm dominates an industry to ask, “What comes after?”

Yet anything can happen and often does. Circumstances conspire. Events converge. Round-hole businesses meet their square-peg world. We can’t predict exactly when or where or how or what will happen, but we know that everybody and everything gets disrupted eventually. It’s all just a matter of time.

When that happens resistance to change temporarily abates. So there’s lots to do and no time to wait. We need to empower our allies, as well as listen to our adversaries. We need to build out a network to connect to others who are sympathetic to our cause. Transformational change is always driven by small groups, loosely connected, but united by a common purpose.

Most of all, we need to prepare. A trigger always comes and, when it does, it brings great opportunity with it.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.