A New Age Of Innovation and Our Next Steps

A New Age Of Innovation and Our Next Steps

GUEST POST from Greg Satell

In Mapping Innovation, I wrote that innovation is never a single event, but a process of discovery, engineering and transformation and that those three things hardly ever happen at the same time or in the same place. Clearly, the Covid-19 pandemic marked an inflection point which demarcated several important shifts in those phases.

Digital technology showed itself to be transformative, as we descended into quarantine and found an entire world of video conferencing and other technologies that we scarcely knew existed. At the same time it was revealed that the engineering of synthetic biology—and mRNA technology in particular—was more advanced than we had thought.

This is just the beginning. I titled the last chapter of my book, “A New Era of Innovation,” because it had become clear that we had begun to cross a new rubicon in which digital technology becomes so ordinary and mundane that it’s hard to remember what life was like without it, while new possibilities alter existence to such an extent we will scarcely believe it.

Post-Digital Architectures

For the past 50 years, the computer industry—and information technology in general—has been driven by the principle known as Moore’s Law, which determined we could double the number of transistors on chips every 18 months. Yet now Moore’s Law is ending and that means we will have to revisit some very basic assumptions about how technology works.

To be clear, the end of Moore’s Law does not mean the end of advancement. There are a number of ways we can speed up computing. We can, for instance, use technologies such as ASIC and FPGA to optimize chips for specialized tasks. Still, those approaches come with tradeoffs, Moore’s law essentially gave us innovation for free.

Another way out of the Moore’s Law conundrum is to shift to completely new architectures, such as quantum, neuromorphic and, possibly, biological computers. Yet here again, the transition will not be seamless or without tradeoffs. Instead of technology based on transistors, we will have multiple architectures based on entirely different logical principles.

So it seems that we will soon be entering a new era of heterogeneous computing, in which we use digital technology to access different technologies suited to different tasks. Each of these technologies will require very different programming languages and algorithmic approaches and, most likely, different teams of specialists to work on them.

What that means is that those who run the IT operations in the future, whether that person is a vaunted CTO or a lowly IT manager, will be unlikely to understand more than a small part of the system. They will have to rely heavily on the expertise of others to an extent that isn’t required today.

Bits Driving Atoms

While the digital revolution does appear to be slowing down, computers have taken on a new role in helping to empower technologies in other fields, such as synthetic biology, materials science and manufacturing 4.0. These, unlike so many digital technologies, are rooted in the physical world and may have the potential to be far more impactful.

Consider the revolutionary mRNA technology, which not only empowered us to develop a Covid vaccine in record time and save the planet from a deadly pandemic, but also makes it possible to design new vaccines in a matter of hours. There is no way we could achieve this without powerful computers driving the process.

There is similar potential in materials discovery. Suffice it to say, every product we use, whether it is a car, a house, a solar panel or whatever, depends on the properties of materials to perform its function. Some need to be strong and light, while others need special electrical properties. Powerful computers and machine learning algorithms can vastly improve our ability to discover better materials (not to mention overcome supply chain disruptions).

Make no mistake, this new era of innovation will be one of atoms, not bits. The challenge we face now is to develop computer scientists who can work effectively with biologists, chemists, factory managers and experts of all kinds to truly create a new future.

Creation And Destruction

The term creative destruction has become so ingrained in our culture we scarcely stop to think where it came from. It was largely coined by economist Joseph Schumpeter to overcome what many saw as an essential “contradiction” of capitalism. Essentially, some thought that if capitalists did their jobs well, then there would be increasing surplus value, which would then be appropriated to accumulate power to rig the system further in capitalists favor.

Schumpeter pointed out that this wasn’t necessarily true because of technological innovation. Railroads, for example, completely changed the contours of competition in the American Midwest. Surely, there had been unfair competition in many cities and towns, but once the railroad came to town, competition flourished (and if it didn’t come, the town died).

For most of history since the beginning of the Industrial Revolution, this has been a happy story. Technological innovation displaced businesses and workers, but resulted in increased productivity which led to more prosperity and entirely new industries. This cycle of creation and destruction has, for the most part, been a virtuous one.

That is, until fairly recently. Digital technology, despite the hype, hasn’t produced the type of productivity gains that earlier technologies, such as electricity and internal combustion, did but actually displaced labor at a faster rate. Put simply, the productivity gains from digital technology are too meager to finance enough new industries with better jobs, which has created income inequality rather than greater prosperity.

We Need To Move From Disrupting Markets To Tackling Grand Challenges

There’s no doubt that digital technology has been highly disruptive. In industry after industry, from retail to media to travel and hospitality, nimble digital upstarts have set established industries on their head, completely changing the basis upon which firms compete. Many incumbents haven’t survived. Many others are greatly diminished.

Still, in many ways, the digital revolution has been a huge disappointment. Besides the meager productivity gains, we’ve seen a ​​global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets, not to mention an anxiety epidemic, increased obesity and, at least in the US, decreased life expectancy.

We can—and must—do better. We can learn from the mistakes we made during the digital revolution and shift our mindset from disrupting markets to tackling grand challenges. This new era of innovation will give us the ability to shape the world around us like never before, at a molecular level and achieve incredible things.

Yet we can’t just leave our destiny to the whims of market and technological forces. We must actually choose the outcomes we prefer and build strategies to achieve them. The possibilities that we will unlock from new computing architectures, synthetic biology, advanced materials science, artificial intelligence and other things will give us that power.

What we do with it is up to us.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

4 thoughts on “A New Age Of Innovation and Our Next Steps

  1. Pingback: Competing in a New Era of Innovation | Human-Centered Change and Innovation

  2. Fraser

    Innovation is all about problem solving, and sustainable change is all about taking what use to work under the guidance of 20th century horse and buggy silo vision that had a lot of short term financial benefits to a privileged few and the community. But left tge unsustainable problems to the responsibility of the community and the government who had rules based on the horse buggy vision until those visionary came along and took the horse and buggy vision into a more sustainable future for those that understood the potential of helping the masses become part of the future of sustainable change, but always based on formal training that money and power had to be kept in the hands of those shown to follow the 20th century formal vision. That money was the only thing of value and anything that impacted the bottom line was to be tossed to the community and government to deal with. But also required to look to the one creating the problem to be paid for a solution. Until economic history repeated it history. The young see the benefit, but no longer accept tge short term vision and an innovator introduces a solution the system see as disruptive but the young no longer care or support tge horse and buggy vision. Going the be an interesting decade or so. Not for tge young but the old guard fighting change, but have already lost. Just not smart enough to be part of change.

    Reply
  3. Pingback: Competing in a New Era of Innovation | ☞ The Tech Wrap by Ben Martin☜

  4. Pingback: We Must Rethink the Future of Technology | Human-Centered Change and Innovation

Leave a Reply

Your email address will not be published. Required fields are marked *