Tag Archives: atoms

The End of the Digital Revolution

Here’s What You Need to Know

The End of the Digital Revolution

GUEST POST from Greg Satell

The history of digital technology has largely been one of denial followed by disruption. First came the concept of the productivity paradox, which noted the limited economic impact of digital technology. When e-commerce appeared, many doubted that it could ever compete with physical retail. Similar doubts were voiced about digital media.

Today, it’s hard to find anyone who doesn’t believe in the power of digital technology. Whole industries have been disrupted. New applications driven by cloud computing, artificial intelligence and blockchain promise even greater advancement to come. Every business needs to race to adopt them in order to compete for the future.

Ironically, amid all this transformation the digital revolution itself is ending. Over the next decade, new computing architectures will move to the fore and advancements in areas like synthetic biology and materials science will reshape entire fields, such as healthcare, energy and manufacturing. Simply waiting to adapt won’t be enough. The time to prepare is now.

1. Drive Digital Transformation

As I explained in Mapping Innovation, innovation is never a single event, but a process of discovery, engineering and transformation. Clearly, with respect to digital technology, we are deep into the transformation phase. So the first part of any post-digital strategy is to accelerate digital transformation efforts in order to improve your competitive position.

One company that’s done this very well is Walmart. As an old-line incumbent in the physical retail industry, it appeared to be ripe for disruption as Amazon reshaped how customers purchased basic items. Why drive out to a Walmart store for a package of toothpaste when you can just click a few buttons on your phone?

Yet rather than ceding the market to Amazon, Walmart has invested heavily in digital technology and has achieved considerable success. It wasn’t any one particular tactic or strategy made the difference, but rather the acknowledgment that every single process needed to be reinvented for the digital age. For example, the company is using virtual reality to revolutionize how it does in-store training.

Perhaps most of all, leaders need to understand that digital transformation is human transformation. There is no shortage of capable vendors that can implement technology for you. What’s key, however, is to shift your culture, processes and business model to leverage digital capabilities.

2. Explore Post-Digital Technologies

While digital transformation is accelerating, advancement in the underlying technology is slowing down. Moore’s law, the consistent doubling of computer chip performance over the last 50 years, is nearing its theoretical limits. It has already slowed down considerably and will soon stop altogether. Yet there are non-digital technologies under development that will be far more powerful than anything we’ve ever seen before.

Consider Intel, which sees its future in what it calls heterogeneous computing combining traditional digital chips with non-digital architectures, such as quantum and neuromorphic. It announced a couple of years ago its Pohoiki Beach neuromorphic system that processes information up to 1,000 times faster and 10,000 more efficiently than traditional chips for certain tasks.

IBM has created a network to develop quantum computing technology, which includes research labs, startups and companies that seek to be early adopters of the technology. Like neuromorphic computing, quantum systems have the potential to be thousands, if not millions, of times more powerful than today’s technology.

The problem with these post-digital architectures is that no one really knows how they are going to work. They operate on a very different logic than traditional computers, will require new programming languages and algorithmic strategies. It’s important to start exploring these technologies now or you could find yourself years behind the curve.

3. Focus on Atoms, Not Bits

The digital revolution created a virtual world. My generation was the first to grow up with video games and our parents worried that we were becoming detached from reality. Then computers entered offices and Dan Bricklin created Visicalc, the first spreadsheet program. Eventually smartphones and social media appeared and we began spending almost as much time in the virtual world as we did in the physical one.

Essentially, what we created was a simulation economy. We could experiment with business models in our computers, find flaws and fix them before they became real. Computer-aided design (CAD) software allowed us to design products in bits before we got down to the hard work of shaping atoms. Because it’s much cheaper to fail in the virtual world than the physical one, this made our economy much more efficient.

Yet the next great transformation will be from bits to atoms. Digital technology is creating revolutions in things like genomics and materials science. Artificial intelligence and cloud computing are reshaping fields like manufacturing and agriculture. Quantum and neuromorphic computing will accelerate these trends.

Much like those new computing architectures, the shift from bits to atoms will create challenges. Applying the simulation economy to the world of atoms will require new skills and we will need people with those skills to move from offices in urban areas to factory floors and fields. They will also need to learn to collaborate effectively with people in those industries.

4. Transformation is Always a Journey, Never a Destination

The 20th century was punctuated by two waves of disruption. The first, driven by electricity and internal combustion, transformed almost every facet of daily life and kicked off a 50-year boom in productivity. The second, driven by the microbe, the atom and the bit, transformed fields such as agriculture, healthcare and management.

Each of these technologies followed the pattern of discovery, engineering and transformation. The discovery phase takes place mostly out of sight, with researchers working quietly in anonymous labs. The engineering phase is riddled with errors, as firms struggle to shape abstract concepts into real products. A nascent technology is easy to ignore, because its impact hasn’t been felt yet.

The truth is that disruption doesn’t begin with inventions, but when an ecosystem emerges to support them. That’s when the transformation phase begins and takes us by surprise, because transformation never plays out like we think it will. The future will always, to a certain extent, unpredictable for the simple reason that it hasn’t happened yet.

Today, we’re on the brink of a new era of innovation that will be driven by new computing architectures, genomics, materials science and artificial intelligence. That’s why we need to design our organizations for transformation by shifting from vertical hierarchies to horizontal networks.

Most of all, we need to shift our mindsets from seeing transformation as set of discreet objectives to a continuous journey of discovery. Digital technology has only been one phase of that journey. The most exciting things are still yet to come.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

A New Age Of Innovation and Our Next Steps

A New Age Of Innovation and Our Next Steps

GUEST POST from Greg Satell

In Mapping Innovation, I wrote that innovation is never a single event, but a process of discovery, engineering and transformation and that those three things hardly ever happen at the same time or in the same place. Clearly, the Covid-19 pandemic marked an inflection point which demarcated several important shifts in those phases.

Digital technology showed itself to be transformative, as we descended into quarantine and found an entire world of video conferencing and other technologies that we scarcely knew existed. At the same time it was revealed that the engineering of synthetic biology—and mRNA technology in particular—was more advanced than we had thought.

This is just the beginning. I titled the last chapter of my book, “A New Era of Innovation,” because it had become clear that we had begun to cross a new rubicon in which digital technology becomes so ordinary and mundane that it’s hard to remember what life was like without it, while new possibilities alter existence to such an extent we will scarcely believe it.

Post-Digital Architectures

For the past 50 years, the computer industry—and information technology in general—has been driven by the principle known as Moore’s Law, which determined we could double the number of transistors on chips every 18 months. Yet now Moore’s Law is ending and that means we will have to revisit some very basic assumptions about how technology works.

To be clear, the end of Moore’s Law does not mean the end of advancement. There are a number of ways we can speed up computing. We can, for instance, use technologies such as ASIC and FPGA to optimize chips for specialized tasks. Still, those approaches come with tradeoffs, Moore’s law essentially gave us innovation for free.

Another way out of the Moore’s Law conundrum is to shift to completely new architectures, such as quantum, neuromorphic and, possibly, biological computers. Yet here again, the transition will not be seamless or without tradeoffs. Instead of technology based on transistors, we will have multiple architectures based on entirely different logical principles.

So it seems that we will soon be entering a new era of heterogeneous computing, in which we use digital technology to access different technologies suited to different tasks. Each of these technologies will require very different programming languages and algorithmic approaches and, most likely, different teams of specialists to work on them.

What that means is that those who run the IT operations in the future, whether that person is a vaunted CTO or a lowly IT manager, will be unlikely to understand more than a small part of the system. They will have to rely heavily on the expertise of others to an extent that isn’t required today.

Bits Driving Atoms

While the digital revolution does appear to be slowing down, computers have taken on a new role in helping to empower technologies in other fields, such as synthetic biology, materials science and manufacturing 4.0. These, unlike so many digital technologies, are rooted in the physical world and may have the potential to be far more impactful.

Consider the revolutionary mRNA technology, which not only empowered us to develop a Covid vaccine in record time and save the planet from a deadly pandemic, but also makes it possible to design new vaccines in a matter of hours. There is no way we could achieve this without powerful computers driving the process.

There is similar potential in materials discovery. Suffice it to say, every product we use, whether it is a car, a house, a solar panel or whatever, depends on the properties of materials to perform its function. Some need to be strong and light, while others need special electrical properties. Powerful computers and machine learning algorithms can vastly improve our ability to discover better materials (not to mention overcome supply chain disruptions).

Make no mistake, this new era of innovation will be one of atoms, not bits. The challenge we face now is to develop computer scientists who can work effectively with biologists, chemists, factory managers and experts of all kinds to truly create a new future.

Creation And Destruction

The term creative destruction has become so ingrained in our culture we scarcely stop to think where it came from. It was largely coined by economist Joseph Schumpeter to overcome what many saw as an essential “contradiction” of capitalism. Essentially, some thought that if capitalists did their jobs well, then there would be increasing surplus value, which would then be appropriated to accumulate power to rig the system further in capitalists favor.

Schumpeter pointed out that this wasn’t necessarily true because of technological innovation. Railroads, for example, completely changed the contours of competition in the American Midwest. Surely, there had been unfair competition in many cities and towns, but once the railroad came to town, competition flourished (and if it didn’t come, the town died).

For most of history since the beginning of the Industrial Revolution, this has been a happy story. Technological innovation displaced businesses and workers, but resulted in increased productivity which led to more prosperity and entirely new industries. This cycle of creation and destruction has, for the most part, been a virtuous one.

That is, until fairly recently. Digital technology, despite the hype, hasn’t produced the type of productivity gains that earlier technologies, such as electricity and internal combustion, did but actually displaced labor at a faster rate. Put simply, the productivity gains from digital technology are too meager to finance enough new industries with better jobs, which has created income inequality rather than greater prosperity.

We Need To Move From Disrupting Markets To Tackling Grand Challenges

There’s no doubt that digital technology has been highly disruptive. In industry after industry, from retail to media to travel and hospitality, nimble digital upstarts have set established industries on their head, completely changing the basis upon which firms compete. Many incumbents haven’t survived. Many others are greatly diminished.

Still, in many ways, the digital revolution has been a huge disappointment. Besides the meager productivity gains, we’ve seen a ​​global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets, not to mention an anxiety epidemic, increased obesity and, at least in the US, decreased life expectancy.

We can—and must—do better. We can learn from the mistakes we made during the digital revolution and shift our mindset from disrupting markets to tackling grand challenges. This new era of innovation will give us the ability to shape the world around us like never before, at a molecular level and achieve incredible things.

Yet we can’t just leave our destiny to the whims of market and technological forces. We must actually choose the outcomes we prefer and build strategies to achieve them. The possibilities that we will unlock from new computing architectures, synthetic biology, advanced materials science, artificial intelligence and other things will give us that power.

What we do with it is up to us.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.