GUEST POST from Greg Satell
It’s become strangely fashionable for digerati to mourn the death of innovation. “There’s nothing new,” has become a common refrain for which they blame venture capitalists, entrepreneurs and other digerati they consider to be less enlightened than themselves. They yearn for a lost age when things were better and more innovative.
What they fail to recognize is that the digital era is ending. After more than 50 years of exponential growth, the technology has matured and advancement has naturally slowed. While it is true that there are worrying signs that things in Silicon Valley have gone seriously awry and those excesses need to be curtailed, there’s more to the story.
The fact is that we’re on the brink of a new era of innovation and, while digital technology will be an enabling factor, it will no longer be center stage. The future will not be written in the digital language of ones and zeroes, but in that of atoms, molecules, genes and proteins. We do not lack potential or possibility, what we need is more imagination and wonder.
The End Of Moore’s Law
In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which predicted that computing power would double about every two years. This idea, known as Moore’s Law , has driven the digital revolution for a half century. It’s what’s empowered us to shrink computers from huge machines to tiny, but powerful, devices we carry in our pockets.
Yet there are limits for everything. The simple truth is that atoms are only so small and the speed of light is only so fast. That puts a limit on how many transistors we can cram onto a silicon wafer and how fast electrons can zip around the logic gates we set up for them. At this point, Moore’s Law is effectively over.
That doesn’t mean that advancement will stop altogether. There are other ways to speed up computing. The problem is that they all come with tradeoffs. New architectures, such as quantum and neuromorphic computing, for instance, require new programming languages, new logical approaches and very different algorithmic strategies than we’re used to.
So for the next decade or two we’re likely to see a heterogeneous computing environment emerge, in which we combine different architectures for different tasks. For example, we will be augmenting traditional AI systems with techniques like quantum machine learning. It is not only possible, but fairly likely, that these types of combinations will result in an exponential increase in capability.
A Biological Revolution
Moore’s Law has become essentially shorthand for exponential improvement in any field. Anytime we see a continuous doubling of efficiency, we call it “the Moore’s Law of ‘X.’’ Yet since the Human Genome Project was completed in 2003, advancement in genetic sequencing has far outpaced what has happened in the digital arena.
What is possibly an even bigger development occurred in 2012, when Jennifer Doudna and her colleagues discovered how CRISPR could revolutionize gene editing. Now, suddenly, the work of genetic engineers that would have taken weeks could be done in hours, at a fraction of the cost, with much greater accuracy and the new era of synthetic biology had begun.
The most obvious consequence of this new era is the Covid-19 vaccine, which was designed in a matter of mere days instead of what’s traditionally taken years. The mRNA technology used to create two of the vaccines also holds promise for cancer treatment and CRISPR-based approaches have been applied to cure sickle cell and other diseases.
Yet as impressive as the medical achievements are, they make up only a fraction of the innovation that synthetic biology is making possible. Scientists are working on programming microorganisms to create new carbon-neutral biofuels and biodegradable plastics. It may very well revolutionize agriculture and help feed the world.
The truth is that the biological revolution is basically where computers were at in the 1970s or 80s and we are just beginning to understand the potential. We can expect progress to accelerate for decades to come.
The Infinite World Of Atoms
Anyone who has regularly read the business press over the past 20 years or so would naturally conclude that we live in a digital economy. Certainly, tech firms dominate any list of the world’s most valuable companies. Yet take a closer look and you will find that information and communication as a sector only makes up for 6% of GDP in advanced countries.
The truth is that we still live very much in a world of atoms and we spend most of our money on what we eat, wear, ride and live in. Any real improvement in our well-being depends on our ability to shape atoms to our liking. As noted above, reprogramming genetic material in cells to make things for us is one way we can do that, but not the only one.
In fact, there is a revolution in materials science underway. Much like in genomics, scientists are learning how to use computers to understand materials on a fundamental level and figure out how we can design them a lot better. In fact, in some cases researchers are able to discover new materials hundreds of times more efficiently than before.
Unlike digital or biological technologies this is largely a quiet revolution with very little publicity. Make no mistake, however, our newfound ability to create advanced materials will transform our ability to create and build everything from vastly more efficient solar panels to lighter, stronger and more environmentally friendly building materials.
The Next Big Thing Always Starts Out Looking Like Nothing At All
The origins of digital computing can be traced back at least a century, to the rise and fall of logical positivism, Turing’s “machine,” the invention of the transistor, the integrated circuit and the emergence of the first modern PC at Xerox PARC in the early 1970s. Yet there wasn’t a measurable impact from computing until the mid-1990s.
We tend to assume that we’ll notice when something important is afoot, but that’s rarely the case. The truth is that the next big thing always starts out looking like nothing at all. It doesn’t appear fully bloomed, but usually incubates for years—and often decades—by scientists quietly working in labs and by specialists debating at obscure conferences.
So, yes, after 50 years the digital revolution has run out of steam, but that shouldn’t blind us to the incredible opportunities that are before us. After all, a year ago very few people had heard of mRNA vaccines, but that didn’t make them any less powerful or important. There is no shortage of nascent technologies that can have just as big of an impact.
The simple fact is that innovation is not, and never has been, about what kind of apps show up on our smartphone screens. The value of a technology is not measured in how a Silicon Valley CEO can dazzle an audience on stage, but in our capacity to solve meaningful problems and, as long as there are meaningful problems to solve, innovation will live on.
— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.