
GUEST POST from Greg Satell
It’s become strangely fashionable for digerati to mourn the death of innovation. “There’s nothing new,” has become a common refrain for which they blame venture capitalists, entrepreneurs and other digerati they consider to be less enlightened than themselves. They yearn for a lost age when things were better and more innovative.
What they fail to recognize is that the digital era is ending. After more than 50 years of exponential growth, the technology has matured and advancement has naturally slowed. While it is true that there are worrying signs that things in Silicon Valley have gone seriously awry and those excesses need to be curtailed, there’s more to the story.
The fact is that we’re on the brink of a new era of innovation and, while digital technology will be an enabling factor, it will no longer be center stage. The future will not be written in the digital language of ones and zeroes, but in that of atoms, molecules, genes and proteins. We do not lack potential or possibility, what we need is more imagination and wonder.
The End Of Moore’s Law
In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which predicted that computing power would double about every two years. This idea, known as Moore’s Law , has driven the digital revolution for a half century. It’s what’s empowered us to shrink computers from huge machines to tiny, but powerful, devices we carry in our pockets.
Yet there are limits for everything. The simple truth is that atoms are only so small and the speed of light is only so fast. That puts a limit on how many transistors we can cram onto a silicon wafer and how fast electrons can zip around the logic gates we set up for them. At this point, Moore’s Law is effectively over.
That doesn’t mean that advancement will stop altogether. There are other ways to speed up computing. The problem is that they all come with tradeoffs. New architectures, such as quantum and neuromorphic computing, for instance, require new programming languages, new logical approaches and very different algorithmic strategies than we’re used to.
So for the next decade or two we’re likely to see a heterogeneous computing environment emerge, in which we combine different architectures for different tasks. For example, we will be augmenting traditional AI systems with techniques like quantum machine learning. It is not only possible, but fairly likely, that these types of combinations will result in an exponential increase in capability.
A Biological Revolution
Moore’s Law has become essentially shorthand for exponential improvement in any field. Anytime we see a continuous doubling of efficiency, we call it “the Moore’s Law of ‘X.’’ Yet since the Human Genome Project was completed in 2003, advancement in genetic sequencing has far outpaced what has happened in the digital arena.
What is possibly an even bigger development occurred in 2012, when Jennifer Doudna and her colleagues discovered how CRISPR could revolutionize gene editing. Now, suddenly, the work of genetic engineers that would have taken weeks could be done in hours, at a fraction of the cost, with much greater accuracy and the new era of synthetic biology had begun.
The most obvious consequence of this new era is the Covid-19 vaccine, which was designed in a matter of mere days instead of what’s traditionally taken years. The mRNA technology used to create two of the vaccines also holds promise for cancer treatment and CRISPR-based approaches have been applied to cure sickle cell and other diseases.
Yet as impressive as the medical achievements are, they make up only a fraction of the innovation that synthetic biology is making possible. Scientists are working on programming microorganisms to create new carbon-neutral biofuels and biodegradable plastics. It may very well revolutionize agriculture and help feed the world.
The truth is that the biological revolution is basically where computers were at in the 1970s or 80s and we are just beginning to understand the potential. We can expect progress to accelerate for decades to come.
The Infinite World Of Atoms
Anyone who has regularly read the business press over the past 20 years or so would naturally conclude that we live in a digital economy. Certainly, tech firms dominate any list of the world’s most valuable companies. Yet take a closer look and you will find that information and communication as a sector only makes up for 6% of GDP in advanced countries.
The truth is that we still live very much in a world of atoms and we spend most of our money on what we eat, wear, ride and live in. Any real improvement in our well-being depends on our ability to shape atoms to our liking. As noted above, reprogramming genetic material in cells to make things for us is one way we can do that, but not the only one.
In fact, there is a revolution in materials science underway. Much like in genomics, scientists are learning how to use computers to understand materials on a fundamental level and figure out how we can design them a lot better. In fact, in some cases researchers are able to discover new materials hundreds of times more efficiently than before.
Unlike digital or biological technologies this is largely a quiet revolution with very little publicity. Make no mistake, however, our newfound ability to create advanced materials will transform our ability to create and build everything from vastly more efficient solar panels to lighter, stronger and more environmentally friendly building materials.
The Next Big Thing Always Starts Out Looking Like Nothing At All
The origins of digital computing can be traced back at least a century, to the rise and fall of logical positivism, Turing’s “machine,” the invention of the transistor, the integrated circuit and the emergence of the first modern PC at Xerox PARC in the early 1970s. Yet there wasn’t a measurable impact from computing until the mid-1990s.
We tend to assume that we’ll notice when something important is afoot, but that’s rarely the case. The truth is that the next big thing always starts out looking like nothing at all. It doesn’t appear fully bloomed, but usually incubates for years—and often decades—by scientists quietly working in labs and by specialists debating at obscure conferences.
So, yes, after 50 years the digital revolution has run out of steam, but that shouldn’t blind us to the incredible opportunities that are before us. After all, a year ago very few people had heard of mRNA vaccines, but that didn’t make them any less powerful or important. There is no shortage of nascent technologies that can have just as big of an impact.
The simple fact is that innovation is not, and never has been, about what kind of apps show up on our smartphone screens. The value of a technology is not measured in how a Silicon Valley CEO can dazzle an audience on stage, but in our capacity to solve meaningful problems and, as long as there are meaningful problems to solve, innovation will live on.
— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.






The world of innovation is full of the names of famous partnerships — Steve Jobs and Steve Wozniak, Bill Hewlett and Dave Packard, Sergey Brin and Larry Page to name but a few. But Bottger and von Tschirnhaus isn’t a combination which springs easily to the lips or off the tongue. Yet it was this unlikely partnership which managed the impossible — between them they were able to transmute base material into weisses Gold — white gold.
Truth was the merchants didn’t know much about it and neither did the Chinese who supplied them. Marco Polo’s best guess at its origins? ‘The dishes are made of a crumbly earth or clay which is dug as though from a mine and stacked in huge mounds and then left for thirty or forty years exposed to wind, rain, and sun …by this time the earth is so refined that dishes made of it are of an azure tint with a very brilliant sheen.” The assumption that this was somehow magical can be seen in an account from 1550 suggested that “porcelain is …… made of a certain juice which coalesces underground…”
Which is where another important piece of the puzzle comes in; rather than develop production on a large scale to make porcelain a commodity product Bottger (with Augustus’s backing and the profits from early sales) began to add design into the mix. He commissioned artists to create a range of exquisite artefacts exploiting the potential of the new material and opening up a wealthy market niche to continue to fund his development work. Meissen porcelain figures were used to decorate the drawing rooms of great houses, sculptures took pride of place in entrance halls and even the more mundane business of eating and drinking became a pleasure when using beautifully crafted plates, cups, pots and jugs. What Bottger did was essentially create 

In today’s terms we’d talk perhaps of a rather weak ‘appropriability regime’ — it was hard to keep the lid on what was going on. Samuel Stöltzel was a senior arcanist at Meissen, one of the few who understood the secrets (the ‘arcana’) of making the hard porcelain for which the company had become famous. But (for a suitably high price) he was persuaded to sell these to a competing venture which, in 1717, started to produce porcelain in Vienna. By 1760 there were over thirty porcelain factories in Europe.
Faced with the challenge of increasing imitation Augustus’s team set about differentiating themselves in other ways. They built a brand, building on the relationships they had already made and the values they and their product stood for — purity, exquisite design, high quality at a premium price. To make sure they got the message across they employed a trade mark — the crossed swords of the Meissen brand which can still be found on their ware today, three hundred years on.
Building a business out of an idea and moving to scale needs a system — inside there are many pieces of the jigsaw puzzle to be put in place. As well as commissioning designers to imagine the products the Meissen team also had to continue their hard work on process technology to be able to manufacture them. All the different stages like moulding, shaping, painting, glazing, firing needed to move from manual operations to controlled and systematic processes. Beyond that there were challenges in scaling around procuring raw materials of the right quality, and downstream development of sales and distribution networks.

