Innovation is Combination

Silicon Valley’s Innovator’s Dilemma – The Atom, the Bit and the Gene

Innovation is Combination

GUEST POST from Greg Satell

Over the past several decades, innovation has become largely synonymous with digital technology. When the topic of innovation comes up, somebody points to a company like Apple, Google or Meta rather than, say, a car company, a hotel or a restaurant. Management gurus wax poetically about the “Silicon Valley way.”

Of course, that doesn’t mean that other industries haven’t been innovative. In fact, there are no shortage of excellent examples of innovation in cars, hotels, restaurants and many other things. Still, the fact remains that for most of recent memory digital technology has moved further and faster than anything else.

This has been largely due to Moore’s Law, our ability to consistently double the number of transistors we’re able to cram onto a silicon wafer. Now, however, Moore’s Law is ending and we’re entering a new era of innovation. Our future will not be written in ones and zeros, but will be determined by our ability to use information to shape the physical world.

The Atom

The concept of the atom has been around at least since the time of the ancient Greek philosopher Democritus. Yet it didn’t take on any real significance until the early 20th century. In fact, the paper Albert Einstein used for his dissertation helped to establish the existence of atoms through a statistical analysis of Brownian motion.

Yet it was the other papers from Einstein’s miracle year of 1905 that transformed the atom from an abstract concept to a transformative force, maybe even the most transformative force in the 20th century. His theory of mass-energy equivalence would usher in the atomic age, while his work on black-body radiation would give rise to quantum mechanics and ideas so radical that even he would refuse to accept them.

Ironically, despite Einstein’s reluctance, quantum theory would lead to the development of the transistor and the rise of computers. These, in turn, would usher in the digital economy, which provided an alternative to the physical economy of goods and services based on things made from atoms and molecules.

Still, the vast majority of what we buy is made up of what we live in, ride in, eat and wear. In fact, information and communication technologies only make up about 6% of GDP in advanced countries, which is what makes the recent revolution in materials science is so exciting. We’re beginning to exponentially improve the efficiency of how we design the materials that make up everything from solar panels to building materials.

The Bit

While the concept of the atom evolved slowly over millennia, the bit is one of the rare instances in which an idea seems to have arisen in the mind of a single person with little or no real precursor. Introduced by Claude Shannon in a paper in 1948—incidentally, the same year the transistor was invented—the bit has shaped how we see and interact with the world ever since.

The basic idea was that information isn’t a function of content, but the absence of ambiguity, which can be broken down to a single unit – a choice between two alternatives. Much like how a coin toss which lacks information while in the air, but takes on a level of certainty when it lands, information arises when ambiguity disappears.

He called this unit, a “binary digit” or a “bit” and much like the pound, quart, meter or liter, it has become such a basic unit of measurement that it’s hard to imagine our modern world without it. Shannon’s work would soon combine with Alan Turing’s concept of a universal computer to create the digital computer.

Now the digital revolution is ending and we will soon be entering a heterogeneous computing environment that will include things like quantum, neuromorphic and biological computing. Still, Claude Shannon’s simple idea will remain central to how we understand how information interacts with the world it describes.

The Gene

The concept of the gene was first discovered by an obscure Austrian monk named Gregor Mendel, but in one of those strange peculiarities of history, his work went almost totally unnoticed until the turn of the century. Even then, no one really knew what a gene was or how they functioned. The term was, for the most part, just an abstract concept.

That changed abruptly when James Watson and Francis Crick published their article in the scientific journal Nature. In a single stroke, the pair were able to show that genes were, in fact, made up of a molecule called DNA and that they operated through a surprisingly simple code made up of A,T,C and G.

Things really began to kick into high gear when the Human Genome Project was completed in 2003. Since then the cost to sequence a genome has been falling faster than the rate of Moore’s Law, which has unleashed a flurry of innovation. Jennifer Doudna’s discovery of CRISPR in 2012 revolutionized our ability to edit genes. More recently, mRNA technology has helped develop COVID-19 vaccines in record time.

Today, we have entered a new era of synthetic biology in which we can manipulate the genetic code of A,T,C and G almost as easily as we can the bits in the machines that Turing imagined all those years ago. Researchers are also exploring how we can use genes to create advanced materials and maybe even create better computers.

Innovation Is Combination

The similarity of the atom, the bit and the gene as elemental concepts is hard to miss and they’ve allowed us to understand our universe in a visceral, substantial way. Still, they arose in vastly different domains and have been largely applied to separate and distinct fields. In the future, however, we can expect vastly greater convergence between the three.

We’ve already seen glimpses of this. For example, as a graduate student Charlie Bennett was a teaching assistant for James Watson. Yet in between his sessions instructing undergraduates in Watson’s work on genes, he took an elective course on the theory of computing in which he learned about the work of Shannon and Turing. That led him to go work for IBM and become a pioneer in quantum computing.

In much the same way, scientists are applying powerful computers to develop new materials and design genetic sequences. Some of these new materials will be used to create more powerful computers. In the future, we can expect the concepts of the atom, the bit and the gene to combine and recombine in exciting ways that we can only begin to imagine today.

The truth is that innovation is combination and has, in truth, always been. The past few decades, in which one technology so thoroughly dominated that it was able to function largely in isolation to other fields, was an anomaly. What we are beginning to see now is, in large part, a reversion to the mean, where the most exciting work will be interdisciplinary.

This is Silicon Valley’s innovator’s dilemma. Nerdy young geeks will no longer be able to prosper coding blithely away in blissful isolation. It is no longer sufficient to work in bits alone. Increasingly we need to combine those bits with atoms and genes to create significant value. If you want to get a glimpse of the future, that’s where to look.

— Article courtesy of the Digital Tonto blog
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

One thought on “Innovation is Combination

  1. Pingback: Top 10 Human-Centered Change & Innovation Articles of October 2024 | Human-Centered Change and Innovation

Leave a Reply

Your email address will not be published. Required fields are marked *