Tag Archives: history

Top 10 Human-Centered Change & Innovation Articles of April 2023

Top 10 Human-Centered Change & Innovation Articles of April 2023Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are April’s ten most popular innovation posts:

  1. Rethinking Customer Journeys — by Geoffrey A. Moore
  2. What Have We Learned About Digital Transformation Thus Far? — by Geoffrey A. Moore
  3. Design Thinking Facilitator Guide — by Douglas Ferguson
  4. Building A Positive Team Culture — by David Burkus
  5. Questions Are More Powerful Than We Think — by Greg Satell
  6. 3 Examples of Why Innovation is a Leadership Problem — by Robyn Bolton
  7. How Has Innovation Changed Since the Pandemic? — by Robyn Bolton
  8. 5 Questions to Answer Before Spending $1 on Innovation — by Robyn Bolton
  9. Customers Care About the Destination Not the Journey — by Shep Hyken
  10. Get Ready for the Age of Acceleration — by Robert B. Tucker

BONUS – Here are five more strong articles published in March that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last three years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Our Innovation is All on Tape

Why Old Technologies Are Sometimes Still the Best Ones

Our Innovation is All on Tape

GUEST POST from John Bessant

Close your eyes and imagine for a moment a computer room in the early days of the industry. Chances are you’ll picture large wardrobe-sized metal cabinets whirring away with white-coated attendants tending to the machines. And it won’t be long before your gaze lands on the ubiquitous spools of tape being loaded and unloaded.

Which might give us a smug feeling as we look at the storage options for our current generation of computers — probably based on some incredibly fast access high-capacity solid state flash drive. It’s been quite a journey — the arc stretches a long way back from the recent years of USB sticks and SD cards, external HDDs and then the wonderful world of floppy discs, getting larger and more rigid as we go back in time. The clunky 1980s when our home computers rode on cassette drives, right back to the prehistoric days where the high priests of mini and mainframes tended their storage flock of tapes.

Ancient history — except that the tape drive hasn’t gone away. In fact it’s alive and well and backing up our most precious memories. Look inside the huge data farms operated by Google, Apple, Amazon, Microsoft Azure or anyone else and you’ll find large computers — and lots of tape. Thousands of kilometres of it, containing everything from your precious family photos to email backups to data from research projects like the Large Hadron Collider.

It turns out that tape is still an incredibly reliable medium — and it has the considerable advantage of being cheap. The alternative would be buying lots of hard drives — something which increasingly matters as the volume of data we are storing is growing. Think about the internet of things — all those intelligent devices, whether security cameras or mobile phones, manufacturing performance data loggers or hospital diagnostic equipment, are generating data which needs secure long-term storage. We’ve moved long past the era of measuring storage in kilobytes or megabytes; now we’re into zettabytes, each one the equivalent of to 250billion DVDs. In 2020 estimates suggest we produced close to 59Zb of data, projected to rise to 175zb by 2025! Fortunately IBM scientist Mark Lantz , an expert in storage, suggests that we can keep scaling tape and doubling capacity every 2.5 years for the next 20 years.

Plus tape offers a number of other advantages, not least in terms of security. Most of the time a tape cartridge is not plugged in to a computer and so is pretty immune to visiting viruses and malware.

In fact the market for magnetic tape storage is in robust health; it’s currently worth nearly $5bn and is expected to grow to double that size by 2030. Not bad for a technology coming up on its hundredth anniversary. Making all of this possible is, of course, our old friend innovation. It’s been a classic journey of incremental improvement, doing what we do but better, punctuated with the occasional breakthrough.

It started in 1877 when “Mary Had a Little Lamb” was recorded and played on Thomas Edison’s first experimental talking machine called a phonograph; the sounds were stored on wax cylinders and severely limited in capacity. The first tape recorder was developed in 1886 by Alexander Graham Bell in his labs using paper with beeswax coated on it. This patented approach never really took off because the sound reproduction was inferior to Edison’s wax cylinders.

Others soon explored alternatives; for example Franklin C. Goodale adapted movie film for analogue audio recording, receiving a patent for his invention in 1909. His film used a stylus to record and play back, essentially mimicking Edison’s approach but allowing for much more storage.

But in parallel with the wax-based approach another strand emerged in 1898, with the work of Voldemar Poulsen, a Danish scientist who built on an idea originally suggested ten years earlier by Oberlin Smith. This used the concept of a wire (which could be spooled) on which information was encoded magnetically. Poulsen’s model used cotton thread, steel sawdust and metal wire and was effectively the world’s first tape recorder; he called it a ‘telegraphone’.

Which brings us to another common innovation theme — convergence. If we fast forward (itself a term which originated in the word of tape recording!) to the 1930s we can see these two strands come together; German scientists working for the giant BASF company built on a patent registered to Fritz Pfleumer in 1928. They developed a magnetic tape using metal oxide coated on plastic tape which could be used in recording sound on a commercial basis; in 1934 they delivered the first 50,000 metres of it to the giant electronics corporation AEG.

The big advantage of magnetic recording was that it didn’t rely on a physical analogue being etched into wax or other medium; instead the patterns could be encoded and read as electrical signals. It wasn’t long before tape recording took over as the dominant design — and one of the early entrants was the 3M company in the USA. They had a long history of coating surfaces with particles, having begun life making sandpaper and moved on to create a successful business out of first adhesive masking tape and then the ubiquitous Scotch tape. Coating metal oxide on to tape was an obvious move and they quickly became a key player in the industry.

Innovation is always about the interplay between needs and means and the tape recording business received a fillip from the growing radio industry in the 1940s. Tape offered to simplify and speed up the recording process and an early fan was Bing Crosby. He’d become fed up with the heavy schedule of live broadcasting which kept him away from his beloved golf course and so was drawn to the idea of pre-recording his shows. But the early disc-based technology wasn’t really up to the task, filled with hisses and scratches and poor sound quality. Crosby’s sound engineer had come across the idea of tape recording and worked with 3M to refine the technology.

The very first radio show, anywhere in the world, to be recorded directly on magnetic tape was broadcast on 1 October 1947 featuring Crosby. It not only opened up a profitable line of new business for 3M, it also did its bit for changing the way the world consumed entertainment, be it drama, music hall or news. (It was also a shrewd investment for Crosby who became one of the emerging industry’s backers)

Which brings us to another kind of innovation interplay, this time between different approaches being taken in the worlds of consumer entertainment and industrial computing. Ever since Marconi, Tesla and others had worked on radio there had been a growing interest in consumer applications which could exploit the technology. And with the grandchildren of Edison’s gramophone and in the 1940s the work on television, the home became an increasingly interesting space for electronics entrepreneurs.

But as the domestic market for fixed appliances grew saturated so the search began for mobile solutions. Portability became an important driver for the industry and gave rise to the transistor radio; it wasn’t long before the in car entertainment market began to take off. An early entrant from the tape playback side was the 8-track cartridge in the mid-1960s which allowed you to listen to your favorite tracks without lugging a portable gramophone with you. Philips’ development of the compact cassette (and its free licensing of the idea to promote rapid and widespread adoption) led to an explosion in demand (over 100 billion cassette tapes were eventually sold worldwide) and eventually to the idea of the Walkman as the first portable personal device for recorded and recording music.

Without which we’d be a little less satisfied. Specifically we’d never been introduced to one of the Rolling Stones’ greatest hits; as guitarist Keith Richards explained in his 2010 autobiography:

“I wrote the song ‘Satisfaction’ in my sleep. I didn’t know at all that I had recorded it, the song only exists, thank God, to the little Philips cassette recorder. I looked at it in the morning — I knew I had put a new tape in the night before — but it was at the very end. Apparently, I had recorded something. I rewound and then ‘Satisfaction’ sounded … and then 40 minutes of snoring!”

Meanwhile back in the emerging computer industry of the 1950s there was a growing demand for storage media for which magnetic tape seemed well suited. Cue the images we imagined in the opening paragraph, acolytes dutifully tending the vast mainframe machines.

Early computers had used punched cards and then paper tape but these soon reached the limit of their usefulness; instead the industry began exploring magnetic audio tape.

IBM’s team under the leadership of Wayne Winger developed digital tape-based storage; of particular importance was finding ways to encode the 1s and 0s of binary patterns onto the tape. They introduced the commercial digital tape recorder in 1952, and it could store what was (for its time) an impressive 2mB of data on a reel.

Not everyone was convinced; as Winger recalled, “A white-haired IBM veteran in Poughkeepsie pulled a few of us aside and told us, ‘You young fellows remember, IBM was built on punched cards, and our foundation will always be punched cards.’ Fortunately Tom Watson Jnr, son of the company founder became a champion and the project went ahead.

But while tape dominated in the short term another parallel trajectory was soon established, replacing tapes and reels with disc drives whose big advantage was the ability to randomly access data rather than wait for the tape to arrive at the right place on the playback head. IBM once again led the way with its launch in 1956 of the hard disc drive and began a steady stream of innovation in which storage volumes and density increased while the size decreased. The landscape moved through various generations of external drives until the advent of personal computers where the drives migrated inside the box and became increasingly small (and floppy).

These developments were taken up by the consumer electronics industry with the growing use of discs as an alternative recording and playback medium, spanning various formats but also decreasing in size. Which of course opened the way for more portability with Sony and Sharp launching mini-disc players in the early 1980s.

All good news for the personal audio experience but less so for the rapidly expanding information technology industry. While new media storage technology continued to improve it came at a cost and with the exponential increase in volumes of data needing to be stored came a renewed interest in alternative (and cheaper) solutions. The road was leading back to good old-fashioned tape.

Its potential was in long-term storage and retrieval of so-called ‘cold data’. Most of what is stored in the cloud today is this kind — images, emails, all sorts of backup files. And while these need to be around they don’t have to be accessed instantly. And that’s where tape has come back into its own. Today’s tapes have moved on somewhat from IBM’s 1952 limited 2mB of capacity version. They are smaller on the outside but their capacity has grown enormously — they can now hold 20Tb or even if compressed 60pTb — that’s a 10 millionfold increase in 70 years. The tapes are not wound by hand on to capstans but instead loaded into cartridges, each of which hold around a kilometer of tape; companies use libraries containing tens of thousands of these cartridges which can be mounted via automated systems deploying robots. This process takes around 90 seconds to locate a cartridge and access and load the tape, so you could be forgiven for thinking that it’s a bit slow compared to your flash drive which has an access time measured in milliseconds.

There’s a pattern here — established and once important technologies giving way to the new kids on the block with their apparently superior performance. We’ve learned that we shouldn’t necessarily write the old technologies off — at the minimum there is often a niche for them amongst enthusiasts. Think about vinyl, about the anti-mp3 backlash from hi-fi fans or more recently photography using film and plates rather than their digital counterparts.

But it’s more than just nostalgia which drives this persistence of the old. Sometimes — like our magnetic tape — there are performance features which are worth holding on to — trading speed for security and lower storage cost, for example. Sometimes there is a particular performance niche which the new technology cannot enter competitively — for example the persistence of fax machines in healthcare where they offer a secure and reliable way of transmitting sensitive information. At the limit we might argue that neither cash nor physical books are as ‘good’ as their digital rivals but their persistence points to other attributes which people continue to find valuable.

And sometimes it is about the underlying accumulated knowledge which the old technology represents — and which might be redeployed to advantage in a different field. Think of Fujifilm’s resurgence as a cosmetics and pharmaceuticals company on the back of its deep knowledge of emulsions and coatings. Technologies which it originally mastered in the now largely disappeared world of film photography. Or Kodak’s ability to offer high speed high quality printing on the back of knowledge it originally acquired in the same old industry — that of accurately spraying and targeting millions of droplets on to a surface. And it was 3M’s deep understanding of how to coat materials on to tapes gained originally from selling masking tape to the paint shops of Detroit which helped it move so effectively into the field of magnetic tape.

Keeping these technologies alive isn’t about putting them on life support; as the IBM example demonstrates it needs a commitment to incremental innovation, driving and optimising performance. And there’s still room for breakthroughs within those trajectories; in the case of magnetic tape storage it came in 2010 in the form of the Linear Tape File System (LTFS) open standard. This allowed tape drives to emulate the random access capabilities of their hard disk competitors, using metadata about the location of data stored on the tapes.

Whichever way you look at it there’s a need for innovation, whether bringing a breakthrough to an existing field or helping sustain a particular niche for the long haul. And we shouldn’t be too quick to write off ‘old’ technologies as new ones emerge which appear superior. It’s worth remembering that the arrival of the steamship didn’t wipe out the shipyards building sailing ships around the world; it actually spurred them on to a golden era of performance imporvement which it took steampships a long time to catch up with.

So, there’s often a lot of life left in old dogs, especially when we can teach them some new innovative tricks.

You can find a podcast version of this here and a video version here

And if you’d like to learn with me take a look at my online course here

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Top 10 Human-Centered Change & Innovation Articles of February 2023

Top 10 Human-Centered Change & Innovation Articles of February 2023Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are February’s ten most popular innovation posts:

  1. Latest Innovation Management Research Revealed — by Braden Kelley
  2. Apple Watch Must Die (At least temporarily, because it’s proven bad for innovation) — by Braden Kelley
  3. Unlock Hundreds of Ideas by Doing This One Thing (Inspired by Hollywood) — by Robyn Bolton
  4. Using Limits to Become Limitless — by Rachel Audige
  5. Kickstarting Change and Innovation in Uncertain Times — by Janet Sernack
  6. Five Challenges All Teams Face — by David Burkus
  7. A Guide to Harnessing the Power of Foresight (Unlock Your Company’s Full Potential) — by Teresa Spangler
  8. Creating Great Change, Transformation and Innovation Teams — by Stefan Lindegaard
  9. The Ultimate Guide to the Phase-Gate Process — by Dainora Jociute
  10. Delivering Innovation (How the History of Mail Order Can Help Us Manage Innovation at Scale) — by John Bessant

BONUS – Here are five more strong articles published in January that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last three years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Hard Facts Are a Hard Thing

Hard Facts Are a Hard Thing

GUEST POST from Greg Satell

In 1977, Ken Olsen, the founder and CEO of Digital Equipment Corporation, reportedly said, “There is no reason for any individual to have a computer in his home.” It was an amazingly foolish thing to say and, ever since, observers have pointed to Olsen’s comment to show how supposed experts can be wildly wrong.

The problem is that Olsen was misquoted. In fact, his company was actually in the business of selling personal computers and he had one in his own home. This happens more often than you would think. Other famous quotes, such IBM CEO Thomas Watson predicting that there would be a global market for only five computers, are similarly false.

There is great fun in bashing experts, which is why so many inaccurate quotes get repeated so often. If the experts are always getting it wrong, then we are liberated from the constraints of expertise and the burden of evidence. That’s the hard thing about hard facts. They can be so elusive that it’s easy to believe doubt their existence. Yet they do exist and they matter.

The Search for Absolute Truth

In the early 20th century, science and technology emerged as a rising force in western society. The new wonders of electricity, automobiles and telecommunication were quickly shaping how people lived, worked and thought. Empirical verification, rather than theoretical musing, became the standard by which ideas were measured.

It was against this backdrop that Moritz Schlick formed the Vienna Circle, which became the center of the logical positivist movement and aimed to bring a more scientific approach to human thought. Throughout the 20’s and 30’s, the movement spread and became a symbol of the new technological age.

At the core of logical positivism was Ludwig Wittgenstein’s theory of atomic facts, the idea the world could be reduced to a set of statements that could be verified as being true or false—no opinions or speculation allowed. Those statements, in turn, would be governed by a set of logical algorithms which would determine the validity of any argument.

It was, to the great thinkers of the day, both a grand vision and an exciting challenge. If all facts could be absolutely verified, then we could confirm ideas with absolute certainty. Unfortunately, the effort would fail so miserably that Wittgenstein himself would eventually disown it. Instead of building a world of verifiable objective reality, we would be plunged into uncertainty.

The Fall of Logic and the Rise of Uncertainty

Ironically, while the logical positivist movement was gaining steam, two seemingly obscure developments threatened to undermine it. The first was a hole at the center of logic called Russell’s Paradox, which suggested that some statements could be both true and false. The second was quantum mechanics, a strange new science in which even physical objects could defy measurement.

Yet the battle for absolute facts would not go down without a fight. David Hilbert, the most revered mathematician of the time, created a program to resolve Russell’s Paradox. Albert Einstein, for his part, argued passionately against the probabilistic quantum universe, declaring that “God does not play dice with the universe.”

Alas, it was all for naught. Kurt Gödel would prove that every logical system is flawed with contradictions. Alan Turing would show that all numbers are not computable. The Einstein-Bohr debates would be resolved in Bohr’s favor, destroying Einstein’s vision of an objective physical reality and leaving us with an uncertain universe.

These developments weren’t all bad. In fact, they were what made modern computing possible. However, they left us with an uncomfortable uncertainty. Facts could no longer be absolutely verifiable, but would stand until they could be falsified. We could, after thorough testing, become highly confident in our facts, but never completely sure.

Science, Truth and Falsifiability

In Richard Feynman’s 1974 commencement speech at Cal-Tech, he recounted going to a new-age resort where people were learning reflexology. A man was sitting in a hot tub rubbing a woman’s big toe and asking the instructor, “Is this the pituitary?” Unable to contain himself, the great physicist blurted out, “You’re a hell of a long way from the pituitary, man.”

His point was that it’s relatively easy to make something appear “scientific” by, for example, having people wear white coats or present charts and tables, but that doesn’t make it real science. True science is testable and falsifiable. You can’t merely state what you believe to be true, but must give others a means to test it and prove you wrong.

This is important because it’s very easy for things to look like the truth, but actually be false. That’s why we need to be careful, especially when we believe something to be true. The burden is even greater when it is something that “everybody knows.” That’s when we need to redouble our efforts, dig in and make sure we verify our facts.

“We’ve learned from experience that the truth will out,” Feynman said. “The first principle is that you must not fool yourself—and you are the easiest person to fool.” Truth doesn’t reveal itself so easily, but it’s out there and we can find it if we are willing to make the effort.

The Lie of a Post-Truth World

Writing a non-fiction book can be a grueling process. You not only need to gather hundreds of pages of facts and mold them into a coherent story that interests the reader, but also to verify that those facts are true. For both of my books, Mapping Innovation and Cascades, I spent countless hours consulting sources and sending out fact checks.

Still, I lived in fear knowing that whatever I put on the page would permanently be there for anyone to discredit. In fact, I would later find two minor inaccuracies in my first book (ironically, both had been checked with primary sources). These were not, to be sure, material errors, but they wounded me. I’m sure, in time, others will be uncovered as well.

Yet I don’t believe that those errors diminish the validity of the greater project. In fact, I think that those imperfections serve to underline the larger truth that the search for knowledge is always a journey, elusive and just out of reach. We can struggle for a lifetime to grasp even a small part of it, but to shake free even a few seemingly insignificant nuggets can be a gift.

Yet all too often people value belief more than facts. That’s why they repeat things that aren’t factual, because they believe they point to some deeper truth that defy facts in evidence. Yet that is not truth. It is just a way of fooling yourself and, if you’re persuasive, fooling others as well. Still, as Feynman pointed out long ago, “We’ve learned from experience that the truth will out.”

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.