Category Archives: Technology

Moneyball and the Beginning, Middle, and End of Innovation

Moneyball and the Beginning, Middle, and End of Innovation

GUEST POST from Robyn Bolton

Recently, pitchers and catchers reported to MLB Spring Training facilities in Florida and Arizona.  For baseball fans, this is the first sign of Spring, an occasion that heralds months of warmth and sunshine, ballparks filled (hopefully) with cheering fans, dinners of beers and brats, and the undying belief that this year will be the year.

Of course, there was still a lot of dark, dreary cold between then and Opening Day.  Perfect weather for watching baseball movies – Bull DurhamMajor LeagueThe NaturalField of Dreams, and, of course, Moneyball.

Moneyball is based on the book of the same name by Michael Lewis and chronicles the 2002 Oakland Athletics season.  The ’02 Oakland A’s, led by General Manager Billy Beane (played by Brad Pitt), forever changed baseball by adopting an approach that valued rigorous statistical analysis over the collective wisdom of baseball insiders (coaches, scouts, front office personnel) when building a team.  This approach, termed “Moneyball,” enabled the A’s to reach the postseason with a team that cost only $44M in salary, compared to the NY Yankees that spent $125M to achieve the same outcome.

While the whole movie (and book) is a testament to the courage and perseverance required to challenge and change the status quo, time and again I come back to three lines that perfectly sum up the journey of every successful intrapreneur I’ve ever met.

The Beginning

I know you’ve taken it in the teeth out there, but the first guy through the wall…he always gets bloody…always always gets bloody.  This is threatening not just a way of doing business… but in their minds, it’s threatening the game. Really what it’s threatening is their livelihood, their jobs. It’s threatening the way they do things… and every time that happens, whether it’s the government, a way of doing business, whatever, the people who are holding the reins – they have their hands on the switch – they go batshit crazy.”

John Henry, Owner of the Boston Red Sox

Context

The 2002 season is over, and the A’s were eliminated in the first round of the playoffs.  John Henry, an owner of the Boston Red Sox, has invited Bill Beane to Boston to offer him the Red Sox GM job.

Lesson

This is what you sign up for when you decide to be an Intrapreneur.  The more you challenge the status quo, the more you question how business is done, the more you ask Why and demand an answer, the closer you get to “tak(ing) it in the teeth.”

This is why courage, perseverance, and an unshakeable belief that things can and should be better are absolutely essential for intrapreneurs.  Your job is to run at the wall over and over until you get through it.

People will follow.  The Red Sox did.  They won the World Series in 2004, breaking an 84-year-old curse.

The Middle

“It’s a process, it’s a process, it’s a process”

Bill Beane

Context

Billy has to convince the ballplayers to forget all the habits that made them great and embrace the philosophy of Moneyball.  To stop stealing bases, turning double plays on bunts, and swinging for the fences and to start taking walks, throwing to first for the easy out, and prioritize getting on base over hitting a home run.

The players are confused and frustrated.  Suddenly, everything that they once did right is wrong and what was not valued is deeply prized.

Lesson

Innovation is something new that creates value.  Something new doesn’t just require change, it requires people to stop doing things that work and start doing things that seem strange or even wrong.

Change doesn’t happen overnight.  It’s not a switch to be flipped.  It’s a process to be learned.  It takes time, practice, reminders, and patience.

The End

“When you get an answer you’re looking for, hang up.”

Billy Beane

Context

In this scene, Billy has offered one of his players to multiple teams, searching for the best deal.  When the phone rings with a deal he likes, he and the other General Manager (GM) agree to it, Billy hangs up.  Even though the other GM was in the middle of a sentence.  When Peter Brand, the Assistant GM played by Jonah Hill, points out that Billy had just hung up on the other GM, Billy responds with this nugget of wisdom.

Lesson

It’s advice intrapreneurs should take very much to heart.  I often see Innovation teams walk into management presentations with long presentations, full of data and projections, anxious to share their progress, and hoping for continued funding and support.  When the meeting starts, a senior exec will say something like, “We’re excited by the progress we’re hearing about and what it will take to continue.”

That’s the cue to “hang up.”

Instead of starting the presentation from the beginning, start with “what it will take to continue.”  You got the answer you’re looking for – they’re excited about the progress you’ve made – don’t spend time giving them the info they already have or, worse, could raise questions and dim their enthusiasm.  Hang up on the conversation you want to have and have the conversation they want to have.

In closing

Moneyball was an innovation that fundamentally changed one of the most tradition-bound businesses in sports.  To be successful, it required someone willing to take it in the teeth, to coach people through a process, and to hang up when they got the answer they wanted.  It wasn’t easy but real change rarely is.

The same is true in corporations.  They need their own Bill Beanes.

Are you willing to step up to the plate?

Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

3 Examples of Why Innovation is a Leadership Problem

Through the Looking Glass

3 Examples of Why Innovation is a Leadership Problem

GUEST POST from Robyn Bolton

Do you sometimes feel like you’re living in an alternate reality?

If so, you’re not alone.  Most innovators feel that way at some point.

After all, you see things that others don’t.

Question things that seem inevitable and true.

Make connections where others only see differences.

Do things that seem impossible.

It’s easy to believe that you’re the crazy one, the Mad Hatter and permanent resident of Wonderland.

But what if you’re not the crazy one?

What if you’re Alice?

And you’re stepping through the looking glass every time you go to work?

In Lewis Carroll’s book, the other side of the looking glass is a chessboard, and all its inhabitants are chess pieces that move in defined and prescribed ways, follow specific rules, and achieve defined goals.  Sound familiar?

Here are a few other things that may sound familiar, too

“The rule is, jam tomorrow and jam yesterday – but never jam today.” – The White Queen

In this scene, the White Queen offers to hire Alice as her lady’s maid and pay her “twopence a week and jam every other day.”  When Alice explains that she doesn’t want the job, doesn’t like jam, and certainly doesn’t want jam today, the queen scoffs and explains the rule.

The problem, Alice points out, is that it’s always today, and that means there’s never jam.

Replace “jam” with “innovation,” and this hits a little too close to home for most innovators.

How often do you hear about the “good old days” when the company was more entrepreneurial, willing to experiment and take risks, and encouraged everyone to innovate?

Innovation yesterday.

How often do you hear that the company will invest in innovation, restart its radical innovation efforts, and disrupt itself as soon as the economy rebounds, business improves, and things settle down a bit?  Innovation tomorrow.

But never innovation today.  After all, “it’s [innovation] every other day: today isn’t any other day, you know.”

“When I use a word, it means just what I choose it to mean – neither more, not less.” – Humpty Dumpty

In this scene, poor Alice tries to converse with Humpty Dumpty, but he keeps using the “wrong” words.  Except they’re not the wrong words because they mean exactly what he chooses them to mean.

Even worse, when Alice asks Humpty to define confusing terms, he gets angry, speaks in a “scornful tone,” and smiles “contemptuously” before “wagging his head gravely from side to side.

We all know what the words we use mean, but we too often think others share our definitions.  We use “innovation” and “growth,” assuming people know what we mean.  But they don’t.  They know what the words mean to them.  And that may or may not be what we mean.

When managers encourage people to share ideas, challenge the status quo, and take risks, things get even trickier.  People listen, share ideas, challenge the status quo, and take risks.  Then they are confused when management doesn’t acknowledge their efforts.  No one realizes that those requests meant one thing to the managers who gave them and a different thing to the people who did them.

“It takes all the running you can do, to keep in the same place.  If you want to go somewhere else, you must run at least twice as fast as that!” – The Red Queen

In this scene, the Red Queen introduces life on the other side of the looking glass and explains Alice’s new role as a pawn.  Of course, the explanation comes after a long sprint that seems to get them nowhere and only confuses Alice more.

When “tomorrow” finally comes, and it’s time for innovation, it often comes with a mandate to “act with urgency” to avoid falling behind.  I’ve seen managers set goals of creating and launching a business with $250M revenue in 3 years and leadership teams scrambling to develop a portfolio of businesses that would generate $16B in 10 years.

Yes, the world is moving faster, so companies need to increase the pace at which they operate and innovate.  But if you’re doing all you can, you can’t do twice as much.  You need help – more people and more funding, not more meetings or oversight.

“Life, what is it but a dream?”

Managers and executives, like the kings and queens, have roles to play.  They live in a defined space, an org chart rather than a chessboard, and they do their best to navigate it following rules set by tradition, culture, and HR.

But you are like Alice.  You see things differently.  You question what’s taken as given.  And, every now and then, you probably want to shake someone until they grow “shorter – and fatter – and softer – and rounder – and…[into] a kitten, after all.”

So how do you get back to reality and bring everyone with you?  You talk to people.  You ask questions and listen to the answers.  You seek to understand their point of view and then share yours.

Some will choose to stay where they are.

Some will choose to follow you back through the looking glass.

They will be the ones who transform a leadership problem into a leadership triumph.

Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The End of the Digital Revolution

Here’s What You Need to Know

The End of the Digital Revolution

GUEST POST from Greg Satell

The history of digital technology has largely been one of denial followed by disruption. First came the concept of the productivity paradox, which noted the limited economic impact of digital technology. When e-commerce appeared, many doubted that it could ever compete with physical retail. Similar doubts were voiced about digital media.

Today, it’s hard to find anyone who doesn’t believe in the power of digital technology. Whole industries have been disrupted. New applications driven by cloud computing, artificial intelligence and blockchain promise even greater advancement to come. Every business needs to race to adopt them in order to compete for the future.

Ironically, amid all this transformation the digital revolution itself is ending. Over the next decade, new computing architectures will move to the fore and advancements in areas like synthetic biology and materials science will reshape entire fields, such as healthcare, energy and manufacturing. Simply waiting to adapt won’t be enough. The time to prepare is now.

1. Drive Digital Transformation

As I explained in Mapping Innovation, innovation is never a single event, but a process of discovery, engineering and transformation. Clearly, with respect to digital technology, we are deep into the transformation phase. So the first part of any post-digital strategy is to accelerate digital transformation efforts in order to improve your competitive position.

One company that’s done this very well is Walmart. As an old-line incumbent in the physical retail industry, it appeared to be ripe for disruption as Amazon reshaped how customers purchased basic items. Why drive out to a Walmart store for a package of toothpaste when you can just click a few buttons on your phone?

Yet rather than ceding the market to Amazon, Walmart has invested heavily in digital technology and has achieved considerable success. It wasn’t any one particular tactic or strategy made the difference, but rather the acknowledgment that every single process needed to be reinvented for the digital age. For example, the company is using virtual reality to revolutionize how it does in-store training.

Perhaps most of all, leaders need to understand that digital transformation is human transformation. There is no shortage of capable vendors that can implement technology for you. What’s key, however, is to shift your culture, processes and business model to leverage digital capabilities.

2. Explore Post-Digital Technologies

While digital transformation is accelerating, advancement in the underlying technology is slowing down. Moore’s law, the consistent doubling of computer chip performance over the last 50 years, is nearing its theoretical limits. It has already slowed down considerably and will soon stop altogether. Yet there are non-digital technologies under development that will be far more powerful than anything we’ve ever seen before.

Consider Intel, which sees its future in what it calls heterogeneous computing combining traditional digital chips with non-digital architectures, such as quantum and neuromorphic. It announced a couple of years ago its Pohoiki Beach neuromorphic system that processes information up to 1,000 times faster and 10,000 more efficiently than traditional chips for certain tasks.

IBM has created a network to develop quantum computing technology, which includes research labs, startups and companies that seek to be early adopters of the technology. Like neuromorphic computing, quantum systems have the potential to be thousands, if not millions, of times more powerful than today’s technology.

The problem with these post-digital architectures is that no one really knows how they are going to work. They operate on a very different logic than traditional computers, will require new programming languages and algorithmic strategies. It’s important to start exploring these technologies now or you could find yourself years behind the curve.

3. Focus on Atoms, Not Bits

The digital revolution created a virtual world. My generation was the first to grow up with video games and our parents worried that we were becoming detached from reality. Then computers entered offices and Dan Bricklin created Visicalc, the first spreadsheet program. Eventually smartphones and social media appeared and we began spending almost as much time in the virtual world as we did in the physical one.

Essentially, what we created was a simulation economy. We could experiment with business models in our computers, find flaws and fix them before they became real. Computer-aided design (CAD) software allowed us to design products in bits before we got down to the hard work of shaping atoms. Because it’s much cheaper to fail in the virtual world than the physical one, this made our economy much more efficient.

Yet the next great transformation will be from bits to atoms. Digital technology is creating revolutions in things like genomics and materials science. Artificial intelligence and cloud computing are reshaping fields like manufacturing and agriculture. Quantum and neuromorphic computing will accelerate these trends.

Much like those new computing architectures, the shift from bits to atoms will create challenges. Applying the simulation economy to the world of atoms will require new skills and we will need people with those skills to move from offices in urban areas to factory floors and fields. They will also need to learn to collaborate effectively with people in those industries.

4. Transformation is Always a Journey, Never a Destination

The 20th century was punctuated by two waves of disruption. The first, driven by electricity and internal combustion, transformed almost every facet of daily life and kicked off a 50-year boom in productivity. The second, driven by the microbe, the atom and the bit, transformed fields such as agriculture, healthcare and management.

Each of these technologies followed the pattern of discovery, engineering and transformation. The discovery phase takes place mostly out of sight, with researchers working quietly in anonymous labs. The engineering phase is riddled with errors, as firms struggle to shape abstract concepts into real products. A nascent technology is easy to ignore, because its impact hasn’t been felt yet.

The truth is that disruption doesn’t begin with inventions, but when an ecosystem emerges to support them. That’s when the transformation phase begins and takes us by surprise, because transformation never plays out like we think it will. The future will always, to a certain extent, unpredictable for the simple reason that it hasn’t happened yet.

Today, we’re on the brink of a new era of innovation that will be driven by new computing architectures, genomics, materials science and artificial intelligence. That’s why we need to design our organizations for transformation by shifting from vertical hierarchies to horizontal networks.

Most of all, we need to shift our mindsets from seeing transformation as set of discreet objectives to a continuous journey of discovery. Digital technology has only been one phase of that journey. The most exciting things are still yet to come.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Is China Our New Sputnik Moment?

Is China Our New Sputnik Moment?

GUEST POST from Greg Satell

When the Soviets launched Sputnik, the first space satellite, into orbit in 1957, it was a wake-up call for America. Over the next year, President Eisenhower would sign the National Defense Education Act to spur science education, increase funding for research and establish NASA and DARPA to spur innovation.

A few years ago, a report by the Council on Foreign Relations (CFR) argued that we are at a similar point today, but with China. While we have been steadily decreasing federal investment in R&D over the past few decades, our Asian rival has been ramping up and now threatens our leadership in key technologies such as AI, genomics and quantum information technology.

Clearly, we need to increase our commitment to science and innovation and that means increasing financial investment. However, what the report makes clear is that money alone won’t solve the problem. We are, in several important ways, actually undermining our ability to innovate, now and in the future. We need to renew our culture of innovation in America.

Educating And Attracting Talent

The foundation of an innovation economy is education, especially in STEM subjects. Historically, America has been the world’s best educated workforce, but more recently we’ve fallen to fifth among OECD countries for post-secondary education. That’s alarming and something we will certainly need to reverse if we are to compete effectively.

Our educational descent can be attributed to three major causes. First, the rest of the world has become more educated, so the competition has become stiffer. Second, is financing. Tuition has nearly tripled in the last decade and student debt has become so onerous that it now takes about 20 years to pay off four years for college. Third, we need to work harder to attract talented people to the United States.

The CFR report recommends developing a “21st century National Defense Education Act” to create scholarships in STEM areas and making it easier for foreign students to get Green Cards when they graduate from our universities. It also points out that we need to work harder to attract foreign talent, especially in high impact areas like AI, genomics and quantum computing.

Unfortunately, we seem to be going the other way. The number of international students to American universities is declining. Policies like the muslim ban and concerns about gun violence are deterring scientific talent coming here. The denial rate for those on H1-B visas has increased from 4% in 2016 to 18% in the first quarter of 2019.

Throughout our history, it has been our openness to new people and new ideas that has made America exceptional. It’s a legitimate question whether that’s still true.

Building Technology Ecosystems

In the 1980s, the US semiconductor industry was on the ropes. Due to increased competition from low-cost Japanese manufacturers, American market share in the DRAM market fell from 70% to 20%. The situation not only had a significant economic impact, there were also important national security implications.

The federal government responded with two initiatives, the Semiconductor Research Corporation and SEMATECH, both of which were nonprofit consortiums that involved government, academia and industry. By the 1990s. American semiconductor manufacturers were thriving again.

Today, we have similar challenges with rare earth elements, battery technology and many manufacturing areas. The Obama administration responded by building similar consortiums to those that were established for semiconductors: The Critical Materials Institute for rare earth elements, JCESR for advanced batteries and the 14 separate Manufacturing Institutes.

Yet here again, we seem to be backsliding. The current administration has sought to slash funding for the Manufacturing Extension Partnership that supports small and medium sized producers. An addendum to the CFR report also points out that the administration has pushed for a 30% cut in funding for the national labs, which support much of the advanced science critical to driving American technology forward.

Supporting International Trade and Alliances

Another historical strength of the US economy has been our open approach to trade. The CFR report points out that our role as a “central node in a global network of research and development,” gave us numerous advantages, such as access to foreign talent at R&D centers overseas, investment into US industry and cooperative responses to global challenges.

However, the report warns that “the Trump administration’s indiscriminate use of tariffs against China, as well as partners and allies, will harm U.S. innovative capabilities.” It also faults the Trump administration for pulling out of the Trans-Pacific Partnership trade agreement, which would have bolstered our relationship with Asian partners and increased our leverage over China.

The tariffs undermine American industry in two ways. First, because many of the tariffs are on intermediate goods which US firms use to make products for export, we’re undermining our own competitive position, especially in manufacturing. Second, because trade partners such as Canada and the EU have retaliated against our tariffs, our position is weakened further.

Clearly, we compete in an ecosystem driven world in which power does not come from the top, but emanates from the center. Traditionally, America has positioned itself at the center of ecosystems by constantly connecting out. Now that process seems to have reversed itself and we are extremely vulnerable to others, such as China, filling the void.

We Need to Stop Killing Innovation in America

The CFR report, whose task force included such luminaries as Admiral William McRaven, former Google CEO Eric Schmidt and economist Laura Tyson, should set alarm bells ringing. Although the report was focused on national security issues, it pertains to general competitiveness just as well and the picture it paints is fairly bleak.

After World War II, America stood almost alone in the world in terms of production capacity. Through smart policy, we were able to transform that initial advantage into long-term technological superiority. Today, however we have stiff competition in areas ranging from AI to synthetic biology to quantum systems.

At the same time, we seem to be doing everything we can to kill innovation in America. Instead of working to educate and attract the world’s best talent, we’re making it harder for Americans to attain higher education and for top foreign talent to come and work here. Instead of ramping up our science and technology programs, presidential budgets regular recommend cutting them. Instead of pulling our allies closer, we are pushing them away.

To be clear, America is still at the forefront of science and technology, vying for leadership in every conceivable area. However, as global competition heats up and we need to be redoubling our efforts, we seem to be doing just the opposite. The truth is that our prosperity is not a birthright to which we are entitled, but a legacy that must be lived up to.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Rethinking Customer Journeys

Rethinking Customer Journeys

GUEST POST from Geoffrey A. Moore

Customer journeys are a mainstay of modern marketing programs. Unfortunately, for most companies, they are pointed in the wrong direction!

Most customer journey diagrams I see map the customer’s journey through the vendor’s marketing and sales process. That’s not a customer journey. That is a vendor journey. Customers could not care less about it.

What customers do care about is any journey that leads to value realization in their enterprise. That means true customer journey mapping must work backward from the customer’s value goals and objectives, not forward from the vendor’s sales goals and objectives.

But to do that, the customer-facing team in the vendor organization has to have good intelligence about what value realization the customer is seeking. That means that sales teams must diagnose before they prescribe. They must interrogate before they present. They must listen before they demo.

That is not what the typical sales enablement program teaches. Instead, it instructs salespeople on how to give the standard presentation, how to highlight the product’s competitive advantages, how to counter the competition’s claims—anything and everything except the only thing that really matters—how do you get good customer intelligence from whatever level of management you are able to converse with?

The SaaS business model with its emphasis on subscription and consumption creates a natural occasion for reforming these practices. Net Revenue Retention is the name of the game. Adoption, extension, and expansion of product usage are core to the customer’s Health Score. This only happens when value is truly being realized.

All this is casting the post-sales customer-facing functions of Customer Success and Customer Support in a new light. These relationships are signaling outposts for current customer status. Vendors still need to connect with the top management, for they are the ones who set the value realization goals and provide the budgets to fund the vendor’s offerings, but for day-to-day reality checks on whether the value is actually getting realized, nothing beats feet on the ground.

So, note to vendors. You can still use your vendor-centric customer journey maps to manage your marketing and sales productivity. Just realize these maps are about you, not the customer. You cannot simply assign the customer a mindset that serves your interests. You have to genuinely engage with them to get to actionable truth.

That’s what I think. What do you think?

Image Credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

How Has Innovation Changed Since the Pandemic?

The Answer in Three Charts

How Has Innovation Changed Since the Pandemic?

GUEST POST from Robyn Bolton

“Everything changed since the pandemic.”

At this point, my husband, a Navy veteran, is very likely to moo (yes, like a cow). It’s a habit he picked up as a submarine officer, something the crew would do whenever someone said something blindingly obvious because “moo” is not just a noise. It’s an acronym – Master Of the Obvious.

But HOW did things change?

From what, to what?

So what?

It can be hard to see the changes when you’re living and working in the midst of them. This is why I found “Benchmarking Innovation Impact, from InnoLead,” a new report from InnoLead and KPMG US, so interesting, insightful, and helpful.

There’s lots of great stuff in the report (and no, this is not a sponsored post though I am a member), so I limited myself to the three charts that answer executives’ most frequently asked innovation questions.

Innovation Leader Research 2023 Chart 1

Question #1: What type of innovation should I pursue?

2023 Answer: Companies are investing more than half of their resources in incremental innovation

So What?:  I may very well be alone in this opinion, but I think this is great news for several reasons:

  1. Some innovation is better than none – Companies shifting their innovation spending to safer, shorter-term bets is infinitely better than shutting down all innovation, which is what usually happens during economic uncertainty
  2. Play to your strengths – Established companies are, on average, better at incremental and adjacent innovation because they have the experience, expertise, resources, and culture required to do those well and other ways (e.g., corporate venture capital, joint ventures) to pursue Transformational innovation.
  3. Adjacent Innovation is increasing –This is the sweet spot for corporate innovation (I may also be biased because Swiffer is an adjacent innovation) because it stretches the business into new customers, offerings, and/or business models without breaking the company or executives’ identities.

Innovation Leader Research 2023 Chart 2

Question #2: Is innovation really a leadership problem (or do you just have issues with authority)?

2023 Answer: Yes (and it depends on the situation). “Lack of Executive Support” is the #6 biggest challenge to innovation, up from #8 in 2020.

So What?: This is a good news/bad news chart.

The good news is that fewer companies are experiencing the top 5 challenges to innovation. Of course, leadership is central to fostering/eliminating turf wars, setting culture, acting on signals, allocating budgets, and setting strategy. Hence, leadership has a role in resolving these issues, too.

The bad news is that MORE innovators are experiencing a lack of executive support (24.3% vs. 19.7% in 2020) and “Other” challenges (17.3% vs. 16.4%), including:

  • Different agendas held by certain leadership as to how to measure innovation and therefore how we go after innovation. Also, the time it takes to ‘sell’ an innovative idea or opportunity into the business; corporate bureaucracy.”
  • Lack of actual strategy. Often, goals or visions are treated as strategy, which results in frustration with the organization’s ability to advance viable work and creates an unnecessary churn, resulting in confused decision-making.”
  • “Innovations are stalling after piloting due to lack of funding and executive support in order to shift to scaling. Many are just happy with PR innovation.”

Innovation Leader Research 2023 Chart 3

Question #3: How much should I invest in innovation?

2023 Answer: Most companies are maintaining past years’ budgets and team sizes.

So What?:  This is another good news/bad news set of charts.

The good news is that investment is staying steady. Companies that cut back or kill innovation investments due to economic uncertainty often find that they are behind competitors when the economy improves. Even worse, it takes longer than expected to catch up because they are starting from scratch regarding talent, strategy, and a pipeline.

The bad news is that investment is staying steady. If you want different results, you need to take different actions. And I don’t know any company that is thrilled with the results of its innovation efforts. Indeed, companies can do different things with existing budgets and teams, but there needs to be flexibility and a willingness to grow the budget and the team as projects progress closer to launch and scale-up.

Not MOO

Yes, everything has changed since the pandemic, but not as much as we think.

Companies are still investing in incremental, adjacent, and transformational innovation. They’re just investing more in incremental innovation.

Innovation is still a leadership problem, but leadership is less of a problem (congrats!)

Investment is still happening, but it’s holding steady rather than increasing.

And that is nothing to “moo” at.

Image credits: Pixabay, InnoLead

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Our Innovation is All on Tape

Why Old Technologies Are Sometimes Still the Best Ones

Our Innovation is All on Tape

GUEST POST from John Bessant

Close your eyes and imagine for a moment a computer room in the early days of the industry. Chances are you’ll picture large wardrobe-sized metal cabinets whirring away with white-coated attendants tending to the machines. And it won’t be long before your gaze lands on the ubiquitous spools of tape being loaded and unloaded.

Which might give us a smug feeling as we look at the storage options for our current generation of computers — probably based on some incredibly fast access high-capacity solid state flash drive. It’s been quite a journey — the arc stretches a long way back from the recent years of USB sticks and SD cards, external HDDs and then the wonderful world of floppy discs, getting larger and more rigid as we go back in time. The clunky 1980s when our home computers rode on cassette drives, right back to the prehistoric days where the high priests of mini and mainframes tended their storage flock of tapes.

Ancient history — except that the tape drive hasn’t gone away. In fact it’s alive and well and backing up our most precious memories. Look inside the huge data farms operated by Google, Apple, Amazon, Microsoft Azure or anyone else and you’ll find large computers — and lots of tape. Thousands of kilometres of it, containing everything from your precious family photos to email backups to data from research projects like the Large Hadron Collider.

It turns out that tape is still an incredibly reliable medium — and it has the considerable advantage of being cheap. The alternative would be buying lots of hard drives — something which increasingly matters as the volume of data we are storing is growing. Think about the internet of things — all those intelligent devices, whether security cameras or mobile phones, manufacturing performance data loggers or hospital diagnostic equipment, are generating data which needs secure long-term storage. We’ve moved long past the era of measuring storage in kilobytes or megabytes; now we’re into zettabytes, each one the equivalent of to 250billion DVDs. In 2020 estimates suggest we produced close to 59Zb of data, projected to rise to 175zb by 2025! Fortunately IBM scientist Mark Lantz , an expert in storage, suggests that we can keep scaling tape and doubling capacity every 2.5 years for the next 20 years.

Plus tape offers a number of other advantages, not least in terms of security. Most of the time a tape cartridge is not plugged in to a computer and so is pretty immune to visiting viruses and malware.

In fact the market for magnetic tape storage is in robust health; it’s currently worth nearly $5bn and is expected to grow to double that size by 2030. Not bad for a technology coming up on its hundredth anniversary. Making all of this possible is, of course, our old friend innovation. It’s been a classic journey of incremental improvement, doing what we do but better, punctuated with the occasional breakthrough.

It started in 1877 when “Mary Had a Little Lamb” was recorded and played on Thomas Edison’s first experimental talking machine called a phonograph; the sounds were stored on wax cylinders and severely limited in capacity. The first tape recorder was developed in 1886 by Alexander Graham Bell in his labs using paper with beeswax coated on it. This patented approach never really took off because the sound reproduction was inferior to Edison’s wax cylinders.

Others soon explored alternatives; for example Franklin C. Goodale adapted movie film for analogue audio recording, receiving a patent for his invention in 1909. His film used a stylus to record and play back, essentially mimicking Edison’s approach but allowing for much more storage.

But in parallel with the wax-based approach another strand emerged in 1898, with the work of Voldemar Poulsen, a Danish scientist who built on an idea originally suggested ten years earlier by Oberlin Smith. This used the concept of a wire (which could be spooled) on which information was encoded magnetically. Poulsen’s model used cotton thread, steel sawdust and metal wire and was effectively the world’s first tape recorder; he called it a ‘telegraphone’.

Which brings us to another common innovation theme — convergence. If we fast forward (itself a term which originated in the word of tape recording!) to the 1930s we can see these two strands come together; German scientists working for the giant BASF company built on a patent registered to Fritz Pfleumer in 1928. They developed a magnetic tape using metal oxide coated on plastic tape which could be used in recording sound on a commercial basis; in 1934 they delivered the first 50,000 metres of it to the giant electronics corporation AEG.

The big advantage of magnetic recording was that it didn’t rely on a physical analogue being etched into wax or other medium; instead the patterns could be encoded and read as electrical signals. It wasn’t long before tape recording took over as the dominant design — and one of the early entrants was the 3M company in the USA. They had a long history of coating surfaces with particles, having begun life making sandpaper and moved on to create a successful business out of first adhesive masking tape and then the ubiquitous Scotch tape. Coating metal oxide on to tape was an obvious move and they quickly became a key player in the industry.

Innovation is always about the interplay between needs and means and the tape recording business received a fillip from the growing radio industry in the 1940s. Tape offered to simplify and speed up the recording process and an early fan was Bing Crosby. He’d become fed up with the heavy schedule of live broadcasting which kept him away from his beloved golf course and so was drawn to the idea of pre-recording his shows. But the early disc-based technology wasn’t really up to the task, filled with hisses and scratches and poor sound quality. Crosby’s sound engineer had come across the idea of tape recording and worked with 3M to refine the technology.

The very first radio show, anywhere in the world, to be recorded directly on magnetic tape was broadcast on 1 October 1947 featuring Crosby. It not only opened up a profitable line of new business for 3M, it also did its bit for changing the way the world consumed entertainment, be it drama, music hall or news. (It was also a shrewd investment for Crosby who became one of the emerging industry’s backers)

Which brings us to another kind of innovation interplay, this time between different approaches being taken in the worlds of consumer entertainment and industrial computing. Ever since Marconi, Tesla and others had worked on radio there had been a growing interest in consumer applications which could exploit the technology. And with the grandchildren of Edison’s gramophone and in the 1940s the work on television, the home became an increasingly interesting space for electronics entrepreneurs.

But as the domestic market for fixed appliances grew saturated so the search began for mobile solutions. Portability became an important driver for the industry and gave rise to the transistor radio; it wasn’t long before the in car entertainment market began to take off. An early entrant from the tape playback side was the 8-track cartridge in the mid-1960s which allowed you to listen to your favorite tracks without lugging a portable gramophone with you. Philips’ development of the compact cassette (and its free licensing of the idea to promote rapid and widespread adoption) led to an explosion in demand (over 100 billion cassette tapes were eventually sold worldwide) and eventually to the idea of the Walkman as the first portable personal device for recorded and recording music.

Without which we’d be a little less satisfied. Specifically we’d never been introduced to one of the Rolling Stones’ greatest hits; as guitarist Keith Richards explained in his 2010 autobiography:

“I wrote the song ‘Satisfaction’ in my sleep. I didn’t know at all that I had recorded it, the song only exists, thank God, to the little Philips cassette recorder. I looked at it in the morning — I knew I had put a new tape in the night before — but it was at the very end. Apparently, I had recorded something. I rewound and then ‘Satisfaction’ sounded … and then 40 minutes of snoring!”

Meanwhile back in the emerging computer industry of the 1950s there was a growing demand for storage media for which magnetic tape seemed well suited. Cue the images we imagined in the opening paragraph, acolytes dutifully tending the vast mainframe machines.

Early computers had used punched cards and then paper tape but these soon reached the limit of their usefulness; instead the industry began exploring magnetic audio tape.

IBM’s team under the leadership of Wayne Winger developed digital tape-based storage; of particular importance was finding ways to encode the 1s and 0s of binary patterns onto the tape. They introduced the commercial digital tape recorder in 1952, and it could store what was (for its time) an impressive 2mB of data on a reel.

Not everyone was convinced; as Winger recalled, “A white-haired IBM veteran in Poughkeepsie pulled a few of us aside and told us, ‘You young fellows remember, IBM was built on punched cards, and our foundation will always be punched cards.’ Fortunately Tom Watson Jnr, son of the company founder became a champion and the project went ahead.

But while tape dominated in the short term another parallel trajectory was soon established, replacing tapes and reels with disc drives whose big advantage was the ability to randomly access data rather than wait for the tape to arrive at the right place on the playback head. IBM once again led the way with its launch in 1956 of the hard disc drive and began a steady stream of innovation in which storage volumes and density increased while the size decreased. The landscape moved through various generations of external drives until the advent of personal computers where the drives migrated inside the box and became increasingly small (and floppy).

These developments were taken up by the consumer electronics industry with the growing use of discs as an alternative recording and playback medium, spanning various formats but also decreasing in size. Which of course opened the way for more portability with Sony and Sharp launching mini-disc players in the early 1980s.

All good news for the personal audio experience but less so for the rapidly expanding information technology industry. While new media storage technology continued to improve it came at a cost and with the exponential increase in volumes of data needing to be stored came a renewed interest in alternative (and cheaper) solutions. The road was leading back to good old-fashioned tape.

Its potential was in long-term storage and retrieval of so-called ‘cold data’. Most of what is stored in the cloud today is this kind — images, emails, all sorts of backup files. And while these need to be around they don’t have to be accessed instantly. And that’s where tape has come back into its own. Today’s tapes have moved on somewhat from IBM’s 1952 limited 2mB of capacity version. They are smaller on the outside but their capacity has grown enormously — they can now hold 20Tb or even if compressed 60pTb — that’s a 10 millionfold increase in 70 years. The tapes are not wound by hand on to capstans but instead loaded into cartridges, each of which hold around a kilometer of tape; companies use libraries containing tens of thousands of these cartridges which can be mounted via automated systems deploying robots. This process takes around 90 seconds to locate a cartridge and access and load the tape, so you could be forgiven for thinking that it’s a bit slow compared to your flash drive which has an access time measured in milliseconds.

There’s a pattern here — established and once important technologies giving way to the new kids on the block with their apparently superior performance. We’ve learned that we shouldn’t necessarily write the old technologies off — at the minimum there is often a niche for them amongst enthusiasts. Think about vinyl, about the anti-mp3 backlash from hi-fi fans or more recently photography using film and plates rather than their digital counterparts.

But it’s more than just nostalgia which drives this persistence of the old. Sometimes — like our magnetic tape — there are performance features which are worth holding on to — trading speed for security and lower storage cost, for example. Sometimes there is a particular performance niche which the new technology cannot enter competitively — for example the persistence of fax machines in healthcare where they offer a secure and reliable way of transmitting sensitive information. At the limit we might argue that neither cash nor physical books are as ‘good’ as their digital rivals but their persistence points to other attributes which people continue to find valuable.

And sometimes it is about the underlying accumulated knowledge which the old technology represents — and which might be redeployed to advantage in a different field. Think of Fujifilm’s resurgence as a cosmetics and pharmaceuticals company on the back of its deep knowledge of emulsions and coatings. Technologies which it originally mastered in the now largely disappeared world of film photography. Or Kodak’s ability to offer high speed high quality printing on the back of knowledge it originally acquired in the same old industry — that of accurately spraying and targeting millions of droplets on to a surface. And it was 3M’s deep understanding of how to coat materials on to tapes gained originally from selling masking tape to the paint shops of Detroit which helped it move so effectively into the field of magnetic tape.

Keeping these technologies alive isn’t about putting them on life support; as the IBM example demonstrates it needs a commitment to incremental innovation, driving and optimising performance. And there’s still room for breakthroughs within those trajectories; in the case of magnetic tape storage it came in 2010 in the form of the Linear Tape File System (LTFS) open standard. This allowed tape drives to emulate the random access capabilities of their hard disk competitors, using metadata about the location of data stored on the tapes.

Whichever way you look at it there’s a need for innovation, whether bringing a breakthrough to an existing field or helping sustain a particular niche for the long haul. And we shouldn’t be too quick to write off ‘old’ technologies as new ones emerge which appear superior. It’s worth remembering that the arrival of the steamship didn’t wipe out the shipyards building sailing ships around the world; it actually spurred them on to a golden era of performance imporvement which it took steampships a long time to catch up with.

So, there’s often a lot of life left in old dogs, especially when we can teach them some new innovative tricks.

You can find a podcast version of this here and a video version here

And if you’d like to learn with me take a look at my online course here

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Humans, Not Technology, Drive Business Success

Humans, Not Technology, Drive Business Success

GUEST POST from Greg Satell

Silicon Valley is often known as a cut-throat, technocratic place where the efficiency of algorithms often define success. Competition is ferocious and the pace of disruption and change can be dizzying. It’s not the type of environment where soft skills are valued particularly highly or even at all.

So, it’s somewhat ironic that Bill Campbell became a Silicon Valley legend by giving hugs and professing love to those he worked with. As coach to executives ranging from Steve Jobs to the entire Google executive team, Campbell preached and practiced a very personal style of business.

Yet while I was reading Trillion Dollar Coach in which former Google executives explain Campbell’s leadership principles, it became clear why he had such an impact. Even in Silicon Valley, technology will only take you so far. The success of a business ultimately depends on the success of the people in it. To compete over the long haul, that’s where you need to focus.

The Efficiency Paradox

In 1911, Frederick Winslow Taylor published The Principles of Scientific Management, based on his experience as a manager in a steel factory. It took aim at traditional management methods and suggested a more disciplined approach. Rather than have workers pursue tasks in their own manner, he sought to find “the one best way” and train accordingly.

Taylor wrote, “It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured. And the duty of enforcing the adoption of standards and enforcing this cooperation rests with management alone.”

Before long, Taylor’s ideas became gospel, spawning offshoots such as scientific marketing, financial engineering and the Six Sigma movement. It was no longer enough to simply work hard, you had to measure, analyze and optimize everything. Over the years these ideas have become so central to business thinking that they are rarely questioned.

Yet management guru Henry Mintzberg has pointed out how a “by-the-numbers” depersonalized approach can often backfire. “Managing without soul has become an epidemic in society. Many managers these days seem to specialize in killing cultures, at the expense of human engagement.”

The evidence would seem to back him up. One study found that of 58 large companies that have announced Six Sigma programs, 91 percent trailed the S&P 500 in stock performance. That, in essence, is the efficiency paradox. When you manage only what you can measure, you end up ignoring key factors to success.

How Generosity Drives Innovation

While researching my book, Mapping Innovation, I interviewed dozens of top innovators. Some were world class scientists and engineers. Others were high level executives at large corporations. Still others were highly successful entrepreneurs. Overall, it was a pretty intimidating group.

So, I was surprised to find that, with few exceptions, they were some of the kindest and most generous people I have ever met. The behavior was so consistent that I felt that it couldn’t be an accident. So I began to research the matter further and found that when it comes to innovation, generosity really is a competitive advantage.

For example, one study of star engineers at Bell Labs found that the best performers were not the ones with the best academic credentials, but those with the best professional networks. A similar study of the design firm IDEO found that great innovators essentially act as brokers able to access a diverse array of useful sources.

A third study helps explain why knowledge brokering is so important. Analyzing 17.9 million papers, the researchers found that the most highly cited work tended to be largely rooted within a traditional field, but with just a smidgen of insight taken from some unconventional place. Breakthrough creativity occurs at the nexus of conventionality and novelty.

The truth is that the more you share with others, the more they’ll be willing to share with you and that makes it much more likely you’ll come across that random piece of information or insight that will allow you to crack a really tough problem.

People As Profit Centers

For many, the idea that innovation is a human centered activity is intuitively obvious. So it makes sense that the high-tech companies that Bill Campbell was involved in would work hard to create environments to attract the best and the brightest people. However, most businesses have much lower margins and have to keep a close eye on the bottom line.

Yet here too there is significant evidence that a human-focused approach to management can yield better results. In The Good Jobs Strategy MIT’s Zeynep Ton found that investing more in well-trained employees can actually lower costs and drive sales. A dedicated and skilled workforce results in less turnover, better customer service and greater efficiency.

For example, when the recession hit in 2008, Mercadona, Spain’s leading discount retailer, needed to cut costs. But rather than cutting wages or reducing staff, it asked its employees to contribute ideas. The result was that it managed to reduce prices by 10% and increased its market share from 15% in 2008 to 20% in 2012.

Its competitors maintained the traditional mindset. They reduced cut wages and employee hours, which saved them some money, but customers found poorly maintained stores with few people to help them, which damaged their brand long-term. The cost savings Mercadona’s employees identified, on the other hand, in many cases improved service and productivity and these gains persisted long after the crisis was over.

Management Beyond Metrics

The truth is that it’s easy to talk about putting people first, but much harder to do it in practice. Research suggests that once a group goes much beyond 200 people social relationships break down, so once a business gets beyond that point, it becomes natural to depersonalize management and focus on metrics.

Yet the best managers understand that it’s the people that drive the numbers. As legendary IBM CEO Lou Gerstner once put it, “Culture isn’t just one aspect of the game… It is the game. What does the culture reward and punish – individual achievement or team play, risk taking or consensus building?”

In other words, culture is about values. The innovators I interviewed for my book valued solving problems, so were enthusiastic about sharing their knowledge and expertise with others, who happily reciprocated. Mercadona valued its people, so when it asked them to find ways to save money during the financial crisis, they did so enthusiastically.

That’s why today, three years after his death, Bill Campbell remains a revered figure in Silicon Valley, because he valued people so highly and helped them learn to value each other. Management is not an algorithm. It is, in the final analysis, an intensely human activity and to do it well, you need to put people first.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Innovation and the Silicon Valley Bank Collapse

Why It’s Bad News and Good News for Corporate Innovation

Innovation and the Silicon Valley Bank Collapse

GUEST POST from Robyn Bolton

Last week, as news of Silicon Valley Bank’s losses and eventual collapse, took over the news cycle, attention understandably turned to the devastating impact on the startup ecosystem.

Prospects brightened a bit on Monday with news that the federal government would make all depositors whole. Startups, VCs, and others in the ecosystem would be able to continue operations and make payroll, and SVB’s collapse would be just another cautionary tale.

But the impact of SVB’s collapse isn’t confined to the startup ecosystem or the banking industry.

Its impact (should have) struck fear and excitement into the hearts of every executive tasked with growing their business.

Your Portfolio’s Risk Profile Just Changed

The early 2000s were the heyday of innovation teams and skunkworks, but as these internal efforts struggled to produce significant results, companies started looking beyond their walls for innovation. Thus began the era of Corporate Venture Capital (CVC).

Innovation, companies realized, didn’t need to be incubated. It could be purchased.

Often at a lower price than the cost of an in-house team.

And it felt less risky. After all, other companies were doing it and it was a hot topic in the business press. Plus, making investments felt much more familiar and comfortable than running small-scale experiments and questioning the status quo.

Between 2010 and 2020, the number of corporate investors increased more than 6x to over 4,000, investment ballooned to nearly $170B in 2021 (up 142% from 2020), and 1,317 CVC-backed deals were closed in Q1 of 2020.

But, with SVB’s collapse, the perceived risk of startup investing suddenly changed.

Now startups feel riskier. Venture Capital firms are pulling back, and traditional banks are prohibited from stepping forward to provide the venture debt many startups rely on. While some see this as an opportunity for CVC to step up, that optimism ignores the fact that companies are, by nature and necessity, risk averse and more likely to follow the herd than lead it.

Why This is Bad News

As CVC, Open Innovation, and joint ventures became the preferred path to innovation and growth, internal innovation shifted to events – hackathons, shark tanks, and Silicon Valley field trips.

Employees were given the “freedom” to innovate within a set time and maybe even some training on tools like Design Thinking and Lean Startup. But behind closed doors, executives spoke of these events as employee retention efforts, not serious efforts to grow the business or advance critical strategies.

Employees eventually saw these events for what they were – innovation theater, activities designed to appease them and create feel-good stories for investors. In response, employees either left for places where innovation (or at least the curiosity and questions required) was welcomed, or they stayed, wiser and more cynical about management’s true intentions.

Then came the pandemic and a recession. Companies retreated further into themselves, focused more on core operations, and cut anything that wouldn’t generate financial results in 12 months or less.

Innovation muscles atrophied.

Just at the moment they need to be flexed most.

Why This is Good News

As the risk of investment in external innovation increases, companies will start looking for other ways to innovate and grow. Ways that feel less risky and give them more control.

They’ll rediscover Internal Innovation.

This is the silver lining of the dark SVB cloud – renewed investment in innovation, not as an event or activity to appease employees, but as a strategic tool critical to delivering strategic priorities and accelerating growth.

And, because this is our 2nd time around, we know it’s not about internal innovation teams OR external partners/investments. It’s about internal innovation teams AND external partners/investments.

Both are needed, and both can be successful if they:

  1. Are critical enablers of strategic priorities
  2. Pursue realistic goals (stretch, don’t splatter!)
  3. Receive the people and resources required to deliver against those goals
  4. Are empowered to choose progress over process
  5. Are supported by senior leaders with words AND actions

What To Do Now

When it comes to corporate innovation teams, many companies are starting from nothing. Some companies have files and playbooks they can dust off. A few have 1 or 2 people already working.

Whatever your starting point is, start now.

Just do me one favor. When you start pulling the team together, remember LL Cool J, “Don’t call it a comeback, I been here for years.”

Image credit: Wikimedia Commons

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Just Because We Can, Doesn’t Mean That We Should!

Just Because We Can, Doesn’t Mean That We Should!

GUEST POST from Pete Foley

An article on innovation from the BBC caught my eye this week. https://www.bbc.com/news/science-environment-64814781. After extensive research and experimentation, a group in Spain has worked out how to farm octopus. It’s clever innovation, but also comes with some ethical questions. The solution involves forcing highly intelligent, sentient animals together in unnatural environments, and then killing them in a slow, likely highly stressful way. And that triggers something that I believe we need to always keep front and center in innovation: Just Because We Can, Doesn’t Mean That We Should!

Pandora’s Box

It’s a conundrum for many innovations. Change opens Pandora’s Box, and with new possibilities come unknowns, new questions, new risks and sometimes, new moral dilemmas. And because our modern world is so complex, interdependent, and evolves so quickly, we can rarely fully anticipate all of these consequences at conception.

Scenario Planning

In most fields we routinely try and anticipate technical challenges, and run all sorts of stress, stability and consumer tests in an effort to anticipate potential problems. We often still miss stuff, especially when it’s difficult to place prototypes into realistic situations. Phones still catch fire, Hyundai’s can be surprisingly easy to steal, and airbags sometimes do more harm than good. But experienced innovators, while not perfect, tend to be pretty good at catching many of the worst technical issues.

Another Innovators Dilemma

Octopus farming doesn’t, as far as I know, have technical issues, but it does raise serious ethical questions. And these can sometimes be hard to spot, especially if we are very focused on technical challenges. I doubt that the innovators involved in octopus farming are intrinsically bad people intent on imposing suffering on innocent animals. But innovation requires passion, focus and ownership. Love is Blind, and innovators who’ve invested themselves into a project are inevitably biased, and often struggle to objectively view the downsides of their invention.

And this of course has far broader implications than octopus farming. The moral dilemma of innovation and unintended consequences has of course been brought into sharp focus with recent advances in AI.  In this case the stakes are much higher. Stephen Hawking and many others expressed concerns that while AI has the potential to provide incalculable benefits, it also has the potential to end the human race. While I personally don’t see CHATgpt as Armageddon, it is certainly evidence that Pandora’s Box is open, and none of us really knows how it will evolve, for better or worse.

What are our Solutions

So what can we do to try and avoid doing more harm than good? Do we need an innovator’s equivalent of the Hippocratic Oath? Should we as a community commit to do no harm, and somehow hold ourselves accountable? Not a bad idea in theory, but how could we practically do that? Innovation and risk go hand in hand, and in reality we often don’t know how an innovation will operate in the real world, and often don’t fully recognize the killer application associated with a new technology. And if we were to eliminate most risk from innovation, we’d also eliminate most progress. This said, I do believe how we balance progress and risk is something we need to discuss more, especially in light of the extraordinary rate of technological innovation we are experiencing, the potential size of its impact, and the increasing challenges associated with predicting outcomes as the pace of change accelerates.

Can We Ever Go Back?

Another issue is that often the choice is not simply ‘do we do it or not’, but instead ‘who does it first’? Frequently it’s not so much our ‘brilliance’ that creates innovation. Instead, it’s simply that all the pieces have just fallen into place and are waiting for someone to see the pattern. From calculus onwards, the history of innovation is replete with examples of parallel discovery, where independent groups draw the same conclusions from emerging data at about the same time.

So parallel to the question of ‘should we do it’ is ‘can we afford not to?’ Perhaps the most dramatic example of this was the nuclear bomb. For the team working the Manhattan Project it must have been ethically agonizing to create something that could cause so much human suffering. But context matters, and the Allies at the time were in a tight race with the Nazi’s to create the first nuclear bomb, the path to which was already sketched out by discoveries in physics earlier that century. The potential consequences of not succeeding were even more horrific than those of winning the race. An ethical dilemma of brutal proportions.

Today, as the pace of change accelerates, we face a raft of rapidly evolving technologies with potential for enormous good or catastrophic damage, and where Pandoras Box is already cracked open. Of course AI is one, but there are so many others. On the technical side we have bio-engineering, gene manipulation, ecological manipulation, blockchain and even space innovation. All of these have potential to do both great good and great harm. And to add to the conundrum, even if we were to decide to shut down risky avenues of innovation, there is zero guarantee that others would not pursue them. On the contrary, as bad players are more likely to pursue ethically dubious avenues of research.

Behavioral Science

And this conundrum is not limited to technical innovations. We are also making huge strides in understanding how people think and make decisions. This is superficially more subtle than AI or bio-manipulation, but as a field I’m close to, it’s also deeply concerning, and carries similar potential to do both great good or cause great harm. Public opinion is one of the few tools we have to help curb mis-use of technology, especially in democracies. But Behavioral Science gives us increasingly effective ways to influence and nudge human choices, often without people being aware they are being nudged. In parallel, technology has given us unprecedented capability to leverage that knowledge, via the internet and social media. There has always been a potential moral dilemma associated with manipulating human behavior, especially below the threshold of consciousness. It’s been a concern since the idea of subliminal advertising emerged in the 1950’s. But technical innovation has created a potentially far more influential infrastructure than the 1950’s movie theater.   We now spend a significant portion of our lives on line, and techniques such as memes, framing, managed choice architecture and leveraging mere exposure provide the potential to manipulate opinions and emotional engagement more profoundly than ever before. And the stakes have gotten higher, with political advertising, at least in the USA, often eclipsing more traditional consumer goods marketing in sheer volume.   It’s one thing to nudge someone between Coke and Pepsi, but quite another to use unconscious manipulation to drive preference in narrowly contested political races that have significant socio-political implications. There is no doubt we can use behavioral science for good, whether it’s helping people eat better, save better for retirement, drive more carefully or many other situations where the benefit/paternalism equation is pretty clear. But especially in socio-political contexts, where do we draw the line, and who decides where that line is? In our increasingly polarized society, without some oversight, it’s all too easy for well intentioned and passionate people to go too far, and in the worst case flirt with propaganda, and thus potentially enable damaging or even dangerous policy.

What Can or Should We Do?

We spend a great deal of energy and money trying to find better ways to research and anticipate both the effectiveness and potential unintended consequences of new technology. But with a few exceptions, we tend to spend less time discussing the moral implications of what we do. As the pace of innovations accelerates, does the innovation community need to adopt some form of ‘do no harm’ Hippocratic Oath? Or do we need to think more about educating, training, and putting processes in place to try and anticipate the ethical downsides of technology?

Of course, we’ll never anticipate everything. We didn’t have the background knowledge to anticipate that the invention of the internal combustion engine would seriously impact the world’s climate. Instead we were mostly just relieved that projections of cities buried under horse poop would no longer come to fruition.

But other innovations brought issues we might have seen coming with a bit more scenario-planning? Air bags initially increased deaths of children in automobile accidents, while prohibition in the US increased both crime and alcoholism. Hindsight is of course very clear, but could a little more foresight have anticipated these? Perhaps my favorite example unintended consequences is the ‘Cobra Effect’. The British in India were worried about the number of venomous cobra snakes, and so introduced a bounty for every dead cobra. Initially successful, this ultimately led to the breeding of cobras for bounty payments. On learning this, the Brits scrapped the reward. Cobra breeders then set the now-worthless snakes free. The result was more cobras than the original start-point. It’s amusing now, but it also illustrates the often significant gap between foresight and hindsight.

I certainly don’t have the answers. But as we start to stack up world changing technologies in increasingly complex, dynamic and unpredictable contexts, and as financial rewards often favor speed over caution, do we as an innovation community need to start thinking more about societal and moral risk? And if so, how could, or should we go about it?

I’d love to hear the opinions of the innovation community!

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.