Category Archives: Technology

4 Things Leaders Must Know About Artificial Intelligence and Automation

4 Things Leaders Must Know About Artificial Intelligence and Automation

GUEST POST from Greg Satell

In 2011, MIT economists Erik Brynjolfsson and Andrew McAfee self-published an unassuming e-book titled Race Against The Machine. It quickly became a runaway hit. Before long, the two signed a contract with W. W. Norton & Company to publish a full-length version, The Second Machine Age that was an immediate bestseller.

The subject of both books was how “digital technologies are rapidly encroaching on skills that used to belong to humans alone.” Although the authors were careful to point out that automation is nothing new, they argued, essentially, that at some point a difference in scale becomes a difference in kind and forecasted we were close to hitting a tipping point.

In recent years, their vision has come to be seen as deterministic and apocalyptic, with humans struggling to stay relevant in the face of a future ruled by robot overlords. There’s no evidence that’s true. The future, in fact, will be driven by humans collaborating with other humans to design work for machines to create value for other humans.

1. Automation Doesn’t Replace Jobs, It Replaces Tasks

When a new technology appears, we always seem to assume that its primary value will be to replace human workers and reduce costs, but that’s rarely true. For example, when automatic teller machines first appeared in the early 1970s, most people thought it would lead to less branches and tellers, but actually just the opposite happened.

What really happens is that as a task is automated, it becomes commoditized and value shifts somewhere else. That’s why today, as artificial intelligence is ramping up, we increasingly find ourselves in a labor shortage. Most tellingly, the shortage is especially acute in manufacturing, where automation is most pervasive.

That’s why the objective of any viable cognitive strategy is not to cut costs, but to extend capabilities. For example, when simple consumer service tasks are automated, that can free up time for human agents to help with more thorny issues. In much the same way, when algorithms can do much of the analytical grunt work, human executives can focus on long-term strategy, which computers tend to not do so well.

The winners in the cognitive era will not be those who can reduce costs the fastest, but those who can unlock the most value over the long haul. That will take more than simply implementing projects. It will require serious thinking about what your organization’s mission is and how best to achieve it.

2. Value Never Disappears, It Just Shifts To Another Place

In 1900, 30 million people in the United States were farmers, but by 1990 that number had fallen to under 3 million even as the population more than tripled. So, in a manner of speaking, 90% of American agriculture workers lost their jobs, mostly due to automation. Still, the twentieth century was seen as an era of unprecedented prosperity.

We’re in the midst of a similar transformation today. Just as our ancestors toiled in the fields, many of us today spend much of our time doing rote, routine tasks. Yet, as two economists from MIT explain in a paper, the jobs of the future are not white collar or blue collar, but those focused on non-routine tasks, especially those that involve other humans.

Far too often, however, managers fail to recognize value hidden in the work their employees do. They see a certain job description, such as taking an order in a restaurant or answering a customer’s call, and see how that task can be automated to save money. What they don’t see, however, is the hidden value of human interaction often embedded in many jobs.

When we go to a restaurant, we want somebody to take care of us (which is why we didn’t order takeout). When we have a problem with a product or service, we want to know somebody cares about solving it. So the most viable strategy is not to cut jobs, but to redesign them to leverage automation to empower humans to become more effective.

3. As Machines Learn To Think, Cognitive Skills Are Being Replaced By Social Skills

20 or 30 years ago, the world was very different. High value work generally involved the retaining information and manipulating numbers. Perhaps not surprisingly, education and corporate training programs were focused on building those skills and people would build their careers on performing well on knowledge and quantitative tasks.

Today, however, an average teenager has more access to information and computing power than even a large enterprise would a generation ago, so knowledge retention and quantitative ability have largely been automated and devalued, so high value work has shifted from cognitive skills to social skills.

To take just one example, the journal Nature has noted that the average scientific paper today has four times as many authors as one did in 1950 and the work they are doing is far more interdisciplinary and done at greater distances than in the past. So even in highly technical areas, the ability to communicate and collaborate effectively is becoming an important skill.

There are some things that a machine will never do. Machines will never strike out at a Little League game, have their hearts broken or see their children born. That makes it difficult, if not impossible, for machines to relate to humans as well as a human can.

4. AI Is A Force Multiplier, Not A Magic Box

The science fiction author Arthur C. Clark noted that “Any sufficiently advanced technology is indistinguishable from magic” and that’s largely true. So when we see a breakthrough technology for the first time, such as when IBM’s Watson system beat top human players at Jeopardy!, many immediately began imagining all the magical possibilities that could be unleashed.

Unfortunately, that always leads to trouble. Many firms raced to implement AI applications without understanding them and were immediately disappointed that the technology was just that — technology — and not actually magic. Besides wasting resources, these projects were also missed opportunities to implement something truly useful.

As Josh Sutton, CEO of Agorai, a platform that helps companies build AI applications for their business, put it, “What I tell business leaders is that AI is useful for tasks you understand well enough that you could do them if you had enough people and enough time, but not so useful if you couldn’t do it with more people and more time. It’s a force multiplier, not a magic box.”

So perhaps most importantly, what business leaders need to understand about artificial intelligence is that it is not inherently utopian or apocalyptic, but a business tool. Much like any other business tool its performance is largely dependent on context and it is a leader’s job to help create that context.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Mystery of Stonehenge Solved

Mystery of Stonehenge Solved

by Braden Kelley

Forget about capturing and reverse engineering alien spacecraft to gain a competitive edge in the innovation race. Sorry, but the universe is billions of years old and even if some extra terrestrial civilization millions or billions of years older than our own managed to travel here from halfway across the galaxy and crash, it is very likely that we would be incapable of reverse engineering their technology.

Why?

When the United States captures a downed enemy aircraft we can reverse engineer it because at its core it is still an aircraft made of similar materials to those we use and made using similar manufacturing processes. Meaning that we already have the capabilities to build something similar, we just need a physical example or blueprints of the aircraft.

But, when you are talking about something made using technology thousands, millions, or billions of years more advanced than our own, it becomes less likely that we would be able to reverse engineer found technology. This is because there would likely be materials involved that we haven’t discovered yet, either entirely new elements on the periodic table or alloys that we don’t yet know how to make. Imagine what would happen if a slightly damaged Apollo-era Saturn V rocket suddenly appeared circa 50 AD next to the Pantheon in Rome. How long would it be before the Romans would be able to fly to the moon?

If a large, and overdue, solar event were to occur and destroy all of our electricity-based technology, how long would it take for us to be able to achieve spaceflight again?

Apocalypse Innovation

There is no doubt that human beings developed a different set of technologies prior to the last great apocalypse and most of this knowledge has been lost through time, warfare, and 400 feet of water or 20 feet of earth. Only tall stone constructions away from prehistoric coastlines or items locked away in dry underground vaults survived. History and technology are incredibly perishable.

Twelve thousand years later we have achieved some pretty remarkable achievements and ground penetrating radar is giving us new insight into the scope and scale of pre-apocalypse societies hidden undersea and underground.

But, there are a great many mysteries from the ancient world that we are still struggling to reverse engineer. From the pyramids to Stonehenge, people are hypothesizing a number of ways these monuments may have been built and what their true purpose might have been.

Nine years ago researchers from the University of Amsterdam determined that the blocks on stone moved around on the Giza plateau on sledges would have moved easier if someone went before them wetting the sand.

Eleven years ago, American Wally Wallington of Michigan showed in a YouTube video how he could move stones weighing more than a ton up to 300 feet per hour and then stand them up vertically all by himself.

He didn’t invent some amazing new piece of technology to do this, but instead eschewed modern technology and showed how he can do this using basic principles of physics and gravity. First let’s look at the video and then we’ll talk about what apocalypse innovation exercise is:

The apocalypse innovation exercise is one way of challenging orthodoxies and is quite simple:

  1. Identify a technology or input that is key to your product or service achieving its goal
  2. Concoct a simple reason why this technology no longer functions or this input is no longer available
  3. Have the group begin to ideate alternative inputs that could be used or alternate technologies that could be leveraged or developed to make the product or service achieve its goal again (If you are looking for a new technology, what are the first principles that you could go back to? And what are the other technology paths you could explore instead? – i.e. acoustic levitation instead of electromagnetic levitation)
  4. Pick one from the list of available options
  5. Re-engage the group to backcast what it will take to replace the existing technology or input with this new one (NOTE: backcasting is the practice of working backwards to show how an outcome will be achieved)
  6. Sketch out how the product or service will change as result of using this new technology or input
  7. Brainstorm ways that this change can be positioned as a benefit for customers

Apocalypse innovation can be a valuable innovation exercise for those products or services approaching the upper flattening of the traditional ‘S’ curve that pretty much all innovations go through and represents one way that can lead you to the steeper part of a new ‘S’ curve.

What other exercises do you like to use to help people challenge orthodoxies?

If you’d like to sign up to learn more about my new FutureHacking™ methodology and set of tools, go here.

Build a Common Language of Innovation on your team

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Moneyball and the Beginning, Middle, and End of Innovation

Moneyball and the Beginning, Middle, and End of Innovation

GUEST POST from Robyn Bolton

Recently, pitchers and catchers reported to MLB Spring Training facilities in Florida and Arizona.  For baseball fans, this is the first sign of Spring, an occasion that heralds months of warmth and sunshine, ballparks filled (hopefully) with cheering fans, dinners of beers and brats, and the undying belief that this year will be the year.

Of course, there was still a lot of dark, dreary cold between then and Opening Day.  Perfect weather for watching baseball movies – Bull DurhamMajor LeagueThe NaturalField of Dreams, and, of course, Moneyball.

Moneyball is based on the book of the same name by Michael Lewis and chronicles the 2002 Oakland Athletics season.  The ’02 Oakland A’s, led by General Manager Billy Beane (played by Brad Pitt), forever changed baseball by adopting an approach that valued rigorous statistical analysis over the collective wisdom of baseball insiders (coaches, scouts, front office personnel) when building a team.  This approach, termed “Moneyball,” enabled the A’s to reach the postseason with a team that cost only $44M in salary, compared to the NY Yankees that spent $125M to achieve the same outcome.

While the whole movie (and book) is a testament to the courage and perseverance required to challenge and change the status quo, time and again I come back to three lines that perfectly sum up the journey of every successful intrapreneur I’ve ever met.

The Beginning

I know you’ve taken it in the teeth out there, but the first guy through the wall…he always gets bloody…always always gets bloody.  This is threatening not just a way of doing business… but in their minds, it’s threatening the game. Really what it’s threatening is their livelihood, their jobs. It’s threatening the way they do things… and every time that happens, whether it’s the government, a way of doing business, whatever, the people who are holding the reins – they have their hands on the switch – they go batshit crazy.”

John Henry, Owner of the Boston Red Sox

Context

The 2002 season is over, and the A’s were eliminated in the first round of the playoffs.  John Henry, an owner of the Boston Red Sox, has invited Bill Beane to Boston to offer him the Red Sox GM job.

Lesson

This is what you sign up for when you decide to be an Intrapreneur.  The more you challenge the status quo, the more you question how business is done, the more you ask Why and demand an answer, the closer you get to “tak(ing) it in the teeth.”

This is why courage, perseverance, and an unshakeable belief that things can and should be better are absolutely essential for intrapreneurs.  Your job is to run at the wall over and over until you get through it.

People will follow.  The Red Sox did.  They won the World Series in 2004, breaking an 84-year-old curse.

The Middle

“It’s a process, it’s a process, it’s a process”

Bill Beane

Context

Billy has to convince the ballplayers to forget all the habits that made them great and embrace the philosophy of Moneyball.  To stop stealing bases, turning double plays on bunts, and swinging for the fences and to start taking walks, throwing to first for the easy out, and prioritize getting on base over hitting a home run.

The players are confused and frustrated.  Suddenly, everything that they once did right is wrong and what was not valued is deeply prized.

Lesson

Innovation is something new that creates value.  Something new doesn’t just require change, it requires people to stop doing things that work and start doing things that seem strange or even wrong.

Change doesn’t happen overnight.  It’s not a switch to be flipped.  It’s a process to be learned.  It takes time, practice, reminders, and patience.

The End

“When you get an answer you’re looking for, hang up.”

Billy Beane

Context

In this scene, Billy has offered one of his players to multiple teams, searching for the best deal.  When the phone rings with a deal he likes, he and the other General Manager (GM) agree to it, Billy hangs up.  Even though the other GM was in the middle of a sentence.  When Peter Brand, the Assistant GM played by Jonah Hill, points out that Billy had just hung up on the other GM, Billy responds with this nugget of wisdom.

Lesson

It’s advice intrapreneurs should take very much to heart.  I often see Innovation teams walk into management presentations with long presentations, full of data and projections, anxious to share their progress, and hoping for continued funding and support.  When the meeting starts, a senior exec will say something like, “We’re excited by the progress we’re hearing about and what it will take to continue.”

That’s the cue to “hang up.”

Instead of starting the presentation from the beginning, start with “what it will take to continue.”  You got the answer you’re looking for – they’re excited about the progress you’ve made – don’t spend time giving them the info they already have or, worse, could raise questions and dim their enthusiasm.  Hang up on the conversation you want to have and have the conversation they want to have.

In closing

Moneyball was an innovation that fundamentally changed one of the most tradition-bound businesses in sports.  To be successful, it required someone willing to take it in the teeth, to coach people through a process, and to hang up when they got the answer they wanted.  It wasn’t easy but real change rarely is.

The same is true in corporations.  They need their own Bill Beanes.

Are you willing to step up to the plate?

Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

3 Examples of Why Innovation is a Leadership Problem

Through the Looking Glass

3 Examples of Why Innovation is a Leadership Problem

GUEST POST from Robyn Bolton

Do you sometimes feel like you’re living in an alternate reality?

If so, you’re not alone.  Most innovators feel that way at some point.

After all, you see things that others don’t.

Question things that seem inevitable and true.

Make connections where others only see differences.

Do things that seem impossible.

It’s easy to believe that you’re the crazy one, the Mad Hatter and permanent resident of Wonderland.

But what if you’re not the crazy one?

What if you’re Alice?

And you’re stepping through the looking glass every time you go to work?

In Lewis Carroll’s book, the other side of the looking glass is a chessboard, and all its inhabitants are chess pieces that move in defined and prescribed ways, follow specific rules, and achieve defined goals.  Sound familiar?

Here are a few other things that may sound familiar, too

“The rule is, jam tomorrow and jam yesterday – but never jam today.” – The White Queen

In this scene, the White Queen offers to hire Alice as her lady’s maid and pay her “twopence a week and jam every other day.”  When Alice explains that she doesn’t want the job, doesn’t like jam, and certainly doesn’t want jam today, the queen scoffs and explains the rule.

The problem, Alice points out, is that it’s always today, and that means there’s never jam.

Replace “jam” with “innovation,” and this hits a little too close to home for most innovators.

How often do you hear about the “good old days” when the company was more entrepreneurial, willing to experiment and take risks, and encouraged everyone to innovate?

Innovation yesterday.

How often do you hear that the company will invest in innovation, restart its radical innovation efforts, and disrupt itself as soon as the economy rebounds, business improves, and things settle down a bit?  Innovation tomorrow.

But never innovation today.  After all, “it’s [innovation] every other day: today isn’t any other day, you know.”

“When I use a word, it means just what I choose it to mean – neither more, not less.” – Humpty Dumpty

In this scene, poor Alice tries to converse with Humpty Dumpty, but he keeps using the “wrong” words.  Except they’re not the wrong words because they mean exactly what he chooses them to mean.

Even worse, when Alice asks Humpty to define confusing terms, he gets angry, speaks in a “scornful tone,” and smiles “contemptuously” before “wagging his head gravely from side to side.

We all know what the words we use mean, but we too often think others share our definitions.  We use “innovation” and “growth,” assuming people know what we mean.  But they don’t.  They know what the words mean to them.  And that may or may not be what we mean.

When managers encourage people to share ideas, challenge the status quo, and take risks, things get even trickier.  People listen, share ideas, challenge the status quo, and take risks.  Then they are confused when management doesn’t acknowledge their efforts.  No one realizes that those requests meant one thing to the managers who gave them and a different thing to the people who did them.

“It takes all the running you can do, to keep in the same place.  If you want to go somewhere else, you must run at least twice as fast as that!” – The Red Queen

In this scene, the Red Queen introduces life on the other side of the looking glass and explains Alice’s new role as a pawn.  Of course, the explanation comes after a long sprint that seems to get them nowhere and only confuses Alice more.

When “tomorrow” finally comes, and it’s time for innovation, it often comes with a mandate to “act with urgency” to avoid falling behind.  I’ve seen managers set goals of creating and launching a business with $250M revenue in 3 years and leadership teams scrambling to develop a portfolio of businesses that would generate $16B in 10 years.

Yes, the world is moving faster, so companies need to increase the pace at which they operate and innovate.  But if you’re doing all you can, you can’t do twice as much.  You need help – more people and more funding, not more meetings or oversight.

“Life, what is it but a dream?”

Managers and executives, like the kings and queens, have roles to play.  They live in a defined space, an org chart rather than a chessboard, and they do their best to navigate it following rules set by tradition, culture, and HR.

But you are like Alice.  You see things differently.  You question what’s taken as given.  And, every now and then, you probably want to shake someone until they grow “shorter – and fatter – and softer – and rounder – and…[into] a kitten, after all.”

So how do you get back to reality and bring everyone with you?  You talk to people.  You ask questions and listen to the answers.  You seek to understand their point of view and then share yours.

Some will choose to stay where they are.

Some will choose to follow you back through the looking glass.

They will be the ones who transform a leadership problem into a leadership triumph.

Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The End of the Digital Revolution

Here’s What You Need to Know

The End of the Digital Revolution

GUEST POST from Greg Satell

The history of digital technology has largely been one of denial followed by disruption. First came the concept of the productivity paradox, which noted the limited economic impact of digital technology. When e-commerce appeared, many doubted that it could ever compete with physical retail. Similar doubts were voiced about digital media.

Today, it’s hard to find anyone who doesn’t believe in the power of digital technology. Whole industries have been disrupted. New applications driven by cloud computing, artificial intelligence and blockchain promise even greater advancement to come. Every business needs to race to adopt them in order to compete for the future.

Ironically, amid all this transformation the digital revolution itself is ending. Over the next decade, new computing architectures will move to the fore and advancements in areas like synthetic biology and materials science will reshape entire fields, such as healthcare, energy and manufacturing. Simply waiting to adapt won’t be enough. The time to prepare is now.

1. Drive Digital Transformation

As I explained in Mapping Innovation, innovation is never a single event, but a process of discovery, engineering and transformation. Clearly, with respect to digital technology, we are deep into the transformation phase. So the first part of any post-digital strategy is to accelerate digital transformation efforts in order to improve your competitive position.

One company that’s done this very well is Walmart. As an old-line incumbent in the physical retail industry, it appeared to be ripe for disruption as Amazon reshaped how customers purchased basic items. Why drive out to a Walmart store for a package of toothpaste when you can just click a few buttons on your phone?

Yet rather than ceding the market to Amazon, Walmart has invested heavily in digital technology and has achieved considerable success. It wasn’t any one particular tactic or strategy made the difference, but rather the acknowledgment that every single process needed to be reinvented for the digital age. For example, the company is using virtual reality to revolutionize how it does in-store training.

Perhaps most of all, leaders need to understand that digital transformation is human transformation. There is no shortage of capable vendors that can implement technology for you. What’s key, however, is to shift your culture, processes and business model to leverage digital capabilities.

2. Explore Post-Digital Technologies

While digital transformation is accelerating, advancement in the underlying technology is slowing down. Moore’s law, the consistent doubling of computer chip performance over the last 50 years, is nearing its theoretical limits. It has already slowed down considerably and will soon stop altogether. Yet there are non-digital technologies under development that will be far more powerful than anything we’ve ever seen before.

Consider Intel, which sees its future in what it calls heterogeneous computing combining traditional digital chips with non-digital architectures, such as quantum and neuromorphic. It announced a couple of years ago its Pohoiki Beach neuromorphic system that processes information up to 1,000 times faster and 10,000 more efficiently than traditional chips for certain tasks.

IBM has created a network to develop quantum computing technology, which includes research labs, startups and companies that seek to be early adopters of the technology. Like neuromorphic computing, quantum systems have the potential to be thousands, if not millions, of times more powerful than today’s technology.

The problem with these post-digital architectures is that no one really knows how they are going to work. They operate on a very different logic than traditional computers, will require new programming languages and algorithmic strategies. It’s important to start exploring these technologies now or you could find yourself years behind the curve.

3. Focus on Atoms, Not Bits

The digital revolution created a virtual world. My generation was the first to grow up with video games and our parents worried that we were becoming detached from reality. Then computers entered offices and Dan Bricklin created Visicalc, the first spreadsheet program. Eventually smartphones and social media appeared and we began spending almost as much time in the virtual world as we did in the physical one.

Essentially, what we created was a simulation economy. We could experiment with business models in our computers, find flaws and fix them before they became real. Computer-aided design (CAD) software allowed us to design products in bits before we got down to the hard work of shaping atoms. Because it’s much cheaper to fail in the virtual world than the physical one, this made our economy much more efficient.

Yet the next great transformation will be from bits to atoms. Digital technology is creating revolutions in things like genomics and materials science. Artificial intelligence and cloud computing are reshaping fields like manufacturing and agriculture. Quantum and neuromorphic computing will accelerate these trends.

Much like those new computing architectures, the shift from bits to atoms will create challenges. Applying the simulation economy to the world of atoms will require new skills and we will need people with those skills to move from offices in urban areas to factory floors and fields. They will also need to learn to collaborate effectively with people in those industries.

4. Transformation is Always a Journey, Never a Destination

The 20th century was punctuated by two waves of disruption. The first, driven by electricity and internal combustion, transformed almost every facet of daily life and kicked off a 50-year boom in productivity. The second, driven by the microbe, the atom and the bit, transformed fields such as agriculture, healthcare and management.

Each of these technologies followed the pattern of discovery, engineering and transformation. The discovery phase takes place mostly out of sight, with researchers working quietly in anonymous labs. The engineering phase is riddled with errors, as firms struggle to shape abstract concepts into real products. A nascent technology is easy to ignore, because its impact hasn’t been felt yet.

The truth is that disruption doesn’t begin with inventions, but when an ecosystem emerges to support them. That’s when the transformation phase begins and takes us by surprise, because transformation never plays out like we think it will. The future will always, to a certain extent, unpredictable for the simple reason that it hasn’t happened yet.

Today, we’re on the brink of a new era of innovation that will be driven by new computing architectures, genomics, materials science and artificial intelligence. That’s why we need to design our organizations for transformation by shifting from vertical hierarchies to horizontal networks.

Most of all, we need to shift our mindsets from seeing transformation as set of discreet objectives to a continuous journey of discovery. Digital technology has only been one phase of that journey. The most exciting things are still yet to come.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Is China Our New Sputnik Moment?

Is China Our New Sputnik Moment?

GUEST POST from Greg Satell

When the Soviets launched Sputnik, the first space satellite, into orbit in 1957, it was a wake-up call for America. Over the next year, President Eisenhower would sign the National Defense Education Act to spur science education, increase funding for research and establish NASA and DARPA to spur innovation.

A few years ago, a report by the Council on Foreign Relations (CFR) argued that we are at a similar point today, but with China. While we have been steadily decreasing federal investment in R&D over the past few decades, our Asian rival has been ramping up and now threatens our leadership in key technologies such as AI, genomics and quantum information technology.

Clearly, we need to increase our commitment to science and innovation and that means increasing financial investment. However, what the report makes clear is that money alone won’t solve the problem. We are, in several important ways, actually undermining our ability to innovate, now and in the future. We need to renew our culture of innovation in America.

Educating And Attracting Talent

The foundation of an innovation economy is education, especially in STEM subjects. Historically, America has been the world’s best educated workforce, but more recently we’ve fallen to fifth among OECD countries for post-secondary education. That’s alarming and something we will certainly need to reverse if we are to compete effectively.

Our educational descent can be attributed to three major causes. First, the rest of the world has become more educated, so the competition has become stiffer. Second, is financing. Tuition has nearly tripled in the last decade and student debt has become so onerous that it now takes about 20 years to pay off four years for college. Third, we need to work harder to attract talented people to the United States.

The CFR report recommends developing a “21st century National Defense Education Act” to create scholarships in STEM areas and making it easier for foreign students to get Green Cards when they graduate from our universities. It also points out that we need to work harder to attract foreign talent, especially in high impact areas like AI, genomics and quantum computing.

Unfortunately, we seem to be going the other way. The number of international students to American universities is declining. Policies like the muslim ban and concerns about gun violence are deterring scientific talent coming here. The denial rate for those on H1-B visas has increased from 4% in 2016 to 18% in the first quarter of 2019.

Throughout our history, it has been our openness to new people and new ideas that has made America exceptional. It’s a legitimate question whether that’s still true.

Building Technology Ecosystems

In the 1980s, the US semiconductor industry was on the ropes. Due to increased competition from low-cost Japanese manufacturers, American market share in the DRAM market fell from 70% to 20%. The situation not only had a significant economic impact, there were also important national security implications.

The federal government responded with two initiatives, the Semiconductor Research Corporation and SEMATECH, both of which were nonprofit consortiums that involved government, academia and industry. By the 1990s. American semiconductor manufacturers were thriving again.

Today, we have similar challenges with rare earth elements, battery technology and many manufacturing areas. The Obama administration responded by building similar consortiums to those that were established for semiconductors: The Critical Materials Institute for rare earth elements, JCESR for advanced batteries and the 14 separate Manufacturing Institutes.

Yet here again, we seem to be backsliding. The current administration has sought to slash funding for the Manufacturing Extension Partnership that supports small and medium sized producers. An addendum to the CFR report also points out that the administration has pushed for a 30% cut in funding for the national labs, which support much of the advanced science critical to driving American technology forward.

Supporting International Trade and Alliances

Another historical strength of the US economy has been our open approach to trade. The CFR report points out that our role as a “central node in a global network of research and development,” gave us numerous advantages, such as access to foreign talent at R&D centers overseas, investment into US industry and cooperative responses to global challenges.

However, the report warns that “the Trump administration’s indiscriminate use of tariffs against China, as well as partners and allies, will harm U.S. innovative capabilities.” It also faults the Trump administration for pulling out of the Trans-Pacific Partnership trade agreement, which would have bolstered our relationship with Asian partners and increased our leverage over China.

The tariffs undermine American industry in two ways. First, because many of the tariffs are on intermediate goods which US firms use to make products for export, we’re undermining our own competitive position, especially in manufacturing. Second, because trade partners such as Canada and the EU have retaliated against our tariffs, our position is weakened further.

Clearly, we compete in an ecosystem driven world in which power does not come from the top, but emanates from the center. Traditionally, America has positioned itself at the center of ecosystems by constantly connecting out. Now that process seems to have reversed itself and we are extremely vulnerable to others, such as China, filling the void.

We Need to Stop Killing Innovation in America

The CFR report, whose task force included such luminaries as Admiral William McRaven, former Google CEO Eric Schmidt and economist Laura Tyson, should set alarm bells ringing. Although the report was focused on national security issues, it pertains to general competitiveness just as well and the picture it paints is fairly bleak.

After World War II, America stood almost alone in the world in terms of production capacity. Through smart policy, we were able to transform that initial advantage into long-term technological superiority. Today, however we have stiff competition in areas ranging from AI to synthetic biology to quantum systems.

At the same time, we seem to be doing everything we can to kill innovation in America. Instead of working to educate and attract the world’s best talent, we’re making it harder for Americans to attain higher education and for top foreign talent to come and work here. Instead of ramping up our science and technology programs, presidential budgets regular recommend cutting them. Instead of pulling our allies closer, we are pushing them away.

To be clear, America is still at the forefront of science and technology, vying for leadership in every conceivable area. However, as global competition heats up and we need to be redoubling our efforts, we seem to be doing just the opposite. The truth is that our prosperity is not a birthright to which we are entitled, but a legacy that must be lived up to.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Rethinking Customer Journeys

Rethinking Customer Journeys

GUEST POST from Geoffrey A. Moore

Customer journeys are a mainstay of modern marketing programs. Unfortunately, for most companies, they are pointed in the wrong direction!

Most customer journey diagrams I see map the customer’s journey through the vendor’s marketing and sales process. That’s not a customer journey. That is a vendor journey. Customers could not care less about it.

What customers do care about is any journey that leads to value realization in their enterprise. That means true customer journey mapping must work backward from the customer’s value goals and objectives, not forward from the vendor’s sales goals and objectives.

But to do that, the customer-facing team in the vendor organization has to have good intelligence about what value realization the customer is seeking. That means that sales teams must diagnose before they prescribe. They must interrogate before they present. They must listen before they demo.

That is not what the typical sales enablement program teaches. Instead, it instructs salespeople on how to give the standard presentation, how to highlight the product’s competitive advantages, how to counter the competition’s claims—anything and everything except the only thing that really matters—how do you get good customer intelligence from whatever level of management you are able to converse with?

The SaaS business model with its emphasis on subscription and consumption creates a natural occasion for reforming these practices. Net Revenue Retention is the name of the game. Adoption, extension, and expansion of product usage are core to the customer’s Health Score. This only happens when value is truly being realized.

All this is casting the post-sales customer-facing functions of Customer Success and Customer Support in a new light. These relationships are signaling outposts for current customer status. Vendors still need to connect with the top management, for they are the ones who set the value realization goals and provide the budgets to fund the vendor’s offerings, but for day-to-day reality checks on whether the value is actually getting realized, nothing beats feet on the ground.

So, note to vendors. You can still use your vendor-centric customer journey maps to manage your marketing and sales productivity. Just realize these maps are about you, not the customer. You cannot simply assign the customer a mindset that serves your interests. You have to genuinely engage with them to get to actionable truth.

That’s what I think. What do you think?

Image Credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

How Has Innovation Changed Since the Pandemic?

The Answer in Three Charts

How Has Innovation Changed Since the Pandemic?

GUEST POST from Robyn Bolton

“Everything changed since the pandemic.”

At this point, my husband, a Navy veteran, is very likely to moo (yes, like a cow). It’s a habit he picked up as a submarine officer, something the crew would do whenever someone said something blindingly obvious because “moo” is not just a noise. It’s an acronym – Master Of the Obvious.

But HOW did things change?

From what, to what?

So what?

It can be hard to see the changes when you’re living and working in the midst of them. This is why I found “Benchmarking Innovation Impact, from InnoLead,” a new report from InnoLead and KPMG US, so interesting, insightful, and helpful.

There’s lots of great stuff in the report (and no, this is not a sponsored post though I am a member), so I limited myself to the three charts that answer executives’ most frequently asked innovation questions.

Innovation Leader Research 2023 Chart 1

Question #1: What type of innovation should I pursue?

2023 Answer: Companies are investing more than half of their resources in incremental innovation

So What?:  I may very well be alone in this opinion, but I think this is great news for several reasons:

  1. Some innovation is better than none – Companies shifting their innovation spending to safer, shorter-term bets is infinitely better than shutting down all innovation, which is what usually happens during economic uncertainty
  2. Play to your strengths – Established companies are, on average, better at incremental and adjacent innovation because they have the experience, expertise, resources, and culture required to do those well and other ways (e.g., corporate venture capital, joint ventures) to pursue Transformational innovation.
  3. Adjacent Innovation is increasing –This is the sweet spot for corporate innovation (I may also be biased because Swiffer is an adjacent innovation) because it stretches the business into new customers, offerings, and/or business models without breaking the company or executives’ identities.

Innovation Leader Research 2023 Chart 2

Question #2: Is innovation really a leadership problem (or do you just have issues with authority)?

2023 Answer: Yes (and it depends on the situation). “Lack of Executive Support” is the #6 biggest challenge to innovation, up from #8 in 2020.

So What?: This is a good news/bad news chart.

The good news is that fewer companies are experiencing the top 5 challenges to innovation. Of course, leadership is central to fostering/eliminating turf wars, setting culture, acting on signals, allocating budgets, and setting strategy. Hence, leadership has a role in resolving these issues, too.

The bad news is that MORE innovators are experiencing a lack of executive support (24.3% vs. 19.7% in 2020) and “Other” challenges (17.3% vs. 16.4%), including:

  • Different agendas held by certain leadership as to how to measure innovation and therefore how we go after innovation. Also, the time it takes to ‘sell’ an innovative idea or opportunity into the business; corporate bureaucracy.”
  • Lack of actual strategy. Often, goals or visions are treated as strategy, which results in frustration with the organization’s ability to advance viable work and creates an unnecessary churn, resulting in confused decision-making.”
  • “Innovations are stalling after piloting due to lack of funding and executive support in order to shift to scaling. Many are just happy with PR innovation.”

Innovation Leader Research 2023 Chart 3

Question #3: How much should I invest in innovation?

2023 Answer: Most companies are maintaining past years’ budgets and team sizes.

So What?:  This is another good news/bad news set of charts.

The good news is that investment is staying steady. Companies that cut back or kill innovation investments due to economic uncertainty often find that they are behind competitors when the economy improves. Even worse, it takes longer than expected to catch up because they are starting from scratch regarding talent, strategy, and a pipeline.

The bad news is that investment is staying steady. If you want different results, you need to take different actions. And I don’t know any company that is thrilled with the results of its innovation efforts. Indeed, companies can do different things with existing budgets and teams, but there needs to be flexibility and a willingness to grow the budget and the team as projects progress closer to launch and scale-up.

Not MOO

Yes, everything has changed since the pandemic, but not as much as we think.

Companies are still investing in incremental, adjacent, and transformational innovation. They’re just investing more in incremental innovation.

Innovation is still a leadership problem, but leadership is less of a problem (congrats!)

Investment is still happening, but it’s holding steady rather than increasing.

And that is nothing to “moo” at.

Image credits: Pixabay, InnoLead

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Our Innovation is All on Tape

Why Old Technologies Are Sometimes Still the Best Ones

Our Innovation is All on Tape

GUEST POST from John Bessant

Close your eyes and imagine for a moment a computer room in the early days of the industry. Chances are you’ll picture large wardrobe-sized metal cabinets whirring away with white-coated attendants tending to the machines. And it won’t be long before your gaze lands on the ubiquitous spools of tape being loaded and unloaded.

Which might give us a smug feeling as we look at the storage options for our current generation of computers — probably based on some incredibly fast access high-capacity solid state flash drive. It’s been quite a journey — the arc stretches a long way back from the recent years of USB sticks and SD cards, external HDDs and then the wonderful world of floppy discs, getting larger and more rigid as we go back in time. The clunky 1980s when our home computers rode on cassette drives, right back to the prehistoric days where the high priests of mini and mainframes tended their storage flock of tapes.

Ancient history — except that the tape drive hasn’t gone away. In fact it’s alive and well and backing up our most precious memories. Look inside the huge data farms operated by Google, Apple, Amazon, Microsoft Azure or anyone else and you’ll find large computers — and lots of tape. Thousands of kilometres of it, containing everything from your precious family photos to email backups to data from research projects like the Large Hadron Collider.

It turns out that tape is still an incredibly reliable medium — and it has the considerable advantage of being cheap. The alternative would be buying lots of hard drives — something which increasingly matters as the volume of data we are storing is growing. Think about the internet of things — all those intelligent devices, whether security cameras or mobile phones, manufacturing performance data loggers or hospital diagnostic equipment, are generating data which needs secure long-term storage. We’ve moved long past the era of measuring storage in kilobytes or megabytes; now we’re into zettabytes, each one the equivalent of to 250billion DVDs. In 2020 estimates suggest we produced close to 59Zb of data, projected to rise to 175zb by 2025! Fortunately IBM scientist Mark Lantz , an expert in storage, suggests that we can keep scaling tape and doubling capacity every 2.5 years for the next 20 years.

Plus tape offers a number of other advantages, not least in terms of security. Most of the time a tape cartridge is not plugged in to a computer and so is pretty immune to visiting viruses and malware.

In fact the market for magnetic tape storage is in robust health; it’s currently worth nearly $5bn and is expected to grow to double that size by 2030. Not bad for a technology coming up on its hundredth anniversary. Making all of this possible is, of course, our old friend innovation. It’s been a classic journey of incremental improvement, doing what we do but better, punctuated with the occasional breakthrough.

It started in 1877 when “Mary Had a Little Lamb” was recorded and played on Thomas Edison’s first experimental talking machine called a phonograph; the sounds were stored on wax cylinders and severely limited in capacity. The first tape recorder was developed in 1886 by Alexander Graham Bell in his labs using paper with beeswax coated on it. This patented approach never really took off because the sound reproduction was inferior to Edison’s wax cylinders.

Others soon explored alternatives; for example Franklin C. Goodale adapted movie film for analogue audio recording, receiving a patent for his invention in 1909. His film used a stylus to record and play back, essentially mimicking Edison’s approach but allowing for much more storage.

But in parallel with the wax-based approach another strand emerged in 1898, with the work of Voldemar Poulsen, a Danish scientist who built on an idea originally suggested ten years earlier by Oberlin Smith. This used the concept of a wire (which could be spooled) on which information was encoded magnetically. Poulsen’s model used cotton thread, steel sawdust and metal wire and was effectively the world’s first tape recorder; he called it a ‘telegraphone’.

Which brings us to another common innovation theme — convergence. If we fast forward (itself a term which originated in the word of tape recording!) to the 1930s we can see these two strands come together; German scientists working for the giant BASF company built on a patent registered to Fritz Pfleumer in 1928. They developed a magnetic tape using metal oxide coated on plastic tape which could be used in recording sound on a commercial basis; in 1934 they delivered the first 50,000 metres of it to the giant electronics corporation AEG.

The big advantage of magnetic recording was that it didn’t rely on a physical analogue being etched into wax or other medium; instead the patterns could be encoded and read as electrical signals. It wasn’t long before tape recording took over as the dominant design — and one of the early entrants was the 3M company in the USA. They had a long history of coating surfaces with particles, having begun life making sandpaper and moved on to create a successful business out of first adhesive masking tape and then the ubiquitous Scotch tape. Coating metal oxide on to tape was an obvious move and they quickly became a key player in the industry.

Innovation is always about the interplay between needs and means and the tape recording business received a fillip from the growing radio industry in the 1940s. Tape offered to simplify and speed up the recording process and an early fan was Bing Crosby. He’d become fed up with the heavy schedule of live broadcasting which kept him away from his beloved golf course and so was drawn to the idea of pre-recording his shows. But the early disc-based technology wasn’t really up to the task, filled with hisses and scratches and poor sound quality. Crosby’s sound engineer had come across the idea of tape recording and worked with 3M to refine the technology.

The very first radio show, anywhere in the world, to be recorded directly on magnetic tape was broadcast on 1 October 1947 featuring Crosby. It not only opened up a profitable line of new business for 3M, it also did its bit for changing the way the world consumed entertainment, be it drama, music hall or news. (It was also a shrewd investment for Crosby who became one of the emerging industry’s backers)

Which brings us to another kind of innovation interplay, this time between different approaches being taken in the worlds of consumer entertainment and industrial computing. Ever since Marconi, Tesla and others had worked on radio there had been a growing interest in consumer applications which could exploit the technology. And with the grandchildren of Edison’s gramophone and in the 1940s the work on television, the home became an increasingly interesting space for electronics entrepreneurs.

But as the domestic market for fixed appliances grew saturated so the search began for mobile solutions. Portability became an important driver for the industry and gave rise to the transistor radio; it wasn’t long before the in car entertainment market began to take off. An early entrant from the tape playback side was the 8-track cartridge in the mid-1960s which allowed you to listen to your favorite tracks without lugging a portable gramophone with you. Philips’ development of the compact cassette (and its free licensing of the idea to promote rapid and widespread adoption) led to an explosion in demand (over 100 billion cassette tapes were eventually sold worldwide) and eventually to the idea of the Walkman as the first portable personal device for recorded and recording music.

Without which we’d be a little less satisfied. Specifically we’d never been introduced to one of the Rolling Stones’ greatest hits; as guitarist Keith Richards explained in his 2010 autobiography:

“I wrote the song ‘Satisfaction’ in my sleep. I didn’t know at all that I had recorded it, the song only exists, thank God, to the little Philips cassette recorder. I looked at it in the morning — I knew I had put a new tape in the night before — but it was at the very end. Apparently, I had recorded something. I rewound and then ‘Satisfaction’ sounded … and then 40 minutes of snoring!”

Meanwhile back in the emerging computer industry of the 1950s there was a growing demand for storage media for which magnetic tape seemed well suited. Cue the images we imagined in the opening paragraph, acolytes dutifully tending the vast mainframe machines.

Early computers had used punched cards and then paper tape but these soon reached the limit of their usefulness; instead the industry began exploring magnetic audio tape.

IBM’s team under the leadership of Wayne Winger developed digital tape-based storage; of particular importance was finding ways to encode the 1s and 0s of binary patterns onto the tape. They introduced the commercial digital tape recorder in 1952, and it could store what was (for its time) an impressive 2mB of data on a reel.

Not everyone was convinced; as Winger recalled, “A white-haired IBM veteran in Poughkeepsie pulled a few of us aside and told us, ‘You young fellows remember, IBM was built on punched cards, and our foundation will always be punched cards.’ Fortunately Tom Watson Jnr, son of the company founder became a champion and the project went ahead.

But while tape dominated in the short term another parallel trajectory was soon established, replacing tapes and reels with disc drives whose big advantage was the ability to randomly access data rather than wait for the tape to arrive at the right place on the playback head. IBM once again led the way with its launch in 1956 of the hard disc drive and began a steady stream of innovation in which storage volumes and density increased while the size decreased. The landscape moved through various generations of external drives until the advent of personal computers where the drives migrated inside the box and became increasingly small (and floppy).

These developments were taken up by the consumer electronics industry with the growing use of discs as an alternative recording and playback medium, spanning various formats but also decreasing in size. Which of course opened the way for more portability with Sony and Sharp launching mini-disc players in the early 1980s.

All good news for the personal audio experience but less so for the rapidly expanding information technology industry. While new media storage technology continued to improve it came at a cost and with the exponential increase in volumes of data needing to be stored came a renewed interest in alternative (and cheaper) solutions. The road was leading back to good old-fashioned tape.

Its potential was in long-term storage and retrieval of so-called ‘cold data’. Most of what is stored in the cloud today is this kind — images, emails, all sorts of backup files. And while these need to be around they don’t have to be accessed instantly. And that’s where tape has come back into its own. Today’s tapes have moved on somewhat from IBM’s 1952 limited 2mB of capacity version. They are smaller on the outside but their capacity has grown enormously — they can now hold 20Tb or even if compressed 60pTb — that’s a 10 millionfold increase in 70 years. The tapes are not wound by hand on to capstans but instead loaded into cartridges, each of which hold around a kilometer of tape; companies use libraries containing tens of thousands of these cartridges which can be mounted via automated systems deploying robots. This process takes around 90 seconds to locate a cartridge and access and load the tape, so you could be forgiven for thinking that it’s a bit slow compared to your flash drive which has an access time measured in milliseconds.

There’s a pattern here — established and once important technologies giving way to the new kids on the block with their apparently superior performance. We’ve learned that we shouldn’t necessarily write the old technologies off — at the minimum there is often a niche for them amongst enthusiasts. Think about vinyl, about the anti-mp3 backlash from hi-fi fans or more recently photography using film and plates rather than their digital counterparts.

But it’s more than just nostalgia which drives this persistence of the old. Sometimes — like our magnetic tape — there are performance features which are worth holding on to — trading speed for security and lower storage cost, for example. Sometimes there is a particular performance niche which the new technology cannot enter competitively — for example the persistence of fax machines in healthcare where they offer a secure and reliable way of transmitting sensitive information. At the limit we might argue that neither cash nor physical books are as ‘good’ as their digital rivals but their persistence points to other attributes which people continue to find valuable.

And sometimes it is about the underlying accumulated knowledge which the old technology represents — and which might be redeployed to advantage in a different field. Think of Fujifilm’s resurgence as a cosmetics and pharmaceuticals company on the back of its deep knowledge of emulsions and coatings. Technologies which it originally mastered in the now largely disappeared world of film photography. Or Kodak’s ability to offer high speed high quality printing on the back of knowledge it originally acquired in the same old industry — that of accurately spraying and targeting millions of droplets on to a surface. And it was 3M’s deep understanding of how to coat materials on to tapes gained originally from selling masking tape to the paint shops of Detroit which helped it move so effectively into the field of magnetic tape.

Keeping these technologies alive isn’t about putting them on life support; as the IBM example demonstrates it needs a commitment to incremental innovation, driving and optimising performance. And there’s still room for breakthroughs within those trajectories; in the case of magnetic tape storage it came in 2010 in the form of the Linear Tape File System (LTFS) open standard. This allowed tape drives to emulate the random access capabilities of their hard disk competitors, using metadata about the location of data stored on the tapes.

Whichever way you look at it there’s a need for innovation, whether bringing a breakthrough to an existing field or helping sustain a particular niche for the long haul. And we shouldn’t be too quick to write off ‘old’ technologies as new ones emerge which appear superior. It’s worth remembering that the arrival of the steamship didn’t wipe out the shipyards building sailing ships around the world; it actually spurred them on to a golden era of performance imporvement which it took steampships a long time to catch up with.

So, there’s often a lot of life left in old dogs, especially when we can teach them some new innovative tricks.

You can find a podcast version of this here and a video version here

And if you’d like to learn with me take a look at my online course here

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Humans, Not Technology, Drive Business Success

Humans, Not Technology, Drive Business Success

GUEST POST from Greg Satell

Silicon Valley is often known as a cut-throat, technocratic place where the efficiency of algorithms often define success. Competition is ferocious and the pace of disruption and change can be dizzying. It’s not the type of environment where soft skills are valued particularly highly or even at all.

So, it’s somewhat ironic that Bill Campbell became a Silicon Valley legend by giving hugs and professing love to those he worked with. As coach to executives ranging from Steve Jobs to the entire Google executive team, Campbell preached and practiced a very personal style of business.

Yet while I was reading Trillion Dollar Coach in which former Google executives explain Campbell’s leadership principles, it became clear why he had such an impact. Even in Silicon Valley, technology will only take you so far. The success of a business ultimately depends on the success of the people in it. To compete over the long haul, that’s where you need to focus.

The Efficiency Paradox

In 1911, Frederick Winslow Taylor published The Principles of Scientific Management, based on his experience as a manager in a steel factory. It took aim at traditional management methods and suggested a more disciplined approach. Rather than have workers pursue tasks in their own manner, he sought to find “the one best way” and train accordingly.

Taylor wrote, “It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured. And the duty of enforcing the adoption of standards and enforcing this cooperation rests with management alone.”

Before long, Taylor’s ideas became gospel, spawning offshoots such as scientific marketing, financial engineering and the Six Sigma movement. It was no longer enough to simply work hard, you had to measure, analyze and optimize everything. Over the years these ideas have become so central to business thinking that they are rarely questioned.

Yet management guru Henry Mintzberg has pointed out how a “by-the-numbers” depersonalized approach can often backfire. “Managing without soul has become an epidemic in society. Many managers these days seem to specialize in killing cultures, at the expense of human engagement.”

The evidence would seem to back him up. One study found that of 58 large companies that have announced Six Sigma programs, 91 percent trailed the S&P 500 in stock performance. That, in essence, is the efficiency paradox. When you manage only what you can measure, you end up ignoring key factors to success.

How Generosity Drives Innovation

While researching my book, Mapping Innovation, I interviewed dozens of top innovators. Some were world class scientists and engineers. Others were high level executives at large corporations. Still others were highly successful entrepreneurs. Overall, it was a pretty intimidating group.

So, I was surprised to find that, with few exceptions, they were some of the kindest and most generous people I have ever met. The behavior was so consistent that I felt that it couldn’t be an accident. So I began to research the matter further and found that when it comes to innovation, generosity really is a competitive advantage.

For example, one study of star engineers at Bell Labs found that the best performers were not the ones with the best academic credentials, but those with the best professional networks. A similar study of the design firm IDEO found that great innovators essentially act as brokers able to access a diverse array of useful sources.

A third study helps explain why knowledge brokering is so important. Analyzing 17.9 million papers, the researchers found that the most highly cited work tended to be largely rooted within a traditional field, but with just a smidgen of insight taken from some unconventional place. Breakthrough creativity occurs at the nexus of conventionality and novelty.

The truth is that the more you share with others, the more they’ll be willing to share with you and that makes it much more likely you’ll come across that random piece of information or insight that will allow you to crack a really tough problem.

People As Profit Centers

For many, the idea that innovation is a human centered activity is intuitively obvious. So it makes sense that the high-tech companies that Bill Campbell was involved in would work hard to create environments to attract the best and the brightest people. However, most businesses have much lower margins and have to keep a close eye on the bottom line.

Yet here too there is significant evidence that a human-focused approach to management can yield better results. In The Good Jobs Strategy MIT’s Zeynep Ton found that investing more in well-trained employees can actually lower costs and drive sales. A dedicated and skilled workforce results in less turnover, better customer service and greater efficiency.

For example, when the recession hit in 2008, Mercadona, Spain’s leading discount retailer, needed to cut costs. But rather than cutting wages or reducing staff, it asked its employees to contribute ideas. The result was that it managed to reduce prices by 10% and increased its market share from 15% in 2008 to 20% in 2012.

Its competitors maintained the traditional mindset. They reduced cut wages and employee hours, which saved them some money, but customers found poorly maintained stores with few people to help them, which damaged their brand long-term. The cost savings Mercadona’s employees identified, on the other hand, in many cases improved service and productivity and these gains persisted long after the crisis was over.

Management Beyond Metrics

The truth is that it’s easy to talk about putting people first, but much harder to do it in practice. Research suggests that once a group goes much beyond 200 people social relationships break down, so once a business gets beyond that point, it becomes natural to depersonalize management and focus on metrics.

Yet the best managers understand that it’s the people that drive the numbers. As legendary IBM CEO Lou Gerstner once put it, “Culture isn’t just one aspect of the game… It is the game. What does the culture reward and punish – individual achievement or team play, risk taking or consensus building?”

In other words, culture is about values. The innovators I interviewed for my book valued solving problems, so were enthusiastic about sharing their knowledge and expertise with others, who happily reciprocated. Mercadona valued its people, so when it asked them to find ways to save money during the financial crisis, they did so enthusiastically.

That’s why today, three years after his death, Bill Campbell remains a revered figure in Silicon Valley, because he valued people so highly and helped them learn to value each other. Management is not an algorithm. It is, in the final analysis, an intensely human activity and to do it well, you need to put people first.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.