Category Archives: Digital Transformation

Four Principles of Successful Digital Transformation

Four Principles of Successful Digital Transformation

GUEST POST from Greg Satell

When Steve Jobs and Apple launched the Macintosh with great fanfare in 1984, it was to be only one step in a long journey that began with Douglas Engelbart’s Mother of All Demos and the development of the Alto at Xerox PARC more than a decade before. The Macintosh was, in many ways, the culmination of everything that came before.

Yet it was far from the end of the road. In fact, it wouldn’t be until the late 90s, after the rise of the Internet, that computers began to have a measurable effect on economic productivity. Until then, personal computers were mainly an expensive device to automate secretarial work and for kids to play video games.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. Yet what few realize is that it is the last part, transformation, that is often the hardest and the longest. In fact, it usually takes about 30 years to go from an initial discovery to a major impact on the world. Here’s what you can do to move things along.

1. Identify A Keystone Change

About a decade before the Macintosh, Xerox invented the Alto, which had many of the features that the Macintosh later became famous for, such as a graphical user interface, a mouse and a bitmapped screen. Yet while the Macintosh became legendary, the Alto never really got off the ground and is now remembered, if at all, as little more than a footnote.

The difference in outcomes had much less to do with technology than it had to do with vision. While Xerox had grand plans to create the “office of the future,” Steve Jobs and Apple merely wanted to create a cool gadget for middle class kids and enthusiasts. Sure, they were only using it to write term papers and play video games, but they were still buying.

In my book, Cascades, I call this a “keystone change,” based on something my friend Talia Milgrom-Elcott told me about ecosystems. Apparently, every ecosystem has one or two keystone species that it needs to thrive. Innovation works the same way, you first need to identify a keystone change before a transformation can begin.

One common mistake is to immediately seek out the largest addressable market for a new product or service. That’s a good idea for an established technology or product category, but when you have something that’s truly new and different, it’s much better to find a hair on fire use case, a problem that’s someone needs solved so badly that they are willing to put up with early glitches and other shortcomings.

2. Indoctrinate Values, Beliefs And Skills

A technology is more than just a collection of transistors and code or even a set of procedures, but needs specific values and skills to make it successful. For example, to shift your business to the cloud, you need to give up control of your infrastructure, which requires a completely new mindset. That’s why so many digital transformations fail. You can’t create a technology shift without a mind shift as well.

For example, when the Institute for Healthcare Improvement began its quest to save 100,000 lives through evidence-based quality practices, it spent significant time preparing the ground beforehand, so that people understood the ethos of the movement. It also created “change kits” and made sure the new procedures were easy to implement to maximize adoption.

In a similar vein, Facebook requires that all new engineers, regardless of experience or expertise, go through its engineering bootcamp. “Beyond the typical training program, at our Bootcamp new engineers see first-hand, and are able to infer, our unique system of values,” Eddie Ruvinsky, an Engineering Director at the company, told me.

“We don’t do this so much through training manuals and PowerPoint decks,” he continued,”but through allowing them to solve real problems working with real people who are going to be their colleagues. We’re not trying to shovel our existing culture at them, but preparing them to shape our culture for the future.”

Before you can change actions, you must first transform values, beliefs and skills.

3. Break Through Higher Thresholds Of Resistance

Growing up in Iowa in the 1930s, Everett Rogers, noticed something strange in his father’s behavior. Although his father loved electrical gadgets, he was hesitant to adopt hybrid seed corn, even though it had higher yields. In fact, his father only made the switch after he saw his neighbor’s hybrid seen crop thrive during a drought in 1936.

This became the basis for Rogers’ now-familiar diffusion of innovations theory, in which an idea first gets popular with a group of early adopters and then only later spreads to other people. Later, Geoffrey Moore explained that most innovations fail because they never cross the chasm from the early adopters to the mainstream.

Both theories have become popular, but are often misunderstood. Early adopters are not a specific personality type, but people with a low threshold of resistance to a particular idea or technology. Remember that Rogers’s father was an early adopter of electrical gadgets, but was more reticent with seed corn.

As network theory pioneer Duncan Watts explained to me, an idea propagates through “easily influenced people influencing other easily influenced people.” So it’s important to start a transformation with people who are already enthusiastic, work out the inevitable kinks and then move on to people slightly more reticent, once you’ve proved success in that earlier group.

4. Focus On The Network, Not The Nodes

Perhaps the biggest mistake that organizations commit when trying to implement a new technology is to try to push everything from above, either through carrots, like financial incentives, or sticks, like disciplinary action for noncompliance. That may give senior management the satisfaction of “taking action,” but can often backfire.

People are much more willing to adopt something new if they feel like its their idea. The Institute for Healthcare Improvement, for example, designated selected institutions to act as “nodes” to help spread its movement. These weren’t watchdogs, but peers that were early adopters who could help their colleagues adopt the new procedures effectively.

In a similar vein, IBM has already taken significant steps to drive adoption of Quantum computing, a technology that won’t be commercially available for years. First it created the Q Experience, an early version of its technology available through the cloud for anyone to use. It has also set up its Q Network of early adopter companies who are working with IBM to develop practical applications for quantum computing.

To date, tens of thousands have already run hundreds of thousands of experiments on Q Experience and about a dozen companies have joined the Q Network. So while there is still significant discovery and engineering to be done, the transformation is already well underway. It always pays to start early.

The truth is that transformation is always about the network, not the nodes. That’s why you need to identify a keystone change, indoctrinate the values and skills that will help you break through higher thresholds of resistance and continuously connect with a diverse set of stakeholders to drive change forward.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

DNA May Be the Next Frontier of Computing and Data Storage

DNA May Be the Next Frontier of Computing and Data Storage

GUEST POST from Greg Satell

Data, as many have noted, has become the new oil, meaning that we no longer regard the information we store as merely a cost of doing business, but a valuable asset and a potential source of competitive advantage. It has become the fuel that powers advanced technologies such as machine learning.

A problem that’s emerging, however, is that our ability to produce data is outstripping our ability to store it. In fact, an article in the journal Nature predicts that by 2040, data storage would consume 10–100 times the expected supply of microchip-grade silicon, using current technology. Clearly, we need a data storage breakthrough.

One potential solution is DNA, which is a million times more information dense than today’s flash drives. It also is more stable, more secure and uses minimal energy. The problem is that it is currently prohibitively expensive. However, a startup that has emerged out of MIT, called CATALOG, may have found the breakthrough we’re looking for: low-cost DNA Storage.

The Makings Of A Scientist-Entrepreneur

Growing up in his native Korea, Hyunjun Park never planned on a career in business, much less the technology business, but expected to become a biologist. He graduated with honors from Seoul National University and then went on to earn a PhD from the University of Wisconsin. Later he joined Tim Lu’s lab at MIT, which specializes in synthetic biology.

In an earlier time, he would have followed an established career path, from PhD to post-doc to assistant professor to tenure. These days, however, there is a growing trend for graduate students to get an entrepreneurial education in parallel with the traditional scientific curriculum. Park, for example, participated in both the Wisconsin Entrepreneurial Bootcamp and Start MIT.

He also met a kindred spirit in Nate Roquet, a PhD candidate who, about to finish his thesis, had started thinking about what to do next. Inspired by a talk from given by the Chief Science Officer at a seed fund, IndieBio, the two began to talk in earnest about starting a company together based on their work in synthetic biology.

As they batted around ideas, the subject of DNA storage came up. By this time, the advantages of the technology were well known but it was not considered practical, costing hundreds of thousands of dollars to store just a few hundred megabytes of data. However, the two did some back-of -the-envelope calculations and became convinced they could do it far more cheaply.

Moving From Idea To Product

The basic concept of DNA storage is simple. Essentially, you just encode the ones and zeros of digital code into the T, G, A and C’s of genetic code. However, stringing those genetic molecules together is tedious and expensive. The idea that Park and Roquet came up with was to use enzymes to alter strands of DNA, rather than building them up piece by piece.

Contrary to popular opinion, most traditional venture capital firms, such as those that populate Sand Hill Road in Silicon Valley, don’t invest in ideas. They invest in products. IndieBio, however, isn’t your typical investor. They give only give a small amount of seed capital, but offer other services, such as wet labs, entrepreneurial training and scientific mentorship. Park and Roquet reached out to them and found some interest.

“We invest in problems, not necessarily solutions,” Arvind Gupta, Founder at IndieBio told me. “Here the problem is massive. How do you keep the world’s knowledge safe? We know DNA can last thousands of years and can be replicated very inexpensively. That’s a really big deal and Hyunjun and Nate’s approach was incredibly exciting.”

Once the pair entered IndieBio’s four-month program, they found both promise and disappointment. Their approach could dramatically reduce the cost of storing information in DNA, but not nearly quickly enough to build a commercially viable product. They would need to pivot if they were going to turn their idea into an actual business.

Scaling To Market

One flaw in CATALOG’s approach was that the process was too complex to scale. Yet they found that by starting with just a few different DNA strands and attaching them together, much like a printing press pre-arranges words in a book, they could come up with something that was not only scalable, but commercially viable from a cost perspective.

The second problem was more thorny. Working with enzymes is incredibly labor intensive and, being biologists, Park and Roquet didn’t have the mechanical engineering expertise to make their process feasible. Fortunately, an advisor, Darren Link, connected the pair to Cambridge Consultants, an innovation consultancy that could help them.

“We started looking at the problem and it seemed that, on paper at least, we could make it work,” Richard Hammond, Technology Director and Head of Synthetic Biology at Cambridge Consultants, told me. “Now we’re about halfway through making the first prototype and we believe we can make it work and scale it significantly. We’re increasingly confident that we can solve the core technical challenges.”

In 2018 CATALOG introduced the world to Shannon, its prototype DNA writer. In 2022 CATALOG announced its DNA computation work at the HPC User Forum. But CATALOG isn’t without competition in the space. For example, Western Digital‘s LTO-9 from 2022, can store 18 TB per cartridge. CATALOG for its part is partnering with Seagate “on several initiatives to advance scalable and automated DNA-based storage and computation platforms, including making DNA-based platforms up to 1000 times smaller.” That should make the process competitive for archival storage, such as medical and legal records as well as storing film databases at movie studios.

“I think the fact that we’re inventing a completely new medium for data storage is really exciting,” Park told me. “I don’t think that we know yet what the true potential is because the biggest use cases probably don’t exist yet. What I do know is that our demand for data storage will soon outstrip our supply and we are thrilled about the possibility of solving that problem.”

Going Beyond Digital

A generation ago, the task of improving data storage would have been seen as solely a computer science problem. Yet today, the digital era is ending and we’re going to have to look further and wider for solutions to the problems we face. With the vast improvement in genomics, which is far outpacing Moore’s law these days, we can expect biology to increasingly play a role.

“Traditional, information technology has been strictly the realm of electrical engineers, physicists and coders,” Gupta of IndieBio told me. “What we’re increasingly finding is that biology, which has been honed for millions of years by evolution, can often point the way to solutions that are more robust and potentially, much cheaper and more efficient.”

Yet this phenomenon goes far beyond biology. We’re also seeing similar accelerations in other fields, such as materials science and space-related technologies. We’re also seeing a new breed of investors, like IndieBio, that focus specifically on scientist entrepreneurs. “I consider myself a product of the growing ecosystem for scientific entrepreneurs at universities and in the investor community,” Park told me.

Make no mistake. We are entering a new era of innovation and the traditional Silicon Valley approach will not get us where we need to go. Instead, we need to forge greater collaboration between the scientific community, the investor community and government agencies to solve problems that are increasingly complex and interdisciplinary.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Video Killed More Than the Radio Star

Video Killed More Than the Radio Star

by Braden Kelley

If you are a child of the eighties, you will remember when MTV went live 24 hours a day with music videos on cable television August 1, 1981 with the broadcast of “Video Killed the Radio Star” by the Buggles.

But I was thinking the other day about how video (or taken more broadly as streaming media – including television, movies, gaming, social media, and the internet) has killed far more things than just radio stars. Many activities have experienced substantial declines due to people staying home and engaging in these forms of entertainment – often by themselves – where in the past people would leave their homes to engage in more human-to-human-interactions.

The ten declines listed below have not only reshaped the American landscape – literally – but have also served to feed declines in the mental health of modern nations at the same time. Without further ado, here is the list

1. Bowling Alleys:

Bowling alleys, once bustling with players and leagues, have faced challenges in recent years. The communal experience of bowling has been replaced by digital alternatives, impacting the industry.

2. Roller Skating Rinks:

Roller skating rinks, which were once popular hangout spots for families and teens, have seen declining attendance. The allure of roller disco and skating parties has waned as people turn to other forms of entertainment.

3. Drive-In Movie Theaters:

Drive-in movie theaters, iconic symbols of mid-20th-century entertainment, have faced challenges in recent decades. While they once provided a unique way to watch films from the comfort of your car, changing lifestyles and technological advancements have impacted their popularity.

4. Arcade Game Centers:

In the ’80s and ’90s, video game arcades were buzzing hubs of entertainment. People flocked to play games like Pac-Man, Street Fighter, and Mortal Kombat. Traditional arcade game centers, filled with pinball machines, classic video games, and ticket redemption games, have struggled to compete with home gaming consoles and online multiplayer experiences. The convenience of playing video games at home has led to a decline in arcade visits. Nostalgia keeps some arcades alive, but they are no longer as prevalent as they once were.

5. Miniature Golf Courses:

Mini-golf courses, with their whimsical obstacles and family-friendly appeal, used to be popular weekend destinations. However, the rise of digital entertainment has impacted their attendance. The allure of playing a round of mini-golf under the sun has faded for many.

6. Indoor Trampoline Parks:

Indoor trampoline parks gained popularity as a fun and active way to spend time with friends and family. However, the pandemic and subsequent lockdowns forced many of these parks to close temporarily. Even before the pandemic, the availability of home trampolines and virtual fitness classes reduced the need for indoor trampoline parks. People can now bounce and exercise at home or virtually, without leaving their living rooms.

7. Live Music Venues:

Live music venues, including small clubs, concert halls, and outdoor amphitheaters, have struggled due to changing entertainment preferences. While some artists and bands continue to perform, the rise of virtual concerts and streaming services has affected attendance. People can now enjoy live music from the comfort of their homes, reducing the need to attend physical venues. The pandemic also disrupted live events, leading to further challenges for the industry.

8. Public Libraries (In-Person Visits):

Public libraries, once bustling with readers and community events, have seen a decline in in-person visits. E-books, audiobooks, and online research resources have made it easier for people to access information without physically visiting a library. While libraries continue to offer valuable services, their role has shifted from primarily physical spaces to digital hubs for learning and exploration – and a place for latchkey kids to go and wait for their parents to get off work.

10. Shopping Malls

Once bustling centers of retail and social activity, shopping malls have faced significant challenges in recent years. Various technological shifts have contributed to their decline, including e-commerce and online shopping, social media and influencer culture, changing demographics and urbanization. Shopping malls are yet another place that parents are no longer dropping off the younger generation at for the day.

And if that’s not enough, here is a bonus one for you:

11. Diners, Malt Shops, Coffee Shops, Dive Bars/Taverns, Neighborhood Pubs (UK) and Drive-In Burger Joints

If you’re a child of the seventies or eighties, no doubt you probably tuned to watch Richie, Potsie, Joanie, Fonsie and Ralph Malph gather every day at Al’s. Unfortunately, many of the more social and casual drinking and dining places are experiences declines as diet, habit and technology changes have kicked in. Demographic changes (aging out of nostalgia) and the rise of food delivery apps and takeout culture have helped to sign their death warrant.

Conclusion

In the ever-evolving landscape of entertainment, video and streaming media have reshaped our experiences and interactions. As we bid farewell to once-thriving institutions, we recognize both the convenience and the cost of this digital transformation. For example, the echoes of strikes and spares have faded as digital alternatives replace the communal joy of bowling. As we navigate this digital era, let us cherish what remains and adapt to what lies ahead. Video may have transformed our world, but the echoes of lost experiences linger, urging us to seek balance in our screens and our souls. As these once ubiquitous gathering places disappear, consumer tastes change and social isolation increases, will we as a society seek to reverse course or evolve to some new way of reconnecting as humans in person? And if so, how?

What other places and/or activities would you have added to the list?
(sound off in the comments)

p.s. Be sure and follow both my personal account and the Human-Centered Change and Innovation community on LinkedIn.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

References:
(1) Duwamish Drive-In was not really about the movies. https://mynorthwest.com/289708/duwamish-drive-in-not-really-about-the-movies/.
(3) How online gaming has become a social lifeline – BBC. https://www.bbc.com/worklife/article/20201215-how-online-gaming-has-become-a-social-lifeline.
(3) Social media brings benefits and risks to teens. Psychology can help …. https://www.apa.org/monitor/2023/09/protecting-teens-on-social-media.
(4) Frontiers | Social Connectedness, Excessive Screen Time During COVID-19 …. https://www.frontiersin.org/articles/10.3389/fhumd.2021.684137/full.

Preparing the Next Generation for a Post-Digital Age

Preparing the Next Generation for a Post-Digital Age

GUEST POST from Greg Satell

An education is supposed to prepare you for the future. Traditionally, that meant learning certain facts and skills, like when Columbus discovered America or how to do long division. Today, curricula have shifted to focus on a more global and digital world, like cultural history, basic computer skills and writing code.

Yet the challenges that our kids will face will be much different than we did growing up and many of the things a typical student learns in school today will no longer be relevant by the time he or she graduates college. In fact, a study at the University of Oxford found that 47% of today’s jobs will be eliminated over the next 20 years.

In 10 or 20 years, much of what we “know” about the world will no longer be true. The computers of the future will not be digital. Software code itself is disappearing, or at least becoming far less relevant. Many of what are considered good jobs today will be either automated or devalued. We need to rethink how we prepare our kids for the world to come.

Understanding Systems

The subjects we learned in school were mostly static. 2+2 always equaled 4 and Columbus always discovered America in 1492. Interpretations may have differed from place to place and evolved over time, but we were taught that the world was based on certain facts and we were evaluated on the basis on knowing them.

Yet as the complexity theorist Sam Arbesman has pointed out, facts have a half life and, as the accumulation of knowledge accelerates, those half lives are shrinking. For example, when we learned computer programming in school, it was usually in BASIC, a now mostly defunct language. Today, Python is the most popular language, but will likely not be a decade from now.

Computers themselves will be very different as well, based less on the digital code of ones and zeros and more on quantum laws and the human brain. We will likely store less information on silicon and more in DNA. There’s no way to teach kids how these things will work because nobody, not even experts, is quite sure yet.

So kids today need to learn less about how things are today and more about the systems future technologies will be based on, such as quantum mechanics, genetics and the logic of code. One thing economists have consistently found is that it is routine jobs that are most likely to be automated. The best way to prepare for the future is to develop the ability to learn and adapt.

Applying Empathy And Design Skills

While machines are taking over many high level tasks, such as medical analysis and legal research, there are some things they will never do. For example, a computer will never strike out in a Little League game, have its heart broken or see its child born. So it is very unlikely, if not impossible, that a machine will be able to relate to a human like other humans can.

That absence of empathy makes it hard for machines to design products and processes that will maximize enjoyment and utility for humans. So design skills are likely to be in high demand for decades to come as basic production and analytical processes are increasingly automated.

We’ve already seen this process take place with regard to the Internet. In the early days, it was a very technical field. You had to be a highly skilled engineer to make a website work. Today, however, building a website is something any fairly intelligent high school student can do and much of the value has shifted to front-end tasks, like designing the user experience.

With the rise of artificial intelligence and virtual reality our experiences with technology will become more far immersive and that will increase the need for good design. For example, conversational analysts (yes, that’s a real job) are working with designers to create conversational intelligence for voice interfaces and, clearly, virtual reality will be much more design intensive than video ever was.

The Ability To Communicate Complex Ideas

Much of the recent emphasis in education has been around STEM subjects (science, technology, engineering and math) and proficiency in those areas is certainly important for today’s students to understand the world around them. However, many STEM graduates are finding it difficult to find good jobs.

On the other hand, the ability to communicate ideas effectively is becoming a highly prized skill. Consider Amazon, one of the most innovative and technically proficient organizations on the planet. However, a key factor to its success its writing culture. The company is so fanatical about the ability to communicate that developing good writing skills are essential to building a successful career there.

Think about Amazon’s business and it becomes clear why. Sure, it employs highly adept engineers, but to create a truly superior product those people need to collaborate closely with designers, marketers, business development executives and others. To coordinate all that activity and keep everybody focused on delivering a specific experience to the customer, communication needs to be clear and coherent.

So while learning technical subjects like math and science is always a good idea, studying things like literature, history and philosophy is just as important.

Collaborating And Working In Teams

Traditionally, school work has been based on individual accomplishment. You were supposed to study at home, come in prepared and take your test without help. If you looked at your friend’s paper, it was called cheating and you got in a lot of trouble for it. We were taught to be accountable for achievements on our own merits.

Yet consider how the nature of work has changed, even in highly technical fields. In 1920, most scientific papers were written by sole authors, but by 1950 that had changed and co-authorship became the norm. Today, the average paper has four times as many authors as it did then and the work being done is far more interdisciplinary and done at greater distances than in the past.

Make no mistake. The high value work today is being done in teams and that will only increase as more jobs become automated. The jobs of the future will not depend as much on knowing facts or crunching numbers, but will involve humans collaborating with other humans to design work for machines. Collaboration will increasingly be a competitive advantage.

That’s why we need to pay attention not just to how our kids work and achieve academically, but how they play, resolve conflicts and make others feel supported and empowered. The truth is that value has shifted from cognitive skills to social skills. As kids will increasingly be able to learn complex subjects through technology, the most important class may well be recess.

Perhaps most of all, we need to be honest with ourselves and make peace with the fact that our kids educational experience will not — and should not — mirror our own. The world which they will need to face will be far more complex and more difficult to navigate than anything we could imagine back in the days when Fast Times at Ridgemont High was still popular.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Innovation Friction Risks and Pitfalls

Innovation Friction Risks and Pitfalls

GUEST POST from Howard Tiersky

There’s a lot to be learned about innovation by looking at good ideas that just didn’t make it. We’d all like to believe that if we have an idea that genuinely improves upon something, and if we execute that idea correctly, the idea will be successful. But there is another factor to consider:

Here’s today’s example:

Back in the early 2000’s, I was running part of the eCommerce practice for Ernst & Young. Around 2003 we moved into a shiny new building at 5 Times Square in New York City, right next to where the ball drops on New Year’s Eve. The building was the first place I had ever seen with a keypad-controlled elevator. Instead of pushing an up or down button, the elevator is called by a numerical pad. You type in the number of the floor you are going to and receive a response from the keypad with a letter (such as D.) That letter corresponds to the elevator that you have been assigned to. You go to “your” elevator and, when it arrives, it automatically takes you to your floor.

This innovation delivers several benefits that improve the elevator experience:

1. It makes the elevators more efficient.

People going to lower floors are clustered together, as are people going to higher floors, and people going to the same floors are put on the same elevator. This allows more elevators to run express. Fewer stops. Less waiting for an elevator and a faster trip in the elevator.

2. It reduces “clicks.”

In a traditional system, an elevator user has to “call” the elevator and indicate their desire to go up or down. Once in the elevator, the user has to pick a floor. The old system was not a massive amount of effort, but the new system reduces two interactions to one. Presumably an improvement.

(Plus there’s no worrying about the kid in the elevator who decides to push all the buttons — there aren’t any!)

Is there a downside to this innovation?

Well, if you’re already in the elevator, there’s no opportunity to change your mind without getting off on the wrong floor and repeating the whole process. The biggest downside of this innovation is simply that it requires users to learn something new. In fact, when I moved into 5 Times Square, I found that when people came to meet with me for the first time, the first 10 minutes of our meeting was inevitably focused on their need to vent their reactions to our crazy elevators and how they couldn’t figure out how to use them!

Truthfully, the elevators were easy to use. Clear instructions were printed above the keypad, and the system worked very well. The problem was that it required users to relearn a skill they had fully and completely mastered (i.e., using an elevator) and start over at a beginner level — even if it only took 30 seconds to learn how to use the new elevator system.

I’ve watched with interest over the years to see if these types of elevators would take off. It turns out they didn’t. Very recently, I was visiting a client in Houston. The building had actually spent money to remove the keypad system and replace it with the traditional 2-step process. Wow. You know your innovation is not doing well when your customers are willing to invest tens of thousands of dollars to get rid of it and go back to the old way.

After much thought, I believe it’s all because of the friction of asking people to re-learn how to push an elevator button. Some innovations don’t require this. The new Boeing 787s have substantial innovation, but from a passenger standpoint, they work in basically the same way as the last round of airplanes. The innovations improve comfort, fuel efficiency, and other factors, but you recline the seat and return your tray table to an upright position in pretty much the same old way. Other innovations require learning: ATMs, DVRs, electric cars. All of these innovations have been successful, despite their learning requirements. However, the need for users to learn new behavior did slow their adoption. Innovation friction slows down adoption of innovations that require substantial behavior change, and even more so if it requires learning. This is especially true if the innovation requires un-learning an old way of doing something. If the friction is greater than the momentum of the benefit of overcoming it, the innovation stops dead in its tracks.

An example of this friction is the metric system, which has made only a very small amount of progress in adoption over the last 50 years, despite being clearly superior to the “English system.” It’s just too darn much trouble to change.

One last story about innovation friction from early in my career.

At that time, I was working with a lot of insurance companies creating web-based interfaces to replace traditional “green screen” systems used by insurance agents to quote and initiate new policies for auto and home insurance. It typically took a new hire 4-5 months to learn the system well enough to complete a policy quote — and well over a year to become truly proficient with it! We proudly designed replacement systems that anyone with basic computer skills could learn in a day or two at most, but found that some users were quite hostile to our efforts. They already knew how to use the green screen systems, and they were pretty darn fast with them. One Customer Support Agent even quoted Charlton Heston to me, saying I would only be able to take away her green screen if I pried it from her “cold dead hands.” Creepy? Yes. But also telling. Those old systems are gone now, because of the huge benefit of being able to train people on the new system so quickly. This benefit put the companies that used the new system in a position to more or less force that innovation onto other users.

Many successful innovations have required change and learning — automobiles, indoor toilets, smartphones. With all of these examples, we’ve seen many people willing to learn, for whom the “pain” of change was outweighed by the perceived benefit. But we also see a substantial number of users who resisted for years, saying, “No thanks, I like my outhouse (or horse and buggy or bank teller) just fine.” When conceiving or launching an innovation that requires learning, it’s important to consider the role innovation friction will play in adoption, where you can reduce it, and where you can increase the user’s willingness to accept it as the cost of the greater benefit.

This article originally appeared on the Howard Tiersky blog
Image Credits: Unsplash, Howard Tiersky

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Top 10 Human-Centered Change & Innovation Articles of April 2024

Top 10 Human-Centered Change & Innovation Articles of April 2024Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are April’s ten most popular innovation posts:

  1. Ignite Innovation with These 3 Key Ingredients — by Howard Tiersky
  2. What Have We Learned About Digital Transformation? — by Geoffrey A. Moore
  3. The Collective Growth Mindset — by Stefan Lindegaard
  4. Companies Are Not Families — by David Burkus
  5. 24 Customer Experience Mistakes to Stop in 2024 — by Shep Hyken
  6. Transformation is Human Not Digital — by Greg Satell
  7. Embrace the Art of Getting Started — by Mike Shipulski
  8. Trust as a Competitive Advantage — by Greg Satell
  9. 3 Innovation Lessons from The Departed — by Robyn Bolton
  10. Humans Are Not as Different from AI as We Think — by Geoffrey A. Moore

BONUS – Here are five more strong articles published in March that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

London Calling

London Calling Braden Kelley

by Braden Kelley

I will be in London attending a reunion soon and have some availability May 15-17, 2024 if anyone would like to book a keynote, workshop, or advisory session while I’m there.

Are you looking to build a continuous innovation infrastructure in your organization?

Would you like to learn more about the Change Planning Toolkit?

Want to learn how to become your own Futurist using the FutureHacking™ suite of tools?

I’m also open to helping promote a get together if someone has a space in central London to offer up for hosting a Human-Centered Change and Innovation community meetup.

Contact me if you have interest in any or all of these!

p.s. Be sure and follow both my personal account and the Human-Centered Change and Innovation community on LinkedIn.

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

What Have We Learned About Digital Transformation?

What Have We Learned About Digital Transformation?

GUEST POST from Geoffrey A. Moore

We are well into our first decade of digital transformation, with both the successes and the scars to show for it, and we can see there is a long way to go. Realistically, there is probably never a finish line, so I think it is time for us to pause and take stock of what we have learned, and how best we can proceed from here. Here are three lessons to take to heart.

Lesson 1: There are three distinct levels of transformation, and operating model transformation is the one that deserves the most attention.

Geoffrey Moore Three Models

The least disruptive transformation is to the infrastructure model. This should be managed within the Productivity Zone, where to be fair, the disruption will be considerable, but it should not require much in the way of behavior change from the rest of the enterprise. Moving from data centers to cloud computing is a good example, as are enabling mobile applications and remote work centers. The goal here is to make employees more efficient while lowering total cost of IT ownership. These transformations are well underway, and there is little confusion about what next steps to take.

By contrast, the most disruptive transformation is to the business model. Here a company may be monetizing information derived from its operating model, as the SABRE system did for American Airlines, or overlaying a digital service on top of its core offering, as the automotive makers are seeking to do with in-car entertainment. The challenge here is that the economics of the new model have little in common with the core model, which creates repercussions both with internal systems and external ecosystem relationships. Few of these transformations to date can be said to be truly successful, and my view is they are more the exception than the rule.

The place where digital transformation is having its biggest impact is on the operating model. Virtually every sector of the economy is re-engineering its customer-facing processes to take advantage of ubiquitous mobile devices interacting with applications hosted in the cloud. These are making material changes to everyday interactions with customers and partners in the Performance Zone, where the priority is to improve effectiveness first, efficiency second. The challenge is to secure rapid, consistent, widespread adoption of the new systems from every employee who touches them. More than any other factor, this is the one that separates the winners from the losers in the digital transformation game.

Lesson 2: Re-engineer operating models from the outside in, not the inside out.

A major challenge that digital transformation at the operating model level must overcome is the inertial resistance of the existing operating model, especially where it is embedded in human behaviors. Simply put, people don’t like change. (Well, actually, they all want other people to change, just not themselves.) When we take the approach of internal improvement, things go way too slowly and eventually lose momentum altogether.

The winning approach is to focus on an external forcing function. For competition cultures, the battle cry should be, this new operating model poses an existential threat to our future. Our competitors are eating our lunch. We need to change, and we need to do it now! For collaboration cultures, the call to action should be, we are letting our customers down because we are too hard to do business with. They love our offers, but if we don’t modernize our operating model, they are going to take their business elsewhere. Besides, with this new digital model, we can make our offers even more effective. Let’s get going!

This is where design thinking comes in. Forget the sticky notes and lose the digital whiteboards. This is not about process. It is about walking a mile in the other person’s shoes, be that an end user, a technical buyer, a project sponsor, or an implementation partner, spending time seeing what hoops they have to go through to implement or use your products or simply to do business with you. No matter how good you were in the pre-digital era, there will be a ton of room for improvement, but it has to be focused on their friction issues, not yours. Work backward from their needs and problems, in other words, not forward from your intentions or desires.

Lesson 3: Digital transformations cannot be pushed. They must be pulled.

This is the hardest lesson to learn. Most executive teams have assumed that if they got the right digital transformation leader, gave them the title of Chief Transformation Officer, funded them properly, and insured that the project was on time, on spec, and on budget, that would do the trick. It makes total sense. It just doesn’t work.

The problem is one endemic to all business process re-engineering. The people whose behavior needs to change—and change radically—are the ones least comfortable with the program. When some outsider shows up with a new system, they can find any number of things wrong with it and use these objections to slow down deployment, redirect it into more familiar ways, and in general, diminish its impact. Mandating adoption can lead to reluctant engagement or even malicious compliance, and the larger the population of people involved, the more likely this is to occur.

So what does work? Transformations that are driven by the organization that has to transform. These start with the executive in charge who must galvanize the team to take up the challenge, to demand the digital transformation, and to insert it into every phase of its deployment. In other words, the transformation has to be pulled, not pushed.

Now, don’t get me wrong. There is still plenty of work on the push side involved, and that will require a strong leader. But at the end of the day, success will depend more on the leader of the consuming organization than that of the delivery team.

That’s what I think. What do you think?

Image Credit: Pexels, Geoffrey Moore

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Top 10 Human-Centered Change & Innovation Articles of March 2024

Top 10 Human-Centered Change & Innovation Articles of March 2024Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are March’s ten most popular innovation posts:

  1. Agile Innovation Management — by Diana Porumboiu
  2. How to Re-engineer the Incubation Zone — by Geoffrey A. Moore
  3. It’s Not Clear What Innovation Success Is — by Robyn Bolton
  4. How Do You Know If Your Idea is Novel? — by Mike Shipulski
  5. How to Tell if You Are Trusted — by Mike Shipulski
  6. Innovation is Rubbish! — by John Bessant
  7. Celebrating the Trailblazing Women Pioneers of Innovation — by Art Inteligencia
  8. Thinking Differently About Leadership and Innovation — by Janet Sernack
  9. The Remarkable Power of Negative Feedback — by Dennis Stauffer
  10. 10 CX and Customer Service Predictions for 2024 (Part 1) — by Shep Hyken

BONUS – Here are five more strong articles published in February that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Transformation is Human Not Digital

Transformation is Human Not Digital

GUEST POST from Greg Satell

A decade ago, many still questioned the relevance of digital technology. While Internet penetration was already significant, e-commerce made up less than 6% of retail sales. Mobile and cloud computing were just getting started and artificial intelligence was still more science fiction than reality.

Yet today, all of those things are not only viable technologies, but increasingly key to effectively competing in the marketplace. Unfortunately, implementing these new technologies can be a thorny process. In fact, research by McKinsey found that fewer than one third of digital transformation efforts succeed.

For the most part, these failures have less to do with technology and more to do with managing the cultural and organizational challenges that a technological shift creates. It’s relatively easy to find a vendor that can implement a system for you, but much harder to prepare your organization to adapt to new technology. Here’s what you need to keep in mind:

Start With Business Objectives

Probably the most common trap that organizations fall into is focusing on technology rather than on specific business objectives. All too often, firms seek to “move to the cloud” or “develop AI capabilities.” That’s a sure sign you’re headed down the wrong path.

“The first question you have to ask is what business outcome you are trying to drive,” Roman Stanek, CEO at GoodData, told me. “Projects start by trying to implement a particular technical approach and not surprisingly, front-line managers and employees don’t find it useful. There’s no real adoption and no ROI.”

So start by asking yourself business related questions, such as “How could we better serve our customers through faster, more flexible technology?” or “How could artificial intelligence transform our business?” Once you understand your business goals, you can work your way back to the technology decisions.

Automate The Most Tedious Tasks First

Technological change often inspires fear. One of the most basic mistakes many firms make is to try to use new technology to try and replace humans and save costs rather than to augment and empower them to improve performance and deliver added value. This not only kills employee morale and slows adoption, it usually delivers worse results.

A much better approach is to use technology to improve the effectiveness of human employees. For example, one study cited by a White House report during the Obama Administration found that while machines had a 7.5 percent error rate in reading radiology images and humans had a 3.5% error rate, when humans combined their work with machines the error rate dropped to 0.5%.

The best way to do this is to start with the most boring and tedious tasks first. Those are what humans are worst at. Machines don’t get bored or tired. Humans, on the other hand, thrive on interaction and like to solve problems. So instead of looking to replace workers, look instead to make them more productive.

Perhaps most importantly, this approach can actually improve morale. Factory workers actively collaborate with robots they program themselves to do low-level tasks. In some cases, soldiers build such strong ties with robots that do dangerous jobs that they hold funerals for them when they “die.”

Shift Your Organization And Your Business Model

Another common mistake is to think that you can make a major technological shift and keep the rest of your business intact. For example, shifting to the cloud can save on infrastructure costs, but the benefits won’t last long if you don’t figure out how to redeploy those resources in some productive way.

For example, when I talked to Barry Libenson, Global CIO of the data giant, Experian, about his company’s shift to the cloud, he told me that “The organizational changes were pretty enormous. We had to physically reconfigure how people were organized. We also needed different skill sets in different places so that required more changes and so on.”

The shift to the cloud made Experian more agile, but more importantly it opened up new business opportunities. Its shift to the cloud allowed the company to create Ascend, a “data on demand” platform that allows its customers to make credit decisions based on near real time data, which is now its fastest growing business.

“All of the shifts we made were focused on opening up new markets and serving our customers better,” Libenson says, and that’s what helped make the technological shift so successful. Because it was focused on business results, it was that much easier to get everybody behind it, gain momentum and create a true transformation.

Humans Collaborating With Machines

Consider how different work was 20 years ago, when Windows 95 was still relatively new and only a minority of executives regularly used programs like Word, Excel and PowerPoint. We largely communicated by phone and memos typed up by secretaries. Data analysis was something you did with a pencil, paper and a desk calculator.

Clearly, the nature of work has changed. We spend far less time quietly working away at our desks and far more interacting with others. Much of the value has shifted from cognitive skills to social skills as collaboration increasingly becomes a competitive advantage. In the future, we can only expect these trends to strengthen and accelerate.

To understand what we can expect, look at what’s happened in the banking industry. When automatic teller machines first appeared in the early 1970s, most people thought it would lead to less branches and tellers, but actually just the opposite happened. Today, there are more than twice the number of bank tellers employed as in the 1970s, because they do things that machines can’t do, like solve unusual problems, show empathy and up-sell.

That’s why we need to treat any technological transformation as a human transformation. The high value work of the future will involve humans collaborating with other humans to design work for machines. Get the human part right and the technology will take care of itself.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Dall-E via Bing

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.