Category Archives: Digital Transformation

Creating Effective Digital Teams

Creating Effective Digital Teams

GUEST POST from Howard Tiersky

Creating digital products is a multi-disciplinary process, blending creativity, engineering, strategy, customer support, legal regulations and more. How to structure their teams is a major challenge faced by large enterprises and global brands undergoing a digital transformation. Specifically, they need to answer the following three questions:

  1. What’s the optimal way to organize the necessary roles and responsibilities?
  2. Which part of the organization should own each capability?
  3. How do we get everyone working together?

The optimal structure for digital teams varies across different organizations. At FROM, we use a base framework that identifies fifteen key roles or competencies that are part of creating and operating most digital properties. Those roles are divided into three conceptual teams: the Digital Business Team, the Digital Technology Team, and the Extended Business Team.

The Digital Business Team

  1. Digital Business Vision Owner: The Business Vision Owner defines the key business measures and objectives for the digital property, including target market segments and their objectives. This “visioneer” makes final decisions on product direction.
  2. Product Management: Product Management owns the product on a day-to-day basis, and liaises with other areas to make sure the digital value proposition is realized. They’re responsible for commissioning and reviewing customer research to develop and maintain the product roadmap in terms of the business vision and can prioritize the backlog of changes and improvements.
  3. Program Management: Distinct from the Product Manager, the Program Manager is responsible for owning the long-term plan to achieve the product roadmap, including budgets and resource allocations, and for maintaining the release schedule.
  4. User Interface/User Experience: UI/UX is responsible for the overall look and feel of the digital product. They develop and maintain UI standards to be used as the product is developed, are involved in user testing, and QA new releases.
  5. Content Development: Content Development creates non-campaign and non-marketing or editorial content for the site, including articles, instructions, and FAQ or helps content. Their job is to create content that’s easy to understand and consistent with the brand or voice of the product or site.

The Digital Technology Team

  1. Front End Development: Front End Development selects frameworks and defines front-end coding standards for any technologies that will be used. They’re also responsible for writing code that will execute in the browser, such as HTML, HTML5, JavaScript, and mobile code (e.g., Objective-C.) Front End Development drives requirements for back-end development teams, to ensure the full user experience can be implemented.
  2. Back End Development: Back End Development manages core enterprise systems, including inventory, financial, and CRM. They’re responsible for exposing, as web services, the capabilities that are needed for front-end development. They’re responsible for developing and enforcing standards to protect the integrity of those enterprise systems, as well as reviewing requests for and implementing new capabilities.
  3. Data: Data develops and maintains enterprise and digital specific data models, managing data, and creating and maintaining plans for data management and warehousing. They monitor the health of databases, expose services for data access, and manage data architecture.
  4. Infrastructure: Infrastructure maintains the physical hardware used for applications and data. They maintain disaster and business continuity programs and monitor the scalability and reliability of the physical infrastructure. They also monitor and proactively manage the security of the infrastructure environment.
  5. Quality Assurance: Quality Assurance creates and maintains QA standards for code in production, develops automated and manual test scripts, and executes any integration, browser, or performance testing scenarios. They also monitor site metrics to identify problems proactively. (It should be noted that, though you want dedicated QA professionals on your team, QA is everyone’s responsibility!)

The Extended Business Team

  1. Marketing: Marketing is responsible for some key digital operations. They develop offers and campaigns to drive traffic. They manage email lists and execution and manage and maintain the CRM system.
  2. Product and Pricing: Product and Pricing responsibility can vary, depending on industry and type of digital property. When appropriate, they develop, license or merchandise anything sold on the site. They set pricing and drive requirements for aligning digital features with any new products based on those product’s parameters.
  3. Operations: Operations is responsible for fulfillment of the value proposition. For commerce sites, for example, this includes picking, packing and shipping orders. For something like a digital video aggregation site, responsibilities include finding, vetting and uploading new video content.
  4. Business Development: Business Development is focused on creating partnerships that increase traffic and sales, or find new streams of revenue.
  5. Customer Support: Customer support is responsible for maintaining knowledge of digital platforms, policies, and known issues and solutions. They assist customers with problems and questions and track customer interactions to report on trends and satisfaction levels.

How these teams and the roles within them fit together varies from company to company. However, it’s good practice to review this model to see, first, if you have these key roles represented in your organization. Then, make sure to create well-defined responsibilities and processes, and finally, look at how they function together, to see if they’re organized in the most effective manner. If your Digital Business, Digital Technology, and Extended Business teams are in sync, all your projects will benefit.

This article originally appeared on the Howard Tiersky blog
Image Credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Top 10 Human-Centered Change & Innovation Articles of June 2024

Top 10 Human-Centered Change & Innovation Articles of June 2024Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are June’s ten most popular innovation posts:

  1. The Surprising Downside of Collaboration in Problem-Solving — by Robyn Bolton
  2. Designing Organizational Change and Transformation — by Stefan Lindegaard
  3. Four Principles of Successful Digital Transformation — by Greg Satell
  4. Managers Make the Difference – Four Common Mistakes Managers Make — by David Burkus
  5. Learning to Innovate — by Janet Sernack
  6. Think Outside Which Box? — by Howard Tiersky
  7. Innovation the Amazon Way — by Greg Satell
  8. Irrelevant Innovation — by John Bessant
  9. Nike Should Stop Blaming Working from Home for Their Innovation Struggles — by Robyn Bolton
  10. Time is a Flat Circle – Jamie Dimon’s Comments on AI Just Proved It — by Robyn Bolton

BONUS – Here are five more strong articles published in May that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Humans Wanted for the Decade’s Biggest Innovation Challenges

Humans Wanted for the Decade's Biggest Innovation Challenges

GUEST POST from Greg Satell

Every era is defined by the problems it tackles. At the beginning of the 20th century, harnessing the power of internal combustion and electricity shaped society. In the 1960s there was the space race. Since the turn of this century, we’ve learned how to decode the human genome and make machines intelligent.

None of these were achieved by one person or even one organization. In the case of electricity, Faraday and Maxwell established key principles in the early and mid 1800s. Edison, Westinghouse and Tesla came up with the first applications later in that century. Scores of people made contributions for decades after that.

The challenges we face today will be fundamentally different because they won’t be solved by humans alone, but through complex human-machine interactions. That will require a new division of labor in which the highest level skills won’t be things like the ability to retain information or manipulate numbers, but to connect and collaborate with other humans.

Making New Computing Architectures Useful

Technology over the past century has been driven by a long succession of digital devices. First vacuum tubes, then transistors and finally microchips transformed electrical power into something approaching an intelligent control system for machines. That has been the key to the electronic and digital eras.

Yet today that smooth procession is coming to an end. Microchips are hitting their theoretical limits and will need to be replaced by new computing paradigms such as quantum computing and neuromorphic chips. The new technologies will not be digital, but will work fundamentally different than what we’re used to.

They will also have fundamentally different capabilities and will be applied in very different ways. Quantum computing, for example, will be able to simulate physical systems, which may revolutionize sciences like chemistry, materials research and biology. Neuromorphic chips may be thousands of times more energy efficient than conventional chips, opening up new possibilities for edge computing and intelligent materials.

There is still a lot of work to be done to make these technologies useful. To be commercially viable, not only do important applications need to be identified, but much like with classical computers, an entire generation of professionals will need to learn how to use them. That, in truth, may be the most significant hurdle.

Ethics For AI And Genomics

Artificial intelligence, once the stuff of science fiction, has become an everyday technology. We speak into our devices as a matter of course and expect to get back coherent answers. In the near future, we will see autonomous cars and other vehicles regularly deliver products and eventually become an integral part of our transportation system.

This opens up a significant number of ethical dilemmas. If given a choice to protect a passenger or a pedestrian, which should be encoded into the software of a autonomous car? Who gets to decide which factors are encoded into systems that make decisions about our education, whether we get hired or if we go to jail? How will these systems be trained? We all worry about who’s educating our kids, but who’s teaching our algorithms?

Powerful genomics techniques like CRISPR open up further ethical dilemmas. What are the guidelines for editing human genes? What are the risks of a mutation inserted in one species jumping to another? Should we revive extinct species, Jurassic Park style? What are the potential consequences?

What’s striking about the moral and ethical issues of both artificial intelligence and genomics is that they have no precedent, save for science fiction. We are in totally uncharted territory. Nevertheless, it is imperative that we develop a consensus about what principles should be applied, in what contexts and for what purpose.

Closing A Perpetual Skills Gap

Education used to be something that you underwent in preparation for your “real life.” Afterwards, you put away the schoolbooks and got down to work, raised a family and never really looked back. Even today, Pew Research reports that nearly one in four adults in the US did not read a single book last year.

Today technology is making many things we learned obsolete. In fact, a study at Oxford estimated that nearly half of the jobs that exist today will be automated in the next 20 years. That doesn’t mean that there won’t be jobs for humans to do, in fact we are in the midst of an acute labor shortage, especially in manufacturing, where automation is most pervasive.

Yet just as advanced technologies are eliminating the need for skills, they are also increasingly able to help us learn new ones. A number of companies are using virtual reality to train workers and finding that it can boost learning efficiency by as much as 40%. IBM, with the Rensselaer Polytechnic Institute, has recently unveiled a system that help you learn a new language like Mandarin. This video shows how it works.

Perhaps the most important challenge is a shift in mindset. We need to treat education as a lifelong need that extends long past childhood. If we only retrain workers once their industry has become obsolete and they’ve lost their jobs, then we are needlessly squandering human potential, not to mention courting an abundance of misery.

Shifting Value To Humans

The industrial revolution replaced the physical labor of humans with that of machines. The result was often mind-numbing labor in factories. Yet further automation opened up new opportunities for knowledge workers who could design ways to boost the productivity of both humans and machines.

Today, we’re seeing a similar shift from cognitive to social skills. Go into a highly automated Apple Store, to take just one example, and you don’t see a futuristic robot dystopia, but a small army of smiling attendants on hand to help you. The future of technology always seems to be more human.

In much the same way, when I talk to companies implementing advanced technologies like artificial intelligence or cloud computing, the one thing I constantly hear is that the human element is often the most important. Unless you can shift your employees to higher level tasks, you miss out on many of the most important benefits

What’s important to consider is that when a task is automated, it is also democratized and value shifts to another place. So, for example, e-commerce devalues the processing of transactions, but increases the value of things like customer service, expertise and resolving problems with orders, which is why we see all those smiling faces when we walk into an Apple Store.

That’s what we often forget about innovation. It’s essentially a very human endeavor and, to measure as true progress, humans always need to be at the center.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

We Must Reinvent Our Organizations for A New Era of Innovation

We Must Reinvent Our Organizations for A New Era of Innovation

GUEST POST from Greg Satell

In the first half of the 20th century, Alfred Sloan created the modern corporation at General Motors. In many ways, it was based on the military. Senior leadership at headquarters would make plans, while managers at individual units would be allocated resources and made responsible for achieving mission objectives.

The rise of digital technology made this kind of structure untenable. By the time strategic information was gathered centrally, it was often too old to be effective. In much the same way, by the time information flowed up from operating units, it was too late to alter the plan. It had already failed.

So in recent years, agility and iteration has become the mantra. Due to pressures from the market and from shareholders, long-term planning is often eschewed for the needs of the moment. Yet today the digital era is ending and organizations will need to shift once again. We’re going to need to learn to combine long-range planning with empowered execution.

Shifting From Iteration To Exploration

When Steve Jobs came up with the idea for a device that would hold “a thousand songs in my pocket,” it wasn’t technically feasible. There was simply no hard drive available that could fit that much storage into that little space. Nevertheless, within a few years a supplier developed the necessary technology and the iPod was born.

Notice how the bulk of the profits went to Apple, which designed the application and very little to the supplier that developed the technology that made it possible. That’s because the technology for developing hard drives was very well understood. If it hadn’t been that supplier, another would have developed what Jobs needed in six months or so.

Yet today, we’re on the brink of a new era of innovation. New technologies, such as revolutionary computing architectures, genomics and artificial intelligence are coming to the fore that aren’t nearly as well understood as digital technology. So we will have to spend years learning about them before we can develop applications safely and effectively.

For example, companies ranging from Daimler and Samsung to JP Morgan Chase and Barclays have joined IBM’s Q Network to explore quantum computing, even though that it will be years before that technology has a commercial impact. Leading tech companies have formed the Partnership on AI to better understand the consequences for artificial intelligence. Hundreds of companies have joined manufacturing hubs to learn about next generation technology.

It’s becoming more important to prepare than adapt. By the time you realize the need to adapt, it may already be too late.

Building A Pipeline Of Problems To Be Solved

While the need to explore technologies long before they become commercially viable is increasing, competitive pressures show no signs of abating. Just because digital technology is not advancing the way it once did doesn’t mean that it will disappear. Many aspects of the digital world, such as the speed at which we communicate, will continue.

So it is crucial to build a continuous pipeline of problems to solve. Most will be fairly incremental, either improving on an existing product or developing new ones based on standard technology. Others will be a bit more aspirational, such as applying existing capabilities to a completely new market or adopting exciting new technology to improve service to existing customers.

However, as the value generated from digital technology continues to level off, much like it did for earlier technologies like internal combustion and electricity, there will be an increasing need to pursue grand challenges to solve fundamental problems. That’s how truly new markets are created.

Clearly, this presents some issues with resource allocation. Senior managers will have to combine the need to move fast and keep up with immediate competitive pressures with the long-term thinking it takes to invest in years of exploration with an uncertain payoff. There’s no magic bullet, but it is generally accepted that the 70/20/10 principle for incremental, adjacent and fundamental innovation is a good rule of thumb.

Empowering Connectivity

When Sloan designed the modern corporation, capacity was a key constraint. The core challenge was to design and build products for the mass market. So long-term planning to effectively organize plant, equipment, distribution and other resources was an important, if not decisive, competitive attribute.

Digitization and globalization, however, flipped this model and vertical integration gave way to radical specialization. Because resources were no longer concentrated in large enterprises, but distributed across global networks, integration within global supply chains became increasingly important.

With the rise of cloud technology, this trend became even more decisive in the digital world. Creating proprietary technology that is closed off to the rest of the world has become unacceptable to customers, who expect you to maintain API’s that integrate with open technologies and those of your competitors.

Over the next decade, it will become increasingly important to build similar connection points for innovation. For example, the US military set up the Rapid Equipping Force that was specifically designed to connect new technologies with soldiers in the field who needed them. Many companies are setting up incubators, accelerators and corporate venture funds for the same reason. Others have set up programs to connect to academic research.

What’s clear is that going it alone is no longer an option and we need to set up specific structures that not only connect to new technology, but ensure that it is understood and adopted throughout the enterprise.

The Leadership Challenge

The shift from one era to another doesn’t mean that old challenges are eliminated. Even today, we need to scale businesses to service mass markets and rapidly iterate new applications. The problems we need to take on in this new era of innovation won’t replace the old ones, they will simply add to them.

Still, we can expect value to shift from agility to exploration as fundamental technologies rise to the fore. Organizations that are able to deliver new computing architectures, revolutionary new materials and miracle cures will have a distinct competitive advantage over those who can merely engineer and design new applications.

It is only senior leaders that can empower these shifts and it won’t be easy. Shareholders will continue to demand quarterly profit performance. Customers will continue to demand product performance and service. Yet it is only those that are able to harness the technologies of this new era — which will not contribute to profits or customer satisfaction for years to come — that will survive the next decade.

The one true constant is that success eventually breeds failure. The skills and strategies of one era do not translate to another. To survive, the key organizational attribute will not be speed, agility or even operational excellence, but leadership that understands that when the game is up, you need to learn how to play a new one.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Creating a Seamless and Unique Customer Experience

Creating a Seamless and Unique Customer Experience

GUEST POST from Howard Tiersky

Most companies recognize that creating a seamless and unique customer experience is key to success in the digital world, but that’s not always easy to do. How can you deliver the optimal digital experience to your users?

If you’ve ever been to the arctic circle, there are icebergs that are not only acres wide, but that rise hundreds of feet above sea level — truly massive objects. Yet what is perhaps even more amazing is that scientists tell us that almost 90% of a typical iceberg’s mass is underwater, and not visible to from the surface. If you are in the “iceberg business” — studying them for science or cutting through them for ships to pass — it’s quite important to understand not just the visible component, but the full scale and depth of the iceberg.

Similarly, most companies now recognize that creating a seamless, elegant and differentiated customer experience is key to success in this increasingly digital world. Defining that optimal experience is not necessarily an easy task. In fact, it can seem like a huge undertaking, and at FROM, it’s something that we spend a large portion of our time working with clients to optimize.

But we also see many companies struggling to execute on delivering their customer experience vision. There are many reasons for this, but a starting point of success is realizing that excellent customer experience is more than meets the eye. While concrete manifestation of the experience is found in the brand’s digital properties, content, and features, this is just the part of the iceberg that sticks up above the water. Beneath the waterline is three additional supporting elements that must also be effectively managed in order to achieve an excellent customer experience and the associated business outcomes.

User Experience FROM Iceberg

1. Technical Architecture

Outstanding customer experiences are supported by modern technology stacks that permit two essential capabilities:

Access From Any Touchpoint

Great customer experiences have the flexibility of touchpoint, and permit you to not only interact via web, phone, mobile, kiosk or other devices but have all actions instantly updated and available in a consistent manner. An example of what not to do: I placed an order on and immediately realized I made a mistake. I wanted to cancel it, but due to technical constraints, you can’t cancel orders on the website, only from the call center. So I called the call center, and they told me they wouldn’t be able to “see” my order (and therefore weren’t able to cancel it) for about an hour when the systems synchronize, and I should call back then. Not a great or accessible customer experience.

Flexible Frameworks

Flexible frameworks have the ability to be modified rapidly along with the changes that are being frequently deployed. The number one secret to how great customer experiences got to be great? It’s not by having a genius team that gets it right the first time; it’s through an iterative process of testing and learning. To do that, you have to be able to efficiently code, test, and iterate or kill new ideas quickly. Furthermore, the frameworks for presentation, business logic, and transaction processing need to be flexible. If user testing shows that changing the sequence of information collected from users during a checkout process might improve conversion, you need to be able to make a change like that reasonably simply. We often see companies with aging mainframe-based “back office” systems that are holding them back from being able to re-engineer their customer experience because “that’s not how the legacy system works.” No matter how much pain, companies in this situation need roadmaps to upgrade, redesign or replace these inflexible systems to permit the creative evolution of their customer experience.

2. Business Operations

Serving the digital customer effectively is not just about creating digital touchpoints, but about evolving the total experience with digital at the center. That means you will need to change the way you do business in a variety of spheres. Customers who use online chat to ask questions expect answers far faster than those who email, let alone those who send in snail mail. Digital customers opening an account at your bank don’t want to have to wait to receive a thick packet of forms in the mail that they have to sign in 17 different places. You may want to offer digital customers alternatives in “out of stock” situations (such as a direct ship) or permit them to customize their purchases in ways that weren’t previously possible. Truly optimizing for digital will probably change how you merchandise, your return policies, your customer support, customer communications, and, well, everything. It may require new roles, new processes or a re-organization of the company.

3. Business Model

One of the benefits customers see from digital is a huge improvement in the value equation. Skype has taken our long distance bill from hundreds of dollars to pennies. Spotify has given us access to practically any song ever recorded for a few dollars a month, and Netflix has done the same for movies. In many markets, Uber has halved the cost of a taxi. This is awesome for consumers, but threatening to incumbents whose business models are dependent on the pricing levels of legacy business models. Jeff Zucker, the former CEO of NBC, echoed this concern a decade ago when he bemoaned having to trade “analog dollars for digital pennies.”

Why are some companies able to offer consumers a “better deal?” Because digital can take substantial cost out of the equation, allowing more digitally centric companies to be more cost-competitive or shift to totally different business models (subscription access to huge content libraries instead of one by one DVD rental in the case of Netflix; offering the largest ground transportation fleet in the world without ever buying a single vehicle in the case of Uber; likewise eBay and Alibaba, two of the largest online stores, both of which stock no inventory.) You can have a great website and app, but if the fundamental value equation of your business is no longer competitive, you are going to struggle.

Don’t Bolt On Digital

Digital started out as a means of communication. We then had the era of eCommerce, where we “bolted on” digital alternatives to access the same inventory and offers available in our non-digital channels. But today, the winners are “digitally-transformed” companies that are offering a digital value proposition and have a technology stack that empowers them to create a great customer experience, and the business processes necessary to support and deliver on it.

It may seem like a lot. And it is. The world is changing fast, and the companies that succeed in the future will be those that make the transition. The ones that don’t will wind up on the list with companies like Kodak, Polaroid, BlockBuster, Sports Authority, Borders, Linens and Things and Circuit City. You can use this as a high-level roadmap for what you need to do to keep up with the digital transformation era. If your formula is not working yet, ask yourself which of these three areas you might not be paying enough attention to, or adapting quickly enough.

This article originally appeared on the Howard Tiersky blog
Image Credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Top 10 Human-Centered Change & Innovation Articles of May 2024

Top 10 Human-Centered Change & Innovation Articles of May 2024Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are May’s ten most popular innovation posts:

  1. Five Lessons from the Apple Car’s Demise — by Robyn Bolton
  2. Six Causes of Employee Burnout — by David Burkus
  3. Learning About Innovation – From a Skateboard? — by John Bessant
  4. Fighting for Innovation in the Trenches — by Geoffrey A. Moore
  5. A Case Study on High Performance Teams — by Stefan Lindegaard
  6. Growth Comes From What You Don’t Have — by Mike Shipulski
  7. Innovation Friction Risks and Pitfalls — by Howard Tiersky
  8. Difference Between Customer Experience Perception and Reality — by Shep Hyken
  9. How Tribalism Can Kill Innovation — by Greg Satell
  10. Preparing the Next Generation for a Post-Digital Age — by Greg Satell

BONUS – Here are five more strong articles published in April that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Principles of Successful Digital Transformation

Four Principles of Successful Digital Transformation

GUEST POST from Greg Satell

When Steve Jobs and Apple launched the Macintosh with great fanfare in 1984, it was to be only one step in a long journey that began with Douglas Engelbart’s Mother of All Demos and the development of the Alto at Xerox PARC more than a decade before. The Macintosh was, in many ways, the culmination of everything that came before.

Yet it was far from the end of the road. In fact, it wouldn’t be until the late 90s, after the rise of the Internet, that computers began to have a measurable effect on economic productivity. Until then, personal computers were mainly an expensive device to automate secretarial work and for kids to play video games.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. Yet what few realize is that it is the last part, transformation, that is often the hardest and the longest. In fact, it usually takes about 30 years to go from an initial discovery to a major impact on the world. Here’s what you can do to move things along.

1. Identify A Keystone Change

About a decade before the Macintosh, Xerox invented the Alto, which had many of the features that the Macintosh later became famous for, such as a graphical user interface, a mouse and a bitmapped screen. Yet while the Macintosh became legendary, the Alto never really got off the ground and is now remembered, if at all, as little more than a footnote.

The difference in outcomes had much less to do with technology than it had to do with vision. While Xerox had grand plans to create the “office of the future,” Steve Jobs and Apple merely wanted to create a cool gadget for middle class kids and enthusiasts. Sure, they were only using it to write term papers and play video games, but they were still buying.

In my book, Cascades, I call this a “keystone change,” based on something my friend Talia Milgrom-Elcott told me about ecosystems. Apparently, every ecosystem has one or two keystone species that it needs to thrive. Innovation works the same way, you first need to identify a keystone change before a transformation can begin.

One common mistake is to immediately seek out the largest addressable market for a new product or service. That’s a good idea for an established technology or product category, but when you have something that’s truly new and different, it’s much better to find a hair on fire use case, a problem that’s someone needs solved so badly that they are willing to put up with early glitches and other shortcomings.

2. Indoctrinate Values, Beliefs And Skills

A technology is more than just a collection of transistors and code or even a set of procedures, but needs specific values and skills to make it successful. For example, to shift your business to the cloud, you need to give up control of your infrastructure, which requires a completely new mindset. That’s why so many digital transformations fail. You can’t create a technology shift without a mind shift as well.

For example, when the Institute for Healthcare Improvement began its quest to save 100,000 lives through evidence-based quality practices, it spent significant time preparing the ground beforehand, so that people understood the ethos of the movement. It also created “change kits” and made sure the new procedures were easy to implement to maximize adoption.

In a similar vein, Facebook requires that all new engineers, regardless of experience or expertise, go through its engineering bootcamp. “Beyond the typical training program, at our Bootcamp new engineers see first-hand, and are able to infer, our unique system of values,” Eddie Ruvinsky, an Engineering Director at the company, told me.

“We don’t do this so much through training manuals and PowerPoint decks,” he continued,”but through allowing them to solve real problems working with real people who are going to be their colleagues. We’re not trying to shovel our existing culture at them, but preparing them to shape our culture for the future.”

Before you can change actions, you must first transform values, beliefs and skills.

3. Break Through Higher Thresholds Of Resistance

Growing up in Iowa in the 1930s, Everett Rogers, noticed something strange in his father’s behavior. Although his father loved electrical gadgets, he was hesitant to adopt hybrid seed corn, even though it had higher yields. In fact, his father only made the switch after he saw his neighbor’s hybrid seen crop thrive during a drought in 1936.

This became the basis for Rogers’ now-familiar diffusion of innovations theory, in which an idea first gets popular with a group of early adopters and then only later spreads to other people. Later, Geoffrey Moore explained that most innovations fail because they never cross the chasm from the early adopters to the mainstream.

Both theories have become popular, but are often misunderstood. Early adopters are not a specific personality type, but people with a low threshold of resistance to a particular idea or technology. Remember that Rogers’s father was an early adopter of electrical gadgets, but was more reticent with seed corn.

As network theory pioneer Duncan Watts explained to me, an idea propagates through “easily influenced people influencing other easily influenced people.” So it’s important to start a transformation with people who are already enthusiastic, work out the inevitable kinks and then move on to people slightly more reticent, once you’ve proved success in that earlier group.

4. Focus On The Network, Not The Nodes

Perhaps the biggest mistake that organizations commit when trying to implement a new technology is to try to push everything from above, either through carrots, like financial incentives, or sticks, like disciplinary action for noncompliance. That may give senior management the satisfaction of “taking action,” but can often backfire.

People are much more willing to adopt something new if they feel like its their idea. The Institute for Healthcare Improvement, for example, designated selected institutions to act as “nodes” to help spread its movement. These weren’t watchdogs, but peers that were early adopters who could help their colleagues adopt the new procedures effectively.

In a similar vein, IBM has already taken significant steps to drive adoption of Quantum computing, a technology that won’t be commercially available for years. First it created the Q Experience, an early version of its technology available through the cloud for anyone to use. It has also set up its Q Network of early adopter companies who are working with IBM to develop practical applications for quantum computing.

To date, tens of thousands have already run hundreds of thousands of experiments on Q Experience and about a dozen companies have joined the Q Network. So while there is still significant discovery and engineering to be done, the transformation is already well underway. It always pays to start early.

The truth is that transformation is always about the network, not the nodes. That’s why you need to identify a keystone change, indoctrinate the values and skills that will help you break through higher thresholds of resistance and continuously connect with a diverse set of stakeholders to drive change forward.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

DNA May Be the Next Frontier of Computing and Data Storage

DNA May Be the Next Frontier of Computing and Data Storage

GUEST POST from Greg Satell

Data, as many have noted, has become the new oil, meaning that we no longer regard the information we store as merely a cost of doing business, but a valuable asset and a potential source of competitive advantage. It has become the fuel that powers advanced technologies such as machine learning.

A problem that’s emerging, however, is that our ability to produce data is outstripping our ability to store it. In fact, an article in the journal Nature predicts that by 2040, data storage would consume 10–100 times the expected supply of microchip-grade silicon, using current technology. Clearly, we need a data storage breakthrough.

One potential solution is DNA, which is a million times more information dense than today’s flash drives. It also is more stable, more secure and uses minimal energy. The problem is that it is currently prohibitively expensive. However, a startup that has emerged out of MIT, called CATALOG, may have found the breakthrough we’re looking for: low-cost DNA Storage.

The Makings Of A Scientist-Entrepreneur

Growing up in his native Korea, Hyunjun Park never planned on a career in business, much less the technology business, but expected to become a biologist. He graduated with honors from Seoul National University and then went on to earn a PhD from the University of Wisconsin. Later he joined Tim Lu’s lab at MIT, which specializes in synthetic biology.

In an earlier time, he would have followed an established career path, from PhD to post-doc to assistant professor to tenure. These days, however, there is a growing trend for graduate students to get an entrepreneurial education in parallel with the traditional scientific curriculum. Park, for example, participated in both the Wisconsin Entrepreneurial Bootcamp and Start MIT.

He also met a kindred spirit in Nate Roquet, a PhD candidate who, about to finish his thesis, had started thinking about what to do next. Inspired by a talk from given by the Chief Science Officer at a seed fund, IndieBio, the two began to talk in earnest about starting a company together based on their work in synthetic biology.

As they batted around ideas, the subject of DNA storage came up. By this time, the advantages of the technology were well known but it was not considered practical, costing hundreds of thousands of dollars to store just a few hundred megabytes of data. However, the two did some back-of -the-envelope calculations and became convinced they could do it far more cheaply.

Moving From Idea To Product

The basic concept of DNA storage is simple. Essentially, you just encode the ones and zeros of digital code into the T, G, A and C’s of genetic code. However, stringing those genetic molecules together is tedious and expensive. The idea that Park and Roquet came up with was to use enzymes to alter strands of DNA, rather than building them up piece by piece.

Contrary to popular opinion, most traditional venture capital firms, such as those that populate Sand Hill Road in Silicon Valley, don’t invest in ideas. They invest in products. IndieBio, however, isn’t your typical investor. They give only give a small amount of seed capital, but offer other services, such as wet labs, entrepreneurial training and scientific mentorship. Park and Roquet reached out to them and found some interest.

“We invest in problems, not necessarily solutions,” Arvind Gupta, Founder at IndieBio told me. “Here the problem is massive. How do you keep the world’s knowledge safe? We know DNA can last thousands of years and can be replicated very inexpensively. That’s a really big deal and Hyunjun and Nate’s approach was incredibly exciting.”

Once the pair entered IndieBio’s four-month program, they found both promise and disappointment. Their approach could dramatically reduce the cost of storing information in DNA, but not nearly quickly enough to build a commercially viable product. They would need to pivot if they were going to turn their idea into an actual business.

Scaling To Market

One flaw in CATALOG’s approach was that the process was too complex to scale. Yet they found that by starting with just a few different DNA strands and attaching them together, much like a printing press pre-arranges words in a book, they could come up with something that was not only scalable, but commercially viable from a cost perspective.

The second problem was more thorny. Working with enzymes is incredibly labor intensive and, being biologists, Park and Roquet didn’t have the mechanical engineering expertise to make their process feasible. Fortunately, an advisor, Darren Link, connected the pair to Cambridge Consultants, an innovation consultancy that could help them.

“We started looking at the problem and it seemed that, on paper at least, we could make it work,” Richard Hammond, Technology Director and Head of Synthetic Biology at Cambridge Consultants, told me. “Now we’re about halfway through making the first prototype and we believe we can make it work and scale it significantly. We’re increasingly confident that we can solve the core technical challenges.”

In 2018 CATALOG introduced the world to Shannon, its prototype DNA writer. In 2022 CATALOG announced its DNA computation work at the HPC User Forum. But CATALOG isn’t without competition in the space. For example, Western Digital‘s LTO-9 from 2022, can store 18 TB per cartridge. CATALOG for its part is partnering with Seagate “on several initiatives to advance scalable and automated DNA-based storage and computation platforms, including making DNA-based platforms up to 1000 times smaller.” That should make the process competitive for archival storage, such as medical and legal records as well as storing film databases at movie studios.

“I think the fact that we’re inventing a completely new medium for data storage is really exciting,” Park told me. “I don’t think that we know yet what the true potential is because the biggest use cases probably don’t exist yet. What I do know is that our demand for data storage will soon outstrip our supply and we are thrilled about the possibility of solving that problem.”

Going Beyond Digital

A generation ago, the task of improving data storage would have been seen as solely a computer science problem. Yet today, the digital era is ending and we’re going to have to look further and wider for solutions to the problems we face. With the vast improvement in genomics, which is far outpacing Moore’s law these days, we can expect biology to increasingly play a role.

“Traditional, information technology has been strictly the realm of electrical engineers, physicists and coders,” Gupta of IndieBio told me. “What we’re increasingly finding is that biology, which has been honed for millions of years by evolution, can often point the way to solutions that are more robust and potentially, much cheaper and more efficient.”

Yet this phenomenon goes far beyond biology. We’re also seeing similar accelerations in other fields, such as materials science and space-related technologies. We’re also seeing a new breed of investors, like IndieBio, that focus specifically on scientist entrepreneurs. “I consider myself a product of the growing ecosystem for scientific entrepreneurs at universities and in the investor community,” Park told me.

Make no mistake. We are entering a new era of innovation and the traditional Silicon Valley approach will not get us where we need to go. Instead, we need to forge greater collaboration between the scientific community, the investor community and government agencies to solve problems that are increasingly complex and interdisciplinary.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Video Killed More Than the Radio Star

Video Killed More Than the Radio Star

by Braden Kelley

If you are a child of the eighties, you will remember when MTV went live 24 hours a day with music videos on cable television August 1, 1981 with the broadcast of “Video Killed the Radio Star” by the Buggles.

But I was thinking the other day about how video (or taken more broadly as streaming media – including television, movies, gaming, social media, and the internet) has killed far more things than just radio stars. Many activities have experienced substantial declines due to people staying home and engaging in these forms of entertainment – often by themselves – where in the past people would leave their homes to engage in more human-to-human-interactions.

The ten declines listed below have not only reshaped the American landscape – literally – but have also served to feed declines in the mental health of modern nations at the same time. Without further ado, here is the list

1. Bowling Alleys:

Bowling alleys, once bustling with players and leagues, have faced challenges in recent years. The communal experience of bowling has been replaced by digital alternatives, impacting the industry.

2. Roller Skating Rinks:

Roller skating rinks, which were once popular hangout spots for families and teens, have seen declining attendance. The allure of roller disco and skating parties has waned as people turn to other forms of entertainment.

3. Drive-In Movie Theaters:

Drive-in movie theaters, iconic symbols of mid-20th-century entertainment, have faced challenges in recent decades. While they once provided a unique way to watch films from the comfort of your car, changing lifestyles and technological advancements have impacted their popularity.

4. Arcade Game Centers:

In the ’80s and ’90s, video game arcades were buzzing hubs of entertainment. People flocked to play games like Pac-Man, Street Fighter, and Mortal Kombat. Traditional arcade game centers, filled with pinball machines, classic video games, and ticket redemption games, have struggled to compete with home gaming consoles and online multiplayer experiences. The convenience of playing video games at home has led to a decline in arcade visits. Nostalgia keeps some arcades alive, but they are no longer as prevalent as they once were.

5. Miniature Golf Courses:

Mini-golf courses, with their whimsical obstacles and family-friendly appeal, used to be popular weekend destinations. However, the rise of digital entertainment has impacted their attendance. The allure of playing a round of mini-golf under the sun has faded for many.

6. Indoor Trampoline Parks:

Indoor trampoline parks gained popularity as a fun and active way to spend time with friends and family. However, the pandemic and subsequent lockdowns forced many of these parks to close temporarily. Even before the pandemic, the availability of home trampolines and virtual fitness classes reduced the need for indoor trampoline parks. People can now bounce and exercise at home or virtually, without leaving their living rooms.

7. Live Music Venues:

Live music venues, including small clubs, concert halls, and outdoor amphitheaters, have struggled due to changing entertainment preferences. While some artists and bands continue to perform, the rise of virtual concerts and streaming services has affected attendance. People can now enjoy live music from the comfort of their homes, reducing the need to attend physical venues. The pandemic also disrupted live events, leading to further challenges for the industry.

8. Public Libraries (In-Person Visits):

Public libraries, once bustling with readers and community events, have seen a decline in in-person visits. E-books, audiobooks, and online research resources have made it easier for people to access information without physically visiting a library. While libraries continue to offer valuable services, their role has shifted from primarily physical spaces to digital hubs for learning and exploration – and a place for latchkey kids to go and wait for their parents to get off work.

10. Shopping Malls

Once bustling centers of retail and social activity, shopping malls have faced significant challenges in recent years. Various technological shifts have contributed to their decline, including e-commerce and online shopping, social media and influencer culture, changing demographics and urbanization. Shopping malls are yet another place that parents are no longer dropping off the younger generation at for the day.

And if that’s not enough, here is a bonus one for you:

11. Diners, Malt Shops, Coffee Shops, Dive Bars/Taverns, Neighborhood Pubs (UK) and Drive-In Burger Joints

If you’re a child of the seventies or eighties, no doubt you probably tuned to watch Richie, Potsie, Joanie, Fonsie and Ralph Malph gather every day at Al’s. Unfortunately, many of the more social and casual drinking and dining places are experiences declines as diet, habit and technology changes have kicked in. Demographic changes (aging out of nostalgia) and the rise of food delivery apps and takeout culture have helped to sign their death warrant.


In the ever-evolving landscape of entertainment, video and streaming media have reshaped our experiences and interactions. As we bid farewell to once-thriving institutions, we recognize both the convenience and the cost of this digital transformation. For example, the echoes of strikes and spares have faded as digital alternatives replace the communal joy of bowling. As we navigate this digital era, let us cherish what remains and adapt to what lies ahead. Video may have transformed our world, but the echoes of lost experiences linger, urging us to seek balance in our screens and our souls. As these once ubiquitous gathering places disappear, consumer tastes change and social isolation increases, will we as a society seek to reverse course or evolve to some new way of reconnecting as humans in person? And if so, how?

What other places and/or activities would you have added to the list?
(sound off in the comments)

p.s. Be sure and follow both my personal account and the Human-Centered Change and Innovation community on LinkedIn.

Image credit: Pixabay

(1) Duwamish Drive-In was not really about the movies.
(3) How online gaming has become a social lifeline – BBC.
(3) Social media brings benefits and risks to teens. Psychology can help ….
(4) Frontiers | Social Connectedness, Excessive Screen Time During COVID-19 ….

Preparing the Next Generation for a Post-Digital Age

Preparing the Next Generation for a Post-Digital Age

GUEST POST from Greg Satell

An education is supposed to prepare you for the future. Traditionally, that meant learning certain facts and skills, like when Columbus discovered America or how to do long division. Today, curricula have shifted to focus on a more global and digital world, like cultural history, basic computer skills and writing code.

Yet the challenges that our kids will face will be much different than we did growing up and many of the things a typical student learns in school today will no longer be relevant by the time he or she graduates college. In fact, a study at the University of Oxford found that 47% of today’s jobs will be eliminated over the next 20 years.

In 10 or 20 years, much of what we “know” about the world will no longer be true. The computers of the future will not be digital. Software code itself is disappearing, or at least becoming far less relevant. Many of what are considered good jobs today will be either automated or devalued. We need to rethink how we prepare our kids for the world to come.

Understanding Systems

The subjects we learned in school were mostly static. 2+2 always equaled 4 and Columbus always discovered America in 1492. Interpretations may have differed from place to place and evolved over time, but we were taught that the world was based on certain facts and we were evaluated on the basis on knowing them.

Yet as the complexity theorist Sam Arbesman has pointed out, facts have a half life and, as the accumulation of knowledge accelerates, those half lives are shrinking. For example, when we learned computer programming in school, it was usually in BASIC, a now mostly defunct language. Today, Python is the most popular language, but will likely not be a decade from now.

Computers themselves will be very different as well, based less on the digital code of ones and zeros and more on quantum laws and the human brain. We will likely store less information on silicon and more in DNA. There’s no way to teach kids how these things will work because nobody, not even experts, is quite sure yet.

So kids today need to learn less about how things are today and more about the systems future technologies will be based on, such as quantum mechanics, genetics and the logic of code. One thing economists have consistently found is that it is routine jobs that are most likely to be automated. The best way to prepare for the future is to develop the ability to learn and adapt.

Applying Empathy And Design Skills

While machines are taking over many high level tasks, such as medical analysis and legal research, there are some things they will never do. For example, a computer will never strike out in a Little League game, have its heart broken or see its child born. So it is very unlikely, if not impossible, that a machine will be able to relate to a human like other humans can.

That absence of empathy makes it hard for machines to design products and processes that will maximize enjoyment and utility for humans. So design skills are likely to be in high demand for decades to come as basic production and analytical processes are increasingly automated.

We’ve already seen this process take place with regard to the Internet. In the early days, it was a very technical field. You had to be a highly skilled engineer to make a website work. Today, however, building a website is something any fairly intelligent high school student can do and much of the value has shifted to front-end tasks, like designing the user experience.

With the rise of artificial intelligence and virtual reality our experiences with technology will become more far immersive and that will increase the need for good design. For example, conversational analysts (yes, that’s a real job) are working with designers to create conversational intelligence for voice interfaces and, clearly, virtual reality will be much more design intensive than video ever was.

The Ability To Communicate Complex Ideas

Much of the recent emphasis in education has been around STEM subjects (science, technology, engineering and math) and proficiency in those areas is certainly important for today’s students to understand the world around them. However, many STEM graduates are finding it difficult to find good jobs.

On the other hand, the ability to communicate ideas effectively is becoming a highly prized skill. Consider Amazon, one of the most innovative and technically proficient organizations on the planet. However, a key factor to its success its writing culture. The company is so fanatical about the ability to communicate that developing good writing skills are essential to building a successful career there.

Think about Amazon’s business and it becomes clear why. Sure, it employs highly adept engineers, but to create a truly superior product those people need to collaborate closely with designers, marketers, business development executives and others. To coordinate all that activity and keep everybody focused on delivering a specific experience to the customer, communication needs to be clear and coherent.

So while learning technical subjects like math and science is always a good idea, studying things like literature, history and philosophy is just as important.

Collaborating And Working In Teams

Traditionally, school work has been based on individual accomplishment. You were supposed to study at home, come in prepared and take your test without help. If you looked at your friend’s paper, it was called cheating and you got in a lot of trouble for it. We were taught to be accountable for achievements on our own merits.

Yet consider how the nature of work has changed, even in highly technical fields. In 1920, most scientific papers were written by sole authors, but by 1950 that had changed and co-authorship became the norm. Today, the average paper has four times as many authors as it did then and the work being done is far more interdisciplinary and done at greater distances than in the past.

Make no mistake. The high value work today is being done in teams and that will only increase as more jobs become automated. The jobs of the future will not depend as much on knowing facts or crunching numbers, but will involve humans collaborating with other humans to design work for machines. Collaboration will increasingly be a competitive advantage.

That’s why we need to pay attention not just to how our kids work and achieve academically, but how they play, resolve conflicts and make others feel supported and empowered. The truth is that value has shifted from cognitive skills to social skills. As kids will increasingly be able to learn complex subjects through technology, the most important class may well be recess.

Perhaps most of all, we need to be honest with ourselves and make peace with the fact that our kids educational experience will not — and should not — mirror our own. The world which they will need to face will be far more complex and more difficult to navigate than anything we could imagine back in the days when Fast Times at Ridgemont High was still popular.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.