At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?
But enough delay, here are June’s ten most popular innovation posts:
If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!
Have something to contribute?
Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.
P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:
Every era is defined by the problems it tackles. At the beginning of the 20th century, harnessing the power of internal combustion and electricity shaped society. In the 1960s there was the space race. Since the turn of this century, we’ve learned how to decode the human genome and make machines intelligent.
None of these were achieved by one person or even one organization. In the case of electricity, Faraday and Maxwell established key principles in the early and mid 1800s. Edison, Westinghouse and Tesla came up with the first applications later in that century. Scores of people made contributions for decades after that.
The challenges we face today will be fundamentally different because they won’t be solved by humans alone, but through complex human-machine interactions. That will require a new division of labor in which the highest level skills won’t be things like the ability to retain information or manipulate numbers, but to connect and collaborate with other humans.
Making New Computing Architectures Useful
Technology over the past century has been driven by a long succession of digital devices. First vacuum tubes, then transistors and finally microchips transformed electrical power into something approaching an intelligent control system for machines. That has been the key to the electronic and digital eras.
Yet today that smooth procession is coming to an end. Microchips are hitting their theoretical limits and will need to be replaced by new computing paradigms such as quantum computing and neuromorphic chips. The new technologies will not be digital, but will work fundamentally different than what we’re used to.
They will also have fundamentally different capabilities and will be applied in very different ways. Quantum computing, for example, will be able to simulate physical systems, which may revolutionize sciences like chemistry, materials research and biology. Neuromorphic chips may be thousands of times more energy efficient than conventional chips, opening up new possibilities for edge computing and intelligent materials.
There is still a lot of work to be done to make these technologies useful. To be commercially viable, not only do important applications need to be identified, but much like with classical computers, an entire generation of professionals will need to learn how to use them. That, in truth, may be the most significant hurdle.
Ethics For AI And Genomics
Artificial intelligence, once the stuff of science fiction, has become an everyday technology. We speak into our devices as a matter of course and expect to get back coherent answers. In the near future, we will see autonomous cars and other vehicles regularly deliver products and eventually become an integral part of our transportation system.
This opens up a significant number of ethical dilemmas. If given a choice to protect a passenger or a pedestrian, which should be encoded into the software of a autonomous car? Who gets to decide which factors are encoded into systems that make decisions about our education, whether we get hired or if we go to jail? How will these systems be trained? We all worry about who’s educating our kids, but who’s teaching our algorithms?
Powerful genomics techniques like CRISPR open up further ethical dilemmas. What are the guidelines for editing human genes? What are the risks of a mutation inserted in one species jumping to another? Should we revive extinct species, Jurassic Park style? What are the potential consequences?
What’s striking about the moral and ethical issues of both artificial intelligence and genomics is that they have no precedent, save for science fiction. We are in totally uncharted territory. Nevertheless, it is imperative that we develop a consensus about what principles should be applied, in what contexts and for what purpose.
Closing A Perpetual Skills Gap
Education used to be something that you underwent in preparation for your “real life.” Afterwards, you put away the schoolbooks and got down to work, raised a family and never really looked back. Even today, Pew Research reports that nearly one in four adults in the US did not read a single book last year.
Today technology is making many things we learned obsolete. In fact, a study at Oxford estimated that nearly half of the jobs that exist today will be automated in the next 20 years. That doesn’t mean that there won’t be jobs for humans to do, in fact we are in the midst of an acute labor shortage, especially in manufacturing, where automation is most pervasive.
Yet just as advanced technologies are eliminating the need for skills, they are also increasingly able to help us learn new ones. A number of companies are using virtual reality to train workers and finding that it can boost learning efficiency by as much as 40%. IBM, with the Rensselaer Polytechnic Institute, has recently unveiled a system that help you learn a new language like Mandarin. This video shows how it works.
Perhaps the most important challenge is a shift in mindset. We need to treat education as a lifelong need that extends long past childhood. If we only retrain workers once their industry has become obsolete and they’ve lost their jobs, then we are needlessly squandering human potential, not to mention courting an abundance of misery.
Shifting Value To Humans
The industrial revolution replaced the physical labor of humans with that of machines. The result was often mind-numbing labor in factories. Yet further automation opened up new opportunities for knowledge workers who could design ways to boost the productivity of both humans and machines.
Today, we’re seeing a similar shift from cognitive to social skills. Go into a highly automated Apple Store, to take just one example, and you don’t see a futuristic robot dystopia, but a small army of smiling attendants on hand to help you. The future of technology always seems to be more human.
In much the same way, when I talk to companies implementing advanced technologies like artificial intelligence or cloud computing, the one thing I constantly hear is that the human element is often the most important. Unless you can shift your employees to higher level tasks, you miss out on many of the most important benefits
What’s important to consider is that when a task is automated, it is also democratized and value shifts to another place. So, for example, e-commerce devalues the processing of transactions, but increases the value of things like customer service, expertise and resolving problems with orders, which is why we see all those smiling faces when we walk into an Apple Store.
That’s what we often forget about innovation. It’s essentially a very human endeavor and, to measure as true progress, humans always need to be at the center.
— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
In the first half of the 20th century, Alfred Sloan created the modern corporation at General Motors. In many ways, it was based on the military. Senior leadership at headquarters would make plans, while managers at individual units would be allocated resources and made responsible for achieving mission objectives.
The rise of digital technology made this kind of structure untenable. By the time strategic information was gathered centrally, it was often too old to be effective. In much the same way, by the time information flowed up from operating units, it was too late to alter the plan. It had already failed.
So in recent years, agility and iteration has become the mantra. Due to pressures from the market and from shareholders, long-term planning is often eschewed for the needs of the moment. Yet today the digital era is ending and organizations will need to shift once again. We’re going to need to learn to combine long-range planning with empowered execution.
Shifting From Iteration To Exploration
When Steve Jobs came up with the idea for a device that would hold “a thousand songs in my pocket,” it wasn’t technically feasible. There was simply no hard drive available that could fit that much storage into that little space. Nevertheless, within a few years a supplier developed the necessary technology and the iPod was born.
Notice how the bulk of the profits went to Apple, which designed the application and very little to the supplier that developed the technology that made it possible. That’s because the technology for developing hard drives was very well understood. If it hadn’t been that supplier, another would have developed what Jobs needed in six months or so.
Yet today, we’re on the brink of a new era of innovation. New technologies, such as revolutionary computing architectures, genomics and artificial intelligence are coming to the fore that aren’t nearly as well understood as digital technology. So we will have to spend years learning about them before we can develop applications safely and effectively.
For example, companies ranging from Daimler and Samsung to JP Morgan Chase and Barclays have joined IBM’s Q Network to explore quantum computing, even though that it will be years before that technology has a commercial impact. Leading tech companies have formed the Partnership on AI to better understand the consequences for artificial intelligence. Hundreds of companies have joined manufacturing hubs to learn about next generation technology.
It’s becoming more important to prepare than adapt. By the time you realize the need to adapt, it may already be too late.
Building A Pipeline Of Problems To Be Solved
While the need to explore technologies long before they become commercially viable is increasing, competitive pressures show no signs of abating. Just because digital technology is not advancing the way it once did doesn’t mean that it will disappear. Many aspects of the digital world, such as the speed at which we communicate, will continue.
So it is crucial to build a continuous pipeline of problems to solve. Most will be fairly incremental, either improving on an existing product or developing new ones based on standard technology. Others will be a bit more aspirational, such as applying existing capabilities to a completely new market or adopting exciting new technology to improve service to existing customers.
However, as the value generated from digital technology continues to level off, much like it did for earlier technologies like internal combustion and electricity, there will be an increasing need to pursue grand challenges to solve fundamental problems. That’s how truly new markets are created.
Clearly, this presents some issues with resource allocation. Senior managers will have to combine the need to move fast and keep up with immediate competitive pressures with the long-term thinking it takes to invest in years of exploration with an uncertain payoff. There’s no magic bullet, but it is generally accepted that the 70/20/10 principle for incremental, adjacent and fundamental innovation is a good rule of thumb.
Empowering Connectivity
When Sloan designed the modern corporation, capacity was a key constraint. The core challenge was to design and build products for the mass market. So long-term planning to effectively organize plant, equipment, distribution and other resources was an important, if not decisive, competitive attribute.
Digitization and globalization, however, flipped this model and vertical integration gave way to radical specialization. Because resources were no longer concentrated in large enterprises, but distributed across global networks, integration within global supply chains became increasingly important.
With the rise of cloud technology, this trend became even more decisive in the digital world. Creating proprietary technology that is closed off to the rest of the world has become unacceptable to customers, who expect you to maintain API’s that integrate with open technologies and those of your competitors.
Over the next decade, it will become increasingly important to build similar connection points for innovation. For example, the US military set up the Rapid Equipping Force that was specifically designed to connect new technologies with soldiers in the field who needed them. Many companies are setting up incubators, accelerators and corporate venture funds for the same reason. Others have set up programs to connect to academic research.
What’s clear is that going it alone is no longer an option and we need to set up specific structures that not only connect to new technology, but ensure that it is understood and adopted throughout the enterprise.
The Leadership Challenge
The shift from one era to another doesn’t mean that old challenges are eliminated. Even today, we need to scale businesses to service mass markets and rapidly iterate new applications. The problems we need to take on in this new era of innovation won’t replace the old ones, they will simply add to them.
Still, we can expect value to shift from agility to exploration as fundamental technologies rise to the fore. Organizations that are able to deliver new computing architectures, revolutionary new materials and miracle cures will have a distinct competitive advantage over those who can merely engineer and design new applications.
It is only senior leaders that can empower these shifts and it won’t be easy. Shareholders will continue to demand quarterly profit performance. Customers will continue to demand product performance and service. Yet it is only those that are able to harness the technologies of this new era — which will not contribute to profits or customer satisfaction for years to come — that will survive the next decade.
The one true constant is that success eventually breeds failure. The skills and strategies of one era do not translate to another. To survive, the key organizational attribute will not be speed, agility or even operational excellence, but leadership that understands that when the game is up, you need to learn how to play a new one.
— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
Most companies recognize that creating a seamless and unique customer experience is key to success in the digital world, but that’s not always easy to do. How can you deliver the optimal digital experience to your users?
If you’ve ever been to the arctic circle, there are icebergs that are not only acres wide, but that rise hundreds of feet above sea level — truly massive objects. Yet what is perhaps even more amazing is that scientists tell us that almost 90% of a typical iceberg’s mass is underwater, and not visible to from the surface. If you are in the “iceberg business” — studying them for science or cutting through them for ships to pass — it’s quite important to understand not just the visible component, but the full scale and depth of the iceberg.
Similarly, most companies now recognize that creating a seamless, elegant and differentiated customer experience is key to success in this increasingly digital world. Defining that optimal experience is not necessarily an easy task. In fact, it can seem like a huge undertaking, and at FROM, it’s something that we spend a large portion of our time working with clients to optimize.
But we also see many companies struggling to execute on delivering their customer experience vision. There are many reasons for this, but a starting point of success is realizing that excellent customer experience is more than meets the eye. While concrete manifestation of the experience is found in the brand’s digital properties, content, and features, this is just the part of the iceberg that sticks up above the water. Beneath the waterline is three additional supporting elements that must also be effectively managed in order to achieve an excellent customer experience and the associated business outcomes.
1. Technical Architecture
Outstanding customer experiences are supported by modern technology stacks that permit two essential capabilities:
Access From Any Touchpoint
Great customer experiences have the flexibility of touchpoint, and permit you to not only interact via web, phone, mobile, kiosk or other devices but have all actions instantly updated and available in a consistent manner. An example of what not to do: I placed an order on HomeDepot.com and immediately realized I made a mistake. I wanted to cancel it, but due to technical constraints, you can’t cancel orders on the website, only from the call center. So I called the call center, and they told me they wouldn’t be able to “see” my order (and therefore weren’t able to cancel it) for about an hour when the systems synchronize, and I should call back then. Not a great or accessible customer experience.
Flexible Frameworks
Flexible frameworks have the ability to be modified rapidly along with the changes that are being frequently deployed. The number one secret to how great customer experiences got to be great? It’s not by having a genius team that gets it right the first time; it’s through an iterative process of testing and learning. To do that, you have to be able to efficiently code, test, and iterate or kill new ideas quickly. Furthermore, the frameworks for presentation, business logic, and transaction processing need to be flexible. If user testing shows that changing the sequence of information collected from users during a checkout process might improve conversion, you need to be able to make a change like that reasonably simply. We often see companies with aging mainframe-based “back office” systems that are holding them back from being able to re-engineer their customer experience because “that’s not how the legacy system works.” No matter how much pain, companies in this situation need roadmaps to upgrade, redesign or replace these inflexible systems to permit the creative evolution of their customer experience.
2. Business Operations
Serving the digital customer effectively is not just about creating digital touchpoints, but about evolving the total experience with digital at the center. That means you will need to change the way you do business in a variety of spheres. Customers who use online chat to ask questions expect answers far faster than those who email, let alone those who send in snail mail. Digital customers opening an account at your bank don’t want to have to wait to receive a thick packet of forms in the mail that they have to sign in 17 different places. You may want to offer digital customers alternatives in “out of stock” situations (such as a direct ship) or permit them to customize their purchases in ways that weren’t previously possible. Truly optimizing for digital will probably change how you merchandise, your return policies, your customer support, customer communications, and, well, everything. It may require new roles, new processes or a re-organization of the company.
3. Business Model
One of the benefits customers see from digital is a huge improvement in the value equation. Skype has taken our long distance bill from hundreds of dollars to pennies. Spotify has given us access to practically any song ever recorded for a few dollars a month, and Netflix has done the same for movies. In many markets, Uber has halved the cost of a taxi. This is awesome for consumers, but threatening to incumbents whose business models are dependent on the pricing levels of legacy business models. Jeff Zucker, the former CEO of NBC, echoed this concern a decade ago when he bemoaned having to trade “analog dollars for digital pennies.”
Why are some companies able to offer consumers a “better deal?” Because digital can take substantial cost out of the equation, allowing more digitally centric companies to be more cost-competitive or shift to totally different business models (subscription access to huge content libraries instead of one by one DVD rental in the case of Netflix; offering the largest ground transportation fleet in the world without ever buying a single vehicle in the case of Uber; likewise eBay and Alibaba, two of the largest online stores, both of which stock no inventory.) You can have a great website and app, but if the fundamental value equation of your business is no longer competitive, you are going to struggle.
Don’t Bolt On Digital
Digital started out as a means of communication. We then had the era of eCommerce, where we “bolted on” digital alternatives to access the same inventory and offers available in our non-digital channels. But today, the winners are “digitally-transformed” companies that are offering a digital value proposition and have a technology stack that empowers them to create a great customer experience, and the business processes necessary to support and deliver on it.
It may seem like a lot. And it is. The world is changing fast, and the companies that succeed in the future will be those that make the transition. The ones that don’t will wind up on the list with companies like Kodak, Polaroid, BlockBuster, Sports Authority, Borders, Linens and Things and Circuit City. You can use this as a high-level roadmap for what you need to do to keep up with the digital transformation era. If your formula is not working yet, ask yourself which of these three areas you might not be paying enough attention to, or adapting quickly enough.
This article originally appeared on the Howard Tiersky blog
Image Credits: Pexels
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?
But enough delay, here are May’s ten most popular innovation posts:
If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!
Have something to contribute?
Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.
P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:
When Steve Jobs and Apple launched the Macintosh with great fanfare in 1984, it was to be only one step in a long journey that began with Douglas Engelbart’s Mother of All Demos and the development of the Alto at Xerox PARC more than a decade before. The Macintosh was, in many ways, the culmination of everything that came before.
Yet it was far from the end of the road. In fact, it wouldn’t be until the late 90s, after the rise of the Internet, that computers began to have a measurable effect on economic productivity. Until then, personal computers were mainly an expensive device to automate secretarial work and for kids to play video games.
The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. Yet what few realize is that it is the last part, transformation, that is often the hardest and the longest. In fact, it usually takes about 30 years to go from an initial discovery to a major impact on the world. Here’s what you can do to move things along.
1. Identify A Keystone Change
About a decade before the Macintosh, Xerox invented the Alto, which had many of the features that the Macintosh later became famous for, such as a graphical user interface, a mouse and a bitmapped screen. Yet while the Macintosh became legendary, the Alto never really got off the ground and is now remembered, if at all, as little more than a footnote.
The difference in outcomes had much less to do with technology than it had to do with vision. While Xerox had grand plans to create the “office of the future,” Steve Jobs and Apple merely wanted to create a cool gadget for middle class kids and enthusiasts. Sure, they were only using it to write term papers and play video games, but they were still buying.
In my book, Cascades, I call this a “keystone change,” based on something my friend Talia Milgrom-Elcott told me about ecosystems. Apparently, every ecosystem has one or two keystone species that it needs to thrive. Innovation works the same way, you first need to identify a keystone change before a transformation can begin.
One common mistake is to immediately seek out the largest addressable market for a new product or service. That’s a good idea for an established technology or product category, but when you have something that’s truly new and different, it’s much better to find a hair on fire use case, a problem that’s someone needs solved so badly that they are willing to put up with early glitches and other shortcomings.
2. Indoctrinate Values, Beliefs And Skills
A technology is more than just a collection of transistors and code or even a set of procedures, but needs specific values and skills to make it successful. For example, to shift your business to the cloud, you need to give up control of your infrastructure, which requires a completely new mindset. That’s why so many digital transformations fail. You can’t create a technology shift without a mind shift as well.
For example, when the Institute for Healthcare Improvement began its quest to save 100,000 lives through evidence-based quality practices, it spent significant time preparing the ground beforehand, so that people understood the ethos of the movement. It also created “change kits” and made sure the new procedures were easy to implement to maximize adoption.
In a similar vein, Facebook requires that all new engineers, regardless of experience or expertise, go through its engineering bootcamp. “Beyond the typical training program, at our Bootcamp new engineers see first-hand, and are able to infer, our unique system of values,” Eddie Ruvinsky, an Engineering Director at the company, told me.
“We don’t do this so much through training manuals and PowerPoint decks,” he continued,”but through allowing them to solve real problems working with real people who are going to be their colleagues. We’re not trying to shovel our existing culture at them, but preparing them to shape our culture for the future.”
Before you can change actions, you must first transform values, beliefs and skills.
3. Break Through Higher Thresholds Of Resistance
Growing up in Iowa in the 1930s, Everett Rogers, noticed something strange in his father’s behavior. Although his father loved electrical gadgets, he was hesitant to adopt hybrid seed corn, even though it had higher yields. In fact, his father only made the switch after he saw his neighbor’s hybrid seen crop thrive during a drought in 1936.
This became the basis for Rogers’ now-familiar diffusion of innovations theory, in which an idea first gets popular with a group of early adopters and then only later spreads to other people. Later, Geoffrey Moore explained that most innovations fail because they never cross the chasm from the early adopters to the mainstream.
Both theories have become popular, but are often misunderstood. Early adopters are not a specific personality type, but people with a low threshold of resistance to a particular idea or technology. Remember that Rogers’s father was an early adopter of electrical gadgets, but was more reticent with seed corn.
As network theory pioneer Duncan Watts explained to me, an idea propagates through “easily influenced people influencing other easily influenced people.” So it’s important to start a transformation with people who are already enthusiastic, work out the inevitable kinks and then move on to people slightly more reticent, once you’ve proved success in that earlier group.
4. Focus On The Network, Not The Nodes
Perhaps the biggest mistake that organizations commit when trying to implement a new technology is to try to push everything from above, either through carrots, like financial incentives, or sticks, like disciplinary action for noncompliance. That may give senior management the satisfaction of “taking action,” but can often backfire.
People are much more willing to adopt something new if they feel like its their idea. The Institute for Healthcare Improvement, for example, designated selected institutions to act as “nodes” to help spread its movement. These weren’t watchdogs, but peers that were early adopters who could help their colleagues adopt the new procedures effectively.
In a similar vein, IBM has already taken significant steps to drive adoption of Quantum computing, a technology that won’t be commercially available for years. First it created the Q Experience, an early version of its technology available through the cloud for anyone to use. It has also set up its Q Network of early adopter companies who are working with IBM to develop practical applications for quantum computing.
To date, tens of thousands have already run hundreds of thousands of experiments on Q Experience and about a dozen companies have joined the Q Network. So while there is still significant discovery and engineering to be done, the transformation is already well underway. It always pays to start early.
The truth is that transformation is always about the network, not the nodes. That’s why you need to identify a keystone change, indoctrinate the values and skills that will help you break through higher thresholds of resistance and continuously connect with a diverse set of stakeholders to drive change forward.
— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Unsplash
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
Data, as many have noted, has become the new oil, meaning that we no longer regard the information we store as merely a cost of doing business, but a valuable asset and a potential source of competitive advantage. It has become the fuel that powers advanced technologies such as machine learning.
A problem that’s emerging, however, is that our ability to produce data is outstripping our ability to store it. In fact, an article in the journal Nature predicts that by 2040, data storage would consume 10–100 times the expected supply of microchip-grade silicon, using current technology. Clearly, we need a data storage breakthrough.
One potential solution is DNA, which is a million times more information dense than today’s flash drives. It also is more stable, more secure and uses minimal energy. The problem is that it is currently prohibitively expensive. However, a startup that has emerged out of MIT, called CATALOG, may have found the breakthrough we’re looking for: low-cost DNA Storage.
The Makings Of A Scientist-Entrepreneur
Growing up in his native Korea, Hyunjun Park never planned on a career in business, much less the technology business, but expected to become a biologist. He graduated with honors from Seoul National University and then went on to earn a PhD from the University of Wisconsin. Later he joined Tim Lu’s lab at MIT, which specializes in synthetic biology.
In an earlier time, he would have followed an established career path, from PhD to post-doc to assistant professor to tenure. These days, however, there is a growing trend for graduate students to get an entrepreneurial education in parallel with the traditional scientific curriculum. Park, for example, participated in both the Wisconsin Entrepreneurial Bootcamp and Start MIT.
He also met a kindred spirit in Nate Roquet, a PhD candidate who, about to finish his thesis, had started thinking about what to do next. Inspired by a talk from given by the Chief Science Officer at a seed fund, IndieBio, the two began to talk in earnest about starting a company together based on their work in synthetic biology.
As they batted around ideas, the subject of DNA storage came up. By this time, the advantages of the technology were well known but it was not considered practical, costing hundreds of thousands of dollars to store just a few hundred megabytes of data. However, the two did some back-of -the-envelope calculations and became convinced they could do it far more cheaply.
Moving From Idea To Product
The basic concept of DNA storage is simple. Essentially, you just encode the ones and zeros of digital code into the T, G, A and C’s of genetic code. However, stringing those genetic molecules together is tedious and expensive. The idea that Park and Roquet came up with was to use enzymes to alter strands of DNA, rather than building them up piece by piece.
Contrary to popular opinion, most traditional venture capital firms, such as those that populate Sand Hill Road in Silicon Valley, don’t invest in ideas. They invest in products. IndieBio, however, isn’t your typical investor. They give only give a small amount of seed capital, but offer other services, such as wet labs, entrepreneurial training and scientific mentorship. Park and Roquet reached out to them and found some interest.
“We invest in problems, not necessarily solutions,” Arvind Gupta, Founder at IndieBio told me. “Here the problem is massive. How do you keep the world’s knowledge safe? We know DNA can last thousands of years and can be replicated very inexpensively. That’s a really big deal and Hyunjun and Nate’s approach was incredibly exciting.”
Once the pair entered IndieBio’s four-month program, they found both promise and disappointment. Their approach could dramatically reduce the cost of storing information in DNA, but not nearly quickly enough to build a commercially viable product. They would need to pivot if they were going to turn their idea into an actual business.
Scaling To Market
One flaw in CATALOG’s approach was that the process was too complex to scale. Yet they found that by starting with just a few different DNA strands and attaching them together, much like a printing press pre-arranges words in a book, they could come up with something that was not only scalable, but commercially viable from a cost perspective.
The second problem was more thorny. Working with enzymes is incredibly labor intensive and, being biologists, Park and Roquet didn’t have the mechanical engineering expertise to make their process feasible. Fortunately, an advisor, Darren Link, connected the pair to Cambridge Consultants, an innovation consultancy that could help them.
“We started looking at the problem and it seemed that, on paper at least, we could make it work,” Richard Hammond, Technology Director and Head of Synthetic Biology at Cambridge Consultants, told me. “Now we’re about halfway through making the first prototype and we believe we can make it work and scale it significantly. We’re increasingly confident that we can solve the core technical challenges.”
In 2018 CATALOG introduced the world to Shannon, its prototype DNA writer. In 2022 CATALOG announced its DNA computation work at the HPC User Forum. But CATALOG isn’t without competition in the space. For example, Western Digital‘s LTO-9 from 2022, can store 18 TB per cartridge. CATALOG for its part is partnering with Seagate “on several initiatives to advance scalable and automated DNA-based storage and computation platforms, including making DNA-based platforms up to 1000 times smaller.” That should make the process competitive for archival storage, such as medical and legal records as well as storing film databases at movie studios.
“I think the fact that we’re inventing a completely new medium for data storage is really exciting,” Park told me. “I don’t think that we know yet what the true potential is because the biggest use cases probably don’t exist yet. What I do know is that our demand for data storage will soon outstrip our supply and we are thrilled about the possibility of solving that problem.”
Going Beyond Digital
A generation ago, the task of improving data storage would have been seen as solely a computer science problem. Yet today, the digital era is ending and we’re going to have to look further and wider for solutions to the problems we face. With the vast improvement in genomics, which is far outpacing Moore’s law these days, we can expect biology to increasingly play a role.
“Traditional, information technology has been strictly the realm of electrical engineers, physicists and coders,” Gupta of IndieBio told me. “What we’re increasingly finding is that biology, which has been honed for millions of years by evolution, can often point the way to solutions that are more robust and potentially, much cheaper and more efficient.”
Yet this phenomenon goes far beyond biology. We’re also seeing similar accelerations in other fields, such as materials science and space-related technologies. We’re also seeing a new breed of investors, like IndieBio, that focus specifically on scientist entrepreneurs. “I consider myself a product of the growing ecosystem for scientific entrepreneurs at universities and in the investor community,” Park told me.
Make no mistake. We are entering a new era of innovation and the traditional Silicon Valley approach will not get us where we need to go. Instead, we need to forge greater collaboration between the scientific community, the investor community and government agencies to solve problems that are increasingly complex and interdisciplinary.
— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
If you are a child of the eighties, you will remember when MTV went live 24 hours a day with music videos on cable television August 1, 1981 with the broadcast of “Video Killed the Radio Star” by the Buggles.
But I was thinking the other day about how video (or taken more broadly as streaming media – including television, movies, gaming, social media, and the internet) has killed far more things than just radio stars. Many activities have experienced substantial declines due to people staying home and engaging in these forms of entertainment – often by themselves – where in the past people would leave their homes to engage in more human-to-human-interactions.
The ten declines listed below have not only reshaped the American landscape – literally – but have also served to feed declines in the mental health of modern nations at the same time. Without further ado, here is the list
1. Bowling Alleys:
Bowling alleys, once bustling with players and leagues, have faced challenges in recent years. The communal experience of bowling has been replaced by digital alternatives, impacting the industry.
2. Roller Skating Rinks:
Roller skating rinks, which were once popular hangout spots for families and teens, have seen declining attendance. The allure of roller disco and skating parties has waned as people turn to other forms of entertainment.
3. Drive-In Movie Theaters:
Drive-in movie theaters, iconic symbols of mid-20th-century entertainment, have faced challenges in recent decades. While they once provided a unique way to watch films from the comfort of your car, changing lifestyles and technological advancements have impacted their popularity.
4. Arcade Game Centers:
In the ’80s and ’90s, video game arcades were buzzing hubs of entertainment. People flocked to play games like Pac-Man, Street Fighter, and Mortal Kombat. Traditional arcade game centers, filled with pinball machines, classic video games, and ticket redemption games, have struggled to compete with home gaming consoles and online multiplayer experiences. The convenience of playing video games at home has led to a decline in arcade visits. Nostalgia keeps some arcades alive, but they are no longer as prevalent as they once were.
5. Miniature Golf Courses:
Mini-golf courses, with their whimsical obstacles and family-friendly appeal, used to be popular weekend destinations. However, the rise of digital entertainment has impacted their attendance. The allure of playing a round of mini-golf under the sun has faded for many.
6. Indoor Trampoline Parks:
Indoor trampoline parks gained popularity as a fun and active way to spend time with friends and family. However, the pandemic and subsequent lockdowns forced many of these parks to close temporarily. Even before the pandemic, the availability of home trampolines and virtual fitness classes reduced the need for indoor trampoline parks. People can now bounce and exercise at home or virtually, without leaving their living rooms.
7. Live Music Venues:
Live music venues, including small clubs, concert halls, and outdoor amphitheaters, have struggled due to changing entertainment preferences. While some artists and bands continue to perform, the rise of virtual concerts and streaming services has affected attendance. People can now enjoy live music from the comfort of their homes, reducing the need to attend physical venues. The pandemic also disrupted live events, leading to further challenges for the industry.
8. Public Libraries (In-Person Visits):
Public libraries, once bustling with readers and community events, have seen a decline in in-person visits. E-books, audiobooks, and online research resources have made it easier for people to access information without physically visiting a library. While libraries continue to offer valuable services, their role has shifted from primarily physical spaces to digital hubs for learning and exploration – and a place for latchkey kids to go and wait for their parents to get off work.
10. Shopping Malls
Once bustling centers of retail and social activity, shopping malls have faced significant challenges in recent years. Various technological shifts have contributed to their decline, including e-commerce and online shopping, social media and influencer culture, changing demographics and urbanization. Shopping malls are yet another place that parents are no longer dropping off the younger generation at for the day.
And if that’s not enough, here is a bonus one for you:
If you’re a child of the seventies or eighties, no doubt you probably tuned to watch Richie, Potsie, Joanie, Fonsie and Ralph Malph gather every day at Al’s. Unfortunately, many of the more social and casual drinking and dining places are experiences declines as diet, habit and technology changes have kicked in. Demographic changes (aging out of nostalgia) and the rise of food delivery apps and takeout culture have helped to sign their death warrant.
Conclusion
In the ever-evolving landscape of entertainment, video and streaming media have reshaped our experiences and interactions. As we bid farewell to once-thriving institutions, we recognize both the convenience and the cost of this digital transformation. For example, the echoes of strikes and spares have faded as digital alternatives replace the communal joy of bowling. As we navigate this digital era, let us cherish what remains and adapt to what lies ahead. Video may have transformed our world, but the echoes of lost experiences linger, urging us to seek balance in our screens and our souls. As these once ubiquitous gathering places disappear, consumer tastes change and social isolation increases, will we as a society seek to reverse course or evolve to some new way of reconnecting as humans in person? And if so, how?
What other places and/or activities would you have added to the list?
(sound off in the comments)
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
References:
(1) Duwamish Drive-In was not really about the movies. https://mynorthwest.com/289708/duwamish-drive-in-not-really-about-the-movies/.
(3) How online gaming has become a social lifeline – BBC. https://www.bbc.com/worklife/article/20201215-how-online-gaming-has-become-a-social-lifeline.
(3) Social media brings benefits and risks to teens. Psychology can help …. https://www.apa.org/monitor/2023/09/protecting-teens-on-social-media.
(4) Frontiers | Social Connectedness, Excessive Screen Time During COVID-19 …. https://www.frontiersin.org/articles/10.3389/fhumd.2021.684137/full.
An education is supposed to prepare you for the future. Traditionally, that meant learning certain facts and skills, like when Columbus discovered America or how to do long division. Today, curricula have shifted to focus on a more global and digital world, like cultural history, basic computer skills and writing code.
Yet the challenges that our kids will face will be much different than we did growing up and many of the things a typical student learns in school today will no longer be relevant by the time he or she graduates college. In fact, a study at the University of Oxford found that 47% of today’s jobs will be eliminated over the next 20 years.
In 10 or 20 years, much of what we “know” about the world will no longer be true. The computers of the future will not be digital. Software code itself is disappearing, or at least becoming far less relevant. Many of what are considered good jobs today will be either automated or devalued. We need to rethink how we prepare our kids for the world to come.
Understanding Systems
The subjects we learned in school were mostly static. 2+2 always equaled 4 and Columbus always discovered America in 1492. Interpretations may have differed from place to place and evolved over time, but we were taught that the world was based on certain facts and we were evaluated on the basis on knowing them.
Yet as the complexity theorist Sam Arbesman has pointed out, facts have a half life and, as the accumulation of knowledge accelerates, those half lives are shrinking. For example, when we learned computer programming in school, it was usually in BASIC, a now mostly defunct language. Today, Python is the most popular language, but will likely not be a decade from now.
Computers themselves will be very different as well, based less on the digital code of ones and zeros and more on quantum laws and the human brain. We will likely store less information on silicon and more in DNA. There’s no way to teach kids how these things will work because nobody, not even experts, is quite sure yet.
So kids today need to learn less about how things are today and more about the systems future technologies will be based on, such as quantum mechanics, genetics and the logic of code. One thing economists have consistently found is that it is routine jobs that are most likely to be automated. The best way to prepare for the future is to develop the ability to learn and adapt.
Applying Empathy And Design Skills
While machines are taking over many high level tasks, such as medical analysis and legal research, there are some things they will never do. For example, a computer will never strike out in a Little League game, have its heart broken or see its child born. So it is very unlikely, if not impossible, that a machine will be able to relate to a human like other humans can.
That absence of empathy makes it hard for machines to design products and processes that will maximize enjoyment and utility for humans. So design skills are likely to be in high demand for decades to come as basic production and analytical processes are increasingly automated.
We’ve already seen this process take place with regard to the Internet. In the early days, it was a very technical field. You had to be a highly skilled engineer to make a website work. Today, however, building a website is something any fairly intelligent high school student can do and much of the value has shifted to front-end tasks, like designing the user experience.
With the rise of artificial intelligence and virtual reality our experiences with technology will become more far immersive and that will increase the need for good design. For example, conversational analysts (yes, that’s a real job) are working with designers to create conversational intelligence for voice interfaces and, clearly, virtual reality will be much more design intensive than video ever was.
The Ability To Communicate Complex Ideas
Much of the recent emphasis in education has been around STEM subjects (science, technology, engineering and math) and proficiency in those areas is certainly important for today’s students to understand the world around them. However, many STEM graduates are finding it difficult to find good jobs.
On the other hand, the ability to communicate ideas effectively is becoming a highly prized skill. Consider Amazon, one of the most innovative and technically proficient organizations on the planet. However, a key factor to its success its writing culture. The company is so fanatical about the ability to communicate that developing good writing skills are essential to building a successful career there.
Think about Amazon’s business and it becomes clear why. Sure, it employs highly adept engineers, but to create a truly superior product those people need to collaborate closely with designers, marketers, business development executives and others. To coordinate all that activity and keep everybody focused on delivering a specific experience to the customer, communication needs to be clear and coherent.
So while learning technical subjects like math and science is always a good idea, studying things like literature, history and philosophy is just as important.
Collaborating And Working In Teams
Traditionally, school work has been based on individual accomplishment. You were supposed to study at home, come in prepared and take your test without help. If you looked at your friend’s paper, it was called cheating and you got in a lot of trouble for it. We were taught to be accountable for achievements on our own merits.
Yet consider how the nature of work has changed, even in highly technical fields. In 1920, most scientific papers were written by sole authors, but by 1950 that had changed and co-authorship became the norm. Today, the average paper has four times as many authors as it did then and the work being done is far more interdisciplinary and done at greater distances than in the past.
Make no mistake. The high value work today is being done in teams and that will only increase as more jobs become automated. The jobs of the future will not depend as much on knowing facts or crunching numbers, but will involve humans collaborating with other humans to design work for machines. Collaboration will increasingly be a competitive advantage.
That’s why we need to pay attention not just to how our kids work and achieve academically, but how they play, resolve conflicts and make others feel supported and empowered. The truth is that value has shifted from cognitive skills to social skills. As kids will increasingly be able to learn complex subjects through technology, the most important class may well be recess.
Perhaps most of all, we need to be honest with ourselves and make peace with the fact that our kids educational experience will not — and should not — mirror our own. The world which they will need to face will be far more complex and more difficult to navigate than anything we could imagine back in the days when Fast Times at Ridgemont High was still popular.
— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
There’s a lot to be learned about innovation by looking at good ideas that just didn’t make it. We’d all like to believe that if we have an idea that genuinely improves upon something, and if we execute that idea correctly, the idea will be successful. But there is another factor to consider:
Here’s today’s example:
Back in the early 2000’s, I was running part of the eCommerce practice for Ernst & Young. Around 2003 we moved into a shiny new building at 5 Times Square in New York City, right next to where the ball drops on New Year’s Eve. The building was the first place I had ever seen with a keypad-controlled elevator. Instead of pushing an up or down button, the elevator is called by a numerical pad. You type in the number of the floor you are going to and receive a response from the keypad with a letter (such as D.) That letter corresponds to the elevator that you have been assigned to. You go to “your” elevator and, when it arrives, it automatically takes you to your floor.
This innovation delivers several benefits that improve the elevator experience:
1. It makes the elevators more efficient.
People going to lower floors are clustered together, as are people going to higher floors, and people going to the same floors are put on the same elevator. This allows more elevators to run express. Fewer stops. Less waiting for an elevator and a faster trip in the elevator.
2. It reduces “clicks.”
In a traditional system, an elevator user has to “call” the elevator and indicate their desire to go up or down. Once in the elevator, the user has to pick a floor. The old system was not a massive amount of effort, but the new system reduces two interactions to one. Presumably an improvement.
(Plus there’s no worrying about the kid in the elevator who decides to push all the buttons — there aren’t any!)
Is there a downside to this innovation?
Well, if you’re already in the elevator, there’s no opportunity to change your mind without getting off on the wrong floor and repeating the whole process. The biggest downside of this innovation is simply that it requires users to learn something new. In fact, when I moved into 5 Times Square, I found that when people came to meet with me for the first time, the first 10 minutes of our meeting was inevitably focused on their need to vent their reactions to our crazy elevators and how they couldn’t figure out how to use them!
Truthfully, the elevators were easy to use. Clear instructions were printed above the keypad, and the system worked very well. The problem was that it required users to relearn a skill they had fully and completely mastered (i.e., using an elevator) and start over at a beginner level — even if it only took 30 seconds to learn how to use the new elevator system.
I’ve watched with interest over the years to see if these types of elevators would take off. It turns out they didn’t. Very recently, I was visiting a client in Houston. The building had actually spent money to remove the keypad system and replace it with the traditional 2-step process. Wow. You know your innovation is not doing well when your customers are willing to invest tens of thousands of dollars to get rid of it and go back to the old way.
After much thought, I believe it’s all because of the friction of asking people to re-learn how to push an elevator button. Some innovations don’t require this. The new Boeing 787s have substantial innovation, but from a passenger standpoint, they work in basically the same way as the last round of airplanes. The innovations improve comfort, fuel efficiency, and other factors, but you recline the seat and return your tray table to an upright position in pretty much the same old way. Other innovations require learning: ATMs, DVRs, electric cars. All of these innovations have been successful, despite their learning requirements. However, the need for users to learn new behavior did slow their adoption. Innovation friction slows down adoption of innovations that require substantial behavior change, and even more so if it requires learning. This is especially true if the innovation requires un-learning an old way of doing something. If the friction is greater than the momentum of the benefit of overcoming it, the innovation stops dead in its tracks.
An example of this friction is the metric system, which has made only a very small amount of progress in adoption over the last 50 years, despite being clearly superior to the “English system.” It’s just too darn much trouble to change.
One last story about innovation friction from early in my career.
At that time, I was working with a lot of insurance companies creating web-based interfaces to replace traditional “green screen” systems used by insurance agents to quote and initiate new policies for auto and home insurance. It typically took a new hire 4-5 months to learn the system well enough to complete a policy quote — and well over a year to become truly proficient with it! We proudly designed replacement systems that anyone with basic computer skills could learn in a day or two at most, but found that some users were quite hostile to our efforts. They already knew how to use the green screen systems, and they were pretty darn fast with them. One Customer Support Agent even quoted Charlton Heston to me, saying I would only be able to take away her green screen if I pried it from her “cold dead hands.” Creepy? Yes. But also telling. Those old systems are gone now, because of the huge benefit of being able to train people on the new system so quickly. This benefit put the companies that used the new system in a position to more or less force that innovation onto other users.
Many successful innovations have required change and learning — automobiles, indoor toilets, smartphones. With all of these examples, we’ve seen many people willing to learn, for whom the “pain” of change was outweighed by the perceived benefit. But we also see a substantial number of users who resisted for years, saying, “No thanks, I like my outhouse (or horse and buggy or bank teller) just fine.” When conceiving or launching an innovation that requires learning, it’s important to consider the role innovation friction will play in adoption, where you can reduce it, and where you can increase the user’s willingness to accept it as the cost of the greater benefit.
This article originally appeared on the Howard Tiersky blog
Image Credits: Unsplash, Howard Tiersky
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.