Author Archives: Greg Satell

About Greg Satell

Greg Satell is a popular speaker and consultant. His latest book, Cascades: How to Create a Movement That Drives Transformational Change, is available now. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.

We Must Reinvent Our Organizations for A New Era of Innovation

We Must Reinvent Our Organizations for A New Era of Innovation

GUEST POST from Greg Satell

In the first half of the 20th century, Alfred Sloan created the modern corporation at General Motors. In many ways, it was based on the military. Senior leadership at headquarters would make plans, while managers at individual units would be allocated resources and made responsible for for achieving mission objectives.

The rise of digital technology made this kind of structure untenable. By the time strategic information was gathered centrally, it was often too old to be effective. In much the same way, by the time information flowed up from operating units, it was too late to alter the plan. It had already failed.

So in recent years, agility and iteration has become the mantra. Due to pressures from the market and from shareholders, long-term planning is often eschewed for the needs of the moment. Yet today the digital era is ending and organizations will need to shift once again. We’re going to need to learn to combine long-range planning with empowered execution.

Shifting From Iteration To Exploration

When Steve Jobs came up with the idea for a device that would hold “a thousand songs in my pocket,” it wasn’t technically feasible. There was simply no hard drive available that could fit that much storage into that little space. Nevertheless, within a few years a supplier developed the necessary technology and the iPod was born.

Notice how the bulk of the profits went to Apple, which designed the application and very little to the supplier that developed the technology that made it possible. That’s because the technology for developing hard drives was very well understood. If it hadn’t been that supplier, another would have developed what Jobs needed in six months or so.

Yet today, we’re on the brink of a new era of innovation. New technologies, such as revolutionary computing architectures, genomics and artificial intelligence are coming to the fore that aren’t nearly as well understood as digital technology. So we will have to spend years learning about them before we can develop applications safely and effectively.

For example, companies ranging from Daimler and Samsung to JP Morgan Chase and Barclays have joined IBM’s Q Network to explore quantum computing, even though that it will be years before that technology has a commercial impact. Leading tech companies have formed the Partnership on AI to better understand the consequences for artificial intelligence. Hundreds of companies have joined manufacturing hubs to learn about next generation technology.

It’s becoming more important to prepare than adapt. By the time you realize the need to adapt, it may already be too late.

Building A Pipeline Of Problems To Be Solved

While the need to explore technologies long before they become commercially viable is increasing, competitive pressures show no signs of abating. Just because digital technology is not advancing the way it once did doesn’t mean that it will disappear. Many aspects of the digital world, such as the speed at which we communicate, will continue.

So it is crucial to build a continuous pipeline of problems to solve. Most will be fairly incremental, either improving on an existing product or developing new ones based on standard technology. Others will be a bit more aspirational, such as applying existing capabilities to a completely new market or adopting exciting new technology to improve service to existing customers.

However, as the value generated from digital technology continues to level off, much like it did for earlier technologies like internal combustion and electricity, there will be an increasing need to pursue grand challenges to solve fundamental problems. That’s how truly new markets are created.

Clearly, this presents some issues with resource allocation. Senior managers will have to combine the need to move fast and keep up with immediate competitive pressures with the long-term thinking it takes to invest in years of exploration with an uncertain payoff. There’s no magic bullet, but it is generally accepted that the 70/20/10 principle for incremental, adjacent and fundamental innovation is a good rule of thumb.

Empowering Connectivity

When Sloan designed the modern corporation, capacity was a key constraint. The core challenge was to design and build products for the mass market. So long-term planning to effectively organize plant, equipment, distribution and other resources was an important, if not decisive, competitive attribute.

Digitization and globalization, however, flipped this model and vertical integration gave way to radical specialization. Because resources were no longer concentrated in large enterprises, but distributed across global networks, integration within global supply chains became increasingly important.

With the rise of cloud technology, this trend became even more decisive in the digital world. Creating proprietary technology that is closed off to the rest of the world has become unacceptable to customers, who expect you to maintain API’s that integrate with open technologies and those of your competitors.

Over the next decade, it will become increasingly important to build similar connection points for innovation. For example, the US military set up the Rapid Equipping Force that was specifically designed to connect new technologies with soldiers in the field who needed them. Many companies are setting up incubators, accelerators and corporate venture funds for the same reason. Others have set up programs to connect to academic research.

What’s clear is that going it alone is no longer an option and we need to set up specific structures that not only connect to new technology, but ensure that it is understood and adopted throughout the enterprise.

The Leadership Challenge

The shift from one era to another doesn’t mean that old challenges are eliminated. Even today, we need to scale businesses to service mass markets and rapidly iterate new applications. The problems we need to take on in this new era of innovation won’t replace the old ones, they will simply add to them.

Still, we can expect value to shift from agility to exploration as fundamental technologies rise to the fore. Organizations that are able to deliver new computing architectures, revolutionary new materials and miracle cures will have a distinct competitive advantage over those who can merely engineer and design new applications.

It is only senior leaders that can empower these shifts and it won’t be easy. Shareholders will continue to demand quarterly profit performance. Customers will continue to demand product performance and service. Yet it is only those that are able to harness the technologies of this new era — which will not contribute to profits or customer satisfaction for years to come — that will survive the next decade.

The one true constant is that success eventually breeds failure. The skills and strategies of one era do not translate to another. To survive, the key organizational attribute will not be speed, agility or even operational excellence, but leadership that understands that when the game is up, you need to learn how to play a new one.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Department Of Energy Programs Helping to Create an American Manufacturing Future

Department Of Energy Programs Helping to Create an American Manufacturing Future

GUEST POST from Greg Satell

In the recession that followed the dotcom crash in 2000, the United States lost five million manufacturing jobs and, while there has been an uptick in recent years, all indications are that they may never be coming back. Manufacturing, perhaps more than any other sector, relies on deep networks of skills and assets that tend to be highly regional.

The consequences of this loss are deep and pervasive. Losing a significant portion of our manufacturing base has led not only to economic vulnerability, but to political polarization. Clearly, it is important to rebuild our manufacturing base. But to do that, we need to focus on new, more advanced, technologies

That’s the mission of the Advanced Manufacturing Office (AMO) at the Department of Energy. By providing a crucial link between the cutting edge science done at the National Labs and private industry, it has been able to make considerable progress. As the collaboration between government scientists widen and deepens over time, US manufacturing may well be revived.

Linking Advanced Research To Private Industry

The origins of the Department of Energy date back to the Manhattan Project during World War II. The immense project was, in many respects, the start of “big science.” Hundreds of top researchers, used to working in small labs, traveled to newly established outposts to collaborate at places like Los Alamos, New Mexico and Oak Ridge, Tennessee.

After the war was over, the facilities continued their work and similar research centers were established to expand the effort. These National Labs became the backbone of the US government’s internal research efforts. In 1977, the National Labs, along with a number of other programs, were combined to form the Department of Energy.

One of the core missions of the AMO is to link the research done at the National Labs to private industry and the Lab Embedded Entrepreneurship Programs (LEEP) have been particularly successful in this regard. Currently, there are four such programs, Cyclotron Road, Chain Reaction Innovations, West Gate and Innovation Crossroads.

I was able to visit Innovation Crossroads at Oak Ridge National Laboratory and meet the entrepreneurs in its current cohort. Each is working to transform a breakthrough discovery into a market changing application, yet due to technical risk, would not be able to attract funding in the private sector. The LEEP program offers a small amount of seed money, access to lab facilities and scientific and entrepreneurial mentorship to help them get off the ground.

That’s just one of the ways that the AMO opens up the resources of the National Labs. It also helps business get access to supercomputing resources (5 out of the 10 fastest computers in the world are located in the United States, most of them at the National Labs) and conducts early stage research to benefit private industry.

Leading Public-Private Consortia

Another area in which the AMO supports private industry is through taking a leading role in consortia, such as the Manufacturing Institutes that were set up to to give American companies a leg up in advanced areas such as clean energy, composite materials and chemical process intensification.

The idea behind these consortia is to create hubs that provide a critical link with government labs, top scientists at academic universities and private companies looking to solve real-world problems. It both helps firms advance in key areas and allows researchers to focus their work on where they will have the greatest possible impact.

For example, the Critical Materials Institute (CMI) was set up to develop alternatives to materials that are subject to supply disruptions, such as the rare earth elements that are critical to many high tech products and are largely produced in China. A few years ago it developed, along with several National Labs and Eck Industries, an advanced alloy that can replace more costly materials in components of advanced vehicles and aircraft.

“We went from an idea on a whiteboard to a profitable product in less than two years and turned what was a waste product into a valuable asset,” Robert Ivester, Director of the Advanced Manufacturing Office told me.

Technology Assistance Partnerships

In 2011, the International Organization for Standardization released its ISO 50001 guidelines. Like previous guidelines that focused on quality management and environmental impact, ISO 50001 recommends best practices to reduce energy use. These can benefit businesses through lower costs and result in higher margins.

Still, for harried executives facing cutthroat competition and demanding customers, figuring out how to implement new standards can easily get lost in the mix. So a third key role that the AMO plays is to assist companies who wish to implement new standards by providing tools, guides and access to professional expertise.

The AMO offers similar support for a number of critical areas, such as prototype development and also provides energy assessment centers for firms that want to reduce costs. “Helping American companies adopt new technology and standards helps keep American manufacturers on the cutting edge,” Ivester says.

“Spinning In” Rather Than Spinning Out

Traditionally we think of the role of government in business largely in terms of regulation. Legislatures pass laws and watchdog agencies enforce them so that we can have confidence in the the food we eat, the products we buy and the medicines that are supposed to cure us. While that is clearly important, we often overlook how government can help drive innovation.

Inventions spun out of government labs include the Internet, GPS and laser scanners, just to name a few. Many of our most important drugs were also originally developed with government funding. Still, traditionally the work has mostly been done in isolation and only later offered to private companies through licensing agreements.

What makes the Advanced Manufacturing Office different than most scientific programs is that it is more focused on “spinning in” private industry rather than spinning out technologies. That enables executives and entrepreneurs with innovative ideas to power them with some of the best minds and advanced equipment in the world.

As Ivester put it to me, “Spinning out technologies is something that the Department of Energy has traditionally done. Increasingly, we want to spin ideas from industry into our labs, so that companies and entrepreneurs can benefit from the resources we have here. It also helps keep our scientists in touch with market needs and helps guide their research.”

Make no mistake, innovation needs collaboration. Combining the ideas from the private sector with the cutting edge science from government labs can help American manufacturing compete for the 21st century.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Innovation the Amazon Way

Innovation the Amazon Way

GUEST POST from Greg Satell

In 2014, Stephenie Landry was finishing up her one-year stint as Technical Advisor to Jeff Wilke, who oversees Amazon’s worldwide consumer business, which is a mentor program that allows high potential executives to shadow a senior leader and learn first-hand. Her next assignment would define her career.

At most companies, an up-and-comer like Stephenie might be given a division to run or work on a big acquisition deal. Amazon, however, is a different kind of place. Landry wrote a memo outlining plans for a new service she’d been thinking about, Prime Now, which today offers one-hour delivery to customers in over 50 cities across 9 countries.

It’s no secret that Amazon is one of the world’s most innovative companies. Starting out as a niche service selling books online, it’s now not only a dominant retailer, but has pioneered new categories such as cloud computing and smart speakers. The key to its success is not any one process, but how it integrates a customer obsession deep within its culture and practice.

Starting With The Customer And Working Back

At the heart of how Amazon innovates is its six-page memo, which is required at the start of every new initiative. What makes it effective isn’t so much the structure of the document itself, but how it is used to embed a fanatical focus on the customer from the day one. It’s something that Amazon employees have impressed upon them early in their careers.

So the first step in developing Prime Now was to write a press release. Landry’s document was not only a description of the service, but how hypothetical customers would react to it. How did the service affect them? What surprised them about it? What concerns did they want addressed? The exercise forced her to internalize how Amazon customers would think and feel about Prime Now from the very start.

Next she wrote a series of FAQ’s anticipating concerns for both customers and for various stakeholders within the firm, like the CFO, operations people and the leadership of the Prime program. So Landry had to imagine what questions each would have, how any issues would be resolved and then explain things in clear, concise language.

All of this happens before the first meeting is held, a single line of code is written or an early prototype is built, because the company strongly believes that until you internalize the customer’s perspective, nothing else really matters. That’s key to how the company operates.

A Deeply Embedded Writing Culture

It’s no accident that the first step to develop a new product at Amazon is a memo rather than, say, a PowerPoint deck or a kickoff meeting. As Fareed Zakaria once put it, “Thinking and writing are inextricably intertwined. When I begin to write, I realize that my ‘thoughts’ are usually a jumble of half-baked, incoherent impulses strung together with gaping logical holes between them”.

So the company focuses on building writing skills early in an executive’s career. “Writing is a key part of our culture,” Landry told me. “I started writing press releases for smaller features and projects. One of my first was actually about packaging for diamond rings. Over years of practice and coaching, I got better at it.” Being able to write a good memo is also a key factor in advancement at Amazon. If you want to rise, you need to write and write well.

She also stressed to me the importance of brevity. “Keeping things concise and to the point forces you to think things through in a way that you wouldn’t otherwise. You can’t hide behind complexity, you actually have to work through it,” Landry said. Or, as another Amazon leader put it, “Perfection is achieved when there is nothing left to remove.”

Moreover, writing a memo isn’t a solo effort, but a collaborative process. Typically, executives spend a week or more and sharing the document with colleagues, getting feedback, honing and tweaking it until every conceivable facet is deeply thought through.

Reinventing The Office Meeting

Another unique facet of Amazon’s culture is how meetings are run. In recent years, a common complaint throughout the corporate world is how the number of meetings has become so oppressive that it’s hard to get any work done. Research from MIT shows that executives spend an average of nearly 23 hours a week in meetings, up from less than 10 hours in 1960

At Amazon, however, the six-page memo cuts down on the number of meetings that are called. If you have to spend a week writing a memo, you don’t just start sending out invites whenever the fancy strikes you. Similarly, the company’s practice of limiting attendance to roughly the number of people that can share two pizzas also promotes restraint.

Each meeting starts out with a 30-60 minute reading period in which everybody digests the memo. From there, all attendees are asked to share gut reactions — senior leaders typically speak last — and then delve into what might be missing, ask probing questions and drill down into any potential issues that may arise.

Subsequent meetings follow the same pattern to review the financials, hone the concept and review mockups as the team further refines ideas and assumptions. “It’s usually not one big piece of feedback that you get,” Landry stressed. “It is really all about the smaller questions, they help you get to a level of detail that really brings the idea to life.”

All of this may seem terribly cumbersome to fast moving executives accustomed to zinging in and out of meetings all day, but you often need to go slow to move fast. In the case of Prime Now, the service took just 111 days to go from an idea on a piece of paper to a product launch in one zip code in Manhattan and expanded quickly from there.

Co-evolving Culture And Practice

Every company innovates differently. Apple has a fanatical focus on design. IBM’s commitment to deep scientific research has enabled it to stay on the cutting edge and compete long after most of its competitors have fallen by the wayside. Google integrates a number of innovation strategies into a seamless whole

What works for one company would likely not work for another, a fact that Amazon CEO Jeff Bezos highlighted in a recent letter to shareholders. “We never claim that our approach is the right one – just that it’s ours – and over the last two decades, we’ve collected a large group of like-minded people. Folks who find our approach energizing and meaningful,” he wrote.

The truth is that there is no one “true” path to innovation because innovation, at its core, is about solving problems and every enterprise chooses different problems to solve. While IBM might be happy to have its scientists work for decades on some arcane technology and Google gladly allows its employees to pursue pet projects, those things probably wouldn’t fly at Amazon.

However, the one thing that all great innovators have in common is that culture and practice are deeply intertwined. That’s what makes them so hard to copy. Anybody can write a six-page memo or start meetings with a reading period. It’s not those specific practices, but the commitment to the values they reflect, that has driven Amazon’s incredible success.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Four Principles of Successful Digital Transformation

Four Principles of Successful Digital Transformation

GUEST POST from Greg Satell

When Steve Jobs and Apple launched the Macintosh with great fanfare in 1984, it was to be only one step in a long journey that began with Douglas Engelbart’s Mother of All Demos and the development of the Alto at Xerox PARC more than a decade before. The Macintosh was, in many ways, the culmination of everything that came before.

Yet it was far from the end of the road. In fact, it wouldn’t be until the late 90s, after the rise of the Internet, that computers began to have a measurable effect on economic productivity. Until then, personal computers were mainly an expensive device to automate secretarial work and for kids to play video games.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. Yet what few realize is that it is the last part, transformation, that is often the hardest and the longest. In fact, it usually takes about 30 years to go from an initial discovery to a major impact on the world. Here’s what you can do to move things along.

1. Identify A Keystone Change

About a decade before the Macintosh, Xerox invented the Alto, which had many of the features that the Macintosh later became famous for, such as a graphical user interface, a mouse and a bitmapped screen. Yet while the Macintosh became legendary, the Alto never really got off the ground and is now remembered, if at all, as little more than a footnote.

The difference in outcomes had much less to do with technology than it had to do with vision. While Xerox had grand plans to create the “office of the future,” Steve Jobs and Apple merely wanted to create a cool gadget for middle class kids and enthusiasts. Sure, they were only using it to write term papers and play video games, but they were still buying.

In my book, Cascades, I call this a “keystone change,” based on something my friend Talia Milgrom-Elcott told me about ecosystems. Apparently, every ecosystem has one or two keystone species that it needs to thrive. Innovation works the same way, you first need to identify a keystone change before a transformation can begin.

One common mistake is to immediately seek out the largest addressable market for a new product or service. That’s a good idea for an established technology or product category, but when you have something that’s truly new and different, it’s much better to find a hair on fire use case, a problem that’s someone needs solved so badly that they are willing to put up with early glitches and other shortcomings.

2. Indoctrinate Values, Beliefs And Skills

A technology is more than just a collection of transistors and code or even a set of procedures, but needs specific values and skills to make it successful. For example, to shift your business to the cloud, you need to give up control of your infrastructure, which requires a completely new mindset. That’s why so many digital transformations fail. You can’t create a technology shift without a mind shift as well.

For example, when the Institute for Healthcare Improvement began its quest to save 100,000 lives through evidence-based quality practices, it spent significant time preparing the ground beforehand, so that people understood the ethos of the movement. It also created “change kits” and made sure the new procedures were easy to implement to maximize adoption.

In a similar vein, Facebook requires that all new engineers, regardless of experience or expertise, go through its engineering bootcamp. “Beyond the typical training program, at our Bootcamp new engineers see first-hand, and are able to infer, our unique system of values,” Eddie Ruvinsky, an Engineering Director at the company, told me.

“We don’t do this so much through training manuals and PowerPoint decks,” he continued,”but through allowing them to solve real problems working with real people who are going to be their colleagues. We’re not trying to shovel our existing culture at them, but preparing them to shape our culture for the future.”

Before you can change actions, you must first transform values, beliefs and skills.

3. Break Through Higher Thresholds Of Resistance

Growing up in Iowa in the 1930s, Everett Rogers, noticed something strange in his father’s behavior. Although his father loved electrical gadgets, he was hesitant to adopt hybrid seed corn, even though it had higher yields. In fact, his father only made the switch after he saw his neighbor’s hybrid seen crop thrive during a drought in 1936.

This became the basis for Rogers’ now-familiar diffusion of innovations theory, in which an idea first gets popular with a group of early adopters and then only later spreads to other people. Later, Geoffrey Moore explained that most innovations fail because they never cross the chasm from the early adopters to the mainstream.

Both theories have become popular, but are often misunderstood. Early adopters are not a specific personality type, but people with a low threshold of resistance to a particular idea or technology. Remember that Rogers’s father was an early adopter of electrical gadgets, but was more reticent with seed corn.

As network theory pioneer Duncan Watts explained to me, an idea propagates through “easily influenced people influencing other easily influenced people.” So it’s important to start a transformation with people who are already enthusiastic, work out the inevitable kinks and then move on to people slightly more reticent, once you’ve proved success in that earlier group.

4. Focus On The Network, Not The Nodes

Perhaps the biggest mistake that organizations commit when trying to implement a new technology is to try to push everything from above, either through carrots, like financial incentives, or sticks, like disciplinary action for noncompliance. That may give senior management the satisfaction of “taking action,” but can often backfire.

People are much more willing to adopt something new if they feel like its their idea. The Institute for Healthcare Improvement, for example, designated selected institutions to act as “nodes” to help spread its movement. These weren’t watchdogs, but peers that were early adopters who could help their colleagues adopt the new procedures effectively.

In a similar vein, IBM has already taken significant steps to drive adoption of Quantum computing, a technology that won’t be commercially available for years. First it created the Q Experience, an early version of its technology available through the cloud for anyone to use. It has also set up its Q Network of early adopter companies who are working with IBM to develop practical applications for quantum computing.

To date, tens of thousands have already run hundreds of thousands of experiments on Q Experience and about a dozen companies have joined the Q Network. So while there is still significant discovery and engineering to be done, the transformation is already well underway. It always pays to start early.

The truth is that transformation is always about the network, not the nodes. That’s why you need to identify a keystone change, indoctrinate the values and skills that will help you break through higher thresholds of resistance and continuously connect with a diverse set of stakeholders to drive change forward.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

DNA May Be the Next Frontier of Computing and Data Storage

DNA May Be the Next Frontier of Computing and Data Storage

GUEST POST from Greg Satell

Data, as many have noted, has become the new oil, meaning that we no longer regard the information we store as merely a cost of doing business, but a valuable asset and a potential source of competitive advantage. It has become the fuel that powers advanced technologies such as machine learning.

A problem that’s emerging, however, is that our ability to produce data is outstripping our ability to store it. In fact, an article in the journal Nature predicts that by 2040, data storage would consume 10–100 times the expected supply of microchip-grade silicon, using current technology. Clearly, we need a data storage breakthrough.

One potential solution is DNA, which is a million times more information dense than today’s flash drives. It also is more stable, more secure and uses minimal energy. The problem is that it is currently prohibitively expensive. However, a startup that has emerged out of MIT, called CATALOG, may have found the breakthrough we’re looking for: low-cost DNA Storage.

The Makings Of A Scientist-Entrepreneur

Growing up in his native Korea, Hyunjun Park never planned on a career in business, much less the technology business, but expected to become a biologist. He graduated with honors from Seoul National University and then went on to earn a PhD from the University of Wisconsin. Later he joined Tim Lu’s lab at MIT, which specializes in synthetic biology.

In an earlier time, he would have followed an established career path, from PhD to post-doc to assistant professor to tenure. These days, however, there is a growing trend for graduate students to get an entrepreneurial education in parallel with the traditional scientific curriculum. Park, for example, participated in both the Wisconsin Entrepreneurial Bootcamp and Start MIT.

He also met a kindred spirit in Nate Roquet, a PhD candidate who, about to finish his thesis, had started thinking about what to do next. Inspired by a talk from given by the Chief Science Officer at a seed fund, IndieBio, the two began to talk in earnest about starting a company together based on their work in synthetic biology.

As they batted around ideas, the subject of DNA storage came up. By this time, the advantages of the technology were well known but it was not considered practical, costing hundreds of thousands of dollars to store just a few hundred megabytes of data. However, the two did some back-of -the-envelope calculations and became convinced they could do it far more cheaply.

Moving From Idea To Product

The basic concept of DNA storage is simple. Essentially, you just encode the ones and zeros of digital code into the T, G, A and C’s of genetic code. However, stringing those genetic molecules together is tedious and expensive. The idea that Park and Roquet came up with was to use enzymes to alter strands of DNA, rather than building them up piece by piece.

Contrary to popular opinion, most traditional venture capital firms, such as those that populate Sand Hill Road in Silicon Valley, don’t invest in ideas. They invest in products. IndieBio, however, isn’t your typical investor. They give only give a small amount of seed capital, but offer other services, such as wet labs, entrepreneurial training and scientific mentorship. Park and Roquet reached out to them and found some interest.

“We invest in problems, not necessarily solutions,” Arvind Gupta, Founder at IndieBio told me. “Here the problem is massive. How do you keep the world’s knowledge safe? We know DNA can last thousands of years and can be replicated very inexpensively. That’s a really big deal and Hyunjun and Nate’s approach was incredibly exciting.”

Once the pair entered IndieBio’s four-month program, they found both promise and disappointment. Their approach could dramatically reduce the cost of storing information in DNA, but not nearly quickly enough to build a commercially viable product. They would need to pivot if they were going to turn their idea into an actual business.

Scaling To Market

One flaw in CATALOG’s approach was that the process was too complex to scale. Yet they found that by starting with just a few different DNA strands and attaching them together, much like a printing press pre-arranges words in a book, they could come up with something that was not only scalable, but commercially viable from a cost perspective.

The second problem was more thorny. Working with enzymes is incredibly labor intensive and, being biologists, Park and Roquet didn’t have the mechanical engineering expertise to make their process feasible. Fortunately, an advisor, Darren Link, connected the pair to Cambridge Consultants, an innovation consultancy that could help them.

“We started looking at the problem and it seemed that, on paper at least, we could make it work,” Richard Hammond, Technology Director and Head of Synthetic Biology at Cambridge Consultants, told me. “Now we’re about halfway through making the first prototype and we believe we can make it work and scale it significantly. We’re increasingly confident that we can solve the core technical challenges.”

In 2018 CATALOG introduced the world to Shannon, its prototype DNA writer. In 2022 CATALOG announced its DNA computation work at the HPC User Forum. But CATALOG isn’t without competition in the space. For example, Western Digital‘s LTO-9 from 2022, can store 18 TB per cartridge. CATALOG for its part is partnering with Seagate “on several initiatives to advance scalable and automated DNA-based storage and computation platforms, including making DNA-based platforms up to 1000 times smaller.” That should make the process competitive for archival storage, such as medical and legal records as well as storing film databases at movie studios.

“I think the fact that we’re inventing a completely new medium for data storage is really exciting,” Park told me. “I don’t think that we know yet what the true potential is because the biggest use cases probably don’t exist yet. What I do know is that our demand for data storage will soon outstrip our supply and we are thrilled about the possibility of solving that problem.”

Going Beyond Digital

A generation ago, the task of improving data storage would have been seen as solely a computer science problem. Yet today, the digital era is ending and we’re going to have to look further and wider for solutions to the problems we face. With the vast improvement in genomics, which is far outpacing Moore’s law these days, we can expect biology to increasingly play a role.

“Traditional, information technology has been strictly the realm of electrical engineers, physicists and coders,” Gupta of IndieBio told me. “What we’re increasingly finding is that biology, which has been honed for millions of years by evolution, can often point the way to solutions that are more robust and potentially, much cheaper and more efficient.”

Yet this phenomenon goes far beyond biology. We’re also seeing similar accelerations in other fields, such as materials science and space-related technologies. We’re also seeing a new breed of investors, like IndieBio, that focus specifically on scientist entrepreneurs. “I consider myself a product of the growing ecosystem for scientific entrepreneurs at universities and in the investor community,” Park told me.

Make no mistake. We are entering a new era of innovation and the traditional Silicon Valley approach will not get us where we need to go. Instead, we need to forge greater collaboration between the scientific community, the investor community and government agencies to solve problems that are increasingly complex and interdisciplinary.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

How Tribalism Can Kill Innovation

How Tribalism Can Kill Innovation

GUEST POST from Greg Satell

While history tends to single out individuals, the truth is that when you look behind the story of any heroic leader, what you find is a network of loyal supporters, active collaborators and outside facilitators that are behind any great achievement. Nobody accomplishes anything significant alone.

That’s probably why it’s become fashionable for pundits to encourage us to “find our tribe,” a network of like-minded people who share your ambitions. Don’t listen to them. The truth is that great things are achieved not by taking comfort from your tribe, but from going beyond it and reaching out to those who aren’t of like mind.

The problem with focusing too much on your tribe is that those people tend to think the same way you do. They frequent the same places, watch the same TED talks and read the same blogs. That may be great for giving you some comfort and confidence, but it also acts as an echo chamber that will reinforce flawed assumptions and lead you down a false path.

The Problem With Closed Networks

In 2005, a team of researchers decided to study why some Broadway plays become hits and others flop. They looked at all the usual factors, such as production budget, marketing budget and the track record of the director, but what they found was that what was most important factor was the informal networks of relationships among the cast and crew.

If no one had ever worked together before, both financial and creative results tended to be poor. However, if the networks among the cast and crew became too dense—for all intents and purposes, becoming a tribe—performance also suffered. It was the teams that had elements of both, strong ties and new blood, that had the greatest success.

The same effect has been found elsewhere. In studies of star engineers at Bell Labs, the German automotive industry and currency traders it has been shown that tightly clustered groups, combined with long range “weak ties” that allow information to flow freely among disparate clusters of activity, consistently outperform close networks of likeminded people.

Just as we need to invest in building strong, trustful relationships, we also need to go beyond our comfort zone and seek out new connections. It’s far too easy to hide in a tribe.

The Discomfort of Diversity

While studies show that closed networks lead to worse performance, it has long been established that diversity improves performance. Researchers at the University of Michigan found that diverse groups can solve problems better than a more homogeneous team of greater objective ability. Another study that simulated markets showed that ethnic diversity deflated asset bubbles.

While the studies noted above merely simulate diversity in a controlled setting, there is also evidence from the real world that diversity produces better outcomes. A McKinsey report that covered 366 public companies in a variety of countries and industries found that those which were more ethnically and gender diverse performed significantly better than others.

Yet diversity also has a downside. In Political Tribes, Yale Professor Amy Chua notes that we are hardwired to be suspicious of others. For example, in a study where young children were randomly assigned to red or blue groups, they liked pictures of other kids who wore t-shirts that reflected their own group better. A study of adults had similar findings.

So you can see the attraction of tribes. We feel uncomfortable with people who we perceive as different. Surrounding ourselves with people who see things the way we do, on the other hand, makes us feel confident and powerful.

Mixing With The Heathens

Growing up in Iowa in the 1930s, Everett Rogers, noticed something strange in his father’s behavior. Although his father loved electrical gadgets, he was hesitant to adopt hybrid seed corn, even though it had higher yields. In fact, his father only made the switch after he saw his neighbor’s hybrid crop thrive during a drought in 1936.

This became the inspiration for Rogers’ now-familiar diffusion of innovations theory, in which an idea first gets popular with a group of early adopters and then only later spreads to other people. Geoffrey Moore later pointed out that most innovations fail because they never cross the chasm from the early adopters to the mainstream.

A study done by researchers at Kellogg and Stanford explains why. They put together groups of college students to solve a murder mystery. The groups made up of students from the same sorority or fraternity felt more confident and successful, even though they performed worse on the task than integrated groups that experienced more conflict, uncertainty and doubt.

That’s the problem with staying in your tribe. Sure, it feels great to have your ideas supported and reinforced by people you like and respect, but they are doing so because they already believe the same things that you do. To actually achieve something worthwhile, however, you have to go beyond preaching to the choir and start mixing with the heathens.

Do You Want To Make A Point Or Do You Want To Make A Difference?

In my book, Cascades, I cover a wide range of movements. Some, like the civil rights movement and the campaign to save 100,000 lives, succeeded brilliantly. Others, like Occupy and the technology companies along Boston’s Route 128, failed miserably. Another thing I found is that many movements that ultimately succeeded, failed initially because they failed to go beyond their tribe.

Here’s what Srdja Popović, who helped lead the Otpor movement that overthrew the brutal regime of Slobodan Milošević in 2000, told me about the initial student protests in 1992.

These were very ‘Occupy’ type of protests where we occupied the five biggest universities and lived there in our little islands of common sense with intellectuals and rock bands while the rest of the country was more or less supportive of Milošević’s idea. And this is where we began to understand that staying in your little blurb of common sense was not going to save the country.

In a similar vein, Nelson Mandela started out as an angry nationalist, but eventually learned that to get results, he would have to actively collaborate with others that didn’t quite see things the same way he did. In Poland, Solidarity’s first actions were disastrous, because they only involved workers. It was only through a later alliance between workers, intellectuals and the church that the movement ultimately succeeded.

Today, both America and the world have become increasingly tribal and it’s easy to retreat into what Srdja calls “your little blurb of common sense.” You can state your beliefs, make your point and see the heads nod around you. You can live in comfort, knowing that any voices of dissent will be quickly shouted down, as you self righteously feel they should be.

However, at some point, you will have to decide if you want to make a point or whether you want to make a difference. To achieve anything worthwhile, you have to go beyond your tribe.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Preparing the Next Generation for a Post-Digital Age

Preparing the Next Generation for a Post-Digital Age

GUEST POST from Greg Satell

An education is supposed to prepare you for the future. Traditionally, that meant learning certain facts and skills, like when Columbus discovered America or how to do long division. Today, curricula have shifted to focus on a more global and digital world, like cultural history, basic computer skills and writing code.

Yet the challenges that our kids will face will be much different than we did growing up and many of the things a typical student learns in school today will no longer be relevant by the time he or she graduates college. In fact, a study at the University of Oxford found that 47% of today’s jobs will be eliminated over the next 20 years.

In 10 or 20 years, much of what we “know” about the world will no longer be true. The computers of the future will not be digital. Software code itself is disappearing, or at least becoming far less relevant. Many of what are considered good jobs today will be either automated or devalued. We need to rethink how we prepare our kids for the world to come.

Understanding Systems

The subjects we learned in school were mostly static. 2+2 always equaled 4 and Columbus always discovered America in 1492. Interpretations may have differed from place to place and evolved over time, but we were taught that the world was based on certain facts and we were evaluated on the basis on knowing them.

Yet as the complexity theorist Sam Arbesman has pointed out, facts have a half life and, as the accumulation of knowledge accelerates, those half lives are shrinking. For example, when we learned computer programming in school, it was usually in BASIC, a now mostly defunct language. Today, Python is the most popular language, but will likely not be a decade from now.

Computers themselves will be very different as well, based less on the digital code of ones and zeros and more on quantum laws and the human brain. We will likely store less information on silicon and more in DNA. There’s no way to teach kids how these things will work because nobody, not even experts, is quite sure yet.

So kids today need to learn less about how things are today and more about the systems future technologies will be based on, such as quantum mechanics, genetics and the logic of code. One thing economists have consistently found is that it is routine jobs that are most likely to be automated. The best way to prepare for the future is to develop the ability to learn and adapt.

Applying Empathy And Design Skills

While machines are taking over many high level tasks, such as medical analysis and legal research, there are some things they will never do. For example, a computer will never strike out in a Little League game, have its heart broken or see its child born. So it is very unlikely, if not impossible, that a machine will be able to relate to a human like other humans can.

That absence of empathy makes it hard for machines to design products and processes that will maximize enjoyment and utility for humans. So design skills are likely to be in high demand for decades to come as basic production and analytical processes are increasingly automated.

We’ve already seen this process take place with regard to the Internet. In the early days, it was a very technical field. You had to be a highly skilled engineer to make a website work. Today, however, building a website is something any fairly intelligent high school student can do and much of the value has shifted to front-end tasks, like designing the user experience.

With the rise of artificial intelligence and virtual reality our experiences with technology will become more far immersive and that will increase the need for good design. For example, conversational analysts (yes, that’s a real job) are working with designers to create conversational intelligence for voice interfaces and, clearly, virtual reality will be much more design intensive than video ever was.

The Ability To Communicate Complex Ideas

Much of the recent emphasis in education has been around STEM subjects (science, technology, engineering and math) and proficiency in those areas is certainly important for today’s students to understand the world around them. However, many STEM graduates are finding it difficult to find good jobs.

On the other hand, the ability to communicate ideas effectively is becoming a highly prized skill. Consider Amazon, one of the most innovative and technically proficient organizations on the planet. However, a key factor to its success its writing culture. The company is so fanatical about the ability to communicate that developing good writing skills are essential to building a successful career there.

Think about Amazon’s business and it becomes clear why. Sure, it employs highly adept engineers, but to create a truly superior product those people need to collaborate closely with designers, marketers, business development executives and others. To coordinate all that activity and keep everybody focused on delivering a specific experience to the customer, communication needs to be clear and coherent.

So while learning technical subjects like math and science is always a good idea, studying things like literature, history and philosophy is just as important.

Collaborating And Working In Teams

Traditionally, school work has been based on individual accomplishment. You were supposed to study at home, come in prepared and take your test without help. If you looked at your friend’s paper, it was called cheating and you got in a lot of trouble for it. We were taught to be accountable for achievements on our own merits.

Yet consider how the nature of work has changed, even in highly technical fields. In 1920, most scientific papers were written by sole authors, but by 1950 that had changed and co-authorship became the norm. Today, the average paper has four times as many authors as it did then and the work being done is far more interdisciplinary and done at greater distances than in the past.

Make no mistake. The high value work today is being done in teams and that will only increase as more jobs become automated. The jobs of the future will not depend as much on knowing facts or crunching numbers, but will involve humans collaborating with other humans to design work for machines. Collaboration will increasingly be a competitive advantage.

That’s why we need to pay attention not just to how our kids work and achieve academically, but how they play, resolve conflicts and make others feel supported and empowered. The truth is that value has shifted from cognitive skills to social skills. As kids will increasingly be able to learn complex subjects through technology, the most important class may well be recess.

Perhaps most of all, we need to be honest with ourselves and make peace with the fact that our kids educational experience will not — and should not — mirror our own. The world which they will need to face will be far more complex and more difficult to navigate than anything we could imagine back in the days when Fast Times at Ridgemont High was still popular.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Change the World With a Keystone Change

Change the World With a Keystone Change

GUEST POST from Greg Satell

On December 31st, 1929, the Indian National Congress, the foremost nationalist group on the subcontinent, issued a Declaration of Purna Swaraj, or complete independence from British rule. It also announced a campaign of civil disobedience, but no one had any idea what form it should take. That task fell to Mohandas Gandhi.

The Mahatma returned to his ashram to contemplate next steps. After his efforts to organize against the Rowlatt Act a decade earlier ended in disaster, he struggled to find a way forward. As he told a friend at the time, “I am furiously thinking day and night and I do not see a way out of the darkness.”

Finally, he decided he would march for salt, which impressed almost no one. It seemed to be an incredibly inconsequential issue, especially considering what was at stake. Yet what few realized at the time was that he had identified a keystone change that would break the logjam and the British hold on power. Today the Salt March is known as Gandhi’s greatest triumph.

A Tangible And Achievable Goal

One of Gandhi’s biggest challenges was to connect the lofty goals and high-minded rhetoric of the elites who led the Indian National Congress with the concerns of everyday Indians. These destitute masses didn’t much care whether they were ruled by British elites or Indian elites and, to them, abstract concepts like “freedom” and “independence” meant little.

Salt, on the other hand, was something that was tangible for everyone, but especially for the poorest Indians and the British salt laws provided a clear and actionable target. All you had to do to defy them was to boil seawater to produce salt. What at first seemed trivial became a powerful call for mass action.

In my book, Cascades, I found that every successful movement for change, whether it was a corporate turnaround, a social initiative or a political uprising, began with a keystone change like Gandhi’s salt protests. To achieve a grand vision, you always have to start somewhere and the best place to begin is with a clear and achievable goal.

In some cases, as with voting rights in the women’s movement in the 19th century and, more recently, marriage equality for the LGBT movement, identifying a keystone change took decades. In other cases, such as improving worker safety in Paul O’Neil’s turnaround of Alcoa or a campaign to save 100,000 lives in Don Berwick’s quest to improve quality in medical care, the keystone change was part of the initial plan.

Involving Multiple Stakeholders

The concept of Indian independence raised a number of thorny issues, many of which have not been resolved to this day. Tensions between majority Hindus and minority Muslims created suspicions about how power would be structured after British rule. Similarly, coordinating action between caste Hindus and “untouchables” was riddled with difficulty. Christians and Sikhs had their own concerns.

Yet anger about the Salt Laws helped bring all of these disparate groups together. It was clear from the outset that everyone would benefit from a repeal. Also, because participating was easy—again, it was as simple as boiling sea water—little coordination was needed. Most of all, being involved in a collective effort helped to ease tensions somewhat.

Wyeth Pharmaceuticals took a similar approach to its quest to reduce costs by 25% through implementing lean manufacturing methods at its factories. Much like Gandhi, the executives understood that transforming the behaviors of 20,000 employees across 16 large facilities, most of whom were skeptical of the change, was no simple task.

So they started with one process — factory changeovers — and reduced the time it took to switch from producing one product to another in half. “That changed assumptions of what was possible,” an advisor that worked on the project told me. “It allowed us to implement metrics, improve collaboration and trained the supervisor to reimagine her perceived role from being a taskmaster that pushed people to work harder to a coach that enables improved performance.”

Breaking Through Higher Thresholds Of Resistance

By now most people are familiar with the diffusion of innovations theory developed by Everett Rogers. A new idea first gains traction among a small group of innovators and early adopters, then later spreads to the mainstream. Some have suggested that early adopters act as “influentials” or “opinion leaders” that spur an idea forward, but that is largely a myth.

What is much closer to the truth is that we all have different thresholds of resistance to a new idea and these thresholds are highly contextual. For example, as a Philadelphia native, I will enthusiastically try out a new cheesesteak place, but have kept the same hairstyle for 30 years. My wife, on the other hand, is much more adventurous with hairstyles than she is with cheesesteaks.

Yet we are all influenced by those around us. So if our friends and neighbors start raving about a cheesesteak, she might give it a try and may even tell people about it. Or, as network theory pioneer Duncan Watts explained to me, an idea propagates through “easily influenced people influencing other easily influenced people.”

That’s how transformative ideas gain momentum and it’s easy to see how a keystone change can help move the process along. By starting out with a tangible goal, such as protesting the salt tax or reducing changeover time at a single factory, you can focus your efforts on people who have lower thresholds of resistance and they, in turn, can help the idea spread to others who are more reticent.

Paving The Way For Future Change

Perhaps most importantly, a keystone change paves the way for larger changes later on. Gandhi’s Salt March showed that the British Raj could be defied. Voting rights for women and, later, blacks, allowed them to leverage their newfound power at the polls. Reducing changeover time showed how similar results could be achieved in other facets of manufacturing. The 100,000 lives campaign helped spur a a quality movement in healthcare.

None of these things happened all at once, but achieving a keystone change showed what was possible, attracted early adopters to the cause and helped give them a basis for convincing others that even more could be achieved. As one of Gandhi’s followers remarked, before the Salt March, the British “were all sahibs and we were obeying. No more after that.”

Another benefit of a keystone change is that it is much less likely to provoke a backlash than a wider, sweeping vision. One of the reasons that the Salt March was successful is that the British didn’t actually gain that much revenue from the tax on salt, so were slow to react to it. The 100,000 lives campaign involved only six relatively easy to implement procedures, rather than pushing hospitals to pursue wholesale change all at once.

So while it’s important to dream big and have lofty goals, the first step is always a keystone change. That’s how you first build a sense of shared purpose and provide a platform from which a movement for change can spread. Before the Salt March, Gandhi was considered by many to be a Hindu nationalist. It was only after that he truly became an inspiration to all Indian people and many others around the world.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Why Quiet Geniuses Excel at Breakthroughs

Why Quiet Geniuses Excel at Breakthroughs

GUEST POST from Greg Satell

When you think of breakthrough innovation, someone like Steve Jobs, Jeff Bezos or Elon Musk often comes to mind. Charismatic and often temperamental, people like these seem to have a knack for creating the next big thing and build great businesses on top of them. They change the world in ways that few can.

Yet what often goes unnoticed is that great entrepreneurs build their empires on the discoveries of others. Steve jobs didn’t invent the computer or the mobile phone any more than Jeff Bezos discovered e-commerce or Elon Musk dreamed up electric cars. Those things were created by scientists and engineers that came long before.

In researching my book, Mapping Innovation, I got to know many who truly helped create the future and I found them to be different than most people, but not in a way that you’d expect. While all were smart and hardworking, the most common trait among them was their quiet generosity and that can teach us a lot about how innovation really works.

How Jim Allison Figured it All Out

At least in appearance, Jim Allison is a far cry from how you would normally picture a genius to look like. Often disheveled with a scruffy beard, he kind of mumbles out a slow Texas drawl that belies his amazingly quick mind. Unassuming almost to a fault, when I asked him about his accomplishments he just said, “well, I always did like figuring things out.”

When Jim was finishing up graduate school, scientists had just discovered T-cells and he told me that he was fascinated by how these things could zip around your body and kill things for you, but not actually hurt you. The thing was, nobody had the faintest idea how it all worked. So Jim decided to become an immunologist and devote his life to figuring it all out.

Over the next few decades, he and his colleagues at other labs did indeed do much to figure it out. They found one receptor, called B-7, which acts like an ignition switch that initiates the immune response, another, CD-28, that acts like a gas pedal and revs things up into high gear and a third, called CTLA-4, that puts on the brakes so things don’t spin out of control.

Jim played a part in all of this, but his big breakthrough came from the work of another scientist in his lab, which made him suspect that the problem with cancer wasn’t that our immune system can’t fight it, but that it puts the brakes on too soon. He thought that if he could devise a way to pull those brakes off, we could cure cancer in a new and different way.

As it turned out, Jim was right. Today, cancer immunotherapy has become a major field unto itself and, in October 2018, he won the Nobel Prize for his discovery of it. Yet the truth is that it wasn’t one major breakthrough, but a decades-long process of slowly putting the pieces together that made it all possible.

How Gary Starkweather Went From Blowup To Breakthrough

Gary Starkweather is every bit as quiet and unassuming as Jim Allison. Yet when I talked to him a few years ago, I could still hear the anger in his voice as he told me about an incident that happened almost 50 years before. In the late 60s, Gary had an idea to invent a new kind of printer, but his boss at Xerox was thwarting his efforts.

At the time, Gary was one of the few experts in the emerging field of laser optics, so there weren’t many others who could understand his work, much less how it could be applied to the still obscure field of computers. His boss was, in fact, was so hostile to Gary’s project that he threatened to fire anyone who worked with him on it.

Furious, the normally mild mannered Gary went over his boss’s head. He walked into the Senior Vice President’s office and threatened, “Do you want me to do this for you or for someone else?” For the stuffy, hierarchical culture of Xerox, it was outrageous behavior, but as luck would have it, the stunt paid off. News of Gary’s work made it across the country to the fledgling computer lab that Xerox had recently established in California, the Palo Alto Research Center (PARC).

Gary thrived in the freewheeling, collaborative culture at PARC. The researchers there had developed a graphical technology called bitmapping, but had no way to print the images out until he showed up. His development of the laser printer was not only a breakthrough in its own right, but with the decline of Xerox’s copier business, it actually saved the company.

The Wild Ideas Of Charlie Bennett

Charlie Bennett is one of those unusual minds that amazes everyone he meets. He told me that when he was growing up in the quiet Westchester village of Croton-on-Hudson he was a “geek before geeks were cool.” While the other kids were playing sports and trading baseball cards, what really inspired Charlie was Watson and Crick’s discovery of the structure of DNA.

So he went to college and majored in biochemistry and then went on to Harvard to do his graduate work, where he served as James Watson’s teaching assistant. Yet it was an elective course he took on the theory of computation that would change his fate. That’s where he first encountered the concept of a Turing Machine and he was amazed how similar it was to DNA.

So Charlie never became a geneticist, but went to work for IBM as a research scientist. It proved to be just the kind of place where a mind like his could run free, discussing wild ideas like quantum cryptography with colleagues around the globe. It was one of those discussions, with Gilles Brassard, that led to his major breakthrough.

What the two discussed was the wildest idea yet. They proposed to transfer information by quantumly entangling photons, something that Einstein had derisively called “spooky action at a distance” and was adamant couldn’t happen. Yet the two put a team together and, in 1993, successfully completed the quantum teleportation experiment.

That, in turn, led Charlie just a few months later to write down his four laws of quantum information, which formed the basis for IBM’s quantum computing program. Today, in his eighties, Charlie is semi-retired, but still goes into the labs at IBM research to quietly discuss wild ideas with the younger scientists, such as the quantum internet that’s continuing to emerge now.

For Innovation, Generosity Is A Competitive Advantage

My conversations with Jim, Gary, Charlie and many others made an impression on me. They were all giants in their fields (although Jim hadn’t won his Nobel yet) and I was a bit intimidated talking to them. Yet I found them to be some of the kindest, most generous people I ever met. Often, they seemed as interested in me as I was in them.

In fact, the behavior was so consistent that I figured it couldn’t be an accident. So I researched the matter further and found a number of studies that helped explain it. One, at Bell Labs, found that star engineers had a knack for “knowing who knows.” Another at the design firm IDEO found that great innovators essentially act as “knowledge brokers.“

A third study helps explain why knowledge brokering is so important. Analyzing 17.9 million papers, the researchers found that the most highly cited work tended to be mostly rooted within a traditional field, with just a smidgen of insight taken from some unconventional place. Breakthrough creativity occurs at the nexus of conventionality and novelty.

So as it turns out, generosity is often a competitive advantage for innovators. By actively sharing their ideas, they build up larger networks of people willing to share with them. That makes it that much more likely that they will come across that random piece of information and insight that will help them crack a really tough problem.

So if you want to find a truly great innovator, don’t look for the ones that make the biggest headlines are that are most inspiring on stage. Look for those who spend their time a bit off to the side, sharing ideas, supporting others and quietly pursuing a path that few others are even aware of.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Value Doesn’t Disappear

It Shifts From One Place to Another

Value Doesn't Disappear

GUEST POST from Greg Satell

A few years ago, I published an article about no-code software platforms, which was very well received. Before long, however, I began to get angry — and sometimes downright nasty — comments from software engineers who were horrified by the notion that you can produce software without actually understanding the code behind it.

Of course, no-code platforms don’t obviate the need for software engineers, but rather automate basic tasks so that amateurs can design applications by themselves. These platforms are, necessarily, limited but can increase productivity dramatically and help line managers customize technology to fit the task at hand.

Similarly, when FORTRAN, the first real computer language, was invented, many who wrote machine code objected, much like the software engineers did to my article. Yet Fortran didn’t destroy computer programming, but democratized and expanded it. The truth is that value never disappears. It just shifts to another place and that’s what we need to learn to focus on.

Why Robots Aren’t Taking Our Jobs

Ever since the financial crisis we’ve been hearing about robots taking our jobs. Yet just the opposite seems to be happening. In fact, we increasingly find ourselves in a labor shortage. Most tellingly, the shortage is especially acute in manufacturing, where automation is most pervasive. So what’s going on?

The fact is that automation doesn’t actually replace jobs, it replaces tasks. To understand how this works, think about the last time you walked into a highly automated Apple store, which actually employs more people than a typical retail location of the same size. They aren’t there to ring up your purchase any faster, but to do all the things that a machine can’t do, like answer your questions and solve your problems.

A few years ago I came across an even more stark example when I asked Vijay Mehta, Chief Innovation Officer for Consumer Information Services at Experian about the effect that shifting to the cloud had on his firm’s business. The first order effect was simple, they needed a lot less technicians to manage its infrastructure and those people could easily be laid off.

Yet they weren’t. Instead Experian shifted a lot of that talent and expertise to focus on creating new services for its customers. One of these, a cloud enabled “data on demand” platform called Ascend has since become one of the $4 billion company’s most profitable products.

Now think of what would have happened if Experian had merely seen cloud technology as an opportunity to cut costs. Sure, it would have fattened its profit margins temporarily, but as its competitors moved to the cloud that advantage would have soon been eroded and, without new products its business would soon decline.

The Outsourcing Dilemma

Another source of disruption in the job market has been outsourcing. While no one seemed to notice when large multinational corporations were outsourcing blue-collar jobs to low cost countries, now so-called “gig economy” sites like Upwork and Fiverr are doing the same thing for white collar professionals like graphic designers and web developers.

So you would expect to see a high degree of unemployment for those job categories, right? Actually no. The Bureau of Labor Statistics expects demand for graphic designers to increase 4% by 2026 and web developers to increase 15%. The site Mashable recently named web development as one of 8 skills you need to get hired in today’s economy.

It’s not hard to see why. While it is true that a skilled professional in a low-cost country can do small projects of the same caliber as those in high cost countries, those tasks do not constitute a whole job. For large, important projects, professionals must collaborate closely to solve complex problems. It’s hard to do that through text messages on a website.

So while it’s true that many tasks are being outsourced, the number of jobs has actually increased. Just like with automation, outsourcing doesn’t make value disappear, but shifts it somewhere else.

The Social Impact

None of this is to say that the effects of technology and globalization hasn’t been real. While it’s fine to speak analytically about value shifting here and there, if a task that you spent years to learn to do well becomes devalued, you take it hard. Economists have also found evidence that disruptions in the job market have contributed to political polarization.

The most obvious thing to do is retrain workers that have been displaced, but it turns out that’s not so simple. In Janesville, a book which chronicles a small town’s struggle to recover from the closing of a GM plant, author Amy Goldstein found that the workers that sought retraining actually did worse than those that didn’t.

When someone loses their job, they don’t need training. They need another job and removing yourself from the job market to take training courses can have serious costs. Work relationships begin to decay and there is no guarantee that the new skills you learn will be in any more demand than the old ones you already had.

In fact, Peter Capelli at the Wharton School argues that the entire notion of a skills gap in America is largely a myth. One reason that there is such a mismatch between the rhetoric about skills and the data is that the most effective training often comes on the job from an employer. It is augmenting skills, not replacing them that creates value.

At the same time, increased complexity in the economy is making collaboration more important, so often the most important skills workers need to learn are soft skills, like writing, listening and being a better team player.

You Can’t Compete With A Robot By Acting Like One

The future is always hard to predict. While it was easy to see that Amazon posed a real problem for large chain bookstores like Barnes & Noble and Borders, it was much less obvious that small independent bookstores would thrive. In much the same way, few saw that ten years after the launch of the Kindle that paper books would surge amid a decline in e-books.

The one overriding trend over the past 50 years or so is that the future is always more human. In Dan Schawbel’s recent book, Back to Human, the author finds that the antidote for our overly automated age is deeper personal relationships. Things like trust, empathy and caring can’t be automated or outsourced.

There are some things a machine will never do. It will never strike out in a little league game, have its heart broken or see its child born. That makes it hard — impossible really — for a machine ever to work effectively with humans as a real person would. The work of humans is increasingly to work with other humans to design work for machines.

That why perhaps the biggest shift in value is from cognitive to social skills. The high paying jobs today have less to do with the ability to retain facts or manipulate numbers (we now use a computer for those things), but require more deep collaboration, teamwork and emotional intelligence.

So while even the most technically inept line manager can now easily produce an application that it would have once required a highly skilled software engineer, to design the next generation of technology, we need engineers and line managers to work more closely together.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.