Humans Wanted for the Decade’s Biggest Innovation Challenges

Humans Wanted for the Decade's Biggest Innovation Challenges

GUEST POST from Greg Satell

Every era is defined by the problems it tackles. At the beginning of the 20th century, harnessing the power of internal combustion and electricity shaped society. In the 1960s there was the space race. Since the turn of this century, we’ve learned how to decode the human genome and make machines intelligent.

None of these were achieved by one person or even one organization. In the case of electricity, Faraday and Maxwell established key principles in the early and mid 1800s. Edison, Westinghouse and Tesla came up with the first applications later in that century. Scores of people made contributions for decades after that.

The challenges we face today will be fundamentally different because they won’t be solved by humans alone, but through complex human-machine interactions. That will require a new division of labor in which the highest level skills won’t be things like the ability to retain information or manipulate numbers, but to connect and collaborate with other humans.

Making New Computing Architectures Useful

Technology over the past century has been driven by a long succession of digital devices. First vacuum tubes, then transistors and finally microchips transformed electrical power into something approaching an intelligent control system for machines. That has been the key to the electronic and digital eras.

Yet today that smooth procession is coming to an end. Microchips are hitting their theoretical limits and will need to be replaced by new computing paradigms such as quantum computing and neuromorphic chips. The new technologies will not be digital, but will work fundamentally different than what we’re used to.

They will also have fundamentally different capabilities and will be applied in very different ways. Quantum computing, for example, will be able to simulate physical systems, which may revolutionize sciences like chemistry, materials research and biology. Neuromorphic chips may be thousands of times more energy efficient than conventional chips, opening up new possibilities for edge computing and intelligent materials.

There is still a lot of work to be done to make these technologies useful. To be commercially viable, not only do important applications need to be identified, but much like with classical computers, an entire generation of professionals will need to learn how to use them. That, in truth, may be the most significant hurdle.

Ethics For AI And Genomics

Artificial intelligence, once the stuff of science fiction, has become an everyday technology. We speak into our devices as a matter of course and expect to get back coherent answers. In the near future, we will see autonomous cars and other vehicles regularly deliver products and eventually become an integral part of our transportation system.

This opens up a significant number of ethical dilemmas. If given a choice to protect a passenger or a pedestrian, which should be encoded into the software of a autonomous car? Who gets to decide which factors are encoded into systems that make decisions about our education, whether we get hired or if we go to jail? How will these systems be trained? We all worry about who’s educating our kids, but who’s teaching our algorithms?

Powerful genomics techniques like CRISPR open up further ethical dilemmas. What are the guidelines for editing human genes? What are the risks of a mutation inserted in one species jumping to another? Should we revive extinct species, Jurassic Park style? What are the potential consequences?

What’s striking about the moral and ethical issues of both artificial intelligence and genomics is that they have no precedent, save for science fiction. We are in totally uncharted territory. Nevertheless, it is imperative that we develop a consensus about what principles should be applied, in what contexts and for what purpose.

Closing A Perpetual Skills Gap

Education used to be something that you underwent in preparation for your “real life.” Afterwards, you put away the schoolbooks and got down to work, raised a family and never really looked back. Even today, Pew Research reports that nearly one in four adults in the US did not read a single book last year.

Today technology is making many things we learned obsolete. In fact, a study at Oxford estimated that nearly half of the jobs that exist today will be automated in the next 20 years. That doesn’t mean that there won’t be jobs for humans to do, in fact we are in the midst of an acute labor shortage, especially in manufacturing, where automation is most pervasive.

Yet just as advanced technologies are eliminating the need for skills, they are also increasingly able to help us learn new ones. A number of companies are using virtual reality to train workers and finding that it can boost learning efficiency by as much as 40%. IBM, with the Rensselaer Polytechnic Institute, has recently unveiled a system that help you learn a new language like Mandarin. This video shows how it works.

Perhaps the most important challenge is a shift in mindset. We need to treat education as a lifelong need that extends long past childhood. If we only retrain workers once their industry has become obsolete and they’ve lost their jobs, then we are needlessly squandering human potential, not to mention courting an abundance of misery.

Shifting Value To Humans

The industrial revolution replaced the physical labor of humans with that of machines. The result was often mind-numbing labor in factories. Yet further automation opened up new opportunities for knowledge workers who could design ways to boost the productivity of both humans and machines.

Today, we’re seeing a similar shift from cognitive to social skills. Go into a highly automated Apple Store, to take just one example, and you don’t see a futuristic robot dystopia, but a small army of smiling attendants on hand to help you. The future of technology always seems to be more human.

In much the same way, when I talk to companies implementing advanced technologies like artificial intelligence or cloud computing, the one thing I constantly hear is that the human element is often the most important. Unless you can shift your employees to higher level tasks, you miss out on many of the most important benefits

What’s important to consider is that when a task is automated, it is also democratized and value shifts to another place. So, for example, e-commerce devalues the processing of transactions, but increases the value of things like customer service, expertise and resolving problems with orders, which is why we see all those smiling faces when we walk into an Apple Store.

That’s what we often forget about innovation. It’s essentially a very human endeavor and, to measure as true progress, humans always need to be at the center.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Leave a Reply

Your email address will not be published. Required fields are marked *