Tag Archives: Science

Have Humans Evolved Beyond Nature and a Need for It?

Have Humans Evolved Beyond Nature and a Need for It?

GUEST POST from Manuel Berdoy, University of Oxford

Our society has evolved so much, can we still say that we are part of Nature? If not, should we worry – and what should we do about it? Poppy, 21, Warwick.

Such is the extent of our dominion on Earth, that the answer to questions around whether we are still part of nature – and whether we even need some of it – rely on an understanding of what we want as Homo sapiens. And to know what we want, we need to grasp what we are.

It is a huge question – but they are the best. And as a biologist, here is my humble suggestion to address it, and a personal conclusion. You may have a different one, but what matters is that we reflect on it.

Perhaps the best place to start is to consider what makes us human in the first place, which is not as obvious as it may seem.


This article is part of Life’s Big Questions

The Conversation’s new series, co-published with BBC Future, seeks to answer our readers’ nagging questions about life, love, death and the universe. We work with professional researchers who have dedicated their lives to uncovering new perspectives on the questions that shape our lives.


Many years ago, a novel written by Vercors called Les Animaux dénaturés (“Denatured Animals”) told the story of a group of primitive hominids, the Tropis, found in an unexplored jungle in New Guinea, who seem to constitute a missing link.

However, the prospect that this fictional group may be used as slave labour by an entrepreneurial businessman named Vancruysen forces society to decide whether the Tropis are simply sophisticated animals or whether they should be given human rights. And herein lies the difficulty.

Human status had hitherto seemed so obvious that the book describes how it is soon discovered that there is no definition of what a human actually is. Certainly, the string of experts consulted – anthropologists, primatologists, psychologists, lawyers and clergymen – could not agree. Perhaps prophetically, it is a layperson who suggested a possible way forward.

She asked whether some of the hominids’ habits could be described as the early signs of a spiritual or religious mind. In short, were there signs that, like us, the Tropis were no longer “at one” with nature, but had separated from it, and were now looking at it from the outside – with some fear.

It is a telling perspective. Our status as altered or “denatured” animals – creatures who have arguably separated from the natural world – is perhaps both the source of our humanity and the cause of many of our troubles. In the words of the book’s author:

All man’s troubles arise from the fact that we do not know what we are and do not agree on what we want to be.

We will probably never know the timing of our gradual separation from nature – although cave paintings perhaps contain some clues. But a key recent event in our relationship with the world around us is as well documented as it was abrupt. It happened on a sunny Monday morning, at 8.15am precisely.

A new age

The atomic bomb that rocked Hiroshima on August 6 1945, was a wake-up call so loud that it still resonates in our consciousness many decades later.

The day the “sun rose twice” was not only a forceful demonstration of the new era that we had entered, it was a reminder of how paradoxically primitive we remained: differential calculus, advanced electronics and almost godlike insights into the laws of the universe helped build, well … a very big stick. Modern Homo sapiens seemingly had developed the powers of gods, while keeping the psyche of a stereotypical Stone Age killer.

We were no longer fearful of nature, but of what we would do to it, and ourselves. In short, we still did not know where we came from, but began panicking about where we were going.

We now know a lot more about our origins but we remain unsure about what we want to be in the future – or, increasingly, as the climate crisis accelerates, whether we even have one.

Arguably, the greater choices granted by our technological advances make it even more difficult to decide which of the many paths to take. This is the cost of freedom.

I am not arguing against our dominion over nature nor, even as a biologist, do I feel a need to preserve the status quo. Big changes are part of our evolution. After all, oxygen was first a poison which threatened the very existence of early life, yet it is now the fuel vital to our existence.

Similarly, we may have to accept that what we do, even our unprecedented dominion, is a natural consequence of what we have evolved into, and by a process nothing less natural than natural selection itself. If artificial birth control is unnatural, so is reduced infant mortality.

I am also not convinced by the argument against genetic engineering on the basis that it is “unnatural”. By artificially selecting specific strains of wheat or dogs, we had been tinkering more or less blindly with genomes for centuries before the genetic revolution. Even our choice of romantic partner is a form of genetic engineering. Sex is nature’s way of producing new genetic combinations quickly.

Even nature, it seems, can be impatient with itself.

Our natural habitat? Shutterstock

Changing our world

Advances in genomics, however, have opened the door to another key turning point. Perhaps we can avoid blowing up the world, and instead change it – and ourselves – slowly, perhaps beyond recognition.

The development of genetically modified crops in the 1980s quickly moved from early aspirations to improve the taste of food to a more efficient way of destroying undesirable weeds or pests.

In what some saw as the genetic equivalent of the atomic bomb, our early forays into a new technology became once again largely about killing, coupled with worries about contamination. Not that everything was rosy before that. Artificial selection, intensive farming and our exploding population growth were long destroying species quicker than we could record them.

The increasing “silent springs” of the 1950s and 60s caused by the destruction of farmland birds – and, consequently, their song – was only the tip of a deeper and more sinister iceberg. There is, in principle, nothing unnatural about extinction, which has been a recurring pattern (of sometimes massive proportions) in the evolution of our planet long before we came on the scene. But is it really what we want?

The arguments for maintaining biodiversity are usually based on survival, economics or ethics. In addition to preserving obvious key environments essential to our ecosystem and global survival, the economic argument highlights the possibility that a hitherto insignificant lichen, bacteria or reptile might hold the key to the cure of a future disease. We simply cannot afford to destroy what we do not know.

Is it this crocodile’s economic, medical or inherent value which should be important to us? Shutterstock

But attaching an economic value to life makes it subject to the fluctuation of markets. It is reasonable to expect that, in time, most biological solutions will be able to be synthesised, and as the market worth of many lifeforms falls, we need to scrutinise the significance of the ethical argument. Do we need nature because of its inherent value?

Perhaps the answer may come from peering over the horizon. It is somewhat of an irony that as the third millennium coincided with decrypting the human genome, perhaps the start of the fourth may be about whether it has become redundant.

Just as genetic modification may one day lead to the end of “Homo sapiens naturalis” (that is, humans untouched by genetic engineering), we may one day wave goodbye to the last specimen of Homo sapiens genetica. That is the last fully genetically based human living in a world increasingly less burdened by our biological form – minds in a machine.

If the essence of a human, including our memories, desires and values, is somehow reflected in the pattern of the delicate neuronal connections of our brain (and why should it not?) our minds may also one day be changeable like never before.

And this brings us to the essential question that surely we must ask ourselves now: if, or rather when, we have the power to change anything, what would we not change?

After all, we may be able to transform ourselves into more rational, more efficient and stronger individuals. We may venture out further, have greater dominion over greater areas of space, and inject enough insight to bridge the gap between the issues brought about by our cultural evolution and the abilities of a brain evolved to deal with much simpler problems. We might even decide to move into a bodiless intelligence: in the end, even the pleasures of the body are located in the brain.

And then what? When the secrets of the universe are no longer hidden, what makes it worth being part of it? Where is the fun?

“Gossip and sex, of course!” some might say. And in effect, I would agree (although I might put it differently), as it conveys to me the fundamental need that we have to reach out and connect with others. I believe that the attributes that define our worth in this vast and changing universe are simple: empathy and love. Not power or technology, which occupy so many of our thoughts but which are merely (almost boringly) related to the age of a civilisation.

True gods

Like many a traveller, Homo sapiens may need a goal. But from the strengths that come with attaining it, one realises that one’s worth (whether as an individual or a species) ultimately lies elsewhere. So I believe that the extent of our ability for empathy and love will be the yardstick by which our civilisation is judged. It may well be an important benchmark by which we will judge other civilisations that we may encounter, or indeed be judged by them.

When we can change everything about ourselves, what will we keep? Shutterstock

There is something of true wonder at the basis of it all. The fact that chemicals can arise from the austere confines of an ancient molecular soup, and through the cold laws of evolution, combine into organisms that care for other lifeforms (that is, other bags of chemicals) is the true miracle.

Some ancients believed that God made us in “his image”. Perhaps they were right in a sense, as empathy and love are truly godlike features, at least among the benevolent gods.

Cherish those traits and use them now, Poppy, as they hold the solution to our ethical dilemma. It is those very attributes that should compel us to improve the wellbeing of our fellow humans without lowering the condition of what surrounds us.

Anything less will pervert (our) nature.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credits: Pixabay, Shutterstock (via theconversation)

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

How will humans change in the next 10,000 years?

Future evolution: from looks to brains and personality

GUEST POST from Nicholas R. Longrich, University of Bath

READER QUESTION: If humans don’t die out in a climate apocalypse or asteroid impact in the next 10,000 years, are we likely to evolve further into a more advanced species than what we are at the moment? Harry Bonas, 57, Nigeria

Humanity is the unlikely result of 4 billion years of evolution.

From self-replicating molecules in Archean seas, to eyeless fish in the Cambrian deep, to mammals scurrying from dinosaurs in the dark, and then, finally, improbably, ourselves – evolution shaped us.

Organisms reproduced imperfectly. Mistakes made when copying genes sometimes made them better fit to their environments, so those genes tended to get passed on. More reproduction followed, and more mistakes, the process repeating over billions of generations. Finally, Homo sapiens appeared. But we aren’t the end of that story. Evolution won’t stop with us, and we might even be evolving faster than ever.


This article is part of Life’s Big Questions

The Conversation’s new series, co-published with BBC Future, seeks to answer our readers’ nagging questions about life, love, death and the universe. We work with professional researchers who have dedicated their lives to uncovering new perspectives on the questions that shape our lives.


It’s hard to predict the future. The world will probably change in ways we can’t imagine. But we can make educated guesses. Paradoxically, the best way to predict the future is probably looking back at the past, and assuming past trends will continue going forward. This suggests some surprising things about our future.

We will likely live longer and become taller, as well as more lightly built. We’ll probably be less aggressive and more agreeable, but have smaller brains. A bit like a golden retriever, we’ll be friendly and jolly, but maybe not that interesting. At least, that’s one possible future. But to understand why I think that’s likely, we need to look at biology.

The end of natural selection?

Some scientists have argued that civilisation’s rise ended natural selection. It’s true that selective pressures that dominated in the past – predators, famine, plague, warfare – have mostly disappeared.

Starvation and famine were largely ended by high-yield crops, fertilisers and family planning. Violence and war are less common than ever, despite modern militaries with nuclear weapons, or maybe because of them. The lions, wolves and sabertoothed cats that hunted us in the dark are endangered or extinct. Plagues that killed millions – smallpox, Black Death, cholera – were tamed by vaccines, antibiotics, clean water.

But evolution didn’t stop; other things just drive it now. Evolution isn’t so much about survival of the fittest as reproduction of the fittest. Even if nature is less likely to murder us, we still need to find partners and raise children, so sexual selection now plays a bigger role in our evolution.

And if nature doesn’t control our evolution anymore, the unnatural environment we’ve created – culture, technology, cities – produces new selective pressures very unlike those we faced in the ice age. We’re poorly adapted to this modern world; it follows that we’ll have to adapt.

And that process has already started. As our diets changed to include grains and dairy, we evolved genes to help us digest starch and milk. When dense cities created conditions for disease to spread, mutations for disease resistance spread too. And for some reason, our brains have got smaller. Unnatural environments create unnatural selection.

To predict where this goes, we’ll look at our prehistory, studying trends over the past 6 million years of evolution. Some trends will continue, especially those that emerged in the past 10,000 years, after agriculture and civilisation were invented.

We’re also facing new selective pressures, such as reduced mortality. Studying the past doesn’t help here, but we can see how other species responded to similar pressures. Evolution in domestic animals may be especially relevant – arguably we’re becoming a kind of domesticated ape, but curiously, one domesticated by ourselves.

I’ll use this approach to make some predictions, if not always with high confidence. That is, I’ll speculate.

Lifespan

Humans will almost certainly evolve to live longer – much longer. Life cycles evolve in response to mortality rates, how likely predators and other threats are to kill you. When mortality rates are high, animals must reproduce young, or might not reproduce at all. There’s also no advantage to evolving mutations that prevent ageing or cancer – you won’t live long enough to use them.

When mortality rates are low, the opposite is true. It’s better to take your time reaching sexual maturity. It’s also useful to have adaptations that extend lifespan, and fertility, giving you more time to reproduce. That’s why animals with few predators – animals that live on islands or in the deep ocean, or are simply big – evolve longer lifespans. Greenland sharks, Galapagos tortoises and bowhead whales mature late, and can live for centuries.

Even before civilisation, people were unique among apes in having low mortality and long lives. Hunter-gatherers armed with spears and bows could defend against predators; food sharing prevented starvation. So we evolved delayed sexual maturity, and long lifespans – up to 70 years.

Still, child mortality was high – approaching 50% or more by age 15. Average life expectancy was just 35 years. Even after the rise of civilisation, child mortality stayed high until the 19th century, while life expectancy went down – to 30 years – due to plagues and famines.

Then, in the past two centuries, better nutrition, medicine and hygiene reduced youth mortality to under 1% in most developed nations. Life expectancy soared to 70 years worldwide , and 80 in developed countries. These increases are due to improved health, not evolution – but they set the stage for evolution to extend our lifespan.

Now, there’s little need to reproduce early. If anything, the years of training needed to be a doctor, CEO, or carpenter incentivise putting it off. And since our life expectancy has doubled, adaptations to prolong lifespan and child-bearing years are now advantageous. Given that more and more people live to 100 or even 110 yearsthe record being 122 years – there’s reason to think our genes could evolve until the average person routinely lives 100 years or even more.

Size, and strength

Animals often evolve larger size over time; it’s a trend seen in tyrannosaurs, whales, horses and primates – including hominins.

Early hominins like Australopithecus afarensis and Homo habilis were small, four to five feet (120cm-150cm) tall. Later hominins – Homo erectus, Neanderthals, Homo sapiens – grew taller. We’ve continued to gain height in historic times, partly driven by improved nutrition, but genes seem to be evolving too.

Why we got big is unclear. In part, mortality may drive size evolution; growth takes time, so longer lives mean more time to grow. But human females also prefer tall males. So both lower mortality and sexual preferences will likely cause humans to get taller. Today, the tallest people in the world are in Europe, led by the Netherlands. Here, men average 183cm (6ft); women 170cm (5ft 6in). Someday, most people might be that tall, or taller.

As we’ve grown taller, we’ve become more gracile. Over the past 2 million years, our skeletons became more lightly built as we relied less on brute force, and more on tools and weapons. As farming forced us to settle down, our lives became more sedentary, so our bone density decreased. As we spend more time behind desks, keyboards and steering wheels, these trends will likely continue.

Humans have also reduced our muscles compared to other apes, especially in our upper bodies. That will probably continue. Our ancestors had to slaughter antelopes and dig roots; later they tilled and reaped in the fields. Modern jobs increasingly require working with people, words and code – they take brains, not muscle. Even for manual laborers – farmers, fisherman, lumberjacks – machinery such as tractors, hydraulics and chainsaws now shoulder a lot of the work. As physical strength becomes less necessary, our muscles will keep shrinking.

Our jaws and teeth also got smaller. Early, plant-eating hominins had huge molars and mandibles for grinding fibrous vegetables. As we shifted to meat, then started cooking food, jaws and teeth shrank. Modern processed food – chicken nuggets, Big Macs, cookie dough ice cream – needs even less chewing, so jaws will keep shrinking, and we’ll likely lose our wisdom teeth.

Beauty

After people left Africa 100,000 years ago, humanity’s far-flung tribes became isolated by deserts, oceans, mountains, glaciers and sheer distance. In various parts of the world, different selective pressures – different climates, lifestyles and beauty standards – caused our appearance to evolve in different ways. Tribes evolved distinctive skin colour, eyes, hair and facial features.

With civilisation’s rise and new technologies, these populations were linked again. Wars of conquest, empire building, colonisation and trade – including trade of other humans – all shifted populations, which interbred. Today, road, rail and aircraft link us too. Bushmen would walk 40 miles to find a partner; we’ll go 4,000 miles. We’re increasingly one, worldwide population – freely mixing. That will create a world of hybrids – light brown skinned, dark-haired, Afro-Euro-Australo-Americo-Asians, their skin colour and facial features tending toward a global average.

Sexual selection will further accelerate the evolution of our appearance. With most forms of natural selection no longer operating, mate choice will play a larger role. Humans might become more attractive, but more uniform in appearance. Globalised media may also create more uniform standards of beauty, pushing all humans towards a single ideal. Sex differences, however, could be exaggerated if the ideal is masculine-looking men and feminine-looking women.

Intelligence and personality

Last, our brains and minds, our most distinctively human feature, will evolve, perhaps dramatically. Over the past 6 million years, hominin brain size roughly tripled, suggesting selection for big brains driven by tool use, complex societies and language. It might seem inevitable that this trend will continue, but it probably won’t.

Instead, our brains are getting smaller. In Europe, brain size peaked 10,000—20,000 years ago, just before we invented farming. Then, brains got smaller. Modern humans have brains smaller than our ancient predecessors, or even medieval people. It’s unclear why.

It could be that fat and protein were scarce once we shifted to farming, making it more costly to grow and maintain large brains. Brains are also energetically expensive – they burn around 20% of our daily calories. In agricultural societies with frequent famine, a big brain might be a liability.

Maybe hunter-gatherer life was demanding in ways farming isn’t. In civilisation, you don’t need to outwit lions and antelopes, or memorise every fruit tree and watering hole within 1,000 square miles. Making and using bows and spears also requires fine motor control, coordination, the ability to track animals and trajectories — maybe the parts of our brains used for those things got smaller when we stopped hunting.

Or maybe living in a large society of specialists demands less brainpower than living in a tribe of generalists. Stone-age people mastered many skills – hunting, tracking, foraging for plants, making herbal medicines and poisons, crafting tools, waging war, making music and magic. Modern humans perform fewer, more specialised roles as part of vast social networks, exploiting division of labour. In a civilisation, we specialise on a trade, then rely on others for everything else.

That being said, brain size isn’t everything: elephants and orcas have bigger brains than us, and Einstein’s brain was smaller than average. Neanderthals had brains comparable to ours, but more of the brain was devoted to sight and control of the body, suggesting less capacity for things like language and tool use. So how much the loss of brain mass affects overall intelligence is unclear. Maybe we lost certain abilities, while enhancing others that are more relevant to modern life. It’s possible that we’ve maintained processing power by having fewer, smaller neurons. Still, I worry about what that missing 10% of my grey matter did.

Curiously, domestic animals also evolved smaller brains. Sheep lost 24% of their brain mass after domestication; for cows, it’s 26%; dogs, 30%. This raises an unsettling possibility. Maybe being more willing to passively go with the flow (perhaps even thinking less), like a domesticated animal, has been bred into us, like it was for them.

Our personalities must be evolving too. Hunter-gatherers’ lives required aggression. They hunted large mammals, killed over partners and warred with neighbouring tribes. We get meat from a store, and turn to police and courts to settle disputes. If war hasn’t disappeared, it now accounts for fewer deaths, relative to population, than at any time in history. Aggression, now a maladaptive trait, could be bred out.

Changing social patterns will also change personalities. Humans live in much larger groups than other apes, forming tribes of around 1,000 in hunter-gatherers. But in today’s world people living in vast cities of millions. In the past, our relationships were necessarily few, and often lifelong. Now we inhabit seas of people, moving often for work, and in the process forming thousands of relationships, many fleeting and, increasingly, virtual. This world will push us to become more outgoing, open and tolerant. Yet navigating such vast social networks may also require we become more willing to adapt ourselves to them – to be more conformist.

Not everyone is psychologically well-adapted to this existence. Our instincts, desires and fears are largely those of stone-age ancestors, who found meaning in hunting and foraging for their families, warring with their neighbours and praying to ancestor-spirits in the dark. Modern society meets our material needs well, but is less able to meet the psychological needs of our primitive caveman brains.

Perhaps because of this, increasing numbers of people suffer from psychological issues such as loneliness, anxiety and depression. Many turn to alcohol and other substances to cope. Selection against vulnerability to these conditions might improve our mental health, and make us happier as a species. But that could come at a price. Many great geniuses had their demons; leaders like Abraham Lincoln and Winston Churchill fought with depression, as did scientists such as Isaac Newton and Charles Darwin, and artists like Herman Melville and Emily Dickinson. Some, like Virginia Woolf, Vincent Van Gogh and Kurt Cobain, took their own lives. Others – Billy Holliday, Jimi Hendrix and Jack Kerouac – were destroyed by substance abuse.

A disturbing thought is that troubled minds will be removed from the gene pool – but potentially at the cost of eliminating the sort of spark that created visionary leaders, great writers, artists and musicians. Future humans might be better adjusted – but less fun to party with and less likely to launch a scientific revolution — stable, happy and boring.

New species?

There were once nine human species, now it’s just us. But could new human species evolve? For that to happen, we’d need isolated populations subject to distinct selective pressures. Distance no longer isolates us, but reproductive isolation could theoretically be achieved by selective mating. If people were culturally segregated – marrying based on religion, class, caste, or even politics – distinct populations, even species, might evolve.

In The Time Machine, sci-fi novelist H.G. Wells saw a future where class created distinct species. Upper classes evolved into the beautiful but useless Eloi, and the working classes become the ugly, subterranean Morlocks – who revolted and enslaved the Eloi.

In the past, religion and lifestyle have sometimes produced genetically distinct groups, as seen in for example Jewish and Gypsy populations. Today, politics also divides us – could it divide us genetically? Liberals now move to be near other liberals, and conservatives to be near conservatives; many on the left won’t date Trump supporters and vice versa.

Could this create two species, with instinctively different views? Probably not. Still, to the extent culture divides us, it could drive evolution in different ways, in different people. If cultures become more diverse, this could maintain and increase human genetic diversity.

Strange New Possibilities

So far, I’ve mostly taken a historical perspective, looking back. But in some ways, the future might be radically unlike the past. Evolution itself has evolved.

One of the more extreme possibilities is directed evolution, where we actively control our species’ evolution. We already breed ourselves when we choose partners with appearances and personalities we like. For thousands of years, hunter-gatherers arranged marriages, seeking good hunters for their daughters. Even where children chose partners, men were generally expected to seek approval of the bride’s parents. Similar traditions survive elsewhere today. In other words, we breed our own children.

And going forward, we’ll do this with far more knowledge of what we’re doing, and more control over the genes of our progeny. We can already screen ourselves and embryos for genetic diseases. We could potentially choose embryos for desirable genes, as we do with crops. Direct editing of the DNA of a human embryo has been proven to be possible — but seems morally abhorrent, effectively turning children into subjects of medical experimentation. And yet, if such technologies were proven safe, I could imagine a future where you’d be a bad parent not to give your children the best genes possible.

Computers also provide an entirely new selective pressure. As more and more matches are made on smartphones, we are delegating decisions about what the next generation looks like to computer algorithms, who recommend our potential matches. Digital code now helps choose what genetic code passed on to future generations, just like it shapes what you stream or buy online. This might sound like dark science fiction, but it’s already happening. Our genes are being curated by computer, just like our playlists. It’s hard to know where this leads, but I wonder if it’s entirely wise to turn over the future of our species to iPhones, the internet and the companies behind them.

Discussions of human evolution are usually backward looking, as if the greatest triumphs and challenges were in the distant past. But as technology and culture enter a period of accelerating change, our genes will too. Arguably, the most interesting parts of evolution aren’t life’s origins, dinosaurs, or Neanderthals, but what’s happening right now, our present – and our future.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We need MD/MBEs not MD/MBAs

We need MD/MBEs not MD/MBAs

GUEST POST from Arlen Meyers, M.D.

The number of MD/MBAs graduating from medical schools continues to expand with about 5% of the roughly 20,000 US medical student graduates having dual degrees. While in past times the idea was to get the knowledge, skills, abilities and competencies to manage health services organizations, many are now doing it on the way to digital health startup land.

Most all of the 38 osteopathic schools offer dual degree programs as well.

However, MBA programs are dwindling and the ones that are still around are rethinking their value proposition and restructuring their curriculum.

For example, business schools are racing to add concentrations in science, technology, engineering and math to their M.B.A. programs as they try to broaden their appeal to prospective students overseas who want to work in the U.S.

Several schools, including Northwestern’s Kellogg School of Management and North Carolina’s Kenan-Flagler Business School, have unveiled STEM-designated master’s in business degrees in recent months. The University of California Berkeley’s Haas School of Business recently reclassified its entire M.B.A. program as STEM.

But, BMETALS is the new STEM.

In my view, we are training too many MD/MBAs that don’t add value to the system and that many programs should be terminated or restructured.

  1. We don’t know how much value the graduates contribute to the sick care system.
  2. The programs are usually not domain specific. Some think that’s a good thing, encouraging exposure to how other industries have solved generic problems. Others feel sick-care is so unique that the lessons are not applicable.
  3. Medical students are already up their waists in debt, most of which is taxpayer subsidized. Should additional debt be added to their student loans?
  4. Few of the programs address the needs of physician entrepreneurs.
  5. There are many substitutes for physician entrepreneurs around the world and US schools are no longer the mecca.
  6. Content has become generic and offered for free on the Internet.
  7. Connections are easy to make using social media.
  8. The MBA is losing credibility, given the large number of places that offer them, particularly those below the first-tier schools.
  9. Employers can see through the credentials.
  10. Costs continue to escalate and the programs do not accommodate the specific needs of busy clinician students.
  11. We need a thorough conversation about the policy wisdom of encouraging dual degrees, potentially side tracking graduates into non clinical roles when there is a global demand for clinicians.
  12. We need to track outcomes and roles of graduates to determine whether the dual degree adds value to the communities they are designed to serve and whether they are cost-effective in an era of skyrocketing student debt.

In addition, there is a difference between having knowledge, skills, abilities and competencies in the business of medicine, health systems science, health service organization management, leadership and leaderpreneurship and entrepreneurship/intrapreneurship. There is a confusing array of dual degree programs leaving students scratching their heads and, in many instances, wasting their time and money.

Also, more medical students are jumping ship to pursue non-clinical careers. While the numbers may a small portion of the roughly 20,000 first year US medical students, the trend is evident.

Instead, we should consider re-shuffling the deck and offer a new combined MBE (Masters in Bioinnovation and Entrepreneurship) degree or dual MD/MBE or PhD/MBE program.

According to Prof. Varda Liberman, the new Provost of Reichman University and Head of the MBA in Healthcare Innovation, “Healthcare systems are going through enormous changes worldwide and with the COVID-19, these changes were accelerated. There is an immediate need for a complete redesign that will necessitate innovative multidisciplinary solutions, leveraging technology, science, information systems, and national policy. Our MBA program in Healthcare Innovation, offered by Reichman University, in collaboration with Israel’s largest hospital, the Sheba Medical Center, Tel Hashomer, is designed to prepare the future leaders of the healthcare industry to develop solutions that will enable the needed redesign. The program brings together all the unique advantages of Israeli innovation, to provide our students with the tools and skills necessary to understand the complexity of the healthcare industry today. The program brings together all the key players of the ecosystem – those coming from the healthcare system, engineering, entrepreneurship, AI, law, biomedicine, pharmacology, high tech, investment, management, and public policy”.

Here’s how it would work:

  1. A four-year program combining two years in medical school and two years in an MBE program, patterned similar to Professional Science Masters Programs.
  2. The medical school curriculum would be separate and distinct from that offered to medical students interested in practicing medicine. Among other topics, we would teach sales.
  3. Clinical rotations should start on day one, intended to instill an entrepreneurial mindset and emphasize being a problem seeker, not a problem solver at this stage
  4. Interdisciplinary education with experiential learning in project teams that includes business, science, engineering, law and other health professionals.
  5. Experiential learning and a mandatory internship with local, national or international company in biopharma, medtech or digital health.
  6. A new tuition and funding structure, possibly run by private equity or medical technology companies who sponsor applicants. The present medical education business model won’t work if it depends on short term revenue by putting butts in the seats.
  7. Project teams would be offered proof of concept funding and iCorps team support
  8. Domain experts would work with project teams
  9. Each student would be assigned an entrepreneur mentor throughout the program
  10. Social biomedical entrepreneurship and ethics would be core streams throughout training. Those interested in creating non-profits or going into public service might be candidates for tuition deferral or waiver.

Another alternative is to make medical school 3 years instead of 4 and offer a one year track in biomedical and clinical entrepreneurship.

The good news for educators is that you don’t need to start from scratch. Karolinska beat you to the punch.

The purpose of the degree program is to provide students with the knowledge, skills and abilities they need to lead global biomedical innovation. Here’s what the curriculum would include:

  1. Building Biotechnology: Introduction to biomedical entrepreneurship
  2. Regulatory Affairs and Reimbursement
  3. Life Science Intellectual Property
  4. International (Bio) Business
  5. Biotech law and ethics
  6. Internship
  7. International trip
  8. Device and digital health entrepreneurship
  9. Leading high performance teams
  10. Bioentrepreneurial finance
  11. Drug discovery and development
  12. Care delivery entrepreneurship
  13. Social entrepreneurship
  14. Electives in other aspects of entrepreneurship

The David Eccles School of Business at the University of Utah is taking its top 10 ranked program for entrepreneurship to new heights with a master’s degree designed for serious entrepreneurs.

The degree is called the Master of Business Creation (MBC), and it’s the first of its kind.

Applicants must be full-time entrepreneurs who want to create, launch and scale a new business, who want more than the 9-to-5 job, who have the drive to overcome the impossible, who want to build their knowledge while doing, and who are willing to put in the hours to make it happen.

We don’t need more physician administrators. We need more physician innovators and entrepreneurs who can lead us out of our sick care mess and close global health outcome disparities. While I believe the optimal career track involves a reasonable time practicing clinical medicine, students are thinking otherwise. For those that do, they need a new path to creating the future and medical and business educators need to create educational products that meet their needs.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovation and the Scientific Method

Innovation and the Scientific Method

GUEST POST from Jesse Nieminen

Most large organizations are led and managed very systematically, and they pride themselves on that. Managers and leaders within those organizations are usually smart, educated, and want to make data-driven, evidence-based decisions.

However, when it comes to innovation, that can be a part of the problem as Clayton Christensen famously pointed out.

Many leaders these days are well aware of the problem, but even if they are, they may still have a hard time leading innovation because the approach is so different from what most of them are used to in their day-to-day. The mindset, mental models and frameworks needed are just fundamentally different.

So, to get it right, you need to pick out the right frameworks and mental models and use those to lead both your own thinking, as well as your teams. Because innovation has become such a hot topic, there’s been an explosion in the number of these. So, how do you know which ones to adopt?

Well, in these situations, it’s often beneficial to take a step back and go to the roots of the phenomenon to figure out what the timeless fundamentals are, and what’s just part of the latest fad.

So, in this article, we’ll look at arguably the oldest innovation framework in the world, the scientific method. We’ll first explore the concept and briefly compare it to more modern frameworks, and then draw some practical takeaways from the exercise.

What is the scientific method and how does it relate to innovation?

Most of us probably remember hearing about the scientific method, and it’s generally seen as the standard for proving a point and for exploring new phenomena. Having said that, given that even to this day, there still isn’t a clear consensus on what the scientific method actually is, it’s probably a good idea to explore the term.

The scientific method is a systematic, iterative, and primarily empirical method of acquiring knowledge.

Some of the key ideas behind the scientific method actually date back to ancient times and several different cultures, perhaps most famously to Ancient Greece. The initial principles evolved gradually throughout the years, but it took until the Enlightenment before the term “scientific method” began to be used, and these principles became popularized.

With that background we can safely call the scientific method the oldest innovation framework in the world. In the end, applying this method is where most of the big technological innovations and breakthroughs we all now know and benefit from every day, have come from throughout history.

But enough about history, what does the process actually look like? Well, as mentioned, that depends on whom you ask, but the key principles everyone agrees on are that it is a systematic, iterative, and primarily empirical method of acquiring knowledge.

Again, there’s no consensus on the exact steps used in the process, and there are also minor variances in terminology, but the four steps practically every version seems to have can be seen from the chart below.

Scientific Method Chart

While traditionally the scientific method has been used primarily for basic research, it’s been the inspiration for many recent, popular processes and frameworks around business innovation.

Just look at Lean Startup, Design Thinking, Growth Hacking, Discovery Driven Growth, and the list goes on.

At a high level, most of these are very similar to the scientific method, just applied to a more specific domain, and that come with some practical guidelines for applying said methods in practice.

With so many similarities, there’s clearly something there that’s worth paying attention to. Let’s next dive deeper to understand why that is the case.

Why are the frameworks so similar?

By definition, innovation is about creating and introducing something new. Sometimes that can mean small, incremental changes, but often we’re talking something much bigger.

And, in today’s globalized, hyperconnected and rapidly moving world, a lot of volatility, uncertainty, complexity and ambiguity (VUCA) will always be involved, especially when you’re moving into these uncharted waters.

This leads to two fundamental problems:

  • You usually can’t have all the information before making a decision
  • Whatever plans and assumptions you initially make will likely be wrong

What that in turn means is that many of the practices and frameworks leaders have applied for years in managing people and projects as they’ve risen through the ranks of the business, will not be applicable here. In fact, they can even be counterproductive as we pointed out in the introduction. Some leaders have a hard time accepting this and adapting to the new reality, and that usually doesn’t end well.

Humility and pragmatism are key for innovation

On the other hand, some leaders that have realized this have decided to go to the other extreme. They’ve heard stories of these great visionaries and innovators that had a dream of the future and just refused to take no for an answer. While there is a lot to like in that approach, the mistake that often happens is that once these leaders embark on that journey, they refuse to adapt their vision to meet the reality.

Finding the right balance is always tricky, but what helps with that is adapting the iterative, exploratory, and empirical approach of the scientific methodand the other frameworks and processes we mentioned before.

This doesn’t mean that it would be a free-for-all, on the contrary. These processes are in fact systematic and usually quite structured.

The purpose of the scientific method is to create structure and understanding from what seems like an incomprehensible mess.

To put it in another way, the purpose of the scientific method is actually to create structure and understanding from what initially seems like an incomprehensible mess – and that is the foundation that most great innovations are built on.

What can we learn from that?

Let’s now reflect on what that means for the day-to-day job of innovators and leaders managing innovation.

For me, it essentially boils down to three main takeaways. We’ll next cover each of them briefly.

Innovation is a learning process, just like the scientific method

As we just covered, most innovation processes abide by the same key principles as the scientific method. They are iterative, empirical, and exploratory. But they are also systematic, evidence-based, and most importantly, focused on learning and solving problems.

With innovation, your first priority is always to be skeptical of your initial plan and question your assumptions. When you do that and look at the data objectively to try figure out how and why things work the way they do, you’ll unlock a deeper level of understanding, and that level of understanding is what can help you solve problems and create better innovations that make a real difference for your customers and your organization.

To sum up, when you’re trying to build the future, don’t assume you’re right. Instead, ask how you’re wrong, and why. Often the hardest part about learning is to unlearn what you’ve previously learned. This is what’s often referred to as first principles thinking.

“Trying things out” isn’t unscientific or non-evidence-based

We still see leaders in many organizations struggle to admit that they, either as a leader or as an organization, don’t know something.

There’s often resistance to admitting a lack of understanding and to “trying things out” because those are seen as amateurish and unscientific or non-evidence-based, approaches. Rational leaders naturally want to do their homework before choosing a direction or committing significant resources to an initiative.

The scientific method is about learning

However, with innovation, often doing your homework properly means that you understand that you don’t know all the answers and need to figure out a way to find out those answers instead of just trusting your gut or whatever market research you might have been able to scrape together.

“Trying things out” is how more or less every meaningful innovation has ever been created. By definition, there’s always an amount of trial and error involved in that process.

So, if you recognize yourself struggling to embrace the uncertainty, take a hard look in the mirror, be more pragmatic and have the courage to make yourself vulnerable. If you have the right talent in your team, being vulnerable is actually a great way to gel the team together and improve performance.

On the other hand, if you understand all of this, but your boss doesn’t, it might be a good idea to politely remind them of how the scientific method works. While it’s not a silver bullet that would be guaranteed to convert everyone into a believer at once, I’ve found this to be a good way to remind leaders how science and progress really gets made.

Essentially, you need to convince them that you know what you’re doing and have a rational, evidence-based plan purpose-built to combat the VUCA we already talked about.

It requires a different management style

As you’ve probably come to understand by now, all of that requires a very different style of management than what most managers and leaders are used to.

To make innovation happen in an organization, leaders do need to provide plenty of structure and guidance to help their teams and employees operate effectively. Without that structure and guidance, which good innovation processes naturally help provide, you’re essentially just hoping for the best which isn’t exactly an ideal strategy.

However, managing innovation is more about setting direction and goals, questioning assumptions, as well as removing obstacles and holding people accountable, than it is about the way most people have learned to manage as they’ve risen in the ranks, which is by breaking a project or goal into pre-defined tasks and then simply delegating those down in the organization.

The traditional approach works well when you have a straightforward problem to solve, or job to accomplish, even if it’s a big and complicated project like building a bridge. These days, the laws of physics related to that are well understood. But if you’re entering a new market or innovating something truly novel, the dynamics probably won’t be as clear.

Building bridges is complicated, not complex

Also, when it comes to capital allocation for innovation, you can certainly try to create a business plan with detailed investment requirements and a thorough project plan along with precise estimates for payback times, but because odds are that all of your assumptions won’t be right, that plan is likely to do more harm than good.

Instead, it’s usually better to allocate capital more dynamically in smaller tranches, even if your goals are big. This can help stay grounded and focus work on solving the next few problems and making real progress instead of executing on a grandiose plan built on a shaky or non-existent foundation.

Conclusion

The scientific method is arguably the oldest innovation framework in the world. While it has naturally evolved, it’s largely stood the test of time.

The scientific method has allowed mankind to significantly accelerate our pace of innovation, and as an innovator, you’d be wise to keep the key principles of the method in mind and introduce processes that institutionalize these within your organization.

Innovation is an iterative process of learning and solving problems, and succeeding at it takes a lot of humility, pragmatism, and even vulnerability. With innovation, you just can’t have all the answers beforehand, nor can you get everything right on the first try.

When you’ve been successful on your career, it’s sometimes easy to forget all of that. So, make sure to remind yourself, and the people you work with, of these principles every now and then.

Fortunately, there’s nothing quite like putting your most critical assumptions to test and learning from the experiment to bring you down to earth and remind yourself of the realities!

This article was originally published in Viima’s blog.

Image credit: Unsplash, Viima

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Healthcare Jugaad Innovation of a 17-Year-Old

Healthcare Jugaad Innovation of a 17-Year-Old

Jugaad Innovation is an innovation subspecialty focused on designing inventions that are intentionally frugal and flexible in order to be more accessible to the entire world. As a result, a lot of jugaad innovation occurs in the developing world. Some of these inventions become innovations and spread from the developing world to the developed world.

I came across a story recently highlighting the potential healthcare jugaad innovation of 17-year-old Dasia Taylor of Iowa, who found that beets provide the perfect dye for her invention of sutures that change color when a surgical wound becomes infected (from bright red to dark purple).

According to Smithsonian magazine:

The 17-year-old student at Iowa City West High School in Iowa City, Iowa, began working on the project in October 2019, after her chemistry teacher shared information about state-wide science fairs with the class. As she developed her sutures, she nabbed awards at several regional science fairs, before advancing to the national stage. This January, Taylor was named one of 40 finalists in the Regeneron Science Talent Search, the country’s oldest and most prestigious science and math competition for high school seniors.

There is still commercialization work to do (more testing, clinical trials, etc.), but the approach shows promise and is far cheaper than high-tech sutures that require a smartphone to sense changes in electrical resistance as an indicator of infection.

Congratulations Dasia!

The great thing about this jugaad innovation approach is that not only could it be a practical solution for developing countries, but national health services and insurance companies are always looking for effective but inexpensive solutions as well.

Good luck with the rest of your research, and keep innovating!


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Using Intuition to Drive Innovation Success

Using Intuition to Drive Innovation Success

Americans are in love with data, big data, analytics, artificial intelligence and machine learning.

… and the rest of the world is catching the same disease.

Data is important, don’t get me wrong, but it is only one side of the coin driving innovation and operational success.

On the other side of the coin is intuition.

As smart organizations try and make greater use of human-centered design, empathy and intuition can and must play an increasingly important role.

Bruce Kasanoff states that “Intuition is the Highest Form of Intelligence” in his article on Forbes.

Intuition is incredibly important to human-centered design from the standpoint that an “intuitive” design taps into our shared understanding as humans of how things should operate.

Intuition is the secret sauce of the quantum human computer, and as the pace of change AND complexity both accelerate, we must change our brain function to develop not just our intellectual capabilities but our instinctual capabilities as well.

Nobel Prize winner Daniel Kahneman wrote about these two ways of thinking in his book Thinking, Fast and Slow. Let’s look at a short video looking at intuition, science and dreams:

Science Intuition and Dreams – Dean Radin

Dreams can be an incredibly powerful tool for innovation, in fact the Nine Innovation Roles that play an important role in the best-selling book Stoking Your Innovation Bonfire came to me in a dream. Many experts recommend that you keep a pen and a notebook next to your bed to capture these flashes of brilliance.

Dreams and shared understanding are but two manifestations of intuition, of our interconnectedness with each other and energies greater than ourselves. But how do we leverage our intuition for innovation?

One way is to use your innovation as an input to use with a tool like The Experiment Canvas™:

The Experiment Canvas

Which is available as a free tool here on my web site from the forthcoming Disruptive Innovation Toolkit™.

You can use it to craft a hypothesis based on your intuition that you want to test, it keeps you focused on what you hope to learn during the experiment, and to consider the setup, operation, and wrapup of your experiment – among other things.

Too often people ignore their intuition because it doesn’t seem scientific. But, turning intuitive insights into hypotheses to test will help you overcome your hesitancy until you train your intuition and to learn to trust it as the potential human quantum computer that it could be. The other reason that people ignore their intuition is that well, they just can’t hear it. For many people, their intellectual mind is so busy that they can’t receive and react to what their intuitive mind is telling them.

Here is an interesting video that highlights these two points and how humans communicate behind the scenes:

Are you drowning out your intuitive mind? Are you failing to consider what is saying, and to test its assertions?

If so, please stop it, and learn new ways to keep innovating!

SPECIAL BONUS:

If you’d like to watch and learn even more about intuition…

Here is a video on Nikola Tesla and the Power of Intuition:


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Time Travel Innovation

Time Travel Innovation

Is it really possible to travel back in time? What about traveling into the future, have we finally figured out how to do that? Well, you’ll have to read on to find out…

But before we explore whether someone has finally figured out how to successfully time travel and recruit you to join me in investing in their pre-IPO startup, I’d like to introduce one of the most important visualizations from the world of innovation that many of your have probably never seen – Neri Oxman’s Krebs Cycle of Creativity from January 2016.

If you’re not familiar with this incredibly important visual artifact from the work of Neri Oxman from MIT’s Media Lab, you should be because it does an amazing job of capturing the interplay between Art, Science, Engineering and Design in the creation of innovation. It builds on John Maeda’s Bermuda Quadrilateral from 2006:

John Maeda Bermuda Quadrilateral

And Rich Gold’s Matrix, also from 2006:

Rich Gold Matrix

While Rich Gold’s visualization builds on the logical bones of John Maeda’s Bermuda Quadrilateral and introduces the concepts of speculative design, speculative engineering, and the contrast between moving minds & moving molecules, it lacks the depth of Neri Oxman’s Krebs Cycle of Creativity visualization. But the Krebs Cycle of Creativity does lose Maeda’s expression of the linkages between science & exploration, engineering & invention, design & communication, and art & expression. But even without these assertions of Maeda, the Krebs Cycle of Creativity still captures a number of other powerful tensions and assertions that can benefit us in our pursuit of innovation.

Time Marches On

The Krebs Cycle of Creativity can be viewed from a number of different perspectives and utilized in a number of different ways. But, one way to look at it is as if it were a watch face. In this context as time moves forward you’re following the typical path, a technology-led innovation approach.

Using the Krebs Cycle of Creativity Canvas in a clockwise direction will help us explore:

  • What information do we have about what might be possible?
  • What knowledge needs to be obtained?
  • What utility does the invention create?
  • What behavior do we need to modify to encourage adoption?

It begins with the invention of a new piece of technology created by the usage of existing information and a new perception of what might be possible within the constraints of our understanding of the natural world, or even by expanding our understanding and knowledge of the natural world using the scientific method.

Neri Oxman Krebs Cycle of Creativity

You’ll see at 3 o’clock in the image above that it at this point in time that most organizations then hand off this new knowledge to their engineers to look at this new understanding of nature through the production lens in order to convert this new knowledge into new utility.

Engineers in most organizations are adept at finding a useful application for a new scientific discovery, and in many organizations this work is done before designers get a peek and begin to imagine how they can present this utility to users in a way that drives behaviors of adoption in a way that the behaviors of using the product or consuming the service feel as natural as possible and as frictionless as possible.

And unfortunately the artists in any organization (or outside via agency relationships) are called in at the eleventh hour to help shape perceptions and to communicate the philosophy behind the solution and the to make the case for it to occupy space in our collective culture.

Pausing at the Innovation Intersection

The way that innovation occurs in many organizations is that Science and Engineering collaborate to investigate and confirm feasibility, then Engineering and Design collaborate to inject viability into the equation, and then Design and Art (with elements of marketing and advertising) collaborate to create Desirability at the end. This may be how it works in many organizations, yet it doesn’t mean that it is the best way…

Feasibility Viability Desirability for Innovation

Traveling Back in Time

But as we all know, water can run uphill, the moon can eclipse the sun, and yes time can run in reverse. Viewing the Krebs Cycle of Creativity in a counter clockwise direction and pushing the hands of the watch backwards will have you following a user-led innovation approach instead.

Using the Krebs Cycle of Creativity Canvas in a counter clockwise direction will help us explore:

  • What information do we have about what is needed?
  • What behavior should we observe?
  • What would create utility for customers?
  • What knowledge must we obtain to realize our solution vision?

It begins with the identification of a new insight uncovered by the investigation of existing information and a new perception of what might be needed within the constraints of our understanding of our customers, or even by expanding our understanding and knowledge of our customers by using ethnography, observation, behavioral science and other tools to enter the mind of your customers, employees or partners.

You’ll see at 9 o’clock in the image above that it at this point in time that user-driven organizations after having their business artists use their perception skills to investigate the culture and philosophy underpinning this new understanding of behavior and pass it off for their designers to look at through the production lens in order to convert it into new utility.

Designers in many organizations are adept at finding a useful application for a new behavioral understanding, and in user-driven organizations this work is done before engineers get a peek and begin to imagine how they can build this utility for users in a way that creates new knowledge in a way that will differentiate the products or services of their organization from those of the competition.

And in user-driven organizations scientists are called in as needed to help overcome any barriers engineers encounter in realizing the solution that best satisfies the users’ identified needs, while leveraging new scientific perceptions that help shape our understanding of nature and empower new philosophical beliefs about what’s possible.

Conclusion

While we haven’t torn any worm holes through the fabric of the space-time continuum with this article, hopefully we have expanded your repertoire with some new tools to facilitate conscious choices around whether you are going to pursue technology-led innovation (clockwise) or user-led innovation (counter clockwise).

Hopefully we have also shown you a better way of visualizing where you are in your innovation journey and where the turning points in your innovation pursuits lie as you seek to take a quantum leap and transform your past into a bright, shiny future.

So now it is time to answer the question you had at the beginning of this article… Is time travel possible?

Well, nearly a decade ago NASA ran an experiment that proved elements of Einstein’s theory of relativity, specifically that the fabric of space-time warps around the earth in response to gravity. Read about it here

And yes, time travel is theoretically possible, or at least time is not theoretically constant as described in this NASA article.

Neither of these indicate that it is possible to travel backwards in time (despite what Superman physics says), only to affect how time advances, but if anyone wants to invest a million dollars in my time travel startup, I’ll cash your check. Because who knows, maybe your check is what will finally make time travel possible!

Anyone? Anyone? Bueller?

 

Image credits: Neri Oxman, MIT Media Lab; Rich Gold; John Maeda; Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Using Gravity to Save and Improve Lives

Using Gravity to Save and Improve Lives

I came across an IndieGogo project that is focused on building and trialing a gravity-powered power station that can serve either as a lantern or as a flexible power source that can be used to power a task light, recharge batteries, or potentially other things that users might dream up that the designers can’t yet imagine.

Check out their video from IndieGogo:

They have already raised FIVE TIMES the money they set out to raise on IndieGogo.

I found it interesting in their promotional video that initially they started with a design challenge of designing a system that would charge a light for indoor use using a solar panel, but that they decided to abandon the approach specified from the outset and pursue alternate power sources.

Also interesting from the IndieGogo project page are the following facts:

The World Bank estimates that, as a result, 780 million women and children inhale smoke which is equivalent to smoking 2 packets of cigarettes every day. 60% of adult, female lung-cancer victims in developing nations are non-smokers. The fumes also cause eye infections and cataracts, but burning kerosene is also more immediately dangerous: 2.5 million people a year, in India alone, suffer severe burns from overturned kerosene lamps. Burning Kerosene also comes with a financial burden: kerosene for lighting ALONE can consume 10 to 20% of a household’s income. This burden traps people in a permanent state of subsistence living, buying cupfuls of fuel for their daily needs, as and when they can.

The burning of Kerosene for lighting also produces 244 million tonnes of Carbon Dioxide annually.

So, what do you think, a meaningful innovation or an interesting but impractical invention?

More information available on their web site here.


Build a common language of innovation on your team

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

The Code for Successful Innovation

The Code for Successful InnovationI had the opportunity to attend the Front End of Innovation a couple of years ago in Boston and of the three days of sessions, I have to say that unlike most people, my favorite session was that of Dr. Clotaire Rapaille. The author of “The Culture Code: An Ingenious Way to Understand Why People Around the World Live and Buy as They Do”, Dr. Rapaille extolled the crowd with his thoughts on ‘codes’ and ‘imprints’.

For me this particular session was the one that most synchronized with how I view the front end of innovation. For me, the front end has nothing to do with ideas or managing ideas, but instead is all about uncovering the key insights to build your ideation on top of.

Now, there are lots of insights that you can build your ideation on top of to create potentially innovative ideas. Consumer insights is one of the building blocks and the one that Clotaire Rapaille has built his empire on. Dr. Rapaille’s core premise is that there is a ‘code’ for each product and service that drives its purchase and adoption. That ‘code’ in turn is driven by the ‘imprints’ that people make when they first understand what something is for the first time and the sensations and feelings they associate with it.

For example, kids don’t grow up drinking coffee, but they grow up smelling coffee from a very young age, most often in the home. So, most of us imprint coffee to the home and our mothers and have a stronger feeling about the smell of coffee than the taste. What does this mean for coffee sellers? Well, instead of focusing on the taste to drive sales (the logical response), they are more likely to have success by focusing on the smell and on creating images that make the product feel like home.

Taking the concept of ‘codes’ and ‘imprints’ further, Dr. Rapaille spoke about how he doesn’t trust what people say, and so he instead focuses on what people do. If you look back at the coffee example, our logical brain would tell us to prefer the coffee that tastes the best, but the reptilian brain will prefer the coffee that smells the best because of the strength of the imprinting. And according to Dr. Rapaille, the reptilian brain always wins.

To make his point, Dr. Rapaille talked about how we remember our dreams – because the cortex arrives late for work. Translation? Our logical brains (cortex) arrive after a decision has already been made by the reptilian brain or the emotional brain and so the logical brain gets put to work justifying the reptilian or emotional brain’s decision with logical reasons. How else would you explain the purchase of a Hummer after all?

Sounds easy right? Well, it gets more complicated as culture gets involved. For example, another of Rapaille’s examples that was not shared at the event is how in the United States the code for a Jeep is ‘horse’ and so the headlights should be round instead of square because horses have round eyes, but in France the code for Jeep is ‘freedom’ because of the strength of WWII liberation imprints – meaning that the marketing strategy for Jeep in France is completely different than in the United States.

Because imprints happen in general at a very young age and given the reach of Dr. Rapaille’s work, you can see very quickly why so many organizations are marketing to children, even for products that are for adults – seemingly as a way to make sure that ‘imprints’ are made so that there is consumer demand to draw on in the future. Or is that conspiracy theory at work?

Dr. Rapaille at the Front End of Innovation also spoke about how when it comes to technology, people want to be amazed, people want the technology to be magical, and to use his favorite phrase – people want to say “wow!” For wow to happen in technology according to Dr. Rapaille, we must strive for simplicity – one magical step with no cables.

Meanwhile, in our organizations we must try and identify what our organization’s ‘code’ is and better leverage multi-disciplinary, multi-cultural teams to drive creativity, while also being careful not to change the code of the organization so much that people don’t recognize it, or trust in it. And finally to use one of Dr. Rapaille’s many generalizations, Americans love to try things (they learn that way), and they love the impossible, so don’t be afraid to ask them to do it.

When I distill all of what he had to say and what he has had to say other places, for me it boils down to one key insight about the limitations of innovation methodologies like:

  • Customer-led innovation
  • Needs-based innovation
  • Jobs-to-be-done

This insight is that the reason that asking customers what they want is problematic is because of the inconsistencies between imprints and intellect, between the reptilian brain and the logical brain, and between knowing and doing. Taken together this ties in nicely with something I have believed for a while now…

When it comes to driving adoption, it matters less what you say and more what you can get others to do. As marketers we are far too focused on trying to get people to ‘tell a friend’. We should be more focused on getting people to ‘show a friend’.

So, what is your code for successful innovation?

What do you want others to show?

Please think about it and let me know what you come up with in the comments.

For those of you who want to know more, check out this embedded via from PBS’ “The Persuaders” with Douglas Rushkoff:

Build a Common Language of Innovation

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovation Requires Diagonal Thinking

Innovation Requires Diagonal ThinkingThe outcome of a back and forth of a dialog on Twitter with Scramray E. Pinkus generated a lovely quote worth sharing:

“Innovating is like thinking diagonally. A perfect combination of both linear and lateral.”

– Scramray E. Pinkus (@Easelton)

The conversation sprung out of a tweet I posted that postulated that when people use technology (iPads, smartphones, laptops, etc.) and television as child minders, that they are actually promoting linear thinking in their children at the expense of the lateral thinking that our society so desperately needs. We need strong lateral thinking to compliment the dominant linear thinking out there, so that together they can drive the social innovation the world needs to fix this mess we’ve made.

What do you think?

Technology as child minder, positive or negative effects on the innovative capacity of our children?

One of my proof points is this article from The Washington Post.

Any other proof points out there?

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.