Tag Archives: Science

The Event That Made Einstein an Icon

The Event That Made Einstein an Icon

GUEST POST from Greg Satell

On April 3rd, 1921, a handful of journalists went to interview a relatively unknown scientist named Albert Einstein. When they arrived to meet his ship they found a crowd of thousands waiting for him, screaming with adulation. Surprised at his popularity, and charmed by his genial personality, the story of Einstein’s arrival made the front page in major newspapers.

It was all a bit of a mistake. The people in the crowd weren’t there to see Einstein, but Chaim Weizmann, the popular Zionist leader that Einstein was traveling with. Nevertheless, that’s how Einstein gained his iconic status. In a way, Einstein didn’t get famous because of relativity, relativity got famous because of Einstein.

This, of course, in no way lessens Einstein’s accomplishments, which were considerable. Yet as Albert-László Barabási, another highly accomplished scientist, explains in The Formula, there is a big difference between success and accomplishment. The truth is that success isn’t what you think it is but, with talent, persistence and some luck, anyone can achieve it.

There Is Virtually No Limit To Success, But There Is To Accomplishment

Einstein was, without a doubt, one of the great scientific minds in history. Yet the first half of the 20th century was a golden age for physics, with many great minds. Niels Bohr, Einstein’s sparring partner at the famous Bohr–Einstein debates (which Bohr is widely considered to have won) was at least as prominent. Yet Einstein towers over all of them.

It’s not just physicists, either. Why is it that Einstein has become a household name and not, say, Watson and Crick, who discovered the structure of DNA, an accomplishment at least as important as relativity? Even less known is Paul Erdős, the most prolific mathematician since Euler in the 18th century, who had an outrageous personality to boot?

For that matter, consider Richard Feynman, who is probably the second most famous physicist of the 20th century. He was, by all accounts, a man of great accomplishment and charisma. However, his fame is probably more due to his performance on TV following the Space Shuttle Challenger disaster than for his theory of quantum electrodynamics.

There are many great golfers, but only one Tiger Woods, just as there are many great basketball players, but only one Lebron James. The truth is that individual human accomplishment is bounded, but success isn’t. Tiger Woods can’t possibly hit every shot perfectly any more than Lebron James can score every point. But chances are, both will outshine all others in the public consciousness, which will drive their fame and fortune.

What’s probably most interesting about Einstein’s fame is that it grew substantially even as he ceased to be a productive scientist, long after he had become, as Robert Oppenheimer put it, “a landmark, not a beacon.”

Success Relies On Networks

Let’s try and deconstruct what happened after Einstein’s arrival in the United States. The day after thousands came to greet Weizmann and the reporters mistakenly assumed that they were there for Einstein, he appeared on the front pages of major newspapers like The New York Times and the Washington Post. For many readers, it may have been the first time they had heard of any physicist.

As I noted above, this period was something of a heyday for physics, with the basic principles of quantum mechanics first becoming established, so it was a topic that was increasingly discussed. Few could understand the details, but many remembered the genius with the crazy white hair they saw in the newspaper. When the subject of physics came up, people would discuss Einstein, which spread his name further.

Barabási himself established this principle of preferential attachment in networks, also known as the “rich get richer” phenomenon or the Matthew effect. When a particular node gains more connections than its rivals, it tends to gain future connections at a faster rate. Even a slight change in early performance leads to a major advantage going forward.

In his book, Barabási details how this principle applies to things as diverse as petitions on Change.org, projects on Kickstarter and books on Amazon. It also applies to websites on the Internet, computers in a network and proteins in our bodies. Look at any connected system and you’ll see preferential attachment at work.

Small Groups, Loosely Connected

The civil rights movement will always be associated with Martin Luther King Jr., but he was far from a solitary figure. In fact, he was just one of the Big Six of civil rights. Yet few today speak of the others. The only one besides King still relatively famous today is John Lewis and that’s largely because of his present role as a US congressman.

Each of these men were not solitary figures either, but leaders of their own organizations, such as the NAACP, The National Urban League and CORE and these, in turn, had hundreds of local chapters. It was King’s connection to all of these that made him the historic icon we know today, because it was all of those small groups, loosely connected, that made up the movement.

In my book, Cascades, I explain how many movements fail to bring change about by trying to emulate events like the March on Washington without first building small groups, loosely connected, but united by a shared purpose. It is those, far more than any charismatic personality or inspirational speech, that makes a movement powerful.

It also helps explain something about Einstein’s iconic status. He was on the ship with Weizman not as a physicist, but as a Zionist activist and that dual status connected him to two separate networks of loosely connected small groups, which enhanced his prestige. So it is quite possible, if not probable, that we equate Einstein with genius today and not, say, Bohr, because of his political activity as much as for his scientific talent.

Randomness Rewards Persistence

None of this should be taken to mean that Einstein could have become a legendary icon if he hadn’t made truly landmark discoveries. It was the combination of his prominence in the scientific community with the happy accident of Weizmann’s adoring crowds being mistaken for his own, that made him a historic figure.

Still, we can imagine an alternate universe in which Einstein becomes just as famous. He was, for example, enormously quotable and very politically active. (He was, at one time, offered the presidency in Israel). So it is completely possible that some other event, combined with his very real accomplishments, would have catapulted him to fame. There is always an element of luck and randomness in every success.

Yet Einstein’s story tells us some very important things about what makes a great success. It is not, as many tell us, simply a matter of working hard to achieve something because human performance is, as noted above, bounded. You can be better than others, but not that much better. At the same time, it takes more than just luck. It is a combination of both and we can do much to increase our chances of benefiting from them.

Einstein was incredibly persistent, working for ten years on special relativity and another ten for general relativity. He was also a great connector, always working to collaborate with other scientists as well as political figures like Weizmann and even little girls needing help with their math homework. That’s what allowed him to benefit from loosely connected small groups.

Perhaps most importantly, these principles of persistence and connection are ones that any of us can apply. We might not all be Einsteins, but with a little luck, we just might make it someday.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credit: misterinnovation.com

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The Eureka Moment Fallacy

The Eureka Moment Fallacy

GUEST POST from Greg Satell

In 1928, Alexander Fleming arrived at his lab to find that a mysterious mold had contaminated his Petri dishes and was eradicating the bacteria colonies he was trying to grow. Intrigued, he decided to study the mold. That’s how Fleming came to be known as the discoverer of penicillin.

Fleming’s story is one that is told and retold because it reinforces so much about what we love about innovation. A brilliant mind meets a pivotal moment of epiphany and — Eureka! — the world is forever changed. Unfortunately, that’s not really how things work. It wasn’t true in Fleming’s case and it won’t work for you.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation, which is why penicillin didn’t become commercially available until 1945 (and the drug was actually a different strain of the mold than Fleming had discovered). We need to stop searching for Eureka moments and get busy with the real work of innovating.

Learning To Recognize And Define Problems

Before Fleming, there was Ignaz Semmelweis and to understand Fleming’s story it helps to understand that of his predecessor. Much like Fleming, Semmelweis was a bright young man of science who had a moment of epiphany. In Semmelweis’s case, he was one of the first to realize that infections could spread from doctor to patient.

That simple insight led him to institute a strict regime of hand washing at Vienna General Hospital. Almost immediately, the incidence of deadly childbed fever dropped precipitously. Yet his ideas were not accepted at the time and Semmelweis didn’t do himself any favors by refusing to format his data properly or to work collaboratively to build support for his ideas. Instead, he angrily railed against the medical establishment he saw as undermining his work.

Semmelweis would die in an insane asylum, ironically from an infection he contracted under care, and never got to see the germ theory of disease emerge from the work of people like Louis Pasteur and Robert Koch. That’s what led to the study of bacteriology, sepsis and Alexander Fleming growing those cultures that were contaminated by the mysterious mold.

When Fleming walked into his lab on that morning in 1928, he was bringing a wealth of experiences to the problem. During World War I, he had witnessed many soldiers die from sepsis and how applying antiseptic agents to the wound often made the problem worse. Later, he found that nasal secretions inhibited bacterial growth.

So when the chance discovery of penicillin happened, it was far from a single moment, but rather a “happy accident” that he had spent years preparing for.

Combining Domains

Today, we remember Fleming’s discovery of penicillin as a historic breakthrough, but it wasn’t considered to be so at the time. In fact, when it was first published in the British Journal of Experimental Pathology, nobody really noticed. The truth is that what Fleming discovered couldn’t have cured anybody. It was just a mold secretion that killed bacteria in a Petri dish.

Perhaps even more importantly, Fleming was ill-equipped to transform penicillin into something useful. He was a pathologist that largely worked alone. To transform his discovery into an actual cure, he would need chemists and other scientists, as well as experts in fermentation, manufacturing, logistics and many other things. To go from milliliters in the lab to metric tons in the real world is no trivial thing.

So Fleming’s paper lay buried in a scientific journal for ten years before it was rediscovered by a team led by Howard Florey and Ernst Chain at the University of Oxford. Chain, a world-class biochemist, was able to stabilize the penicillin compound and another member of the team, Norman Heatley, developed a fermentation process to produce it in greater quantities.

Because Florey and Chain led a larger team in a bigger lab they were also had the staff and equipment to perform experiments on mice, which showed that penicillin was effective in treating infections. However, when they tried to cure a human, they found that they were not able to produce enough of the drug. They simply didn’t have the capacity.

Driving A Transformation

By the time Florey and Chain had established the potential of penicillin it was already 1941 and England was at war, which made it difficult to find funding to scale up their work. Luckily, Florey had done a Rhodes Scholarship in the United States and was able to secure a grant to travel to America and continue the development of penicillin with US-based labs.

That collaboration produced two more important breakthroughs. First, they were able to identify a more powerful strain of the penicillin mold. Second, they developed a fermentation process utilizing corn steep liquor as a medium. Corn steep liquor was common in the American Midwest, but virtually unheard of back in England.

Still, they needed to figure out a way to scale up production and that was far beyond the abilities of research scientists. However, the Office of Scientific Research and Development (OSRD), a government agency in charge of wartime research, understood the potential of penicillin for the war effort and initiated an aggressive program, involving two dozen pharmaceutical companies, to overcome the challenges.

Working feverishly, they were able to produce enough penicillin to deploy the drug for D-Day in 1944 and saved untold thousands of lives. After the war was over, in 1945, penicillin was made commercially available, which touched off a “golden age” of antibiotic research and new drugs were discovered almost every year between 1950 and 1970.

Innovation Is Never A Single Event

The story of Fleming’s Eureka! moment is romantic and inspiring, but also incredibly misleading. It wasn’t one person and one moment that changed the world, but the work of many over decades that made an impact. As I explain in my book, Cascades, it is small groups, loosely connected, but united by a shared purpose that drive transformational change.

In fact, the development of penicillin involved not one, but a series of epiphanies. First, Fleming discovered penicillin. Then, Florey and Chain rediscovered Fleming’s work. Chain stabilized the compound, Heatley developed the fermentation process, other scientists identified the more powerful strain and corn steep liquor as a fermentation medium. Surely, there were many other breakthroughs involving production, logistics and treatment that are lost to history.

This is not the exception, but the rule. The truth is that the next big thing always starts out looking like nothing at all. For example, Jim Allison, who recently won the Nobel Prize for his development of cancer immunotherapy, had his idea rejected by pharmaceutical companies, much like the medical establishment dismissed Semmelweis back in the 1850s.

Yet Allison kept at it. He continued to pound the pavement, connect and collaborate with others and that’s why today he his hailed as a pioneer and a hero. That’s why we need to focus less on inventions and more on ecosystems. It’s never a single moment of Eureka! that truly changes the world, but many of them.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Are We Abandoning Science?

Are We Abandoning Science?

GUEST POST from Greg Satell

A recent Pew poll found that, while Americans generally view scientific expertise in high regard, there are deep pockets of mistrust. For example, less than half of Republicans believe that scientists should take an active role in policy debates and significant minorities question the transparency and integrity of scientific findings.

An earlier study done by researchers at Ohio State University found that, when confronted with scientific evidence that conflicted with their pre-existing views, such as the reality of climate change or the safety of vaccines, partisans would not only reject the evidence, but become hostile and question the objectivity of science.

This is a major problem, because if we are only willing to accept evidence that agrees with what we already think we know, we are unlikely to advance our understanding. Perhaps even worse, it opens us up to being influenced by pundits —those with strong opinions but questionable expertise. When we turn our backs on science, we turn our backs on truth.

The Rise Of Science

When René Descartes wrote “I think, therefore I am” in the mid 1600s, he was doing more than coining a clever phrase, he was making an argument for a rational world ruled by pure logic. He believed that you could find the answers to problems you needed to solve merely by thinking about them clearly.

Yet Descartes and his rational movement soon ran out of steam. Many of the great minds that followed, such as John Locke and David Hume, took a more empirical view and argued that we can only truly understand the world around us through our experiences, however flawed and limited they may be.

It was this emphasis on experiences that led us to the concept of expertise. As the Renaissance and the Enlightenment gave way to the modern world, knowledge became specialized. It was no longer enough to think about things, the creation of knowledge came to be seen as arising from a scientific process of testing hypotheses through experiment.

This was a major shift, because you could no longer simply argue about things like how many angels could fit on the head of a pin, you actually had to put your thoughts to the test. Others could then examine the same evidence and see if they came to the same conclusions as you did. Thinking about things wasn’t enough, you had to show that they worked in the real world.

The Soccer Ball You Can’t See

Science is a funny thing, full of chance discoveries, strange coincidences and unlikely moments of insight. In his book, The God Particle, the Nobel prizewinning physicist Leon Lederman tells a metaphorical story about an alien race watching a soccer game to illustrate how it is practiced.

These aliens are very much like humans except that they can not see black and white patterns. If they went to a soccer game, they would be utterly confused to see a bunch of guys running around a field for no apparent reason. They could come up with theories, formulas and other conjectures, but would fail to make useful predictions.

Eventually, one might notice a slight bulge in the net of the goal just as the crowd erupted in a cheer and come up with a crazy idea about an invisible ball. Through further observation, they could test the hypothesis and build evidence. Although they could never actually see the ball, they could catalogue its effects and use them to understand events.

His point is that science is not common sense. It deals with things that we do not directly experience, but nevertheless have concrete effects on the world we live in. Today, we live in a world of the visceral abstract, where oddball theories like relativity result in real innovations like microprocessors and the Internet.

Cargo Cult Science

Because so much of science deals with stuff we can’t directly experience, we need metaphors like Lederman’s story about the aliens to make sense of things. Part of the fun of science is letting your imagination run wild and seeing where things go. Then you can test those ideas to see if they actually reflect reality.

The problem is that pundits and flakes can do the same thing — let their imagination run wild — and not bother to test whether they are true. Consider the anti-vax movement, which has no scientific basis, but has gone viral and led to a resurgence of diseases that were nearly extinct. Nevertheless, dressed up in some scientific sounding words, the idea that vaccines cause disease in children can be very convincing.

The physicist Richard Feynman called this cargo cult science, after a strange phenomenon that takes place on some islands in the South Pacific in which some tribes try to mimic the use of technology. For example, they build mock airstrips in the hopes that airplanes would appear with valuable cargo.

What makes science real is not fancy sounding words or white lab coats, but the fact that you work under certain constraints. You follow the scientific method, observe professional standards and subject your work to peer review. Pundits, on the other hand, do none of these things. Simply having an opinion on a subject will suffice.

The New Mysticism

Clearly, science is what created the modern world. Without science, you cannot have technology and without technology, you cannot create prosperity. So, on purely economic terms, science is extremely important to our success as a society. We need science in order to progress.

Yet in broader terms, science is the search for truth. In a nutshell, science is the practice of coming up with testable statements to see what’s possible. That’s what separates Darwin’s theory of natural selection and the big bang from nonscientific theories. The former is a matter of science, which can be tested through experiment and observation, the latter a matter of faith and belief.

Consider what Marco Rubio said in an interview with GQ about the age of the universe a few years ago:

“I think the age of the universe has zero to do with how our economy is going to grow. I’m not a scientist. I don’t think I’m qualified to answer a question like that. At the end of the day, I think there are multiple theories out there on how the universe was created and I think this is a country where people should have the opportunity to teach them all.”

Yet the big bang is not just a theory, but the result of a set of theories, including general relativity and quantum mechanics, combined with many observations over a period of decades. Students in physics class are supposed to learn about the big bang not to shape their religious beliefs, but because of its importance to those underlying theories.

And those concepts are central to our everyday lives. We use relativity to calibrate GPS satellites, so that we can find restaurants and target missiles. Quantum mechanics gave us lasers and microprocessors, from which we make barcode scanners and iPhones. In fact, the theories underlying big bang are essential for our modern economy to function.

When we turn our backs on science, what we are left with is essentially a form a mysticism. We can listen to our inner voices to decide what we believe and, when faced with a competing idea, ascribe its provenance to only someone else’s inner voice. Once we make truth a matter of opinion, we start our way down a slippery slope.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Is China Our New Sputnik Moment?

Is China Our New Sputnik Moment?

GUEST POST from Greg Satell

When the Soviets launched Sputnik, the first space satellite, into orbit in 1957, it was a wake-up call for America. Over the next year, President Eisenhower would sign the National Defense Education Act to spur science education, increase funding for research and establish NASA and DARPA to spur innovation.

A few years ago, a report by the Council on Foreign Relations (CFR) argued that we are at a similar point today, but with China. While we have been steadily decreasing federal investment in R&D over the past few decades, our Asian rival has been ramping up and now threatens our leadership in key technologies such as AI, genomics and quantum information technology.

Clearly, we need to increase our commitment to science and innovation and that means increasing financial investment. However, what the report makes clear is that money alone won’t solve the problem. We are, in several important ways, actually undermining our ability to innovate, now and in the future. We need to renew our culture of innovation in America.

Educating And Attracting Talent

The foundation of an innovation economy is education, especially in STEM subjects. Historically, America has been the world’s best educated workforce, but more recently we’ve fallen to fifth among OECD countries for post-secondary education. That’s alarming and something we will certainly need to reverse if we are to compete effectively.

Our educational descent can be attributed to three major causes. First, the rest of the world has become more educated, so the competition has become stiffer. Second, is financing. Tuition has nearly tripled in the last decade and student debt has become so onerous that it now takes about 20 years to pay off four years for college. Third, we need to work harder to attract talented people to the United States.

The CFR report recommends developing a “21st century National Defense Education Act” to create scholarships in STEM areas and making it easier for foreign students to get Green Cards when they graduate from our universities. It also points out that we need to work harder to attract foreign talent, especially in high impact areas like AI, genomics and quantum computing.

Unfortunately, we seem to be going the other way. The number of international students to American universities is declining. Policies like the muslim ban and concerns about gun violence are deterring scientific talent coming here. The denial rate for those on H1-B visas has increased from 4% in 2016 to 18% in the first quarter of 2019.

Throughout our history, it has been our openness to new people and new ideas that has made America exceptional. It’s a legitimate question whether that’s still true.

Building Technology Ecosystems

In the 1980s, the US semiconductor industry was on the ropes. Due to increased competition from low-cost Japanese manufacturers, American market share in the DRAM market fell from 70% to 20%. The situation not only had a significant economic impact, there were also important national security implications.

The federal government responded with two initiatives, the Semiconductor Research Corporation and SEMATECH, both of which were nonprofit consortiums that involved government, academia and industry. By the 1990s. American semiconductor manufacturers were thriving again.

Today, we have similar challenges with rare earth elements, battery technology and many manufacturing areas. The Obama administration responded by building similar consortiums to those that were established for semiconductors: The Critical Materials Institute for rare earth elements, JCESR for advanced batteries and the 14 separate Manufacturing Institutes.

Yet here again, we seem to be backsliding. The current administration has sought to slash funding for the Manufacturing Extension Partnership that supports small and medium sized producers. An addendum to the CFR report also points out that the administration has pushed for a 30% cut in funding for the national labs, which support much of the advanced science critical to driving American technology forward.

Supporting International Trade and Alliances

Another historical strength of the US economy has been our open approach to trade. The CFR report points out that our role as a “central node in a global network of research and development,” gave us numerous advantages, such as access to foreign talent at R&D centers overseas, investment into US industry and cooperative responses to global challenges.

However, the report warns that “the Trump administration’s indiscriminate use of tariffs against China, as well as partners and allies, will harm U.S. innovative capabilities.” It also faults the Trump administration for pulling out of the Trans-Pacific Partnership trade agreement, which would have bolstered our relationship with Asian partners and increased our leverage over China.

The tariffs undermine American industry in two ways. First, because many of the tariffs are on intermediate goods which US firms use to make products for export, we’re undermining our own competitive position, especially in manufacturing. Second, because trade partners such as Canada and the EU have retaliated against our tariffs, our position is weakened further.

Clearly, we compete in an ecosystem driven world in which power does not come from the top, but emanates from the center. Traditionally, America has positioned itself at the center of ecosystems by constantly connecting out. Now that process seems to have reversed itself and we are extremely vulnerable to others, such as China, filling the void.

We Need to Stop Killing Innovation in America

The CFR report, whose task force included such luminaries as Admiral William McRaven, former Google CEO Eric Schmidt and economist Laura Tyson, should set alarm bells ringing. Although the report was focused on national security issues, it pertains to general competitiveness just as well and the picture it paints is fairly bleak.

After World War II, America stood almost alone in the world in terms of production capacity. Through smart policy, we were able to transform that initial advantage into long-term technological superiority. Today, however we have stiff competition in areas ranging from AI to synthetic biology to quantum systems.

At the same time, we seem to be doing everything we can to kill innovation in America. Instead of working to educate and attract the world’s best talent, we’re making it harder for Americans to attain higher education and for top foreign talent to come and work here. Instead of ramping up our science and technology programs, presidential budgets regular recommend cutting them. Instead of pulling our allies closer, we are pushing them away.

To be clear, America is still at the forefront of science and technology, vying for leadership in every conceivable area. However, as global competition heats up and we need to be redoubling our efforts, we seem to be doing just the opposite. The truth is that our prosperity is not a birthright to which we are entitled, but a legacy that must be lived up to.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Top 10 Human-Centered Change & Innovation Articles of March 2023

Top 10 Human-Centered Change & Innovation Articles of March 2023Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are March’s ten most popular innovation posts:

  1. Taking Care of Yourself is Not Impossible — by Mike Shipulski
  2. Rise of the Prompt Engineer — by Art Inteligencia
  3. A Guide to Effective Brainstorming — by Diana Porumboiu
  4. What Disruptive Innovation Really Is — by Geoffrey A. Moore
  5. The 6 Building Blocks of Great Teams — by David Burkus
  6. Take Charge of Your Mind to Reclaim Your Potential — by Janet Sernack
  7. Ten Reasons You Must Deliver Amazing Customer Experiences — by Shep Hyken
  8. Deciding You Have Enough Opens Up New Frontiers — by Mike Shipulski
  9. The AI Apocalypse is Here – 3 Reasons You Should Celebrate! — by Robyn Bolton
  10. Artificial Intelligence is Forcing Us to Answer Some Very Human Questions — by Greg Satell

BONUS – Here are five more strong articles published in February that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last three years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Is Futurology a Pseudoscience?

Is Futurology a Pseudoscience?

GUEST POST from Art Inteligencia

Futurology (aka Future Studies or Futures Research) is a subject of study that attempts to make predictions and forecasts about the future. It is an interdisciplinary field that draws from a variety of sources, including science, economics, philosophy, and technology. In recent years, futurology has become a popular topic of debate, with some arguing that it is a pseudoscience and others defending its validity as a legitimate field of study.

One of the main criticisms of futurology is that it relies on speculation and extrapolation of existing trends, rather than on scientific evidence or principles. Critics argue that this makes futurists’ predictions unreliable and that futurology is more of a speculative activity than a rigorous scientific discipline. They also point out that predictions about the future are often wrong, and that the field has had a reputation for making exaggerated claims that have not been borne out by the facts.

“Futurology always ends up telling you more about you own time than about the future.” Matt Ridley

On the other hand, proponents of futurology argue that the field has a legitimate place in the scientific community. They point to the fact that many futurists are well-educated, highly trained professionals who use rigorous methods and data analysis to make accurate predictions. These futurists also often draw on a wide range of sources, such as history, economics, and psychology, to make their forecasts.

Ultimately, the debate over whether or not futurology (aka future studies or futures research) is a pseudoscience is likely to continue. Some may see it as a legitimate field of study, while others may view it as little more than guesswork. What is certain, however, is that the field is still evolving and that the ability of futurists to accurately predict the future will be an important factor in determining its ultimate validity.

Do you think futurology is a pseudoscience?
(sound off in the comments)

And to the futurists and futurology professionals out there, what say you?
(add a comment)

Bottom line: Futurology and prescience are not fortune telling. Skilled futurologists and futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Have Humans Evolved Beyond Nature and a Need for It?

Have Humans Evolved Beyond Nature and a Need for It?

GUEST POST from Manuel Berdoy, University of Oxford

Our society has evolved so much, can we still say that we are part of Nature? If not, should we worry – and what should we do about it? Poppy, 21, Warwick.

Such is the extent of our dominion on Earth, that the answer to questions around whether we are still part of nature – and whether we even need some of it – rely on an understanding of what we want as Homo sapiens. And to know what we want, we need to grasp what we are.

It is a huge question – but they are the best. And as a biologist, here is my humble suggestion to address it, and a personal conclusion. You may have a different one, but what matters is that we reflect on it.

Perhaps the best place to start is to consider what makes us human in the first place, which is not as obvious as it may seem.


This article is part of Life’s Big Questions

The Conversation’s new series, co-published with BBC Future, seeks to answer our readers’ nagging questions about life, love, death and the universe. We work with professional researchers who have dedicated their lives to uncovering new perspectives on the questions that shape our lives.


Many years ago, a novel written by Vercors called Les Animaux dénaturés (“Denatured Animals”) told the story of a group of primitive hominids, the Tropis, found in an unexplored jungle in New Guinea, who seem to constitute a missing link.

However, the prospect that this fictional group may be used as slave labour by an entrepreneurial businessman named Vancruysen forces society to decide whether the Tropis are simply sophisticated animals or whether they should be given human rights. And herein lies the difficulty.

Human status had hitherto seemed so obvious that the book describes how it is soon discovered that there is no definition of what a human actually is. Certainly, the string of experts consulted – anthropologists, primatologists, psychologists, lawyers and clergymen – could not agree. Perhaps prophetically, it is a layperson who suggested a possible way forward.

She asked whether some of the hominids’ habits could be described as the early signs of a spiritual or religious mind. In short, were there signs that, like us, the Tropis were no longer “at one” with nature, but had separated from it, and were now looking at it from the outside – with some fear.

It is a telling perspective. Our status as altered or “denatured” animals – creatures who have arguably separated from the natural world – is perhaps both the source of our humanity and the cause of many of our troubles. In the words of the book’s author:

All man’s troubles arise from the fact that we do not know what we are and do not agree on what we want to be.

We will probably never know the timing of our gradual separation from nature – although cave paintings perhaps contain some clues. But a key recent event in our relationship with the world around us is as well documented as it was abrupt. It happened on a sunny Monday morning, at 8.15am precisely.

A new age

The atomic bomb that rocked Hiroshima on August 6 1945, was a wake-up call so loud that it still resonates in our consciousness many decades later.

The day the “sun rose twice” was not only a forceful demonstration of the new era that we had entered, it was a reminder of how paradoxically primitive we remained: differential calculus, advanced electronics and almost godlike insights into the laws of the universe helped build, well … a very big stick. Modern Homo sapiens seemingly had developed the powers of gods, while keeping the psyche of a stereotypical Stone Age killer.

We were no longer fearful of nature, but of what we would do to it, and ourselves. In short, we still did not know where we came from, but began panicking about where we were going.

We now know a lot more about our origins but we remain unsure about what we want to be in the future – or, increasingly, as the climate crisis accelerates, whether we even have one.

Arguably, the greater choices granted by our technological advances make it even more difficult to decide which of the many paths to take. This is the cost of freedom.

I am not arguing against our dominion over nature nor, even as a biologist, do I feel a need to preserve the status quo. Big changes are part of our evolution. After all, oxygen was first a poison which threatened the very existence of early life, yet it is now the fuel vital to our existence.

Similarly, we may have to accept that what we do, even our unprecedented dominion, is a natural consequence of what we have evolved into, and by a process nothing less natural than natural selection itself. If artificial birth control is unnatural, so is reduced infant mortality.

I am also not convinced by the argument against genetic engineering on the basis that it is “unnatural”. By artificially selecting specific strains of wheat or dogs, we had been tinkering more or less blindly with genomes for centuries before the genetic revolution. Even our choice of romantic partner is a form of genetic engineering. Sex is nature’s way of producing new genetic combinations quickly.

Even nature, it seems, can be impatient with itself.

Our natural habitat? Shutterstock

Changing our world

Advances in genomics, however, have opened the door to another key turning point. Perhaps we can avoid blowing up the world, and instead change it – and ourselves – slowly, perhaps beyond recognition.

The development of genetically modified crops in the 1980s quickly moved from early aspirations to improve the taste of food to a more efficient way of destroying undesirable weeds or pests.

In what some saw as the genetic equivalent of the atomic bomb, our early forays into a new technology became once again largely about killing, coupled with worries about contamination. Not that everything was rosy before that. Artificial selection, intensive farming and our exploding population growth were long destroying species quicker than we could record them.

The increasing “silent springs” of the 1950s and 60s caused by the destruction of farmland birds – and, consequently, their song – was only the tip of a deeper and more sinister iceberg. There is, in principle, nothing unnatural about extinction, which has been a recurring pattern (of sometimes massive proportions) in the evolution of our planet long before we came on the scene. But is it really what we want?

The arguments for maintaining biodiversity are usually based on survival, economics or ethics. In addition to preserving obvious key environments essential to our ecosystem and global survival, the economic argument highlights the possibility that a hitherto insignificant lichen, bacteria or reptile might hold the key to the cure of a future disease. We simply cannot afford to destroy what we do not know.

Is it this crocodile’s economic, medical or inherent value which should be important to us? Shutterstock

But attaching an economic value to life makes it subject to the fluctuation of markets. It is reasonable to expect that, in time, most biological solutions will be able to be synthesised, and as the market worth of many lifeforms falls, we need to scrutinise the significance of the ethical argument. Do we need nature because of its inherent value?

Perhaps the answer may come from peering over the horizon. It is somewhat of an irony that as the third millennium coincided with decrypting the human genome, perhaps the start of the fourth may be about whether it has become redundant.

Just as genetic modification may one day lead to the end of “Homo sapiens naturalis” (that is, humans untouched by genetic engineering), we may one day wave goodbye to the last specimen of Homo sapiens genetica. That is the last fully genetically based human living in a world increasingly less burdened by our biological form – minds in a machine.

If the essence of a human, including our memories, desires and values, is somehow reflected in the pattern of the delicate neuronal connections of our brain (and why should it not?) our minds may also one day be changeable like never before.

And this brings us to the essential question that surely we must ask ourselves now: if, or rather when, we have the power to change anything, what would we not change?

After all, we may be able to transform ourselves into more rational, more efficient and stronger individuals. We may venture out further, have greater dominion over greater areas of space, and inject enough insight to bridge the gap between the issues brought about by our cultural evolution and the abilities of a brain evolved to deal with much simpler problems. We might even decide to move into a bodiless intelligence: in the end, even the pleasures of the body are located in the brain.

And then what? When the secrets of the universe are no longer hidden, what makes it worth being part of it? Where is the fun?

“Gossip and sex, of course!” some might say. And in effect, I would agree (although I might put it differently), as it conveys to me the fundamental need that we have to reach out and connect with others. I believe that the attributes that define our worth in this vast and changing universe are simple: empathy and love. Not power or technology, which occupy so many of our thoughts but which are merely (almost boringly) related to the age of a civilisation.

True gods

Like many a traveller, Homo sapiens may need a goal. But from the strengths that come with attaining it, one realises that one’s worth (whether as an individual or a species) ultimately lies elsewhere. So I believe that the extent of our ability for empathy and love will be the yardstick by which our civilisation is judged. It may well be an important benchmark by which we will judge other civilisations that we may encounter, or indeed be judged by them.

When we can change everything about ourselves, what will we keep? Shutterstock

There is something of true wonder at the basis of it all. The fact that chemicals can arise from the austere confines of an ancient molecular soup, and through the cold laws of evolution, combine into organisms that care for other lifeforms (that is, other bags of chemicals) is the true miracle.

Some ancients believed that God made us in “his image”. Perhaps they were right in a sense, as empathy and love are truly godlike features, at least among the benevolent gods.

Cherish those traits and use them now, Poppy, as they hold the solution to our ethical dilemma. It is those very attributes that should compel us to improve the wellbeing of our fellow humans without lowering the condition of what surrounds us.

Anything less will pervert (our) nature.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credits: Pixabay, Shutterstock (via theconversation)

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

How will humans change in the next 10,000 years?

Future evolution: from looks to brains and personality

GUEST POST from Nicholas R. Longrich, University of Bath

READER QUESTION: If humans don’t die out in a climate apocalypse or asteroid impact in the next 10,000 years, are we likely to evolve further into a more advanced species than what we are at the moment? Harry Bonas, 57, Nigeria

Humanity is the unlikely result of 4 billion years of evolution.

From self-replicating molecules in Archean seas, to eyeless fish in the Cambrian deep, to mammals scurrying from dinosaurs in the dark, and then, finally, improbably, ourselves – evolution shaped us.

Organisms reproduced imperfectly. Mistakes made when copying genes sometimes made them better fit to their environments, so those genes tended to get passed on. More reproduction followed, and more mistakes, the process repeating over billions of generations. Finally, Homo sapiens appeared. But we aren’t the end of that story. Evolution won’t stop with us, and we might even be evolving faster than ever.


This article is part of Life’s Big Questions

The Conversation’s new series, co-published with BBC Future, seeks to answer our readers’ nagging questions about life, love, death and the universe. We work with professional researchers who have dedicated their lives to uncovering new perspectives on the questions that shape our lives.


It’s hard to predict the future. The world will probably change in ways we can’t imagine. But we can make educated guesses. Paradoxically, the best way to predict the future is probably looking back at the past, and assuming past trends will continue going forward. This suggests some surprising things about our future.

We will likely live longer and become taller, as well as more lightly built. We’ll probably be less aggressive and more agreeable, but have smaller brains. A bit like a golden retriever, we’ll be friendly and jolly, but maybe not that interesting. At least, that’s one possible future. But to understand why I think that’s likely, we need to look at biology.

The end of natural selection?

Some scientists have argued that civilisation’s rise ended natural selection. It’s true that selective pressures that dominated in the past – predators, famine, plague, warfare – have mostly disappeared.

Starvation and famine were largely ended by high-yield crops, fertilisers and family planning. Violence and war are less common than ever, despite modern militaries with nuclear weapons, or maybe because of them. The lions, wolves and sabertoothed cats that hunted us in the dark are endangered or extinct. Plagues that killed millions – smallpox, Black Death, cholera – were tamed by vaccines, antibiotics, clean water.

But evolution didn’t stop; other things just drive it now. Evolution isn’t so much about survival of the fittest as reproduction of the fittest. Even if nature is less likely to murder us, we still need to find partners and raise children, so sexual selection now plays a bigger role in our evolution.

And if nature doesn’t control our evolution anymore, the unnatural environment we’ve created – culture, technology, cities – produces new selective pressures very unlike those we faced in the ice age. We’re poorly adapted to this modern world; it follows that we’ll have to adapt.

And that process has already started. As our diets changed to include grains and dairy, we evolved genes to help us digest starch and milk. When dense cities created conditions for disease to spread, mutations for disease resistance spread too. And for some reason, our brains have got smaller. Unnatural environments create unnatural selection.

To predict where this goes, we’ll look at our prehistory, studying trends over the past 6 million years of evolution. Some trends will continue, especially those that emerged in the past 10,000 years, after agriculture and civilisation were invented.

We’re also facing new selective pressures, such as reduced mortality. Studying the past doesn’t help here, but we can see how other species responded to similar pressures. Evolution in domestic animals may be especially relevant – arguably we’re becoming a kind of domesticated ape, but curiously, one domesticated by ourselves.

I’ll use this approach to make some predictions, if not always with high confidence. That is, I’ll speculate.

Lifespan

Humans will almost certainly evolve to live longer – much longer. Life cycles evolve in response to mortality rates, how likely predators and other threats are to kill you. When mortality rates are high, animals must reproduce young, or might not reproduce at all. There’s also no advantage to evolving mutations that prevent ageing or cancer – you won’t live long enough to use them.

When mortality rates are low, the opposite is true. It’s better to take your time reaching sexual maturity. It’s also useful to have adaptations that extend lifespan, and fertility, giving you more time to reproduce. That’s why animals with few predators – animals that live on islands or in the deep ocean, or are simply big – evolve longer lifespans. Greenland sharks, Galapagos tortoises and bowhead whales mature late, and can live for centuries.

Even before civilisation, people were unique among apes in having low mortality and long lives. Hunter-gatherers armed with spears and bows could defend against predators; food sharing prevented starvation. So we evolved delayed sexual maturity, and long lifespans – up to 70 years.

Still, child mortality was high – approaching 50% or more by age 15. Average life expectancy was just 35 years. Even after the rise of civilisation, child mortality stayed high until the 19th century, while life expectancy went down – to 30 years – due to plagues and famines.

Then, in the past two centuries, better nutrition, medicine and hygiene reduced youth mortality to under 1% in most developed nations. Life expectancy soared to 70 years worldwide , and 80 in developed countries. These increases are due to improved health, not evolution – but they set the stage for evolution to extend our lifespan.

Now, there’s little need to reproduce early. If anything, the years of training needed to be a doctor, CEO, or carpenter incentivise putting it off. And since our life expectancy has doubled, adaptations to prolong lifespan and child-bearing years are now advantageous. Given that more and more people live to 100 or even 110 yearsthe record being 122 years – there’s reason to think our genes could evolve until the average person routinely lives 100 years or even more.

Size, and strength

Animals often evolve larger size over time; it’s a trend seen in tyrannosaurs, whales, horses and primates – including hominins.

Early hominins like Australopithecus afarensis and Homo habilis were small, four to five feet (120cm-150cm) tall. Later hominins – Homo erectus, Neanderthals, Homo sapiens – grew taller. We’ve continued to gain height in historic times, partly driven by improved nutrition, but genes seem to be evolving too.

Why we got big is unclear. In part, mortality may drive size evolution; growth takes time, so longer lives mean more time to grow. But human females also prefer tall males. So both lower mortality and sexual preferences will likely cause humans to get taller. Today, the tallest people in the world are in Europe, led by the Netherlands. Here, men average 183cm (6ft); women 170cm (5ft 6in). Someday, most people might be that tall, or taller.

As we’ve grown taller, we’ve become more gracile. Over the past 2 million years, our skeletons became more lightly built as we relied less on brute force, and more on tools and weapons. As farming forced us to settle down, our lives became more sedentary, so our bone density decreased. As we spend more time behind desks, keyboards and steering wheels, these trends will likely continue.

Humans have also reduced our muscles compared to other apes, especially in our upper bodies. That will probably continue. Our ancestors had to slaughter antelopes and dig roots; later they tilled and reaped in the fields. Modern jobs increasingly require working with people, words and code – they take brains, not muscle. Even for manual laborers – farmers, fisherman, lumberjacks – machinery such as tractors, hydraulics and chainsaws now shoulder a lot of the work. As physical strength becomes less necessary, our muscles will keep shrinking.

Our jaws and teeth also got smaller. Early, plant-eating hominins had huge molars and mandibles for grinding fibrous vegetables. As we shifted to meat, then started cooking food, jaws and teeth shrank. Modern processed food – chicken nuggets, Big Macs, cookie dough ice cream – needs even less chewing, so jaws will keep shrinking, and we’ll likely lose our wisdom teeth.

Beauty

After people left Africa 100,000 years ago, humanity’s far-flung tribes became isolated by deserts, oceans, mountains, glaciers and sheer distance. In various parts of the world, different selective pressures – different climates, lifestyles and beauty standards – caused our appearance to evolve in different ways. Tribes evolved distinctive skin colour, eyes, hair and facial features.

With civilisation’s rise and new technologies, these populations were linked again. Wars of conquest, empire building, colonisation and trade – including trade of other humans – all shifted populations, which interbred. Today, road, rail and aircraft link us too. Bushmen would walk 40 miles to find a partner; we’ll go 4,000 miles. We’re increasingly one, worldwide population – freely mixing. That will create a world of hybrids – light brown skinned, dark-haired, Afro-Euro-Australo-Americo-Asians, their skin colour and facial features tending toward a global average.

Sexual selection will further accelerate the evolution of our appearance. With most forms of natural selection no longer operating, mate choice will play a larger role. Humans might become more attractive, but more uniform in appearance. Globalised media may also create more uniform standards of beauty, pushing all humans towards a single ideal. Sex differences, however, could be exaggerated if the ideal is masculine-looking men and feminine-looking women.

Intelligence and personality

Last, our brains and minds, our most distinctively human feature, will evolve, perhaps dramatically. Over the past 6 million years, hominin brain size roughly tripled, suggesting selection for big brains driven by tool use, complex societies and language. It might seem inevitable that this trend will continue, but it probably won’t.

Instead, our brains are getting smaller. In Europe, brain size peaked 10,000—20,000 years ago, just before we invented farming. Then, brains got smaller. Modern humans have brains smaller than our ancient predecessors, or even medieval people. It’s unclear why.

It could be that fat and protein were scarce once we shifted to farming, making it more costly to grow and maintain large brains. Brains are also energetically expensive – they burn around 20% of our daily calories. In agricultural societies with frequent famine, a big brain might be a liability.

Maybe hunter-gatherer life was demanding in ways farming isn’t. In civilisation, you don’t need to outwit lions and antelopes, or memorise every fruit tree and watering hole within 1,000 square miles. Making and using bows and spears also requires fine motor control, coordination, the ability to track animals and trajectories — maybe the parts of our brains used for those things got smaller when we stopped hunting.

Or maybe living in a large society of specialists demands less brainpower than living in a tribe of generalists. Stone-age people mastered many skills – hunting, tracking, foraging for plants, making herbal medicines and poisons, crafting tools, waging war, making music and magic. Modern humans perform fewer, more specialised roles as part of vast social networks, exploiting division of labour. In a civilisation, we specialise on a trade, then rely on others for everything else.

That being said, brain size isn’t everything: elephants and orcas have bigger brains than us, and Einstein’s brain was smaller than average. Neanderthals had brains comparable to ours, but more of the brain was devoted to sight and control of the body, suggesting less capacity for things like language and tool use. So how much the loss of brain mass affects overall intelligence is unclear. Maybe we lost certain abilities, while enhancing others that are more relevant to modern life. It’s possible that we’ve maintained processing power by having fewer, smaller neurons. Still, I worry about what that missing 10% of my grey matter did.

Curiously, domestic animals also evolved smaller brains. Sheep lost 24% of their brain mass after domestication; for cows, it’s 26%; dogs, 30%. This raises an unsettling possibility. Maybe being more willing to passively go with the flow (perhaps even thinking less), like a domesticated animal, has been bred into us, like it was for them.

Our personalities must be evolving too. Hunter-gatherers’ lives required aggression. They hunted large mammals, killed over partners and warred with neighbouring tribes. We get meat from a store, and turn to police and courts to settle disputes. If war hasn’t disappeared, it now accounts for fewer deaths, relative to population, than at any time in history. Aggression, now a maladaptive trait, could be bred out.

Changing social patterns will also change personalities. Humans live in much larger groups than other apes, forming tribes of around 1,000 in hunter-gatherers. But in today’s world people living in vast cities of millions. In the past, our relationships were necessarily few, and often lifelong. Now we inhabit seas of people, moving often for work, and in the process forming thousands of relationships, many fleeting and, increasingly, virtual. This world will push us to become more outgoing, open and tolerant. Yet navigating such vast social networks may also require we become more willing to adapt ourselves to them – to be more conformist.

Not everyone is psychologically well-adapted to this existence. Our instincts, desires and fears are largely those of stone-age ancestors, who found meaning in hunting and foraging for their families, warring with their neighbours and praying to ancestor-spirits in the dark. Modern society meets our material needs well, but is less able to meet the psychological needs of our primitive caveman brains.

Perhaps because of this, increasing numbers of people suffer from psychological issues such as loneliness, anxiety and depression. Many turn to alcohol and other substances to cope. Selection against vulnerability to these conditions might improve our mental health, and make us happier as a species. But that could come at a price. Many great geniuses had their demons; leaders like Abraham Lincoln and Winston Churchill fought with depression, as did scientists such as Isaac Newton and Charles Darwin, and artists like Herman Melville and Emily Dickinson. Some, like Virginia Woolf, Vincent Van Gogh and Kurt Cobain, took their own lives. Others – Billy Holliday, Jimi Hendrix and Jack Kerouac – were destroyed by substance abuse.

A disturbing thought is that troubled minds will be removed from the gene pool – but potentially at the cost of eliminating the sort of spark that created visionary leaders, great writers, artists and musicians. Future humans might be better adjusted – but less fun to party with and less likely to launch a scientific revolution — stable, happy and boring.

New species?

There were once nine human species, now it’s just us. But could new human species evolve? For that to happen, we’d need isolated populations subject to distinct selective pressures. Distance no longer isolates us, but reproductive isolation could theoretically be achieved by selective mating. If people were culturally segregated – marrying based on religion, class, caste, or even politics – distinct populations, even species, might evolve.

In The Time Machine, sci-fi novelist H.G. Wells saw a future where class created distinct species. Upper classes evolved into the beautiful but useless Eloi, and the working classes become the ugly, subterranean Morlocks – who revolted and enslaved the Eloi.

In the past, religion and lifestyle have sometimes produced genetically distinct groups, as seen in for example Jewish and Gypsy populations. Today, politics also divides us – could it divide us genetically? Liberals now move to be near other liberals, and conservatives to be near conservatives; many on the left won’t date Trump supporters and vice versa.

Could this create two species, with instinctively different views? Probably not. Still, to the extent culture divides us, it could drive evolution in different ways, in different people. If cultures become more diverse, this could maintain and increase human genetic diversity.

Strange New Possibilities

So far, I’ve mostly taken a historical perspective, looking back. But in some ways, the future might be radically unlike the past. Evolution itself has evolved.

One of the more extreme possibilities is directed evolution, where we actively control our species’ evolution. We already breed ourselves when we choose partners with appearances and personalities we like. For thousands of years, hunter-gatherers arranged marriages, seeking good hunters for their daughters. Even where children chose partners, men were generally expected to seek approval of the bride’s parents. Similar traditions survive elsewhere today. In other words, we breed our own children.

And going forward, we’ll do this with far more knowledge of what we’re doing, and more control over the genes of our progeny. We can already screen ourselves and embryos for genetic diseases. We could potentially choose embryos for desirable genes, as we do with crops. Direct editing of the DNA of a human embryo has been proven to be possible — but seems morally abhorrent, effectively turning children into subjects of medical experimentation. And yet, if such technologies were proven safe, I could imagine a future where you’d be a bad parent not to give your children the best genes possible.

Computers also provide an entirely new selective pressure. As more and more matches are made on smartphones, we are delegating decisions about what the next generation looks like to computer algorithms, who recommend our potential matches. Digital code now helps choose what genetic code passed on to future generations, just like it shapes what you stream or buy online. This might sound like dark science fiction, but it’s already happening. Our genes are being curated by computer, just like our playlists. It’s hard to know where this leads, but I wonder if it’s entirely wise to turn over the future of our species to iPhones, the internet and the companies behind them.

Discussions of human evolution are usually backward looking, as if the greatest triumphs and challenges were in the distant past. But as technology and culture enter a period of accelerating change, our genes will too. Arguably, the most interesting parts of evolution aren’t life’s origins, dinosaurs, or Neanderthals, but what’s happening right now, our present – and our future.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We need MD/MBEs not MD/MBAs

We need MD/MBEs not MD/MBAs

GUEST POST from Arlen Meyers, M.D.

The number of MD/MBAs graduating from medical schools continues to expand with about 5% of the roughly 20,000 US medical student graduates having dual degrees. While in past times the idea was to get the knowledge, skills, abilities and competencies to manage health services organizations, many are now doing it on the way to digital health startup land.

Most all of the 38 osteopathic schools offer dual degree programs as well.

However, MBA programs are dwindling and the ones that are still around are rethinking their value proposition and restructuring their curriculum.

For example, business schools are racing to add concentrations in science, technology, engineering and math to their M.B.A. programs as they try to broaden their appeal to prospective students overseas who want to work in the U.S.

Several schools, including Northwestern’s Kellogg School of Management and North Carolina’s Kenan-Flagler Business School, have unveiled STEM-designated master’s in business degrees in recent months. The University of California Berkeley’s Haas School of Business recently reclassified its entire M.B.A. program as STEM.

But, BMETALS is the new STEM.

In my view, we are training too many MD/MBAs that don’t add value to the system and that many programs should be terminated or restructured.

  1. We don’t know how much value the graduates contribute to the sick care system.
  2. The programs are usually not domain specific. Some think that’s a good thing, encouraging exposure to how other industries have solved generic problems. Others feel sick-care is so unique that the lessons are not applicable.
  3. Medical students are already up their waists in debt, most of which is taxpayer subsidized. Should additional debt be added to their student loans?
  4. Few of the programs address the needs of physician entrepreneurs.
  5. There are many substitutes for physician entrepreneurs around the world and US schools are no longer the mecca.
  6. Content has become generic and offered for free on the Internet.
  7. Connections are easy to make using social media.
  8. The MBA is losing credibility, given the large number of places that offer them, particularly those below the first-tier schools.
  9. Employers can see through the credentials.
  10. Costs continue to escalate and the programs do not accommodate the specific needs of busy clinician students.
  11. We need a thorough conversation about the policy wisdom of encouraging dual degrees, potentially side tracking graduates into non clinical roles when there is a global demand for clinicians.
  12. We need to track outcomes and roles of graduates to determine whether the dual degree adds value to the communities they are designed to serve and whether they are cost-effective in an era of skyrocketing student debt.

In addition, there is a difference between having knowledge, skills, abilities and competencies in the business of medicine, health systems science, health service organization management, leadership and leaderpreneurship and entrepreneurship/intrapreneurship. There is a confusing array of dual degree programs leaving students scratching their heads and, in many instances, wasting their time and money.

Also, more medical students are jumping ship to pursue non-clinical careers. While the numbers may a small portion of the roughly 20,000 first year US medical students, the trend is evident.

Instead, we should consider re-shuffling the deck and offer a new combined MBE (Masters in Bioinnovation and Entrepreneurship) degree or dual MD/MBE or PhD/MBE program.

According to Prof. Varda Liberman, the new Provost of Reichman University and Head of the MBA in Healthcare Innovation, “Healthcare systems are going through enormous changes worldwide and with the COVID-19, these changes were accelerated. There is an immediate need for a complete redesign that will necessitate innovative multidisciplinary solutions, leveraging technology, science, information systems, and national policy. Our MBA program in Healthcare Innovation, offered by Reichman University, in collaboration with Israel’s largest hospital, the Sheba Medical Center, Tel Hashomer, is designed to prepare the future leaders of the healthcare industry to develop solutions that will enable the needed redesign. The program brings together all the unique advantages of Israeli innovation, to provide our students with the tools and skills necessary to understand the complexity of the healthcare industry today. The program brings together all the key players of the ecosystem – those coming from the healthcare system, engineering, entrepreneurship, AI, law, biomedicine, pharmacology, high tech, investment, management, and public policy”.

Here’s how it would work:

  1. A four-year program combining two years in medical school and two years in an MBE program, patterned similar to Professional Science Masters Programs.
  2. The medical school curriculum would be separate and distinct from that offered to medical students interested in practicing medicine. Among other topics, we would teach sales.
  3. Clinical rotations should start on day one, intended to instill an entrepreneurial mindset and emphasize being a problem seeker, not a problem solver at this stage
  4. Interdisciplinary education with experiential learning in project teams that includes business, science, engineering, law and other health professionals.
  5. Experiential learning and a mandatory internship with local, national or international company in biopharma, medtech or digital health.
  6. A new tuition and funding structure, possibly run by private equity or medical technology companies who sponsor applicants. The present medical education business model won’t work if it depends on short term revenue by putting butts in the seats.
  7. Project teams would be offered proof of concept funding and iCorps team support
  8. Domain experts would work with project teams
  9. Each student would be assigned an entrepreneur mentor throughout the program
  10. Social biomedical entrepreneurship and ethics would be core streams throughout training. Those interested in creating non-profits or going into public service might be candidates for tuition deferral or waiver.

Another alternative is to make medical school 3 years instead of 4 and offer a one year track in biomedical and clinical entrepreneurship.

The good news for educators is that you don’t need to start from scratch. Karolinska beat you to the punch.

The purpose of the degree program is to provide students with the knowledge, skills and abilities they need to lead global biomedical innovation. Here’s what the curriculum would include:

  1. Building Biotechnology: Introduction to biomedical entrepreneurship
  2. Regulatory Affairs and Reimbursement
  3. Life Science Intellectual Property
  4. International (Bio) Business
  5. Biotech law and ethics
  6. Internship
  7. International trip
  8. Device and digital health entrepreneurship
  9. Leading high performance teams
  10. Bioentrepreneurial finance
  11. Drug discovery and development
  12. Care delivery entrepreneurship
  13. Social entrepreneurship
  14. Electives in other aspects of entrepreneurship

The David Eccles School of Business at the University of Utah is taking its top 10 ranked program for entrepreneurship to new heights with a master’s degree designed for serious entrepreneurs.

The degree is called the Master of Business Creation (MBC), and it’s the first of its kind.

Applicants must be full-time entrepreneurs who want to create, launch and scale a new business, who want more than the 9-to-5 job, who have the drive to overcome the impossible, who want to build their knowledge while doing, and who are willing to put in the hours to make it happen.

We don’t need more physician administrators. We need more physician innovators and entrepreneurs who can lead us out of our sick care mess and close global health outcome disparities. While I believe the optimal career track involves a reasonable time practicing clinical medicine, students are thinking otherwise. For those that do, they need a new path to creating the future and medical and business educators need to create educational products that meet their needs.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovation and the Scientific Method

Innovation and the Scientific Method

GUEST POST from Jesse Nieminen

Most large organizations are led and managed very systematically, and they pride themselves on that. Managers and leaders within those organizations are usually smart, educated, and want to make data-driven, evidence-based decisions.

However, when it comes to innovation, that can be a part of the problem as Clayton Christensen famously pointed out.

Many leaders these days are well aware of the problem, but even if they are, they may still have a hard time leading innovation because the approach is so different from what most of them are used to in their day-to-day. The mindset, mental models and frameworks needed are just fundamentally different.

So, to get it right, you need to pick out the right frameworks and mental models and use those to lead both your own thinking, as well as your teams. Because innovation has become such a hot topic, there’s been an explosion in the number of these. So, how do you know which ones to adopt?

Well, in these situations, it’s often beneficial to take a step back and go to the roots of the phenomenon to figure out what the timeless fundamentals are, and what’s just part of the latest fad.

So, in this article, we’ll look at arguably the oldest innovation framework in the world, the scientific method. We’ll first explore the concept and briefly compare it to more modern frameworks, and then draw some practical takeaways from the exercise.

What is the scientific method and how does it relate to innovation?

Most of us probably remember hearing about the scientific method, and it’s generally seen as the standard for proving a point and for exploring new phenomena. Having said that, given that even to this day, there still isn’t a clear consensus on what the scientific method actually is, it’s probably a good idea to explore the term.

The scientific method is a systematic, iterative, and primarily empirical method of acquiring knowledge.

Some of the key ideas behind the scientific method actually date back to ancient times and several different cultures, perhaps most famously to Ancient Greece. The initial principles evolved gradually throughout the years, but it took until the Enlightenment before the term “scientific method” began to be used, and these principles became popularized.

With that background we can safely call the scientific method the oldest innovation framework in the world. In the end, applying this method is where most of the big technological innovations and breakthroughs we all now know and benefit from every day, have come from throughout history.

But enough about history, what does the process actually look like? Well, as mentioned, that depends on whom you ask, but the key principles everyone agrees on are that it is a systematic, iterative, and primarily empirical method of acquiring knowledge.

Again, there’s no consensus on the exact steps used in the process, and there are also minor variances in terminology, but the four steps practically every version seems to have can be seen from the chart below.

Scientific Method Chart

While traditionally the scientific method has been used primarily for basic research, it’s been the inspiration for many recent, popular processes and frameworks around business innovation.

Just look at Lean Startup, Design Thinking, Growth Hacking, Discovery Driven Growth, and the list goes on.

At a high level, most of these are very similar to the scientific method, just applied to a more specific domain, and that come with some practical guidelines for applying said methods in practice.

With so many similarities, there’s clearly something there that’s worth paying attention to. Let’s next dive deeper to understand why that is the case.

Why are the frameworks so similar?

By definition, innovation is about creating and introducing something new. Sometimes that can mean small, incremental changes, but often we’re talking something much bigger.

And, in today’s globalized, hyperconnected and rapidly moving world, a lot of volatility, uncertainty, complexity and ambiguity (VUCA) will always be involved, especially when you’re moving into these uncharted waters.

This leads to two fundamental problems:

  • You usually can’t have all the information before making a decision
  • Whatever plans and assumptions you initially make will likely be wrong

What that in turn means is that many of the practices and frameworks leaders have applied for years in managing people and projects as they’ve risen through the ranks of the business, will not be applicable here. In fact, they can even be counterproductive as we pointed out in the introduction. Some leaders have a hard time accepting this and adapting to the new reality, and that usually doesn’t end well.

Humility and pragmatism are key for innovation

On the other hand, some leaders that have realized this have decided to go to the other extreme. They’ve heard stories of these great visionaries and innovators that had a dream of the future and just refused to take no for an answer. While there is a lot to like in that approach, the mistake that often happens is that once these leaders embark on that journey, they refuse to adapt their vision to meet the reality.

Finding the right balance is always tricky, but what helps with that is adapting the iterative, exploratory, and empirical approach of the scientific methodand the other frameworks and processes we mentioned before.

This doesn’t mean that it would be a free-for-all, on the contrary. These processes are in fact systematic and usually quite structured.

The purpose of the scientific method is to create structure and understanding from what seems like an incomprehensible mess.

To put it in another way, the purpose of the scientific method is actually to create structure and understanding from what initially seems like an incomprehensible mess – and that is the foundation that most great innovations are built on.

What can we learn from that?

Let’s now reflect on what that means for the day-to-day job of innovators and leaders managing innovation.

For me, it essentially boils down to three main takeaways. We’ll next cover each of them briefly.

Innovation is a learning process, just like the scientific method

As we just covered, most innovation processes abide by the same key principles as the scientific method. They are iterative, empirical, and exploratory. But they are also systematic, evidence-based, and most importantly, focused on learning and solving problems.

With innovation, your first priority is always to be skeptical of your initial plan and question your assumptions. When you do that and look at the data objectively to try figure out how and why things work the way they do, you’ll unlock a deeper level of understanding, and that level of understanding is what can help you solve problems and create better innovations that make a real difference for your customers and your organization.

To sum up, when you’re trying to build the future, don’t assume you’re right. Instead, ask how you’re wrong, and why. Often the hardest part about learning is to unlearn what you’ve previously learned. This is what’s often referred to as first principles thinking.

“Trying things out” isn’t unscientific or non-evidence-based

We still see leaders in many organizations struggle to admit that they, either as a leader or as an organization, don’t know something.

There’s often resistance to admitting a lack of understanding and to “trying things out” because those are seen as amateurish and unscientific or non-evidence-based, approaches. Rational leaders naturally want to do their homework before choosing a direction or committing significant resources to an initiative.

The scientific method is about learning

However, with innovation, often doing your homework properly means that you understand that you don’t know all the answers and need to figure out a way to find out those answers instead of just trusting your gut or whatever market research you might have been able to scrape together.

“Trying things out” is how more or less every meaningful innovation has ever been created. By definition, there’s always an amount of trial and error involved in that process.

So, if you recognize yourself struggling to embrace the uncertainty, take a hard look in the mirror, be more pragmatic and have the courage to make yourself vulnerable. If you have the right talent in your team, being vulnerable is actually a great way to gel the team together and improve performance.

On the other hand, if you understand all of this, but your boss doesn’t, it might be a good idea to politely remind them of how the scientific method works. While it’s not a silver bullet that would be guaranteed to convert everyone into a believer at once, I’ve found this to be a good way to remind leaders how science and progress really gets made.

Essentially, you need to convince them that you know what you’re doing and have a rational, evidence-based plan purpose-built to combat the VUCA we already talked about.

It requires a different management style

As you’ve probably come to understand by now, all of that requires a very different style of management than what most managers and leaders are used to.

To make innovation happen in an organization, leaders do need to provide plenty of structure and guidance to help their teams and employees operate effectively. Without that structure and guidance, which good innovation processes naturally help provide, you’re essentially just hoping for the best which isn’t exactly an ideal strategy.

However, managing innovation is more about setting direction and goals, questioning assumptions, as well as removing obstacles and holding people accountable, than it is about the way most people have learned to manage as they’ve risen in the ranks, which is by breaking a project or goal into pre-defined tasks and then simply delegating those down in the organization.

The traditional approach works well when you have a straightforward problem to solve, or job to accomplish, even if it’s a big and complicated project like building a bridge. These days, the laws of physics related to that are well understood. But if you’re entering a new market or innovating something truly novel, the dynamics probably won’t be as clear.

Building bridges is complicated, not complex

Also, when it comes to capital allocation for innovation, you can certainly try to create a business plan with detailed investment requirements and a thorough project plan along with precise estimates for payback times, but because odds are that all of your assumptions won’t be right, that plan is likely to do more harm than good.

Instead, it’s usually better to allocate capital more dynamically in smaller tranches, even if your goals are big. This can help stay grounded and focus work on solving the next few problems and making real progress instead of executing on a grandiose plan built on a shaky or non-existent foundation.

Conclusion

The scientific method is arguably the oldest innovation framework in the world. While it has naturally evolved, it’s largely stood the test of time.

The scientific method has allowed mankind to significantly accelerate our pace of innovation, and as an innovator, you’d be wise to keep the key principles of the method in mind and introduce processes that institutionalize these within your organization.

Innovation is an iterative process of learning and solving problems, and succeeding at it takes a lot of humility, pragmatism, and even vulnerability. With innovation, you just can’t have all the answers beforehand, nor can you get everything right on the first try.

When you’ve been successful on your career, it’s sometimes easy to forget all of that. So, make sure to remind yourself, and the people you work with, of these principles every now and then.

Fortunately, there’s nothing quite like putting your most critical assumptions to test and learning from the experiment to bring you down to earth and remind yourself of the realities!

This article was originally published in Viima’s blog.

Image credit: Unsplash, Viima

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.