Category Archives: Innovation

Sometimes to Innovate You Must Do the Following

Sometimes to Innovate You Must Do the Following

GUEST POST from Mike Shipulski

What it takes to do new work:

Confidence to get it wrong and confidence to do it early and often.

Purposeful misuse of worst practices in a way that makes them the right practices.

Tolerance for not knowing what to do next and tolerance for those uncomfortable with that.

Certainty that they’ll ask for a hard completion date and certainty you won’t hit it.

Knowledge that the context is different and knowledge that everyone still wants to behave like it’s not.

Disdain for best practices.

Discomfort with success because it creates discomfort when it’s time for new work.

Certainty you’ll miss the mark and certainty you’ll laugh about it next week.

Trust in others’ bias to do what worked last time and trust that it’s a recipe for disaster.

Belief that successful business models have half-lives and belief that no one else does.

Trust that others will think nothing will come of the work and trust that they’re likely right.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

False Choice – Founder versus Manager

False Choice - Founder versus Manager

GUEST POST from Robyn Bolton

Paul Graham, cofounder of Y Combinator, was so inspired by a speech by Airbnb cofounder and CEO that he wrote an essay about well-intentioned advice that, to scale a business, founders must shift modes and become managers.

It went viral. 

In the essay, he argued that:

In effect there are two different ways to run a company: founder mode and manager mode. Till now most people even in Silicon Valley have implicitly assumed that scaling a startup meant switching to manager mode. But we can infer the existence of another mode from the dismay of founders who’ve tried it, and the success of their attempts to escape from it.

With curiosity and an open mind, I read on.

I finished with a deep sigh and an eye roll. 

This is why.

Manager Mode: The realm of liars and professional fakers

On the off chance that you thought Graham’s essay would be a balanced and reflective examination of management styles in different corporate contexts, his description of Manager Mode should relieve you of that thought:

The way managers are taught to run companies seems to be like modular design in the sense that you treat subtrees of the org chart as black boxes. You tell your direct reports what to do, and it’s up to them to figure out how. But you don’t get involved in the details of what they do. That would be micromanaging them, which is bad.

Hire good people and give them room to do their jobs. Sounds great when it’s described that way, doesn’t it? Except in practice, judging from the report of founder after founder, what this often turns out to mean is: hire professional fakers and let them drive the company into the ground.

Later, he writes about how founders are gaslit into adopting Manager Mode from every angle, including by “VCs who haven’t been founders themselves don’t know how founders should run companies, and C-level execs, as a class, include some of the most skillful liars in the world.”

Founder Mode: A meritocracy of lifelong learners

For Graham, Founder Mode boils down to two things:

  1. Sweating the details
  2. Engaging with employees throughout the organization beyond just direct reports.  He cites Steve Jobs’ practice of holding “an annual retreat for what he considered the 100 most important people at Apple, and these were not the 100 people highest on the org chart.”

To his credit, Graham acknowledges that getting involved in the details is micromanaging, “which is bad,” and that delegation is required because “founders can’t keep running a 2000 person company the way they ran it when it had 20.” A week later, he acknowledged that female founders “don’t have permission to run their companies in Founder Mode the same way men can.”

Yet he persists in believing that Founder, not Manager, Mode is critical to success,

“Look at what founders have achieved already, and yet they’ve achieved this against a headwind of bad advice. Imagine what they’ll do once we can tell them how to run their companies like Steve Jobs instead of John Sculley.”

Leader Mode: Manager Mode + Founder Mode

The essay is interesting, but I have real issues with two of his key points:

  • Professional managers are disconnected from the people and businesses they manage, and as a result, their practices and behaviors are inconsistent with startup success.
  • Founders should ignore conventional wisdom and micromanage to their heart’s content.

Most “professional managers” I’ve met are deeply connected to the people they manage, committed to the businesses they operate, and act with integrity and authenticity. They are a far cry from the “professional fakers” and “skillful liars” Graham describes.

Most founders I’ve met should not be allowed near the details once they have a team in place. Their meddling, need for control, and soul-crushing FOMO (Fear of Missing Out) lead to chaos, burnout, and failure.

The truth is, it’s contextual.  The leaders I know switch between Founder and Manager mode based on the context.  They work with the passion of founders, trust with the confidence of managers, and are smart and humble enough to accept feedback when they go too far in one direction or the other.

Being both manager and founder isn’t just the essence of being a leader. It’s the essence of being a successful corporate innovator.  You are a founder,  investing in, advocating for, and sweating the details of ambiguous and risky work.  And you are a manager navigating the economic, operational, and political minefields that govern the core business and fund your paycheck and your team.

Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Innovation is Combination

Silicon Valley’s Innovator’s Dilemma – The Atom, the Bit and the Gene

Innovation is Combination

GUEST POST from Greg Satell

Over the past several decades, innovation has become largely synonymous with digital technology. When the topic of innovation comes up, somebody points to a company like Apple, Google or Meta rather than, say, a car company, a hotel or a restaurant. Management gurus wax poetically about the “Silicon Valley way.”

Of course, that doesn’t mean that other industries haven’t been innovative. In fact, there are no shortage of excellent examples of innovation in cars, hotels, restaurants and many other things. Still, the fact remains that for most of recent memory digital technology has moved further and faster than anything else.

This has been largely due to Moore’s Law, our ability to consistently double the number of transistors we’re able to cram onto a silicon wafer. Now, however, Moore’s Law is ending and we’re entering a new era of innovation. Our future will not be written in ones and zeros, but will be determined by our ability to use information to shape the physical world.

The Atom

The concept of the atom has been around at least since the time of the ancient Greek philosopher Democritus. Yet it didn’t take on any real significance until the early 20th century. In fact, the paper Albert Einstein used for his dissertation helped to establish the existence of atoms through a statistical analysis of Brownian motion.

Yet it was the other papers from Einstein’s miracle year of 1905 that transformed the atom from an abstract concept to a transformative force, maybe even the most transformative force in the 20th century. His theory of mass-energy equivalence would usher in the atomic age, while his work on black-body radiation would give rise to quantum mechanics and ideas so radical that even he would refuse to accept them.

Ironically, despite Einstein’s reluctance, quantum theory would lead to the development of the transistor and the rise of computers. These, in turn, would usher in the digital economy, which provided an alternative to the physical economy of goods and services based on things made from atoms and molecules.

Still, the vast majority of what we buy is made up of what we live in, ride in, eat and wear. In fact, information and communication technologies only make up about 6% of GDP in advanced countries, which is what makes the recent revolution in materials science is so exciting. We’re beginning to exponentially improve the efficiency of how we design the materials that make up everything from solar panels to building materials.

The Bit

While the concept of the atom evolved slowly over millennia, the bit is one of the rare instances in which an idea seems to have arisen in the mind of a single person with little or no real precursor. Introduced by Claude Shannon in a paper in 1948—incidentally, the same year the transistor was invented—the bit has shaped how we see and interact with the world ever since.

The basic idea was that information isn’t a function of content, but the absence of ambiguity, which can be broken down to a single unit – a choice between two alternatives. Much like how a coin toss which lacks information while in the air, but takes on a level of certainty when it lands, information arises when ambiguity disappears.

He called this unit, a “binary digit” or a “bit” and much like the pound, quart, meter or liter, it has become such a basic unit of measurement that it’s hard to imagine our modern world without it. Shannon’s work would soon combine with Alan Turing’s concept of a universal computer to create the digital computer.

Now the digital revolution is ending and we will soon be entering a heterogeneous computing environment that will include things like quantum, neuromorphic and biological computing. Still, Claude Shannon’s simple idea will remain central to how we understand how information interacts with the world it describes.

The Gene

The concept of the gene was first discovered by an obscure Austrian monk named Gregor Mendel, but in one of those strange peculiarities of history, his work went almost totally unnoticed until the turn of the century. Even then, no one really knew what a gene was or how they functioned. The term was, for the most part, just an abstract concept.

That changed abruptly when James Watson and Francis Crick published their article in the scientific journal Nature. In a single stroke, the pair were able to show that genes were, in fact, made up of a molecule called DNA and that they operated through a surprisingly simple code made up of A,T,C and G.

Things really began to kick into high gear when the Human Genome Project was completed in 2003. Since then the cost to sequence a genome has been falling faster than the rate of Moore’s Law, which has unleashed a flurry of innovation. Jennifer Doudna’s discovery of CRISPR in 2012 revolutionized our ability to edit genes. More recently, mRNA technology has helped develop COVID-19 vaccines in record time.

Today, we have entered a new era of synthetic biology in which we can manipulate the genetic code of A,T,C and G almost as easily as we can the bits in the machines that Turing imagined all those years ago. Researchers are also exploring how we can use genes to create advanced materials and maybe even create better computers.

Innovation Is Combination

The similarity of the atom, the bit and the gene as elemental concepts is hard to miss and they’ve allowed us to understand our universe in a visceral, substantial way. Still, they arose in vastly different domains and have been largely applied to separate and distinct fields. In the future, however, we can expect vastly greater convergence between the three.

We’ve already seen glimpses of this. For example, as a graduate student Charlie Bennett was a teaching assistant for James Watson. Yet in between his sessions instructing undergraduates in Watson’s work on genes, he took an elective course on the theory of computing in which he learned about the work of Shannon and Turing. That led him to go work for IBM and become a pioneer in quantum computing.

In much the same way, scientists are applying powerful computers to develop new materials and design genetic sequences. Some of these new materials will be used to create more powerful computers. In the future, we can expect the concepts of the atom, the bit and the gene to combine and recombine in exciting ways that we can only begin to imagine today.

The truth is that innovation is combination and has, in truth, always been. The past few decades, in which one technology so thoroughly dominated that it was able to function largely in isolation to other fields, was an anomaly. What we are beginning to see now is, in large part, a reversion to the mean, where the most exciting work will be interdisciplinary.

This is Silicon Valley’s innovator’s dilemma. Nerdy young geeks will no longer be able to prosper coding blithely away in blissful isolation. It is no longer sufficient to work in bits alone. Increasingly we need to combine those bits with atoms and genes to create significant value. If you want to get a glimpse of the future, that’s where to look.

— Article courtesy of the Digital Tonto blog
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.






The Runaway Innovation Train

The Runaway Innovation Train

GUEST POST from Pete Foley

In this blog, I return and expand on a paradox that has concerned me for some time.    Are we getting too good at innovation, and is it in danger of getting out of control?   That may seem like a strange question for an innovator to ask.  But innovation has always been a two edged sword.  It brings huge benefits, but also commensurate risks. 

Ostensibly, change is good. Because of technology, today we mostly live more comfortable lives, and enjoy superior health, longevity, and mostly increased leisure and abundance compared to our ancestors.

Exponential Innovation Growth:  The pace of innovation is accelerating. It may not exactly mirror Moore’s Law, and of course, innovation is much harder to quantify than transistors. But the general trend in innovation and change approximates exponential growth. The human stone-age lasted about 300,000 years before ending in about 3,000 BC with the advent of metalworking.  The culture of the Egyptian Pharos lasted 30 centuries.  It was certainly not without innovations, but by modern standards, things changed very slowly. My mum recently turned 98 years young, and the pace of change she has seen in her lifetime is staggering by comparison to the past.  Literally from horse and carts delivering milk when she was a child in poor SE London, to todays world of self driving cars and exploring our solar system and beyond.  And with AI, quantum computing, fusion, gene manipulation, manned interplanetary spaceflight, and even advanced behavior manipulation all jockeying for position in the current innovation race, it seems highly likely that those living today will see even more dramatic change than my mum experienced.  

The Dark Side of Innovation: While accelerated innovation is probably beneficial overall, it is not without its costs. For starters, while humans are natural innovators, we are also paradoxically change averse.  Our brains are configured to manage more of our daily lives around habits and familiar behaviors than new experiences.  It simply takes more mental effort to manage new stuff than familiar stuff.  As a result we like some change, but not too much, or we become stressed.  At least some of the burgeoning mental health crisis we face today is probably attributable the difficulty we have adapting to so much rapid change and new technology on multiple fronts.

Nefarious Innovation:  And of course, new technology can be used for nefarious as well as noble purpose. We can now kill our fellow humans far more efficiently, and remotely than our ancestors dreamed of.  The internet gives us unprecedented access to both information and connectivity, but is also a source of misinformation and manipulation.  

The Abundance Dichotomy:  Innovation increases abundance, but it’s arguable if that actually makes us happier.  It gives us more, but paradoxically brings greater inequalities in distribution of the ‘wealth’ it creates. Behavior science has shown us consistently that humans make far more relative than absolute judgments.  Being better off than our ancestors actually doesn’t do much for us.  Instead we are far more interested in being better off than our peers, neighbors or the people we compare ourselves to on Instagram. And therein lies yet another challenge. Social media means we now compare ourselves to far more people than past generations, meaning that the standards we judge ourselves against are higher than ever before.     

Side effects and Unintended Consequences: Side effects and unintended consequences are perhaps the most difficult challenge we face with innovation. As the pace of innovation accelerates, so does the build up of side effects, and problematically, these often lag our initial innovations. All too often, we only become aware of them when they have already become a significant problem. Climate change is of course a poster child for this, as a huge unanticipated consequence of the industrial revolution. The same applies to pollution.  But as innovation accelerates, the unintended consequences it brings are also stacking up.  The first generations of ‘digital natives’ are facing unprecedented mental health challenges.  Diseases are becoming resistant to antibiotics, while population density is leading increased rate of new disease emergence. Agricultural efficiency has created monocultures that are inherently more fragile than the more diverse supply chain of the past.  Longevity is putting enormous pressure on healthcare.

The More we Innovate, the less we understand:  And last, but not least, as innovation accelerates, we understand less about what we are creating. Technology becomes unfathomably complex, and requires increasing specialization, which means few if any really understand the holistic picture.  Today we are largely going full speed ahead with AI, quantum computing, genetic engineering, and more subtle, but equally perilous experiments in behavioral and social manipulation.  But we are doing so with increasingly less pervasive understanding of direct, let alone unintended consequences of these complex changes!   

The Runaway Innovation Train:  So should we back off and slow down?  Is it time to pump the brakes? It’s an odd question for an innovator, but it’s likely a moot point anyway. The reality is that we probably cannot slow down, even if we want to.  Innovation is largely a self-propagating chain reaction. All innovators stand on the shoulders of giants. Every generation builds on past discoveries, and often this growing knowledge base inevitably leads to multiple further innovations.  The connectivity and information access of internet alone is driving today’s unprecedented innovation, and AI and quantum computing will only accelerate this further.  History is compelling on this point. Stone-age innovation was slow not because our ancestors lacked intelligence.  To the best of our knowledge, they were neurologically the same as us.  But they lacked the cumulative knowledge, and the network to access it that we now enjoy.   Even the smartest of us cannot go from inventing flint-knapping to quantum mechanics in a single generation. But, back to ‘standing on the shoulder of giants’, we can build on cumulative knowledge assembled by those who went before us to continuously improve.  And as that cumulative knowledge grows, more and more tools and resources become available, multiple insights emerge, and we create what amounts to a chain reaction of innovations.  But the trouble with chain reactions is that they can be very hard to control.    

Simultaneous Innovation: Perhaps the most compelling support for this inevitability of innovation lies in the pervasiveness of simultaneous innovation.   How does human culture exist for 50,000 years or more and then ‘suddenly’ two people, Darwin and Wallace come up with the theory of evolution independently and simultaneously?  The same question for calculus (Newton and Leibniz), or the precarious proliferation of nuclear weapons and other assorted weapons of mass destruction.  It’s not coincidence, but simply reflects that once all of the pieces of a puzzle are in place, somebody, and more likely, multiple people will inevitably make connections and see the next step in the innovation chain. 

But as innovation expands like a conquering army on multiple fronts, more and more puzzle pieces become available, and more puzzles are solved.  But unfortunately associated side effects and unanticipated consequences also build up, and my concern is that they can potentially overwhelm us. And this is compounded because often, as in the case of climate change, dealing with side effects can be more demanding than the original innovation. And because they can be slow to emerge, they are often deeply rooted before we become aware of them. As we look forward, just taking AI as an example, we can already somewhat anticipate some worrying possibilities. But what about the surprises analogous to climate change that we haven’t even thought of yet? I find that a sobering thought that we are attempting to create consciousness, but despite the efforts of numerous Nobel laureates over decades, we still have to idea what consciousness is. It’s called the ‘hard problem’ for good reason.  

Stop the World, I Want to Get Off: So why not slow down? There are precedents, in the form of nuclear arms treaties, and a variety of ethically based constraints on scientific exploration.  But regulations require everybody to agree and comply. Very big, expensive and expansive innovations are relatively easy to police. North Korea and Iran notwithstanding, there are fortunately not too many countries building nuclear capability, at least not yet. But a lot of emerging technology has the potential to require far less physical and financial infrastructure.  Cyber crime, gene manipulation, crypto and many others can be carried out with smaller, more distributed resources, which are far more difficult to police.  Even AI, which takes considerable resources to initially create, opens numerous doors for misuse that requires far less resource. 

The Atomic Weapons Conundrum.  The challenge with getting bad actors to agree on regulation and constraint is painfully illustrated by the atomic bomb.  The discovery of fission by Strassman and Hahn in the late 1930’s made the bomb inevitable. This set the stage for a race to turn theory into practice between the Allies and Nazi Germany. The Nazis were bad actor, so realistically our only option was to win the race.  We did, but at enormous cost. Once the ‘cat was out of the bag, we faced a terrible choice; create nuclear weapons, and the horror they represent, or chose to legislate against them, but in so doing, cede that terrible power to the Nazi’s?  Not an enviable choice.

Cumulative Knowledge.  Today we face similar conundrums on multiple fronts. Cumulative knowledge will make it extremely difficult not to advance multiple, potentially perilous technologies.  Countries who legislate against it risk either pushing it underground, or falling behind and deferring to others. The recent open letter from Meta to the EU chastising it for the potential economic impacts of its AI regulations may have dripped with self-interest.  But that didn’t make it wrong.   https://euneedsai.com/  Even if the EU slows down AI development, the pieces of the puzzle are already in place.  Big corporations, and less conservative countries will still pursue the upside, and risk the downside. The cat is very much out of the bag.

Muddling Through:  The good news is that when faced with potentially perilous change in the past, we’ve muddled through.  Hopefully we will do so again.   We’ve avoided a nuclear holocaust, at least for now.  Social media has destabilized our social order, but hasn’t destroyed it, yet.  We’ve been through a pandemic, and come out of it, not unscathed, but still functioning.  We are making progress in dealing with climate change, and have made enormous strides in managing pollution.

Chain Reactions:  But the innovation chain reaction, and the impact of cumulative knowledge mean that the rate of change will, in the absence of catastrophe, inevitably continue to accelerate. And as it does, so will side effects, nefarious use, mistakes and any unintended consequences that derive from it. Key factors that have helped us in the past are time and resource, but as waves of innovation increase in both frequency and intensity, both are likely to be increasingly squeezed.   

What can, or should we do? I certainly don’t have simple answers. We’re all pretty good, although by definition, far from perfect at scenario planning and trouble shooting for our individual innovations.  But the size and complexity of massive waves of innovation, such as AI, are obviously far more challenging.  No individual, or group can realistically either understand or own all of the implications. But perhaps we as an innovation community should put more collective resources against trying? We’ll never anticipate everything, and we’ll still get blindsided.  And putting resources against ‘what if’ scenarios is always a hard sell. But maybe we need to go into sales mode. 

Can the Problem Become the Solution? Encouragingly, the same emerging technology that creates potential issues could also help us.  AI and quantum computing will give us almost infinite capacity for computation and modeling.  Could we collectively assign more of that emerging resource against predicting and managing it’s own risks?

With many emerging technologies, we are now where we were in the 1900’s with climate change.  We are implementing massive, unpredictable change, and by definition have no idea what the unanticipated consequences of that will be. I personally think we’ll deal with climate change.  It’s difficult to slow a leviathan that’s been building for over a hundred years.  But we’ve taken the important first steps in acknowledging the problem, and are beginning to implement corrective action. 

But big issues require big solutions.  Long-term, I personally believe the most important thing for humanity to escape the gravity well.   Given the scale of our ability to curate global change, interplanetary colonization is not a luxury, but an essential.  Climate change is a shot across the bow with respect to how fragile our planet is, and how big our (unintended) influence can be.  We will hopefully manage that, and avoid nuclear war or synthetic pandemics for long enough to achieve it.  But ultimately, humanity needs the insurance dispersed planetary colonization will provide.  

Image credits: Microsoft Copilot

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Why Modifying This One Question Changes Everything

Why Modifying This One Question Changes Everything

GUEST POST from Robyn Bolton

You know that asking questions is essential.  After all, when you’re innovating, you’re doing something new, which means you’re learning, and the best way to learn is by asking questions.  You also know that asking genuine questions, rather than rhetorical or weaponized ones, is critical to building a culture of curiosity, exploration, and smart risk-taking.  But did you know that making a small change to a single question can radically change everything for your innovation strategy, process, and portfolio?

What is your hypothesis?

Before Lean Startup, there was Discovery-Driven Planning.  This approach, first proposed by Columbia Business School professor Rita McGrath and Wharton School professor Ian MacMillan in their 1995 HBR article, outlines a “planning” approach that acknowledges and embraces assumptions (instead of pretending that they’re facts) and relentlessly tests them to uncover new data and inform and update the plan.

It’s the scientific method applied to business.

How confident are you?

However, not all assumptions or hypotheses are created equal.  This was the assertion in the 2010 HBR article “Beating the Odds When You Launch a New Venture.”  Using examples from Netflix, Johnson & Johnson, and a host of other large enterprises and scrappy startups, the authors encourage innovators to ask two questions about their assumptions:

  1. How confident am I that this assumption is true?
  2. What is the (negative) impact on the idea if the assumption is false?

By asking these two questions of every assumption, the innovator sorts assumptions into three categories:

  1. Deal Killers: Assumptions that, if left untested, threaten the idea’s entire existence
  2. Path-dependent risks: Assumptions that impact the strategic underpinnings of the idea and cost significant time and money to resolve
  3. High ROI risks: Assumptions that can be quickly and easily tested but don’t have a significant impact on the idea’s strategy or viability

However, human beings have a long and inglorious history of overconfidence.  This well-established bias in which our confidence in our judgment exceeds the objective (data-based) accuracy of those judgments resulted in disasters like Chernobyl, the sinking of the Titanic, the explosions of the Space Shuttle Challenger and Discovery, and the Titan submersible explosion.

Let’s not add your innovation to that list.

How much of your money are you willing to bet?

For years, I’ve worked with executives and their teams to adopt Discovery-Driven Planning and focus their earliest efforts on testing Deal Killer assumptions. I was always struck by how confident everyone was and rather dubious when they reported that they had no Deal Killer assumptions.

So, I changed the question.

Instead of asking how confident they were, I asked how much they would bet. Then I made it personal—high confidence meant you were willing to bet your annual income, medium confidence meant dinner for the team at a Michelin-starred restaurant, and low confidence meant a cup of coffee.

Suddenly, people weren’t quite so confident, and there were A LOT of Deal Killers to test.

Make it Personal

It’s easy to become complacent in companies.  You don’t get paid more if you come in under budget, and you don’t get fired if you overspend.  Your budget is a rounding error in the context of all the money available to the company.  And your signing authority is probably a rounding error on the rounding error that is your budget.  So why worry about ten grand here and a hundred grand there?

Because neither you, your team, nor your innovation efforts have the luxury of complacency.

Innovation is always under scrutiny.  People expect you to generate results with a fraction of the resources in record time.  If you don’t, you, your team, and your budget are the first to be cut.

The business of innovation is personal.  Treat it that way. 

How much of your time, money, and reputation are you willing to risk?  What do you need your team to risk in terms of their time, money, and professional aspirations?  How much time, money, and reputation are your stakeholders willing to risk?

The answers change everything.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Triggering Radical Transformational Change

Triggering Radical Transformational Change

GUEST POST from Greg Satell

There’s an old adage that says we should never let a crisis go to waste. The point is that during a crisis there is a visceral sense of urgency and resistance often falls by the wayside. We certainly saw that during the COVID-19 pandemic. Digital technologies such as video conferencing, online grocery and tele-health have gone from fringe to mainstream in record time.

Seasoned leaders learn how to make good use of a crisis. Consider Bill Gates and his ‘Internet Tidal Wave‘ memo, which leveraged what could have been a mortal threat to Microsoft into a springboard to even greater dominance. Or how Steve Jobs used Apple’s near-death experience to reshape the ailing company into a powerhouse.

But what if we could prepare for a trigger before it happens? The truth is that indications of trouble are often clear long before the crisis arrives. Clearly, there were a number of warning signs that a pandemic was possible, if not likely. As every good leader knows, there’s never a shortage of looming threats. If we learn to plan ahead, we can make a crisis work for us.

The Plan Hatched in a Belgrade Cafe

In the fall of 1998, five young activists met in a coffee shop in Belgrade, Serbia. Although still in their twenties, they were already grizzled veterans. In 1992, they took part in student protests against the war in Bosnia. In 1996, they helped organize a series of rallies in response to Slobodan Milošević’s attempt to steal local elections.

To date, their results were decidedly mixed. The student protests were fun, but when the semester ended, everyone went home for the summer and that was the end of that. The 1996 protests were more successful, overturning the fraudulent results, but the opposition coalition, called “Zajedno,” soon devolved into infighting.

So they met in the coffee shop to discuss their options for the upcoming presidential election to be held in 2000. They knew from experience that they could organize rallies effectively and get people to the polls. They also knew that when they got people to the polls and won, Milošević would use his power and position to steal the election.

That would be their trigger.

The next day, six friends joined them and they called their new organization Otpor. Things began slowly, with mostly street theatre and pranks, but within 2 years their ranks had swelled to more than 70,000. When Milošević tried to steal the election they were ready and what is now known as the Bulldozer Revolution erupted.

The Serbian strongman was forced to concede. The next year, Milošević would be arrested and sent to The Hague for his crimes against humanity. He would die in his prison cell in 1996, awaiting trial.

Opportunity From the Ashes

In 2014, in the wake of the Euromaidan protests that swept the thoroughly corrupt autocrat Viktor Yanukovych from power, Ukraine was in shambles. Having been looted of roughly $100 billion (roughly the amount of the country’s entire GDP) and invaded by Russia, things looked bleak. Without western aid, the proud nation’s very survival was in doubt.

Yet for Vitaliy Shabunin and the Anti-Corruption Action Center, it was a moment he had been waiting for. He established the organization with his friend Dasha Kaleniuk a few years earlier. Since then they, along with a small staff, had been working with international NGOs to document corruption and develop effective legislation to fight it.

With Ukraine’s history of endemic graft, which had greatly worsened under Yanukovych, progress had been negligible. Yet now, with the IMF and other international institutions demanding reform, Shabunin and Kaleniuk were instantly in demand to advise the government on instituting a comprehensive anti-corruption program, which passed in record time.

Yet they didn’t stop there either. “Our long-term strategy is to create a situation in which it will be impossible not to do anti-corruption reforms,” Shabunin would later tell me. “We are working to ensure that these reforms will be done, either by these politicians or by another, because they will lose their office if they don’t do these reforms.”

Vitaliy, Dasha and the Anti-Corruption Action Center continue to prepare for future triggers.

The Genius of Xerox PARC

One story that Silicon Valley folks love to tell involves Steve Jobs and Xerox. After the copier giant made an investment in Apple, which was then a fledgling company, it gave Jobs access to its Palo Alto Research Center (PARC). He then used the technology he saw there to create the Macintosh. Jobs built an empire based on Xerox’s oversight.

Yet the story misses the point. By the late 60s, its Xerox CEO Peter McColough knew that the copier business, while still incredibly profitable, was bound to be disrupted eventually. At the same time it was becoming clear that computer technology was advancing quickly and, someday, would revolutionize how we worked. PARC was created to prepare for that trigger.

The number of groundbreaking technologies created at PARC is astounding. The graphical user interface, networked computing, object oriented programing, the list goes on. Virtually everything that we came to know as “personal computing” had its roots in the work done at PARC in the 1970s.

Most of all, PARC saved Xerox. The laser printer invented there would bring in billions and, eventually, largely replace the copier business. Some technologies were spun off into new companies, such as Adobe and 3Com, with an equity stake going to Xerox. And, of course, the company even made a tidy profit off the Macintosh, because of the equity stake that gave Jobs access to the technology in the first place.

Transforming an Obstacle Into a Design Constraint

The hardest thing about change is that, typically, most people don’t want it. If they did, it have already been accepted as the normal state of affairs. That can make transformation a lonely business. The status quo has inertia on its side and never yields its power gracefully. The path for an aspiring changemaker can be heartbreaking and soul crushing.

Many would see the near-certainty that Milosevic would try to steal the election as an excuse to do nothing. Most people would look at the almost impossibly corrupt Yanukovych regime and see the idea of devoting your life to anti-corruption reforms as quixotic folly. It is extremely rare for a CEO whose firm dominates an industry to ask, “What comes after?”

Yet anything can happen and often does. Circumstances conspire. Events converge. Round-hole businesses meet their square-peg world. We can’t predict exactly when or where or how or what will happen, but we know that everybody and everything gets disrupted eventually. It’s all just a matter of time.

When that happens resistance to change temporarily abates. So there’s lots to do and no time to waste. We need to empower our allies, as well as listen to our adversaries. We need to build out a network to connect to others who are sympathetic to our cause. Transformational change is always driven by small groups, loosely connected, but united by a common purpose.

Most of all, we need to prepare. A trigger always comes and, when it does, it brings great opportunity with it.

— Article courtesy of the Digital Tonto blog
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.






Innovation or Not – The Microdosing Revolution

Innovation or Not - The Microdosing Revolution

GUEST POST from Art Inteligencia

In recent years, the concept of microdosing has moved from the fringes of alternative therapy into the mainstream as a potential tool for enhancing mental performance and wellness. But is microdosing truly an innovation, or is it a passing trend destined for the annals of speculative practices? Through examining its revolutionary potential and analyzing its impact in real-world scenarios, we can better understand the role microdosing plays in our continuous pursuit of human-centered innovation.

Understanding Microdosing

Microdosing typically involves taking sub-perceptual doses of psychedelics, like LSD or psilocybin, approximately one-tenth of a recreational dose, to experience the potential therapeutic benefits without hallucinogenic effects. Advocates claim it can boost creativity, alleviate anxiety, and improve focus, leading to its rising popularity among entrepreneurs, artists, and the tech-savvy.

Case Study 1: Microdosing in Silicon Valley

In the competitive landscape of Silicon Valley, professionals are constantly seeking a competitive edge to enhance productivity and creativity. The tech hub has notably become a breeding ground for experimentation with microdosing. Tech workers claim the practice helps them to sustain high levels of innovation and problem-solving abilities in an environment where mental agility is highly prized.

For instance, a significant number of software developers and startup founders have reported that microdosing has supported cognitive function and stress reduction, leading to improved workplace performance and job satisfaction. Companies have begun embracing wellness practices, subtly endorsing microdosing as part of a broader strategy to cultivate employee well-being and foster an innovative work culture.

Case Study 2: Microdosing in Mental Health Treatment

Beyond corporate environments, microdosing has gained attention as a potential revolutionary approach in mental health treatment. Psychedelics-assisted therapy research has opened up dialogues about microdosing’s efficacy as a treatment for mood disorders and PTSD. Leading institutions are exploring the controlled use of microdoses as an adjunct to traditional therapies.

A pilot study conducted at a renowned university evaluated the impact of psilocybin microdosing on patients with treatment-resistant depression. Preliminary findings suggest a marked improvement in mood stabilization and cognitive flexibility among participants, renewing hope for alternative approaches in mental health treatment. This study has prompted further research and dialogue within the medical community, transforming discussions around treatment paradigms.

Case Study 3: Brez Beverages – Microdosing in the Consumer Market

Brez Beverages, a pioneering player in the beverage industry, has embraced the microdosing revolution by developing a line of drinks infused with adaptogenic and nootropic compounds. Their products aim to provide consumers with the benefits of microdosing in a more accessible and socially acceptable format.

Brez Beverages

The innovative approach of Brez Beverages lies in their ability to tap into the growing desire for wellness-centric consumer products. By integrating microdosed elements into beverages, they offer a unique alternative for individuals seeking mental clarity and stress reduction without committing to psychedelic substances. Brez Beverages represents a shift in how microdosing concepts can be commercialized and introduced to mainstream consumers.

Market feedback indicates a burgeoning interest among health-conscious customers who are drawn to the idea of enhancing their daily lives with subtle botanical blends, thus carving a new niche in the health and wellness sector. Brez continues to capitalize on the demand for unconventional health solutions, reflecting both the challenge and potential of integrating microdosing into consumer products.

The Verdict: Innovation or Not?

Whether microdosing is labeled as an innovation largely depends on one’s perspective. On one hand, it presents a novel application of existing compounds, showcasing unconventional problem-solving in enhancing human potential—an experimental departure from typical wellness and therapeutic practices. On the other hand, its lack of universal acceptance and scientific consensus makes it a contentious archetype of modern self-experimentation rather than unmistakable innovation.

In conclusion, microdosing embodies the dynamic nature of innovation—provocative yet promising. As we push the boundaries of what’s possible in the human experience, microdosing remains an emblem of the desire to enhance and evolve our capabilities. Whether it stands the test of time will depend on ongoing research, legal structures, and societal acceptance, but it undoubtedly shapes the current discourse on potential pathways for human-centered transformation.

Image credit: DrinkBrez.com, Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.






Top 10 Human-Centered Change & Innovation Articles of September 2024

Top 10 Human-Centered Change & Innovation Articles of September 2024Drum roll please…

At the beginning of each month, we will profile the ten articles from the previous month that generated the most traffic to Human-Centered Change & Innovation. Did your favorite make the cut?

But enough delay, here are September’s ten most popular innovation posts:

  1. Three Reasons Nobody Cares About Your Ideas — by Greg Satell
  2. Six Key Habits of Great Leaders — by David Burkus
  3. Are You Leading in the Wrong Zone? — by Geoffrey A. Moore
  4. Projects Don’t Go All Right or All Wrong — by Howard Tiersky
  5. How to Cultivate Respect as a Leader — by David Burkus
  6. What is Your Mindset? Fixed, Growth or Hybrid? — by Stefan Lindegaard
  7. Embracing Failure is a Catalyst for Learning and Innovation — by Stefan Lindegaard
  8. ISO Innovation Standards — by Robyn Bolton
  9. The Hidden Cost of Waiting — by Mike Shipulski
  10. AI Requires Conversational Intelligence — by Greg Satell

BONUS – Here are five more strong articles published in August that continue to resonate with people:

If you’re not familiar with Human-Centered Change & Innovation, we publish 4-7 new articles every week built around innovation and transformation insights from our roster of contributing authors and ad hoc submissions from community members. Get the articles right in your Facebook, Twitter or Linkedin feeds too!

SPECIAL BONUS – THREE DAYS ONLY: From now until 11:59PM ET you can get either the eBook or the hardcover version of the SECOND EDITION of my latest bestselling book Charting Change for 50% OFF using code FLSH50. This deal won’t last long, so grab your copy while supplies last!

Accelerate your change and transformation success

Have something to contribute?

Human-Centered Change & Innovation is open to contributions from any and all innovation and transformation professionals out there (practitioners, professors, researchers, consultants, authors, etc.) who have valuable human-centered change and innovation insights to share with everyone for the greater good. If you’d like to contribute, please contact me.

P.S. Here are our Top 40 Innovation Bloggers lists from the last four years:

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.






Who is More Creative – Women or Men?

753 Studies Have the Answer

Who is More Creative – Women or Men?

GUEST POST from Robyn Bolton

You were born creative. As an infant, you had to figure many things out—how to get fed or changed, get help or attention, and make a onesie covered in spit-up still look adorable.  As you grew older, your creativity grew, too.  You drew pictures, wrote stories, played dress-up, and acted out imaginary stories.

Then you went to school, and it was time to be serious.  Suddenly, creativity had a time and place.  It became an elective or a hobby.  Something you did just enough of to be “well-rounded” but not so much that you would be judged irresponsible or impractical.

When you entered the “real world,” your job determined whether you were creative.  Advertising, design, marketing, innovation?  Creative.  Business, medicine, law, engineering?  Not creative.

As if Job-title-a-determinant-of-creativity wasn’t silly enough, in 2022, a paper was published in the Journal of Applied Psychology that declared that, based on a meta-analysis of 259 studies (n=79,915), there is a “male advantage in creative performance.”

Somewhere, Don Draper, Pablo Picasso, and Norman Mailer high-fived.

But, as every good researcher (and innovator) knows, the headline is rarely the truth.  The truth is that it’s contextual and complicated, and everything from how the original studies collected data to how “creativity” was defined matters.

But that’s not what got reported.  It’s also not what people remember when they reference this study (and I have heard more than a few people invoke these findings in the three years since publication).

That is why I was happy to see Fortune report on a new study just published in the Journal of Applied Psychology. The study cites findings from a meta-analysis of 753 studies (n=265,762 individuals) that show men and women are equally creative. When “usefulness (of an idea) is explicitly incorporated in creativity assessment,” women’s creativity is “stronger.”

Somewhere, Mary Wells LawrenceFrida Kahlo, and Virginia Woolf high-fived.

Of course, this finding is also contextual.

What makes someone “creative?”

Both studies defined creativity as “the generation of novel and useful ideas.”

However, while the first study focused on how context drives creativity, the second study looked deeper, focusing on two essential elements of creativity: risk-taking and empathy. The authors argued that risk-taking is critical to generating novel ideas, while empathy is essential to developing useful ideas.

Does gender influence creativity?

It can.  But even when it does, it doesn’t make one gender more or less creative than the other.

Given “contextual moderators” like country-level culture, industry gender composition, and role status, men tend to follow an “agentic pathway” (creativity via risk-taking), so they are more likely to generate novel ideas.

However, given the same contextual moderators, women follow a “communal pathway” (creativity via empathy), so they are more likely to generate useful ideas.

How you can use this to maximize creativity

Innovation and creativity go hand in hand. Both focus on creating something new (novel) and valuable (useful).  So, to maximize innovation within your team or organization, maximize creativity by:

  • Explicitly incorporate novelty and usefulness in assessment criteria.  If you focus only on usefulness, you’ll end up with extremely safe and incremental improvements.  If you focus only on novelty, you’ll end up with impractical and useless ideas.
  • Recruit for risk-taking and empathy.  While the manifestation of these two skills tends to fall along gender lines, don’t be sexist and assume that’s always the case.  When seeking people to join your team or your brainstorming session, find people who have demonstrated strong risk-taking or empathy-focused behaviors and invite them in.
  • Always consider the context.  Just as “contextual moderators” impact people’s creative pathways, so too does the environment you create.  If you want people to take risks, be vulnerable, and exhibit empathy, you must establish a psychologically safe environment first.  And that starts with making sure there aren’t any “tokens” (one of a “type”) in the group.

Which brings us back to the beginning.

You ARE creative.

How will you be creative today?

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

We Need to Solve the Productivity Crisis

We Need to Solve the Productivity Crisis

GUEST POST from Greg Satell

When politicians and pundits talk about the economy, they usually do so in terms of numbers. Unemployment is too high or GDP is too low. Inflation should be at this level or at that. You get the feeling that somebody somewhere is turning knobs and flicking levers in order to get the machine humming at just the right speed.

Yet the economy is really about our well being. It is, at its core, our capacity to produce goods and services that we want and need, such as the food that sustains us, the homes that shelter us and the medicines that cure us, not to mention all of the little niceties and guilty pleasures that we love to enjoy.

Our capacity to generate these things is determined by our productive capacity. Despite all the hype about digital technology creating a “new economy,” productivity growth for the past 50 years has been tremendously sluggish. If we are going to revive it and improve our lives we need to renew our commitment to scientific capital, human capital and free markets.

Restoring Scientific Capital

In 1945, Vannevar Bush, delivered a report, Science, The Endless Frontier, that argued that the US government needed to invest in “scientific capital” and through basic research and scientific education. It would set in motion a number of programs that would set the stage for America’s technological dominance during the second half of the century.

Bush’s report led to the development of America’s scientific infrastructure, including agencies such as the National Science Foundation (NSF), National Institutes of Health (NIH) and DARPA. Others, such as the National Labs and science programs at the Department of Agriculture, also contribute significantly to our scientific capital.

The results speak for themselves and returns on public research investment have been shown to surpass those in private industry. To take just one example, it has been estimated that the $3.8 billion invested in the Human Genome Project resulted in nearly $800 billion in economic impact and created over 300,000 jobs in just the first decade.

Unfortunately, we forgot those lessons. Government investment in research as a percentage of GDP has been declining for decades, limiting our ability to produce the kinds of breakthrough discoveries that lead to exciting new industries. What passes for innovation these days displaces workers, but does not lead to significant productivity gains.

So the first step to solving the productivity puzzle would be to renew our commitment to investing in the type of scientific knowledge that, as Bush put it, can “turn the wheels of private and public enterprise.” There was a bill before congress to do exactly that, but unfortunately it got bogged down in the Senate due to infighting.

Investing In Human Capital

Innovation, at its core, is something that people do, which is why education was every bit as important to Bush’s vision as investment was. “If ability, and not the circumstance of family fortune, is made to determine who shall receive higher education in science, then we shall be assured of constantly improving quality at every level of scientific activity,” he wrote.

Programs like the GI Bill delivered on that promise. We made what is perhaps the biggest investment ever in human capital, sending millions to college and creating a new middle class. American universities, considered far behind their European counterparts earlier in the century, especially in the sciences, came to be seen as the best in the world by far.

Today, however, things have gone horribly wrong. A recent study found that about half of all college students struggle with food insecurity, which is probably why only 60% of students at 4-year institutions and even less at community colleges ever earn a degree. The ones that do graduate are saddled with decades of debt

So the bright young people who we don’t starve we are condemning to decades of what is essentially indentured servitude. That’s no way to run an entrepreneurial economy. In fact, a study done by the Federal Reserve Bank of Philadelphia found that student debt has a measurable negative impact on new business creation.

Recommitting Ourselves To Free and Competitive Markets

There is no principle more basic to capitalism than that of free markets, which provide the “invisible hand” to efficiently allocate resources. When market signals get corrupted, we get less of what we need and more of what we don’t. Without vigorous competition, firms feel less of a need to invest and innovate, and become less productive.

There is abundant evidence that is exactly what has happened. Since the late 1970s antitrust enforcement has become lax, ushering in a new gilded age. While digital technology was hyped as a democratizing force, over 75% of industries have seen a rise in concentration levels since the late 1990s, which has led to a decline in business dynamism.

The problem isn’t just monopoly power dominating consumers, either, but also monopsony, or domination of suppliers by buyers, especially in labor markets. There is increasing evidence of collusion among employers designed to keep wages low, while an astonishing abuse of non-compete agreements that have affected more than a third of the workforce.

In a sense, this is nothing new. Adam Smith himself observed in The Wealth of Nations that “Our merchants and master-manufacturers complain much of the bad effects of high wages in raising the price, and thereby lessening the sale of their goods both at home and abroad. They say nothing concerning the bad effects of high profits. They are silent with regard to the pernicious effects of their own gains. They complain only of those of other people.”

Getting Back On Track

In the final analysis, solving the productivity puzzle shouldn’t be that complicated. It seems that everything we need to do we’ve done before. We built a scientific architecture that remains unparalleled even today. We led the world in educating our people. American markets were the most competitive on the planet.

Yet somewhere we lost our way. Beginning in the early 1970s, we started reducing our investment in scientific research and public education. In the early 1980s, the Chicago school of competition law started to gain traction and antitrust enforcement began to wane. Since 2000, competitive markets in the United States have been in serious decline.

None of this was inevitable. We made choices and those choices had consequences. We can make other ones. We can choose to invest in discovering new knowledge, educate our children without impoverishing them, to demand our industries compete and hold our institutions to account. We’ve done these things before and can do so again.

All that’s left is the will and the understanding that the economy doesn’t exist in the financial press, on the floor of the stock markets or in the boardrooms of large corporations, but in our own welfare as well as in our ability to actualize our potential and realize our dreams. Our economy should be there to serve our needs, not the other way around.

— Article courtesy of the Digital Tonto blog
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.