Category Archives: Psychology

Three Reasons You Are Not Happy at Work

And What to Do to Become as Happy as You Could Be

Three Reasons You Are Not Happy at Work

GUEST POST from Stefan Lindegaard

Most people spend years in jobs that feel fine. Not great, not terrible – just fine. But fine isn’t the goal. You deserve to do work that energizes you, challenges you, and gives you a sense of purpose.

Yet too many professionals stay stuck. Why? Because they fall into one (or more) of these three traps:

1. You’ve Let Work Happen to You

The problem: If your career feels like a series of random events rather than intentional choices, it’s because you’ve been reacting instead of leading. Maybe you took the first job that paid well, accepted promotions without questioning whether they aligned with what you wanted, or stayed in a role simply because it was comfortable.

The fix: Take ownership of your career. What do you actually want from your work? More impact? More autonomy? A new challenge? Stop waiting for opportunities to fall into your lap and start actively shaping your path. Schedule time this week to reflect, map out your ideal work life, and make a move toward it.

2. You’re Valuing Stability Over Growth

The problem: If your job is predictable but uninspiring, you might have traded growth for comfort. Sure, stability feels safe, but it comes at a cost – boredom, disengagement, and a slow decline in motivation.

The fix: Push yourself out of autopilot. Challenge yourself to take on a stretch project, learn a new skill, or initiate a conversation about expanding your role. Growth is what fuels long-term satisfaction – without it, even the best job will start to feel dull.

3. You’re Waiting for the ‘Perfect’ Job Instead of Making the Most of Where You Are

The problem: Many people think happiness at work comes from finding the right job or employer. But job satisfaction is not just about where you work – it’s about how you work. If you’re constantly waiting for a better company, a better boss, or a better opportunity, you might miss the chance to make your current role more fulfilling.

The fix: Find ways to bring more purpose and energy into your day now. Connect with colleagues who inspire you. Start a project that excites you. Look for small ways to align your work with what matters to you. The next big move will come – but don’t let the wait stop you from enjoying today.

Happiness at Work Isn’t Luck. It’s a Choice!

You don’t need a new job to feel more engaged, fulfilled, or challenged. You need:

  • A clear direction for where you want to go
  • A commitment to continuous growth
  • A proactive approach to shaping your experience

Are you leading your work life or just letting it happen to you? The choice is yours.

Image Credit: Stefan Lindegaard

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

You Must Accept That People Are Irrational

You Must Accept That People Are Irrational

GUEST POST from Greg Satell

For decades, economists have been obsessed with the idea of “enlightened self-interest,” building elaborate models based on the assumption that people make rational choices. Business and political leaders have used these models to shape competitive strategies, compensation, tax policies and social services among other things.

It’s clear that the real world is far more complex than that. Consider the prisoner’s dilemma, a famous thought experiment in which individuals acting in their self-interest make everyone worse off. In a wide array of real world and experimental contexts, people will cooperate for the greater good rather than pursue pure self-interest.

We are wired to cooperate as well as to compete. Identity and dignity will guide our actions even more than the prospect for loss or gain. While business schools have trained generations of managers to assume that they can optimize results by designing incentives, the truth is that leaders that can forge a sense of shared identity and purpose have the advantage.

Overcoming The Prisoner’s Dilemma

John von Neumann was a frustrated poker player. Despite having one of the best mathematical minds in history that could probably calculate the odds better than anyone on earth, he couldn’t tell whether other players were bluffing or not. It was his failure at poker that led him to create game theory, which calculates the strategies of other players.

As the field developed, it was expanded to include cooperative games in which players could choose to collaborate and even form coalitions with each other. That led researchers at RAND to create the prisoner’s dilemma, in which two suspects are being interrogated separately and each offered a reduced sentence to confess.

Prisoner's Dilemma

Here’s how it works: If both prisoners cooperate with each other and neither confesses, they each get one year in prison on a lesser charge. If one confesses, he gets off scot-free, while his partner gets 5 years. If they both rat each other out, then they get three years each—collectively the worst outcome of all.

Notice how from a rational viewpoint, the best strategy is to defect. No matter what one guy does, the other one is better off ratting him out. If both pursue self-interest, they are made worse off. It’s a frustrating problem. Game theorists call it a Nash equilibrium—one in which nobody can improve their position by unilateral move. In theory, you’re basically stuck.

Yet in a wide variety of real-world contexts, ranging from the survival strategies of guppies to military alliances, cooperation is credibly maintained. In fact, there are a number of strategies that have proved successful in overcoming the prisoner’s dilemma. One, called tit-for-tat, relies on credible punishments for defections. Even more effective, however, is building a culture of shared purpose and trust.

Kin Selection And Identity

Evolutionary psychology is a field very similar to game theory. It employs mathematical models to explain what types of behaviors provide the best evolutionary outcomes. At first, this may seem like the utilitarian approach that economists have long-employed, but when you combine genetics with natural selection, you get some surprising answers.

Consider the concept of kin selection. From a purely selfish point of view, there is no reason for a mother to sacrifice herself for her child. However, from an evolutionary point of view, it makes perfect sense for parents to put their kids first. Groups who favor children are more likely to grow and outperform groups who don’t.

This is what Richard Dawkins meant when he called genes selfish. If we look at things from our genes’ point of view, it makes perfect sense for them to want us to sacrifice ourselves for children, who are more likely to be able to propagate our genes than we are. The effect would logically also apply to others, such as cousins, that likely carry our genes.

Researchers have also applied the concept of kin selection to other forms of identity that don’t involve genes, but ideas (also known as memes) in examples such as patriotism. When it comes to people or ideas we see as an important part of our identity, we tend to take a much more expansive view of our interests than traditional economic models would predict.

Cultures of Dignity

It’s not just identity that figures into our decisions, but dignity as well. Consider the ultimatum game. One player is given a dollar and needs to propose how to split it with another player. If the offer is accepted, both players get the agreed upon shares. If it is not accepted, neither player gets anything.

If people acted purely rationally, offers as low as a penny would be routinely accepted. After all, a penny is better than nothing. Yet decades of experiments across different cultures show that most people do not accept a penny. In fact, offers of less than 30 cents are routinely rejected as unfair because they offend people’s dignity and sense of self.

Results from ultimatum game are not uniform, but vary in different cultures and more recent research suggests why. In a study in which a similar public goods game was played it was found that cooperative—as well as punitive—behavior is contagious, spreading through three degrees of interactions, even between people who haven’t had any direct contact.

Whether we know it or not, we are constantly building ecosystems of norms that reward and punish behavior according to expectations. If we see the culture we are operating in as trusting and generous, we are much more likely to act collaboratively. However, if we see our environment as cutthroat and greedy, we’ll tend to model that behavior in the same way.

Forging Shared Identity And Shared Purpose

In an earlier age, organizations were far more hierarchical. Power rested at the top. Information flowed up, orders went down, work got done and people got paid. Incentives seemed to work. You could pay more and get more. Yet in today’s marketplace, that’s no longer tenable because the work we need done is increasingly non-routine.

That means we need people to do more than merely carry out tasks, they need to put all of their passion and creativity into their work to perform at a high-level. They need to collaborate effectively in teams and take pride in the impact their efforts produce. To achieve that at an organizational level, leaders need to shift their mindsets.

As David Burkus explained in his TED Talk, humans are prosocial. They are vastly more likely to perform when they understand and identify with who their work benefits than when they are given financial incentives or fed some grandiose vision. Evolutionary psychologists have long established that altruism is deeply embedded in our sense of tribe.

The simple truth is that we can no longer coerce people to do what we want with Rube Goldberg-like structures of carrots and sticks, but must inspire people to want what we want. Humans are not purely rational beings, responding to stimuli as if they were vending machines that spit out desired behaviors when the right buttons are pushed, but are motivated by identity and dignity more than anything else.

Leadership is not an algorithm, but a practice of creating meaning through relationships of trust in the context of a shared purpose.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Learning Business and Life Lessons from Monkeys

Learning Business and Life Lessons from Monkeys

GUEST POST from Greg Satell

Franz Kafka was especially skeptical about parables. “Many complain that the words of the wise are always merely parables and of no use in daily life,” he wrote. “When the sage says: ‘Go over,’ he does not mean that we should cross to some actual place… he means some fabulous yonder…that he cannot designate more precisely, and therefore cannot help us here in the very least.

Business pundits, on the other hand, tend to favor parables, probably because telling simple stories allows for the opportunity to seem both folksy and wise at the same time. When Warren Buffet says “Only when the tide goes out do you discover who’s been swimming naked,” it doesn’t sound so much like an admonishment.

Over the years I’ve noticed that some of the best business parables involve monkeys. I’m not sure why that is, but I think it has something to do with taking intelligence out of the equation. We’re often prone to imagining ourselves as the clever hero of our own story and we neglect simple truths. That may be why monkey parables have so much to teach us.

1. Build The #MonkeyFirst

When I work with executives, they often have a breakthrough idea they are excited about. They begin to tell me what a great opportunity it is and how they are perfectly positioned to capitalize on it. However, when I begin to dig a little deeper it appears that there is some major barrier to making it happen. When I try to ask about it, they just shut down.

One reason that this happens is that there is a fundamental tension between innovation and operations. Operational executives tend to focus on identifying clear benchmarks to track progress. That’s fine for a typical project, but when you are trying to do something truly new and different, you have to directly confront the unknown.

At Google X, the tech giant’s “moonshot factory,” the mantra is #MonkeyFirst. The idea is that if you want to get a monkey to recite Shakespeare on a pedestal, you start by training the monkey, not building the pedestal, because training the monkey is the hard part. Anyone can build a pedestal.

The problem is that most people start with the pedestal, because it’s what they know and by building it, they can show early progress against a timeline. Unfortunately, building a pedestal gets you nowhere. Unless you can actually train the monkey, working on the pedestal is wasted effort.

The moral: Make sure you address the crux of the problem and don’t waste time with peripheral issues.

2. Don’t Get Taken In By Coin Flipping Monkeys

We live in a world that worships accomplishment. Sports stars who have never worked in an office are paid large fees to speak to corporate audiences. Billionaires who have never walked a beat speak out on how to fight crime (even as they invest in gun manufacturers). Others like to espouse views on education, although they have never taught a class.

Many say that you can’t argue with success, but consider this thought experiment: Put a million monkeys in a coin flipping contest. The winners in each round win a dollar and the losers drop out. After twenty rounds, there will only be two monkeys left, each winning $262,144. The vast majority of the other monkeys leave with merely pocket change.

How much would you pay the winning monkeys to speak at your corporate event? Would you invite them to advise your company? Sit on your board? Would you be interested in their views about how to raise your children, invest your savings or make career choices? Would you try to replicate their coin-flipping success? (Maybe it’s all in the wrist).

The truth is that chance and luck play a much bigger part in success than we like to admit. Einstein, for example, became the most famous scientist of the 20th century not just because of his discoveries but also due to an unlikely coincidence. True accomplishment is difficult to evaluate, so we look for signals of success to guide our judgments.

The moral: Next time you judge someone, either by their success or lack thereof, ask yourself whether you are judging actual accomplishment or telltale signs of successful coin flipping. It’s harder to tell the difference than you’d think.

3. The Infinite Monkey Theorem

There is an old thought experiment called the Infinite Monkey Theorem, which is eerily disturbing. The basic idea is that if there were an infinite amount of monkeys pecking away on an infinite amount of keyboards they would, in time, produce the complete works of Shakespeare, Tolstoy and every other literary masterpiece.

It’s a perplexing thought because we humans pride ourselves on our ability to recognize and evaluate patterns. The idea that something we value so highly could be randomly generated is extremely unsettling. Yet there is an entire branch of mathematics, called Ramsey Theory, devoted to the study of how order emerges from random sets of data.

While the infinite monkey theorem is, of course, theoretical, technology is forcing us to confront the very real dilemma’s it presents. For example, music scholar and composer David Cope has been able to create algorithms that produce original works of music that are so good even experts can’t tell they are computer generated. So what is the value of human input?

The moral: Much like the coin flipping contest, the infinite monkey theorem makes us confront what we value and why. What is the difference between things human produced and identical works that are computer generated? Are Tolstoy’s words what give his stories meaning? Or is it the intent of the author and the fact that a human was trying to say something important?

Imagining Monkeys All Around Us

G. H. Hardy, widely considered a genius, wrote that “For any serious purpose, intelligence is a very minor gift.” What he meant was that even in purely intellectual pursuits, such as his field of number theory, there are things that are far more important. It was, undoubtedly, intellectual humility that led Hardy to Ramanujuan, perhaps his greatest discovery of all.

Imagining ourselves to be heroes of our own story can rob us of the humility we need to succeed and prosper. Mistaking ourselves for geniuses can often get us into trouble. People who think they’re playing it smart tend to make silly mistakes, both because they expect to see things that others don’t and because they fail to look for and recognize trouble signs.

Parables about monkeys can be useful because nobody expects them to be geniuses, which demands that we ask ourselves hard questions. Are we doing the important work, or the easiest tasks to show progress on? If monkeys flipping coins can simulate professional success, what do we really celebrate? If monkeys tapping randomly on typewriters can create masterworks, what is the value of human agency?

The truth is that humans are prone to be foolish. We are unable, outside a few limited areas of expertise, to make basic distinctions in matters of importance. So we look for signals of prosperity, intelligence, shared purpose and other things we value to make judgments about what information we should trust. Imagining monkeys around us helps us to be more careful.

Sometimes the biggest obstacle between where we are now and the fabulous yonder we seek is just the few feet in front of us.

— Article courtesy of the Digital Tonto blog
— Image credit: Flickr

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Identity is Crucial to Change

Identity is Crucial to Change

GUEST POST from Greg Satell

In an age of disruption, the only viable strategy is to adapt. Today, we are undergoing major shifts in technology, resources, migration and demography that will demand that we make changes in how we think and what we do. The last time we saw this much change afoot was during the 1920s and that didn’t end well. The stakes are high.

In a recent speech, the EU’s High Representative for Foreign Affairs and Security Policy Josep Borrell highlighted the need for Europe to change and adapt to shifts in the geopolitical climate. He also pointed out that change involves far more than interests and incentives, carrots and sticks, but even more importantly, identity.

“Remember this sentence,” he said. “’It is the identity, stupid.’ It is no longer the economy, it is the identity.” What he meant was that human beings build attachments to things they identify with and, when those are threatened, they are apt to behave in a visceral, reactive and violent way. That’s why change and identity are always inextricably intertwined.

“We can’t define the change we want to pursue until we define who we want to be.” — Greg Satell

The Making Of A Dominant Model

Traditional models come to us with such great authority that we seldom realize that they too once were revolutionary. We are so often told how Einstein is revered for showing that Newton’s mechanics were flawed it is easy to forget that Newton himself was a radical insurgent, who rewrote the laws of nature and ushered in a new era.

Still, once a model becomes established, few question it. We go to school, train for a career and hone our craft. We make great efforts to learn basic principles and gain credentials when we show that we have grasped them. As we strive to become masters of our craft we find that as our proficiency increases, so does our success and status.

The models we use become more than mere tools to get things done, but intrinsic to our identity. Back in the nineteenth century, the miasma theory, the notion that bad air caused disease, was predominant in medicine. Doctors not only relied on it to do their job, they took great pride in their mastery of it. They would discuss its nuances and implications with colleagues, signaling their membership in a tribe as they did.

In the 1840s, when a young doctor named Ignaz Semmelweis showed that doctors could prevent infections by washing their hands, many in the medical establishment were scandalized. First, the suggestion that they, as men of prominence, could spread something as dirty as disease was insulting. Even more damaging, however, was the suggestion that their professional identity was, at least in part, based on a mistake.

Things didn’t turn out well for Semmelweis. He railed against the establishment, but to no avail. He would eventually die in an insane asylum, ironically of an infection he contracted under care, and the questions he raised about the prevailing miasma paradigm went unanswered.

A Gathering Storm Of Accumulating Evidence

We all know that for every rule, there are exceptions and anomalies that can’t be explained. As the statistician George Box put it, “all models are wrong, but some are useful.” The miasma theory, while it seems absurd today, was useful in its own way. Long before we had technology to study bacteria, smells could alert us to their presence in unsanitary conditions.

But Semmelweis’s hand-washing regime threatened doctors’ view of themselves and their role. Doctors were men of prominence, who saw disease emanating from the smells of the lower classes. This was more than a theory. It was an attachment to a particular view of the world and their place in it, which is one reason why Semmelweis experienced such backlash.

Yet he raised important questions and, at least in some circles, doubts about the miasma theory continued to grow. In 1854, about a decade after Semmelweis instituted hand washing, a cholera epidemic broke out in London and a miasma theory skeptic named John Snow was able to trace the source of the infection to a single water pump.

Yet once again, the establishment could not accept evidence that contradicted its prevailing theory. William Farr, a prominent medical statistician, questioned Snow’s findings. Besides, Snow couldn’t explain how the water pump was making people sick, only that it seemed to be the source of some pathogen. Farr, not Snow, won the day.

Later it would turn out that a septic pit had been dug too close to the pump and the water had been contaminated with fecal matter. But for the moment, while doubts began to grow about the miasma theory, it remained the dominant model and countless people would die every year because of it.

Breaking Through To A New Paradigm

In the early 1860s, as the Civil War was raging in the US, Louis Pasteur was researching wine-making in France. While studying the fermentation process, he discovered that microorganisms spoiled beverages such as beer and milk. He proposed that they be heated to temperatures between 60 and 100 degrees Celsius to avoid spoiling, a process that came to be called pasteurization

Pasteur guessed that the similar microorganisms made people sick which, in turn, led to the work of Robert Koch and Joseph Lister. Together they would establish the germ theory of disease. This work then led to not only better sanitary practices, but eventually to the work of Alexander Fleming, Howard Florey and Ernst Chain and development of antibiotics.

To break free of the miasma theory, doctors needed to change the way they saw themselves. The miasma theory had been around since Hippocrates. To forge a new path, they could no longer be the guardians of ancient wisdom, but evidence-based scientists, and that would require that everything about the field be transformed.

None of this occurred in a vacuum. In the late 19th century, a number of long-held truths, from Euclid’s Geometry to Aristotle’s logic, were being discarded, which would pave the way for strange new theories, such as Einstein’s relativity and Turing’s machine. To abandon these old ideas, which were considered gospel for thousands of years, was no doubt difficult. Yet it was what we needed to do to create the modern world.

Moving From Disruption to Resilience

Today, we stand on the precipice of a new paradigm. We’ve suffered through a global financial crisis, a pandemic and the most deadly conflict in Europe since World War II. The shifts in technology, resources, migration and demography are already underway. The strains and dangers of these shifts are already evident, yet the benefits are still to come.

To successfully navigate the decade ahead, we must make decisions not just about what we want, but who we want to be. Nowhere is this playing out more than in Ukraine right now, where the war being waged is almost solely about identity. Russians want to deny Ukrainian identity and to defy what they see as the US-led world order. Europeans need to take sides. So do the Chinese. Everyone needs to decide who they are and where they stand.

This is not only true in international affairs, but in every facet of society. Different eras make different demands. The generation that came of age after World War II needed to rebuild and they did so magnificently. Yet as things grew, inefficiencies mounted and the Boomer Generation became optimizers. The generations that came after worshiped disruption and renewal. These are, of course, gross generalizations, but the basic narrative holds true.

What should be clear is that where we go from here will depend on who we want to be. My hope is that we become protectors who seek to make the shift from disruption to resilience. We can no longer simply worship market and technological forces and leave our fates up to them as if they were gods. We need to make choices and the ones we make will be greatly influenced by how we see ourselves and our role.

As Josep Borrell so eloquently put it: It is the identity, stupid. It is no longer the economy, it is the identity.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Do You Have Gumption?

Do You Have Gumption?

GUEST POST from Mike Shipulski

Doing new work takes gumption. But there are two problems with gumption. One, you’ve got to create it from within. Two, it takes a lot of energy to generate the gumption and to do that you’ve got to be physically fit and mentally grounded. Here are some words that may help.

Move from self-judging to self-loving. It makes a difference.

It’s never enough until you decide it’s enough. And when you do, you can be more beholden to yourself.

You already have what you’re looking for. Look inside.

Taking care of yourself isn’t selfish, it’s self-ful.

When in doubt, go outside.

You can’t believe in yourself without your consent.

Your well-being is your responsibility. And it’s time to be responsible.

When you move your body, your mind smiles.

With selfish, you take care of yourself at another’s expense. With self-ful, you take care of yourself because you’re full of self-love.

When in doubt, feel the doubt and do it anyway.

If you’re not taking care of yourself, understand what you’re putting in the way and then don’t do that anymore.

You can’t help others if you don’t take care of yourself.

If you struggle with taking care of yourself, pretend you’re someone else and do it for them.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Five Triggers of Burnout at Work

Five Triggers of Burnout at Work

GUEST POST from David Burkus

Demands at work have been piling on in recent years. Including the demand on employees to continue to do more with less. And those demands come with a lot of potential burnout at work. Burnout at work is a series problem for most organizations. Burnout can lead to decreased productivity, increased absenteeism, and even physical and mental health issues.

It’s incumbent on every leader to be aware of and attempt to avoid burnout on their teams. But burnout isn’t always caused by asking too much of employees. Being overcapacity can be one element that triggers a burned-out team. But there are other triggers leaders need to be aware of.

In this article, we will explore the five triggers of burnout at work and discuss how leaders can mitigate them to create a more engaged and productive team.

Trigger 1: Lack of Margin

The first trigger of burnout at work is a lack of margin. As said above, often burnout happens because people are just at capacity. In many organizations, the reward for good work is more work. This can lead to employees constantly feeling overloaded with assignments and overwhelmed. To mitigate this, leaders can redistribute tasks more equitably and avoid rewarding good work with additional responsibilities. And they can identify priorities more clearly so teammates know what tasks matter most and which can afford to wait until later. In addition, regular individual check-ins and team-wide huddles can also help identify areas where margin can be borrowed from other team members, ensuring that everyone has a manageable workload.

Trigger 2: Lack of Control

The second trigger of burnout at work is a lack of control. Employees who feel they lack autonomy over their work are dramatically more likely to burnout than employees who can control certain elements of their job. In addition, employees who feel left out of the decision-making process and lack the necessary resources to do their job can quickly become burnt out. Leaders can address this trigger by providing employees with more autonomy in when, where, or how they work. This could involve flexible work hours, remote work options, or giving employees a say in the decision-making process. By empowering employees and giving them a sense of control over their work, leaders can help prevent burnout and increase job satisfaction.

Section 3: Lack of Clarity

The third trigger of burnout at work is a lack of clarity. Leaving employees without clear expectations or without a firm belief that increased effort will increase performance is leaving employees open to burnout. Vague job descriptions and frequent changes in roles and tasks can leave employees feeling uncertain and overwhelmed. This trigger can often sneak up on employees and their leaders because the demands of a job change over time, and gradually move people away from the role they were initially hired for. Without frequent updating of expectations and clear feedback, the job becomes ambiguous. Leaders can help avoid this through regular check-ins, clear project definitions, and resources to help employees achieve their tasks. Clear communication and setting realistic goals can go a long way in reducing burnout caused by a lack of clarity.

Section 4: Lack of Civility

The fourth trigger of burnout at work is a lack of civility. Working in a toxic team or organization can be extremely detrimental to one’s mental well-being and job satisfaction. Whether it’s a single individual or a bad boss, negative experiences in the workplace can quickly lead to burnout. This can happen even in overall positive company cultures, because one toxic boss or dysfunctional team can have an outsized effect on the team and its potential for burnout. Leaders can address this trigger by modeling respectful behavior and reinforcing expectations of respect and cohesion. Creating a positive and inclusive work culture where everyone feels valued and supported can help prevent burnout and foster a more harmonious work environment.

Section 5: Lack of Social Support

The fifth trigger of burnout at work is a lack of social support. Humans are social creatures—and work often meets a small or large part of our social needs. Feeling isolated and lonely at work can significantly contribute to burnout. Without social connections and friendships, employees may struggle to find motivation and support in their roles. Leaders can create opportunities for social support and friendships within the team by organizing team-building activities, encouraging collaboration, and fostering a sense of community. You can’t force people to be friends, but you can create the environment where friendships develop. And having friends at work can not only drive productivity but also decrease stress and enhance overall job satisfaction.

By addressing these triggers of burnout, leaders can create a work environment that promotes employee well-being, engagement, and productivity. Redistributing tasks, providing autonomy, ensuring clarity, promoting civility, and fostering social support are all essential steps in preventing burnout and creating a more positive and fulfilling work experience. And a positive work experience helps everyone do their best work ever.

Image credit: Pexels

Originally published at https://davidburkus.com on June 26, 2023.

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

How to Not Get Depleted

How to Not Get Depleted

GUEST POST from Mike Shipulski

On every operating plan there are more projects than there are people to do them and at every meeting there more new deliverables than people to take them on. At every turn, our demand for increased profits pushes our people for more. And, to me, I think this is the reason every day feel fuller than the last.

This year do you have more things to accomplish or fewer? Do you have more meetings or fewer? Do you get more emails or fewer?

We add work to people’s day as if their capacity to do work is infinite. And we add metrics to measure them to make sure they get the work done. And that’s a recipe for depletion. At some point, even the best, most productive people reach their physical and emotional limits. And at some point, as the volume of work increases, we all become depleted. It’s not that we’re moving slowly, being wasteful or giving it less than our all. When the work exceeds our capacity to do it, we run out of gas.

Here are some thoughts that may help you over the next year.

The amount of work you will get done this year is the same as you got done last year. But don’t get sidetracked here. This has nothing to do with the amount of work you were asked to do last year. Because you didn’t complete everything you were asked to do last year, the same thing will happen this year unless the amount of work on this year’s plan is equal to the amount of work you actually accomplished last year. Every year, scrub a little work off your yearly commitments until the work content finally equals your capacity to get it done.

Once the work content of your yearly plan is in line, the mantra becomes – finish one before you start one. If you had three projects last year and you finished one, you can add one project this year. If you didn’t finish any projects last year you can’t start one this year, at least until you finish one this year. It’s a simple mantra, but a powerful one. It will help you stop starting and start finishing.

There’s a variant to the finish-before-you-start approach that doesn’t have to wait for the completion of a long project. Instead of finishing a project, unimportant projects are stopped before they’re finished. This is loosely known as – stop doing before start doing. Stopping is even more powerful than finishing because low value work is stopped and the freed-up resources are immediately applied to higher value work. This takes judgement and courage to stop a dull project, but it’s well worth the discomfort.

If you want to get ahead of the game, create a stop-doing list. For each item on the list estimate how much time you will free up and sum the freed-up time for the lot. Be ruthless. Stop all but the most important work. And when your boss says you can’t stop something because it’s too important, propose that you stop for a week and see what happens. And when no one notices you stopped, propose to stop for a month and see what happens. Rinse and repeat.

When the amount of work you have to get done fits with your capacity to do it, your physical and mental health will improve. You’ll regain that spring in your step and you’ll be happier. And the quality of your work will improve. But more importantly, your family life and personal relationships will improve. You’ll be able to let go of work and be fully present with your friends and family.

Regardless of the company’s growth objectives, one person can only do the work of one person. And it’s better for everyone (and the company) if we respect this natural constraint.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Metaphysics Philosophy

Metaphysics Philosophy

GUEST POST from Geoffrey A. Moore

Philosophy is arguably the most universal of all subjects. And yet, it is one of the least pursued in the liberal arts curriculum. The reason for this, I will claim, is that the entire field was kidnapped by some misguided academics around a century ago, and since then no one has paid the ransom to free it. That’s not OK, and with that in mind, here is a series of four blogs that taken together constitute an Emancipation Proclamation.

There are four branches of philosophy, and in order of importance they are

  1. metaphysics,
  2. ethics,
  3. epistemology, and
  4. logic.

This post will address the first of these four, with subsequent posts addressing the remaining three.

Metaphysics is best understood in terms of Merriam-Webster’s definition: “the philosophical study of the ultimate causes and underlying nature of things.” In everyday language, it answers the most fundamental kinds of philosophical questions:

  • What’s happening?
  • What is going on?
  • Where and how do we fit in?
  • In other words, what kind of a hand have we been dealt?

Metaphysics, however, is not normally conceived in everyday terms. Here is what the Oxford English Dictionary (OED) has to say about it in its lead definition:

That branch of speculative inquiry which treats of the first principles of things, including such concepts as being, substance, essence, time, space, cause, identity, etc.; theoretical philosophy as the ultimate science of Being and Knowing.

The problem is that concepts like substance and essence are not only intimidatingly abstract, they have no meaning in modern cosmology. That is, they are artifacts of an earlier era when things like the atomic nature of matter and the electromagnetic nature of form were simply not understood. Today, they are just verbiage.

But wait, things get worse. Here is the OED in its third sense of the word:

[Used by some followers of positivist, linguistic, or logical philosophy] Concepts of an abstract or speculative nature which are not verifiable by logical or linguistic methods.

The Oxford Companion to the Mind sheds further light on this:

The pejorative sense of ‘obscure’ and ‘over-speculative’ is recent, especially following attempts by A.J. Ayer and others to show that metaphysics is strictly nonsense.

Now, it’s not hard to understand what Ayer and others were trying to get at, but do we really want to say that the philosophical study of the ultimate causes and underlying nature of things is strictly nonsense? Instead, let’s just say that there is a bunch of unsubstantiated nonsense that calls itself metaphysics but that isn’t really metaphysics at all. We can park that stuff with magic crystals and angels on the head of a pin and get back to what real metaphysics needs to address—what exactly is the universe, what is life, what is consciousness, and how do they all work together?

The best platform for so doing, in my view, is the work done in recent decades on complexity and emergence, and that is what organizes the first two-thirds of The Infinite Staircase. Metaphysics, it turns out, needs to be understood in terms of strata, and then within those strata, levels or stair steps. The three strata that make the most sense of things are as follows:

  1. Material reality as described by the sciences of physics, chemistry, and biology, or what I called the metaphysics of entropy. This explains all emergence up to the entrance of consciousness.
  2. Psychological and social reality, as explained by the social sciences, or what I called the metaphysics of Darwinism, which builds the transition from a world of mindless matter up to one of matter-less mind, covering the intermediating emergence of desire, consciousness, values, and culture.
  3. Symbolic reality, as explained by the humanities, or what I called the metaphysics of memes, which begins with the introduction of language that in turn enables the emergence of humanity’s two most powerful problem-solving tools, narrative and analytics, culminating in the emergence of theory, ideally a theory of everything, which is, after all, what metaphysics promised to be in the first place.

The key point here is that every step in this metaphysical journey is grounded in verifiable scholarship ranging over multiple centuries and involving every department in a liberal arts faculty—except, ironically, the philosophy department which is holed up somewhere on campus, held hostage by forces to be discussed in later blogs.

That’s what I think. What do you think?

Image Credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The Crisis Innovation Trap

Why Proactive Innovation Wins

The Crisis Innovation Trap

by Braden Kelley and Art Inteligencia

In the narrative of business, we often romanticize the idea of “crisis innovation.” The sudden, high-stakes moment when a company, backed against a wall, unleashes a burst of creativity to survive. The pandemic, for instance, forced countless businesses to pivot their models overnight. While this showcases incredible human resilience, it also reveals a dangerous and costly trap: the belief that innovation is something you turn on only when there’s an emergency. As a human-centered change and innovation thought leader, I’ve seen firsthand that relying on crisis as a catalyst is a recipe for short-term fixes and long-term decline. True, sustainable innovation is not a reaction; it’s a proactive, continuous discipline.

The problem with waiting for a crisis is that by the time it hits, you’re operating from a position of weakness. You’re making decisions under immense pressure, with limited resources, and with a narrow focus on survival. This reactive approach rarely leads to truly transformative breakthroughs. Instead, it produces incremental changes and tactical adaptations—often at a steep price in terms of burnout, strategic coherence, and missed opportunities. The most successful organizations don’t innovate to escape a crisis; they innovate continuously to prevent one from ever happening.

The Cost of Crisis-Driven Innovation

Relying on crisis as your innovation driver comes with significant hidden costs:

  • Reactive vs. Strategic: Crisis innovation is inherently reactive. You’re fixing a symptom, not addressing the root cause. This prevents you from engaging in the deep, strategic thinking necessary for true market disruption.
  • Loss of Foresight: When you’re in a crisis, all attention is on the immediate threat. This short-term focus blinds you to emerging trends, shifting customer needs, and new market opportunities that could have been identified and acted upon proactively.
  • Burnout and Exhaustion: Innovation requires creative energy. Forcing your teams into a constant state of emergency to innovate leads to rapid burnout, high turnover, and a culture of fear, not creativity.
  • Suboptimal Outcomes: The solutions developed in a crisis are often rushed, inadequately tested, and sub-optimized. They are designed to solve an immediate problem, not to create a lasting competitive advantage.

“Crisis innovation is a sprint for survival. Proactive innovation is a marathon for market leadership. You can’t win a marathon by only practicing sprints when the gun goes off.”

Building a Culture of Proactive, Human-Centered Innovation

The alternative to the crisis innovation trap is to embed innovation into your organization’s DNA. This means creating a culture where curiosity, experimentation, and a deep understanding of human needs are constant, not sporadic. It’s about empowering your people to solve problems and create value every single day.

  1. Embrace Psychological Safety: Create an environment where employees feel safe to share half-formed ideas, question assumptions, and even fail. This is the single most important ingredient for continuous innovation.
  2. Allocate Dedicated Resources: Don’t expect innovation to happen in people’s spare time. Set aside dedicated time, budget, and talent for exploratory projects and initiatives that don’t have an immediate ROI.
  3. Focus on Human-Centered Design: Continuously engage with your customers and employees to understand their frustrations and aspirations. True innovation comes from solving real human problems, not just from internal brainstorming.
  4. Reward Curiosity, Not Just Results: Celebrate learning, even from failures. Recognize teams for their efforts in exploring new ideas and for the insights they gain, not just for the products they successfully launch.

Case Study 1: Blockbuster vs. Netflix – The Foresight Gap

The Challenge:

In the late 1990s, Blockbuster was the undisputed king of home video rentals. It had a massive physical footprint, brand recognition, and a highly profitable business model based on late fees. The crisis of digital disruption and streaming was not a sudden event; it was a slow-moving signal on the horizon.

The Reactive Approach (Blockbuster):

Blockbuster’s management was aware of the shift to digital, but they largely viewed it as a distant threat. They were so profitable from their existing model that they had no incentive to proactively innovate. When Netflix began gaining traction with its subscription-based, DVD-by-mail service, Blockbuster’s response was a reactive, half-hearted attempt to mimic it. They launched an online service but failed to integrate it with their core business, and their culture remained focused on the physical store model. They only truly panicked and began a desperate, large-scale innovation effort when it was already too late and the market had irreversibly shifted to streaming.

The Result:

Blockbuster’s crisis-driven innovation was a spectacular failure. By the time they were forced to act, they lacked the necessary strategic coherence, internal alignment, and cultural agility to compete. They didn’t innovate to get ahead; they innovated to survive, and they failed. They went from market leader to bankruptcy, a powerful lesson in the dangers of waiting for a crisis to force your hand.


Case Study 2: Lego’s Near-Death and Subsequent Reinvention

The Challenge:

In the early 2000s, Lego was on the brink of bankruptcy. The brand, once a global icon, had become a sprawling, unfocused company that was losing relevance with children increasingly drawn to video games and digital entertainment. The company’s crisis was not a sudden external shock, but a slow, painful internal decline caused by a lack of proactive innovation and a departure from its core values. They had innovated, but in a scattered, unfocused way that diluted the brand.

The Proactive Turnaround (Lego):

Lego’s new leadership realized that a reactive, last-ditch effort wouldn’t save them. They saw the crisis as a wake-up call to fundamentally reinvent how they innovate. Their strategy was not just to survive but to thrive by returning to a proactive, human-centered approach. They went back to their core product, the simple plastic brick, and focused on deeply understanding what their customers—both children and adult fans—wanted. They launched several initiatives:

  • Re-focus on the Core: They trimmed down their product lines and doubled down on what made Lego special—creativity and building.
  • Embracing the Community: They proactively engaged with their most passionate fans, the “AFOLs” (Adult Fans of Lego), and co-created new products like the highly successful Lego Architecture and Ideas series. This wasn’t a reaction to a trend; it was a strategic partnership.
  • Thoughtful Digital Integration: Instead of panicking and launching a thousand digital products, they carefully integrated their physical and digital worlds with games like Lego Star Wars and movies like The Lego Movie. These weren’t rushed reactions; they were part of a long-term, strategic vision.

The Result:

Lego’s transformation from a company on the brink to a global powerhouse is a powerful example of the superiority of proactive innovation. By not just reacting to their crisis but using it as a catalyst to build a continuous, human-centered innovation engine, they not only survived but flourished. They turned a painful crisis into a foundation for a new era of growth, proving that the best time to innovate is always, not just when you have no other choice.


Eight I's of Infinite Innovation

The Eight I’s of Infinite Innovation

Braden Kelley’s Eight I’s of Infinite Innovation provides a comprehensive framework for organizations seeking to embed continuous innovation into their DNA. The model starts with Ideation, the spark of new concepts, which must be followed by Inspiration—connecting those ideas to a compelling, human-centered vision. This vision is refined through Investigation, a process of deeply understanding customer needs and market dynamics, leading to the Iteration of prototypes and solutions based on real-world feedback. The framework then moves from development to delivery with Implementation, the critical step of bringing a viable product to market. This is not the end, however; it’s a feedback loop that requires Invention of new business models, a constant process of Improvement based on outcomes, and finally, the cultivation of an Innovation culture where the cycle can repeat infinitely. Each ‘I’ builds upon the last, creating a holistic and sustainable engine for growth.

Conclusion: The Time to Innovate is Now

The notion of “crisis innovation” is seductive because it offers a heroic narrative. But behind every such story is a cautionary tale of a company that let a problem fester for far too long. The most enduring, profitable, and relevant organizations don’t wait for a burning platform to jump; they are constantly building new platforms. They have embedded a culture of continuous, proactive innovation driven by a deep understanding of human needs. They innovate when times are good so they are prepared when times are tough.

The time to innovate is not when your stock price plummets or your competitor launches a new product. The time to innovate is now, and always. By making innovation a fundamental part of your business, you ensure your organization’s longevity and its ability to not just survive the future, but to shape it.

Image credit: Pixabay

Content Authenticity Statement: The topic area and the key elements to focus on were decisions made by Braden Kelley, with help from Google Gemini to shape the article and create the illustrative case studies.

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Unlearning is More Important Than Learning

Unlearning is More Important Than Learning

GUEST POST from Greg Satell

When I first went overseas to Poland in 1997, I thought I knew how the media business worked. I had some experience selling national radio time in New York and thought I could teach the Poles who, after 50 years of communism, hadn’t had much opportunity to learn how a modern ad market functioned. I was soon disappointed.

Whenever I would explain a simple principle, they would ask me, “why?” I was at a loss for an answer, because these were thought to be so obvious that nobody ever questioned them. When I thought about it though, many of the things I had learned as immutable laws were merely conventions that had built up over time.

As I traveled to more countries I found that even basic market functions, such as TV buying, varied enormously from place to place. I would come to realize that there wasn’t one “right” way to run a business but innumerable ways things could work. It was then that I began to understand the power of unlearning. It is, in fact, a key skill for the next era of innovation.

The One “True” Way To Innovate?

Innovation has become like a religion in business today, with “innovate or die” as its mantra. Much like televangelists preaching the prosperity gospel to gullible rubes, there’s no shortage of innovation gurus that claim to have discovered the secret to breakthrough innovation and are willing to share it with you, for an exorbitant fee, of course.

What I learned researching my book Mapping Innovation, however, is that there is no one “true” path to innovation. In fact, if you look at companies like IBM, Google and Amazon, although they are all world-class innovators, each goes about it very differently. IBM focuses on grand challenges that can take decades to solve, Google integrates a portfolio of innovation strategies and Amazon has embedded a customer obsession deep within its culture and practice.

What I found most interesting was that most people defined innovation in terms of how they’d been successful in the past, or in the case of self-described gurus, what they’d seen and heard to be successful. By pointing to case studies, they could “prove” that their way was indeed the “right” way. In effect, they believed that what they experienced was all there is.

Yet as I’ve explained in Harvard Business Review, innovation is really about finding novel solutions to important problems and there are as many ways to innovate as there are different types of problems to solve. Many organizations expect the next problem they need to solve to be like the last one. Inevitably, they end up spinning their wheels.

The Survival Of The Fittest?

The survival of the fittest is a thoroughly misunderstood concept. Although it arose out of Darwin’s work, it did not originate from him. It was coined by Herbert Spencer to connect Darwin’s work to his own ideas. Darwin’s theory was so novel and powerful at the time, it was difficult to articulate it clearly, and the phrase caught on.

All too often, people assume that Darwin’s theories predicted some sort of teleological end state in which one optimized form will dominate. If that were true, then the optimal strategy for every organism, as well as every business model and every organization, would be to strive to achieve that optimal state and dominate the competition.

Yet that’s not what Darwin meant at all. In fact, his theory rested on three pillars, limited resources, changing environments and super-fecundity, which is the tendency of organisms to produce more offspring that can survive. “Fittest” refers to a temporary state, not a permanent advantage. What is “fit” for one environment may be detrimental in another.

Eastern Europe was, for me, similar to the Galápagos Islands where Darwin first formed his famous theory. Seeing different business environments, in close proximity, give rise to so many different business models opened my eyes to new possibilities. Once I unlearned what I thought I knew, I was able to learn more than I could have imagined.

Turning The Page On Welchism

At the beginning of this century, Fortune magazine proclaimed Jack Welch to be the optimal manager of the last one. American industry had grown sclerotic and bureaucratic. It was in great need of some trimming down and Welch was truly an optimized fit for the environment.

Nicknamed “Neutron Jack” for his penchant of getting rid of all the people and only leaving the buildings standing, he voraciously cut through GE’s red tape. Profits soared, Welch became something of a prophet and “Welchism” a religion. Corporate boards heavily recruited GE executives as CEOs to replicate Welchism at their companies.

Yet as David Gelles explains in his book about Welch’s tenure at GE, The Man Who Broke Capitalism, not all was as it seemed. Yes, Welch made GE more efficient and profitable, but he also increased risk through “financializing” the industrial company, undermined engineering and innovation by moving manufacturing facilities overseas and cooked the books to make profits seem much smoother than they were.

GE would eventually implode, but the damage went much further and deeper than one company. Because Welchism was seen as the “one best way” to run a business, many other firms replicated its methods. The results have been alarming. In fact, a 2020 report by the Federal Reserve found that business dynamism in America has steadily declined since Jack Welch took the helm at GE in 1981.

Clearly we have some unlearning to do.

Moving Boldly Into An Uncertain Future

I’ve thought for some time that the 2020s would look a lot like the 1920s. That was the last time that we had such a convergence of technological, demographic and political forces at one time (and a pandemic as well!). Yet historical analogies can often be misleading. History is long and, if you look enough, you can find an analogy for almost anything.

It is certainly true that history seems to converge and cascade on particular moments and we seem to be at one of these moments now. We will need to unlearn much of what we thought we knew about shareholder value and other things as well. Yet correcting the mistakes of the past is never enough. We need to create anew.

The recently passed CHIPS Act is a good model for how to do this. Much of the $280 billion bill goes to tried and true programs that we under-invested in recent years, such as science programs at the NSF and the DOE as well as programs that support manufacturing and, of course, subsidies to support semiconductors. We know these things work.

Yet other programs are experiments. Some, such as a new Technology Directorate at the NSF are controversial. Others, such as $10 billion that will be spent on regional technology hubs and $1 billion that will go to a RECOMPETE pilot program to empower distressed communities, are new and innovative. We can almost guarantee that there will be hiccups and outright failures along the way.

It is tautologically true that the well-trod path will take us nowhere new. We need to unlearn the past if we are to learn how to build a new future.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.