Tag Archives: economics

Why an AI Soft Landing Might Look Like Victorian England

LAST UPDATED: April 18, 2026 at 3:29 PM

Why an AI Soft Landing Might Look Like Victorian England

by Braden Kelley and Art Inteligencia


The Mirage of the Post-Scarcity Utopia

For decades, the prevailing narrative surrounding artificial intelligence has been one of a post-scarcity “Star Trek” future. The logic was simple: as machines took over the labor, the dividends of automation would be harvested by the state and redistributed via Universal Basic Income (UBI), freeing humanity to pursue art, philosophy, and leisure.

The AI Promise vs. The Fiscal Reality

However, this utopian vision ignores the gravity of The Great American Contraction. As we approach 2026 and beyond, the friction between exponential technological growth and a $37 trillion+ national debt (with a $2 trillion annual budget deficit) creates a structural barrier to redistribution. When the tax base of human labor erodes, the math for a livable UBI simply fails to compute.

The Victorian Hypothesis

If UBI is a mathematical and political impossibility fueled by corporate and human greed, we must look toward an alternative “soft landing.” This hypothesis suggests a vertical restructuring of society. As AI drives the cost of production and the demand for goods into a deflationary spiral, the purchasing power of the remaining “employed elite” will skyrocket.

The result isn’t a horizontal distribution of wealth, but a return to a Neo-Victorian social hierarchy. In this reality, the new digital gentry will use their outsized wealth to employ a massive “servant class” to maintain stately homes and personal lives, creating a world where status is defined by the human labor one can afford to command.

Neo-Victorian Hypothesis Infographic

The Great American Contraction: Why UBI is a Non-Starter

The conversation around the transition to an AI-driven economy often treats Universal Basic Income as an inevitability — a safety net that will naturally catch those displaced by the silicon wave. However, this assumes a level of fiscal elasticity that no longer exists. We are entering The Great American Contraction, a period where the traditional levers of government spending are restricted by the sheer weight of historical obligation and systemic greed.

The Debt Ceiling of Compassion

With a national debt exceeding $37 trillion, a $2 trillion budget deficit and rising interest rates, the federal government’s “room to maneuver” has effectively vanished. A livable UBI requires a massive, consistent tax base. As AI begins to hollow out the middle class, the very tax revenue needed to fund such a program disappears. To fund UBI under these conditions would require a level of sovereign borrowing that the global markets simply will not support, leading to a reality where the government cannot afford to be the savior of the displaced.

The Greed Variable

Even if the math were more favorable, the human element remains a constant. Corporate interests, focused on margin preservation and shareholder value, are unlikely to support the aggressive taxation required to fund a social floor. In the race to the bottom of production costs, the primary goal of the “winners” in the AI revolution will be wealth concentration, not social equity. The political willpower to force a massive transfer of wealth from AI-profiting corporations to the idle masses is a historical outlier that we should not count on repeating.

The Velocity of Displacement

Finally, the speed of the AI transition is its most disruptive feature. Legislative bodies move in years, while AI cycles move in weeks. By the time a political consensus for UBI could be formed, the economic floor will have already fallen out. This lag time creates a vacuum that will be filled not by government checks, but by a desperate search for subsistence, setting the stage for the return of the domestic labor economy.

The Deflationary Paradox: Collapse of Demand and Cost

In a traditional economy, unemployment leads to recession, which usually leads to stagflation or managed recovery. However, the AI-driven “soft landing” introduces a unique mechanical failure: the Deflationary Paradox. As AI and advanced robotics permeate every sector, the labor cost of producing goods and services begins to approach zero, but the pool of consumers capable of buying those goods simultaneously evaporates.

The Production Floor Drops

We are witnessing the end of the labor theory of value. When an AI can design, a robot can manufacture, and an automated fleet can deliver a product without a single human touchpoint, the marginal cost of production hits the floor. In a desperate bid to capture the dwindling “active” capital in the market, companies will engage in a race to the bottom, causing the prices of physical and digital goods to deflate at a rate unseen in modern history.

The Demand Vacuum

While cheap goods sound like a boon, they are a symptom of a deeper rot: the Demand Vacuum. As the middle class is hollowed out, the velocity of money slows to a crawl. The economy shifts from a mass-consumption model to a precision-consumption model. Most businesses will fail not because they can’t produce, but because there are no longer enough customers with a paycheck to buy, even at rock-bottom prices.

The Purchasing Power of the “Remaining”

This is where the Victorian shift begins. For the small percentage of Americans who retain their income — the innovators, the orchestrators, and the entrepreneurs — this deflationary environment is a golden age. Their dollars, fixed in value while the cost of everything else drops, suddenly possess exponential purchasing power. When a gallon of milk or a digital service costs mere pennies in relative terms, the “wealthy” find themselves with a massive surplus of capital that cannot be spent on “things” alone. This surplus will naturally be redirected toward the one thing that remains scarce and high-status: the dedicated service of another human being.

The New “Stately Home” Economy

As the Deflationary Paradox takes hold, we will see a fundamental shift in the definition of luxury. In the pre-AI era, luxury was defined by the acquisition of high-tech gadgets or rare goods. In the Neo-Victorian era, where machines produce goods for nearly nothing, “luxury” will pivot back toward the human-centered experience. Status will no longer be measured by what you own, but by whose time you command.

From Software to Service

For the “In-Group” — those entrepreneurs and specialized leaders still generating significant revenue — capital will lose its utility in the digital marketplace. When software is free and manufactured goods are commoditized, wealth seeks the only remaining friction: human presence. We will see a massive migration of capital away from Silicon Valley “platforms” and toward the local domestic economy. The wealthy will stop buying more “things” and start buying “lives” — the total dedicated attention of house managers, chefs, valets, and tutors.

The Modern Manor

This economic shift will be physically manifested in the return of the Stately Home. These won’t just be houses; they will be complex ecosystems of employment. Large estates will once again become the primary employer for local communities. As traditional corporate offices vanish, the residence becomes the center of both social and economic power. These modern manors will require extensive human staffs to cook, clean, maintain grounds, and provide security — services that, while technically possible via robotics, will be performed by humans as a deliberate signal of the owner’s immense “effectively wealthy” status.

The Return of the Domestic Professional

Perhaps the most jarring aspect of this transition will be the class of worker entering domestic service. We are not talking about a traditional blue-collar service shift, but the “Victorianization” of the former middle class. Displaced white-collar professionals — accountants, teachers, and middle managers — will find that their highest-paying opportunity is no longer in a cubicle, but in managing the complex domestic affairs, private education, and logistics of the new digital aristocracy. It is a “soft landing” in name only; while they may live in proximity to grandeur, their survival is entirely tethered to the whims of their employer.

Socio-Economic Stratification: The Two-Tiered Reality

The inevitable result of the “Victorian Soft Landing” is the formalization of a rigid, two-tiered social structure. Unlike the 20th century, which was defined by a fluid and expanding middle class, the post-contraction era will be characterized by extreme polarization. The economic “missing middle” creates a vacuum that forces every citizen into one of two distinct realities: the Digital Gentry or the Dependent Class.

The Corporate and Government Gentry

A small percentage of Americans — likely less than 10% — will remain tethered to the engines of primary wealth creation. This “In-Group” consists of high-level AI orchestrators, strategic entrepreneurs, and essential government officials who maintain the infrastructure of the state. Because their income is derived from high-margin automated systems while their cost of living has plummeted due to deflation, they possess a level of functional wealth that rivals the landed gentry of the 19th century. To this group, the “Great Contraction” is not a crisis, but a refinement of their dominance.

The Dependent Class

For those outside the digital fortress, the reality is stark. Without a national UBI to provide a floor, the majority of the population becomes the “Dependent Class.” Their economic utility is no longer found in the marketplace of ideas or manufacturing, but in the marketplace of personal service. In this neo-Victorian landscape, you either work for the companies that own the AI, work for the government that protects it, or you work directly for the individuals who do.

The Choice: Service or Scarcity

This stratification reintroduces a primal power dynamic into the American workforce. When the cost of basic survival (food and shelter) is low due to deflation, but the opportunity for independent income is zero, the wealthy gain total leverage. The “soft landing” is, in truth, a forced labor transition. Those who are not “useful” to the gentry — either as specialized labor or domestic support — face the grim reality of the Victorian workhouse era: they must find a patron to serve, or they will starve in a world of plenty.

Experience Design in the Neo-Victorian Era

Experience Design in the Neo-Victorian Era

From the perspective of experience design and futurology, the shift toward a Victorian-style social structure will fundamentally alter the aesthetic of status. In a world where AI can generate perfect, flawless goods and digital experiences at zero marginal cost, “perfection” becomes a commodity. Status, therefore, will be redesigned around human friction and intentional inefficiency.

The Aesthetic of Inequality

We will see a move away from the sleek, minimalist “Apple-esque” design of the early 21st century toward a more ornate, human-heavy luxury. Experience design for the elite will emphasize things that AI cannot authentically replicate: the slight imperfection of a hand-cooked meal, the presence of a uniformed gatekeeper, and the physical maintenance of vast, non-automated gardens. Architecture will pivot back to “human-centric” layouts—designing spaces not for efficiency, but to accommodate the movement and housing of a live-in staff.

Designing for Disconnect

The most challenging aspect of this new era will be the Experience of the Invisible. Designers will be tasked with creating systems that allow the Digital Gentry to interact with their environment without acknowledging the vast economic disparity surrounding them. This involves “Social UX” — designing layers of intermediation where the “Dependent Class” provides the comfort, but the “Gentry” only interacts with the result. It is a return to the “back-stairs” architecture of the 19th century, modernized for a digital age.

The UX of Survival

For the majority, the “User Experience” of daily life will be one of Hyper-Personal Patronage. Navigation of the economy will no longer be about interfaces or platforms, but about the “UX of Relationships.” Survival will depend on the ability to design one’s persona to be indispensable to a wealthy patron. In this reality, human-centered design takes on a darker, more literal meaning: the human becomes the product, the service, and the infrastructure all at once.

Conclusion: Preparing for the Retro-Future

The “Soft Landing” we are currently engineering is not the one we were promised. As the Great American Contraction forces a collision between astronomical debt and the deflationary power of AI, the middle-class dream of a subsidized leisure class is evaporating. In its place, we are seeing the blueprints of a Retro-Future — a world that looks forward technologically but moves backward socially.

A Call for Human-Centered Transition

If we continue to view innovation solely through the lens of efficiency and margin preservation, the Victorian outcome is not just possible — it is inevitable. We must realize that without a radical redesign of how we value human contribution beyond mere “market productivity,” we are simply building a more efficient feudalism. True Experience Design must now focus on the social fabric, or we risk creating a world where the only “innovation” left is finding new ways for the many to serve the few.

Final Thought: The Soft Landing Paradox

We must be careful what we wish for when we ask for a “seamless” transition. A landing that is “soft” for the Digital Gentry is one where the friction of poverty and the noise of the displaced have been successfully silenced by the return of the servant class. History doesn’t repeat, but it does rhyme — and right now, the future sounds remarkably like 1837. The question is no longer if AI will change our world, but whether we have the courage to design a future that doesn’t require us to retreat into our past.

Frequently Asked Questions

Why would prices deflate if the economy is struggling?

In this scenario, AI and robotics drive the marginal cost of production toward zero. Simultaneously, massive job displacement creates a “demand vacuum.” To capture what little liquid currency remains, companies must drop prices drastically, leading to a reality where goods are incredibly cheap but income is even scarcer.

How does this differ from the 20th-century middle class?

The 20th century was defined by a “horizontal” distribution where many people owned moderate assets. The Neo-Victorian model is “vertical.” The middle class disappears, replaced by a tiny, hyper-wealthy elite (Digital Gentry) and a large class of people who provide them with personalized human services (the Servant Class).

Isn’t UBI a more logical solution to AI displacement?

While logical in theory, the “Great American Contraction” hypothesis suggests that high national debt and corporate prioritisation of margins make a livable UBI politically and fiscally impossible. Without a state-funded floor, the market defaults to the oldest form of social safety: personal patronage and domestic service.

EDITOR’S NOTE: This is a visualization of but one possible future. I will be publishing other possible futures as they crystallize in my mind (or as you suggest them for me to explore).

Image credits: Google Gemini

Content Authenticity Statement: The topic area, key elements to focus on, etc. were decisions made by Braden Kelley, with a little help from Google Gemini to clean up the article, add images and create infographics.

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

The Consumption Collapse – When the Feedback Loop Bites Back

Why the Great American Contraction is leading to a crisis of demand and a re-imagining of the American Social Contract.

LAST UPDATED: April 17, 2026 at 3:58 PM

The Consumption Collapse - When the Feedback Loop Bites Back

GUEST POST from Art Inteligencia


The Ghost in the Shopping Mall

In our previous exploration, The Great American Contraction,” we identified a fundamental shift in the American story. For the first time in our history, the foundational assumption of “more” — more people, more labor, and more expansion — has been inverted. We discussed how the exponential rise of AI and robotics is dismantling the traditional value chain of human labor, moving us from a nation of “doers” to a necessary, albeit smaller, elite class of “architects.”

However, as we move closer to the two-year horizon of the next United States Presidential election, a more insidious shadow is beginning to fall across the landscape. It is no longer just a crisis of employment; it has evolved into a crisis of consumption. This is the “Feedback Loop of Irrelevance.”

The logic is as cold as the algorithms driving it: As increasing numbers of knowledge workers and service providers are displaced by autonomous agents, their disposable income evaporates. When people lose their financial footing, they spend less. When they spend less, the revenue of the very companies that automated them begins to shrink. To protect their margins in a declining market, these companies are forced to cut back even further — often doubling down on automation to reduce costs — which in turn removes more consumers from the marketplace.

We are witnessing the birth of a deflationary death spiral where corporate efficiency threatens to cannibalize the very markets it was designed to serve. Over the next 24 months, this cycle will redefine the American psyche and set the stage for an election year unlike any we have ever seen.

It is time to look beyond the immediate shock of job loss and examine the structural integrity of our economic operating system. If the “Old Equation” of labor-for-income is a sinking ship, we must decide what happens to the passengers before we reach the horizon of 2028.

The Vicious Cycle of Automated Austerity

The transition from a growth-based economy to a Great Contraction is not a linear event; it is a recursive loop. As AI adoption accelerates, we are witnessing a phenomenon I call “Automated Austerity.” This is the process where short-term corporate gains from labor reduction lead directly to long-term market erosion. The cycle progresses through four distinct, overlapping phases:

Phase 1: The First Wave Displacement

We are currently seeing the replacement of both low-skilled physical labor and high-skilled knowledge work by autonomous systems. This isn’t just about factory floors; it’s about the “Architect” roles we once thought were safe. As companies replace $150k-a-year analysts with $15-a-month compute tokens, the immediate impact is a massive surge in corporate profit margins.

Phase 2: The Wallet Effect

The friction begins here. Displaced workers initially rely on savings or severance, but as those dry up, the “gig economy” safety net is nowhere to be found — because AI is already performing the freelance writing, coding, and administrative tasks that used to provide a bridge. Disposable income doesn’t just dip; for a significant percentage of the population, it vanishes. This causes a sharp contraction in discretionary spending.

Phase 3: The Revenue Mirage

This is the trap. Companies that automated to save money suddenly find their top-line revenue shrinking because their customers (the former workers) can no longer afford their products. The efficiency gains are real, but the market size is artificial. We are entering a period where companies may be 100% efficient at producing goods that 0% of the displaced population can buy.

Phase 4: The Secondary Contraction

Faced with shrinking revenues, boards of directors demand even deeper cost-cutting to protect investor dividends. This leads to a second, more desperate wave of layoffs, further reducing the tax base and consumer spending power. This feedback loop creates a Deflationary Death Spiral that traditional monetary policy is ill-equipped to handle.

“When you automate the consumer out of a job, you eventually automate the business out of a customer.” — Braden Kelley

Over the next two years, this cycle will move from the periphery of Silicon Valley to the heart of every American household, forcing a radical re-evaluation of how we distribute the abundance that AI creates.

Vicious Cycle of Automated Austerity

The Two-Year Horizon: 2026–2028

As we navigate the next twenty-four months, the gap between traditional economic indicators and the lived reality of American citizens will become a canyon. We are entering a period of Economic Bifurcation, where the distance between those who own the “compute” and those who formerly provided the “labor” creates a new social stratification.

The Rise of the ‘Hollow’ Recovery

Expect to hear the term “efficiency-led growth” frequently in the coming months. Wall Street may remain buoyant as AI-integrated corporations report record-breaking margins per employee. However, this is a hollow success. While the stock market reflects corporate optimization, our Alternative Economic Health Measures—like the Genuine Progress Indicator (GPI) — will likely show a steep decline. We are becoming a nation that is technically “wealthier” while the average citizen’s ability to participate in that wealth is structurally dismantled.

The Shift from ‘Doer’ to ‘Architect’ Burnout

The “Great American Contraction” is not just about those losing roles; it is about the immense pressure on those who remain. The survivors — the Architect Class — are tasked with managing sprawling AI ecosystems. This creates a new kind of cognitive load. By 2027, I predict we will see a peak in “Technological Burnout,” where the speed of AI-driven change outpaces the human capacity to design for it. This is where Human-Centered Innovation becomes a survival skill rather than a corporate luxury.

The Mindset of Survivalist Innovation

As the feedback loop of shrinking revenue intensifies, we will see American citizens taking radical actions to decouple from a failing labor market. This includes:

  • Hyper-Localization: A resurgence in local bartering and community-based resource sharing as a hedge against the volatility of the automated economy.
  • The ‘Off-Grid’ Digital Economy: Individuals utilizing open-source AI models to create value outside of the traditional corporate gatekeepers, leading to a “shadow economy” of peer-to-peer services.
  • Consumption Sabotage: A psychological shift where citizens, feeling irrelevant to the economy, consciously reduce their consumption to the bare essentials, further accelerating the contraction.

This period will be defined by a search for meaning in a post-labor world. The American citizen of 2027 is no longer asking “How do I get ahead?” but rather “How do I remain relevant in a world that no longer requires my effort to function?”

The Survivalist Innovation Framework

Beyond GDP: New Vitals for a Contracting Economy

As the “Old Equation” fails, the metrics we use to measure national success are becoming dangerously obsolete. In a world where AI can drive productivity while simultaneously hollowing out the consumer class, GDP is no longer a compass; it is a rearview mirror. To navigate the next two years, we must shift our focus to alternative economic health measures that prioritize human vitality over transactional velocity.

1. The Genuine Progress Indicator (GPI)

Unlike GDP, which counts the “cost of cleaning up a disaster” as a positive, the GPI factors in income inequality and the social costs of underemployment. As we move toward 2028, we must demand a GPI-centered view of the economy. If AI-driven efficiency creates wealth but destroys the social capital of our communities, the GPI will show we are regressing, providing a much-needed reality check to “hollow” stock market gains.

2. The U-7 ‘Utility’ Rate

Standard unemployment figures (U-3) are increasingly irrelevant. We need a U-7 ‘Utility’ Rate to track those who are “technologically displaced”—individuals whose roles have been absorbed by algorithms or whose wages have been suppressed to the point of working poverty. This metric will highlight the Architect Gap: the growing number of people who have the capacity for high-value human contribution but lack access to the compute resources required to compete.

3. The Social Progress Index (SPI)

The goal of an automated economy should be to improve the human condition. The SPI measures outcomes that actually matter: Access to advanced education, personal freedom, and environmental quality. By 2027, the SPI will be the most honest indicator of whether the Great Contraction is a managed transition to a better life or a chaotic collapse of the middle class.

4. Value of Organizational Learning Technologies (VOLT)

We must begin measuring the “Agility Score” of our nation. VOLT measures how effectively we are using AI to solve complex problems rather than just replacing workers. A high VOLT score paired with a low SPI suggests we are building a “learning machine” that has forgotten its purpose: to serve the humans who created it.

“A high-GDP nation with a crashing Social Progress Index(SPI) is merely a failed state in a gold tuxedo.”

The political battleground of the next two years will be defined by a new set of metrics similar to these (but likely different). The 2028 election will not just be a choice between candidates, but a choice between maintaining the illusion of growth or designing a system of sovereignty for the American citizen.

The Localized Pivot

The Sovereign Tech-Stack & The Localized Pivot

As the “Feedback Loop of Irrelevance” continues to shrink traditional income, we are witnessing a radical grassroots response: The Localized Pivot. When the macro-economy fails to provide value to the individual, the individual stops providing value to the macro-economy and turns inward to their community.

The Rise of the ‘Personal AI’ Infrastructure

By 2027, the barrier to entry for sophisticated production will vanish. We will see a surge in “Sovereign Tech-Stacks” — individuals and small collectives using localized, open-source AI models to run micro-manufactories, automated vertical farms, and peer-to-peer service networks. This is Innovation as a Survival Tactic. These citizens are essentially “unplugging” from the hollowed-out corporate ecosystem and creating a shadow economy that traditional GDP cannot track.

From Global Chains to Hyper-Local Resilience

The contraction of consumer spending will lead to the death of the “long supply chain” for many goods. In its place, we will see the rise of Regional Circular Economies. AI will be used not to maximize global profit, but to optimize local resource sharing. Imagine community AI agents that manage local energy grids or coordinate the bartering of skills — human-centered design at its most fundamental level.

The ‘Architect’ of the Commons

In this phase, the “Architect” role I’ve discussed previously becomes a civic one. These are the individuals who design the systems that keep their communities thriving while the national revenue shrinks. They are the ones building the Human-Centered Guardrails that ensure technology serves the neighborhood, not the shareholder. This shift represents a move from Global Consumerism to Local Sovereignty.

“When the national economic engine stops fueling the household, the household must build its own engine, or it dies.” — Braden Kelley

This localized movement will be the wild card of 2028. It creates a class of “Un-Architected” citizens who are no longer dependent on the federal government or major corporations, creating a profound tension for any political candidate trying to promise a return to the ‘Old Equation’.

The Road to 2028: The Politics of Human Relevance

As we approach the next Presidential election, the political discourse will undergo a seismic shift. The traditional “Left vs. Right” battle lines over tax rates and social issues will be superseded by a more existential debate: The Individual vs. The Algorithm. The 2028 election will likely be the first in history centered entirely on the consequences of a post-labor economy.

The ‘Humanity First’ Tax and Sovereign Solvency

The most contentious issue will be how to fund a shrinking state as the labor-based tax system collapses. We will see the rise of the “Compute Tax” — a proposal to tax AI tokens and robotic output rather than human hours. This isn’t just about revenue; it’s about sovereign solvency. When companies reinvest profits into compute rather than wages, the “Economic OS” crashes. Expect candidates to run on a platform of Universal Basic Everything (UBE) — providing the results of automation (healthcare, housing, and energy) directly to the people as the tax base from labor vanishes.

The Compute Tax

The Death of Traditional Immigration Debates

As I noted in our initial look at the Contraction, the old argument about immigrants “taking jobs” or “filling gaps” is dead. In 2028, the focus will shift to “Strategic Talent Acquisition.” The debate will center on how to attract the world’s few remaining irreplaceable “Architect” minds while managing a domestic population that is increasingly surplus to the needs of capital. This will create a strange political alliance between protectionists and humanists, both seeking to shield human value from digital devaluation.

Mindset and Likely Actions of the Citizenry

By the time voters head to the polls, the American mindset will have shifted from aspiration to preservation. We are likely to see:

  • The Rise of ‘Neo-Luddite’ Activism: Not a rejection of technology, but a demand for “Human-Centered Guardrails” that prevent AI from cannibalizing the last remaining sectors of human connection.
  • The Search for Non-Monetary Meaning: A surge in candidates who focus on “Quality of Life” metrics rather than fiscal growth, appealing to a class of people who no longer derive their identity from their “job.”
  • Algorithmic Populism: Politicians using AI to personalize fear and hope at scale, creating a feedback loop where the technology used to displace the worker is also used to win their vote.

The central question of the 2028 election will be simple but devastating: “What is a country for, if not to support the thriving of its people — even when those people are no longer ‘productive’ in a traditional sense?” The winner will be the one who can design a new social contract for a smaller, more resilient, and truly innovative nation.

Conclusion: Designing a Thrivable Contraction

The Great American Contraction is no longer a theoretical “what-if” for futurists to debate; it is an active restructuring of our reality. As the feedback loop of automated austerity begins to bite, we are discovering that a country built on the relentless pursuit of “more” is fundamentally ill-equipped to handle the arrival of “enough.”

The next two years will be a period of intense friction as our legacy systems — our tax codes, our education models, and our social safety nets — grind against the frictionless efficiency of the AI era. We will see traditional economic metrics fail to capture the quiet struggle of the consumer, and we will watch as the 2028 election turns into a referendum on the value of a human being in a post-labor world.

But contraction does not have to mean collapse. If we shift our focus from transactional velocity to human vitality, we have the opportunity to design a new version of the American Dream. This new dream isn’t about the quantity of jobs we can protect from the machines, but the quality of the lives we can build with the abundance those machines create. It is about moving from a nation of “doers” who are exhausted by the grind to a nation of “architects” who are inspired by the possible.

“The goal of innovation was never to replace the human; it was to release the human. We are finally being forced to decide what we want to be released to do.” — Braden Kelley

The road to 2028 will be defined by whether we choose to cling to the wreckage of the growth-based model or whether we have the courage to embrace a smaller, smarter, and more human-centered future. The contraction is inevitable, but the outcome is ours to design.

STAY TUNED: On Tuesday my friend Braden Kelley (with a little help from me) is publishing an article featuring one hypothesis for what an AI SOFT LANDING might look like.

Image credits: Google Gemini

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

When Survival Crowds Out Creativity: How Affordability Crises Undermine Innovation

An exploration of how rising costs of living reduce cognitive surplus, suppress innovation, and limit organizational and societal progress.

LAST UPDATED: January 19, 2026 at 4:43 PM

When Survival Crowds Out Creativity: How Affordability Crises Undermine Innovation

GUEST POST from Art Inteligencia

I am frequently asked about the ingredients of a successful innovation ecosystem. We talk about venture capital, high-speed internet, patent laws, and university partnerships. But we rarely talk about the most fundamental requirement of all: human physiological and psychological security.

Innovation is not a purely intellectual exercise; it is an emotional and biological one. It requires a specific state of mind — one that is open, curious, and willing to embrace the possibility of failure. However, when a society faces systemic affordability challenges — skyrocketing rents, food insecurity, and the crushing weight of debt — we are effectively taxing the cognitive bandwidth of our greatest resource: people.

“Innovation is not a luxury of the elite, but a byproduct of a society that provides its citizens enough stability to dream. When we price people out of their basic needs, we price ourselves out of our future.” — Braden Kelley


The Cognitive Tax of Scarcity

To understand why affordability kills innovation, we must look at how the human brain functions under stress. Human-centered innovation is rooted in the idea that people solve problems when they have the mental “slack” to do so. When an individual is constantly calculating how to cover a 30% increase in rent or skipping meals to pay for childcare, they are operating in survival mode.

In survival mode, the brain’s prefrontal cortex — the center for higher-order thinking, long-term planning, and creative synthesis — takes a backseat to the amygdala. We become more reactive, more short-term focused, and significantly more risk-averse. You cannot disrupt an industry when you are terrified of an eviction notice.

This “scarcity mindset” creates a hidden drain on productivity and creativity. It is a form of Innovation Debt that we are accruing as a society, where the interest is paid in ideas that were never born because the potential innovators were too exhausted to think of them.

In organizations, this manifests as:

  • Employees avoiding bold ideas for fear of failure
  • Reduced participation in innovation programs
  • Higher burnout and turnover among creative talent
  • A preference for incrementalism over experimentation

“Innovation requires slack — slack in time, money, attention, and emotional safety. When survival becomes the primary occupation, imagination is the first casualty.” — Braden Kelley


Case Study 1: The Silicon Valley “Talent Flight”

The Situation

For decades, Silicon Valley was the undisputed epicenter of global innovation. However, by the early 2020s, the median home price in the region exceeded $1.5 million. While established tech giants could afford to pay engineers high salaries, the support ecosystem — the teachers, the artists, the junior researchers, and the “garage tinkerers” — could not.

The Innovation Impact

Innovation thrives on cross-pollination. When only the wealthy can afford to live in a hub, the diversity of thought collapses. We began to see a “homogenization of innovation,” where new startups focused almost exclusively on problems faced by high-income individuals (e.g., luxury delivery apps) rather than solving systemic human challenges. The high cost of living created a barrier to entry that effectively barred the next generation of “scrappy” innovators who didn’t have a safety net or venture backing.

The Result

Data showed a significant migration of talent to “secondary” hubs like Austin, Denver, and Lisbon. While this decentralization has benefits, the initial friction and lost momentum in the primary hub represented a massive opportunity cost for breakthrough research that requires physical proximity and intense collaboration.


The Death of the “Garage Startup”

The “garage startup” is a cherished myth in innovation circles, but it relies on a very real economic reality: the availability of low-cost, low-risk space. Hewlett-Packard, Apple, and Google all started in spaces that were relatively cheap to rent or own.

In today’s urban environments, that “low-risk space” has vanished. When every square foot of a city is optimized for maximum real estate yield, there is no room for the inefficient, messy work of early-stage experimentation. We are replacing “maker spaces” with luxury condos, and in doing so, we are dismantling the physical infrastructure of the Fail Fast philosophy. If the cost of your “lab” (your garage or basement) is $3,000 a month, you cannot afford to fail. And if you cannot afford to fail, you will never truly innovate.


Case Study 2: Food Insecurity in the Academic Pipeline

The Situation

A 2023 study of graduate students in North America revealed that nearly 30% experienced some form of food insecurity. These are the individuals tasked with the most rigorous scientific and social research — the literal “R” in R&D.

The Innovation Impact

Graduate students are the primary engine of university-led innovation. When these researchers spend their nights worrying about calorie counts instead of quantum counts, the quality of research suffers. The persistence required to push through a failed experiment is diminished when physical health is compromised.

The Result

Universities noted a decline in “high-risk, high-reward” thesis topics. Students began gravitating toward “safe” research areas with guaranteed funding or clear paths to corporate employment to pay off student loans and eat. The “Failure Budget” for these young innovators was effectively zero, leading to a stifling of the very exploratory research that historically leads to major scientific breakthroughs.


Case Study 3: A Manufacturing Firm’s Productivity Paradox

A mid-sized manufacturing company invested heavily in digital transformation and innovation training, yet saw minimal improvement in idea generation or experimentation. Leadership initially blamed culture and skills.

A deeper assessment revealed a different root cause: nearly 40 percent of the workforce was experiencing food or housing insecurity. Employees were working second jobs, skipping medical care, and managing chronic stress.

The company shifted strategy. It introduced wage stabilization, subsidized meals, and emergency financial support. Within twelve months, participation in continuous improvement programs doubled, and frontline innovation proposals increased by over 60 percent.

Innovation did not fail due to lack of tools. It failed due to lack of breathing room.


Why Affordability Shapes Risk Appetite

Innovation requires people to take risks that may not pay off immediately. But when the margin for error is razor-thin, risk becomes reckless rather than courageous.

Employees who fear eviction or medical debt are far less likely to:

  • Challenge entrenched assumptions
  • Experiment with unproven ideas
  • Advocate for long-term investments
  • Speak candidly about systemic flaws

Affordability challenges quietly turn organizations into compliance machines rather than learning systems.


Conclusion: A Call for Human-Centered Policy

If we want to maintain a competitive edge in a rapidly changing world, we must view affordability as an innovation policy. Rent control, affordable housing, student debt relief, and food security are not just “social issues”; they are the foundational layers of a healthy innovation funnel.

We need to create “slack” in our systems. We need to ensure that the next great thinker is not working three gig-economy jobs just to keep the lights on. As leaders, we must advocate for a world where people are free to use their entire brain for the work of change, rather than wasting half of it on the math of survival.

True innovation starts with a simple human truth: A mind preoccupied with where to sleep cannot dream of how to fly.


Frequently Asked Questions

Q: How do high housing costs impact an organization’s innovation potential?

A: High housing costs force talent to relocate or spend a disproportionate amount of cognitive energy on survival. This reduces “cognitive bandwidth,” making employees more risk-averse and less likely to engage in the creative problem-solving or “intrapreneurship” required for organizational growth.

Q: What is the “Cognitive Tax” of affordability challenges?

A: The cognitive tax is the mental drain caused by financial stress. When individuals are worried about basic needs like food and rent, their prefrontal cortex — the area responsible for complex decision-making and creativity — is overwhelmed by the stress of survival, effectively lowering their functional IQ and creative output.

Q: Can innovation survive in an environment of economic scarcity?

A: While scarcity can occasionally breed “frugal innovation,” systemic affordability challenges generally stifle breakthrough innovation. Breakthroughs require “slack” — time, resources, and mental space — to experiment and fail. Without basic economic security, individuals cannot afford the risk of failure.

Disclaimer: This article speculates on the potential future direction of society based on current factors. It is hard to predict whether commercial, political and charitable organizations will respond in ways sufficient to alter the course of history or not.

Image credits: ChatGPT

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Are Starving Our Innovation Economy

We Are Starving Our Innovation Economy

GUEST POST from Greg Satell

The Cold War was fundamentally different from any conflict in history. It was, to be sure, less over land, blood and treasure than it was about ideas. Communist countries believed that their ideology would prevail. They were wrong. The Berlin Wall fell and capitalism, it seemed, was triumphant.

Today, however, capitalism is in real trouble. Besides the threat of a rising China, the system seems to be crumbling from within. Income inequality in developed countries is at 50-year highs. In the US, the bastion of capitalism, markets have weakened by almost every imaginable metric. This wasn’t what we imagined winning would look like.

Yet we can’t blame capitalism. The truth is that its earliest thinkers warned about the potential for excesses that lead to market failure. The fact is that we did this to ourselves. We believed that we could blindly leave our fates to market and technological forces. We were wrong. Prosperity doesn’t happen by itself. We need to invest in an innovation economy.

Capitalism’s (Seemingly) Fatal Contradiction

Anyone who’s taken an “Economics 101” course knows about Adam Smith and his invisible hand. Essentially, the forces of self-interest, by their very nature, work to identify the optimal price that attracts just enough supply of a particular good or service to satisfy demand. This magical equilibrium point creates prosperity through an optimal use of resources.

However, some argued that the story wasn’t necessarily a happy one. After all, equilibrium implies a lack of economic profit and certainly businesses would want to do better than that. They would seek to gain a competitive advantage and, in doing so, create surplus value, which would then be appropriated to accumulate power to rig the system further in their favor.

Indeed, Adam Smith himself was aware of this danger. “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices,” he wrote. In fact, the preservation of free markets was a major concern that ran throughout his work.

Yet as the economist Joseph Schumpeter pointed out, with innovation the contradiction dissipates. As long as we have creative destruction, market equilibriums are constantly shifting and don’t require capitalists to employ extractive, anti-competitive practices in order to earn excellent profits.

Two Paths To Profit

Anyone who manages a business must pursue at least one of two paths to profit. The first is to innovate. By identifying and solving problems in a competitive marketplace, firms can find new ways to create, deliver and capture value. Everybody wins.

Google’s search engine improved our lives in countless ways. Amazon and Walmart have dramatically improved distribution of goods throughout the economy, making it possible for us to pay less and get more. Pfizer and Moderna invested in an unproven technology that uses mRNA to deliver life-saving molecules and saved us from a deadly pandemic.

Still, the truth is that the business reality is not, “innovate or die,” but rather “innovate or find ways to reduce competition.” There are some positive ways to tilt the playing field, such as building a strong brand or specializing in some niche market. However, other strategies are not so innocent. They seek to profit by imposing costs on the rest of us

The first, called rent seeking, involves businesses increasing profits through getting litigation passed in their favor, as when car dealerships in New Jersey sued against Tesla’s direct sales model. The second, regulatory capture, seeks to co-opt agencies that are supposed to govern industry, resulting in favorable implementation and enforcement of the legal code.

Why “Pro-Business” Often Means Anti-Market

Corporations lobby federal, state and local governments to advance their interests and there’s nothing wrong with that. Elected officials should be responsive to their constituents’ concerns. That is, after all, how democracy is supposed to work. However, very often business interests try to maintain that they are arguing for the public good rather than their own.

Consider the issue of a minimum wage. Businesses argue that government regulation of wages is an imposition on the free market and that, given the magical forces of the invisible hand, letting the market set the price for wages would produce optimal outcomes. Artificially increasing wages, on the other hand, would unduly raise prices on the public and reduce profits needed to invest in competitiveness.

This line of argument is nothing new, of course. In fact, Adam Smith addressed it in The Wealth of Nations nearly 250 years ago:

Our merchants and master-manufacturers complain much of the bad effects of high wages in raising the price, and thereby lessening the sale of their goods both at home and abroad. They say nothing concerning the bad effects of high profits. They are silent with regard to the pernicious effects of their own gains. They complain only of those of other people.

At the same time corporations have themselves been undermining the free market for wages through the abuse of non-compete agreements. Incredibly, 38% of American workers have signed some form of non-compete agreement. Of course, most of these are illegal and wouldn’t hold up in court, but serve to intimidate employees, especially low-wage workers.

That’s just for starters. Everywhere you look, free markets are under attack. Occupational licensing, often the result of lobbying by trade associations, has increased five-fold since the 1950s. Antitrust regulation has become virtually nonexistent, while competition has been reduced in the vast majority of American industries.

Perhaps not surprisingly, while all this lobbying has been going on, recent decades have seen business investment and innovation decline, and productivity growth falter while new business formation has fallen by 50%. Corporate profits, on the other hand, are at record highs.

Getting Back On Track

At the end of World War II, America made important investments to create the world’s greatest innovation economy. The GI Bill made what is perhaps the biggest investment ever in human capital, sending millions to college and creating a new middle class. Investments in institutions such as the National Science Foundation (NSF) and the National Institutes of Health (NIH) would create scientific capital that would fuel US industry.

Unfortunately, we abandoned that very successful playbook. Over the past 20 years, college tuition in the US has roughly doubled in the last 20 years. Perhaps not surprisingly, we’ve fallen to ninth among OECD countries for post-secondary education. The ones who do graduate are often forced into essentially decades of indentured servitude in the form of student loans.

At the same time, government investment in research as a percentage of GDP has been declining for decades, limiting our ability to produce the kinds of breakthrough discoveries that lead to exciting new industries. What passes for innovation these days displaces workers, but does not lead to significant productivity gains. Legislation designed to rectify the situation and increase our competitiveness stalled in the Senate.

So after 250 years, capitalism remains pretty much as Adam Smith first conceived, powerful yet fragile, always at risk of being undermined and corrupted by the same basic animal spirits that it depends on to set prices efficiently. He never wrote, nor is there any indication he ever intended, that markets should be left to their own devices. In fact, he and others warned us that markets need to be actively promoted and protected.

We are free to choose. We need to choose more wisely.

— Article courtesy of the Digital Tonto blog
— Image credits: Microsoft CoPilot

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

We Need to Solve the Productivity Crisis

We Need to Solve the Productivity Crisis

GUEST POST from Greg Satell

When politicians and pundits talk about the economy, they usually do so in terms of numbers. Unemployment is too high or GDP is too low. Inflation should be at this level or at that. You get the feeling that somebody somewhere is turning knobs and flicking levers in order to get the machine humming at just the right speed.

Yet the economy is really about our well being. It is, at its core, our capacity to produce goods and services that we want and need, such as the food that sustains us, the homes that shelter us and the medicines that cure us, not to mention all of the little niceties and guilty pleasures that we love to enjoy.

Our capacity to generate these things is determined by our productive capacity. Despite all the hype about digital technology creating a “new economy,” productivity growth for the past 50 years has been tremendously sluggish. If we are going to revive it and improve our lives we need to renew our commitment to scientific capital, human capital and free markets.

Restoring Scientific Capital

In 1945, Vannevar Bush, delivered a report, Science, The Endless Frontier, that argued that the US government needed to invest in “scientific capital” and through basic research and scientific education. It would set in motion a number of programs that would set the stage for America’s technological dominance during the second half of the century.

Bush’s report led to the development of America’s scientific infrastructure, including agencies such as the National Science Foundation (NSF), National Institutes of Health (NIH) and DARPA. Others, such as the National Labs and science programs at the Department of Agriculture, also contribute significantly to our scientific capital.

The results speak for themselves and returns on public research investment have been shown to surpass those in private industry. To take just one example, it has been estimated that the $3.8 billion invested in the Human Genome Project resulted in nearly $800 billion in economic impact and created over 300,000 jobs in just the first decade.

Unfortunately, we forgot those lessons. Government investment in research as a percentage of GDP has been declining for decades, limiting our ability to produce the kinds of breakthrough discoveries that lead to exciting new industries. What passes for innovation these days displaces workers, but does not lead to significant productivity gains.

So the first step to solving the productivity puzzle would be to renew our commitment to investing in the type of scientific knowledge that, as Bush put it, can “turn the wheels of private and public enterprise.” There was a bill before congress to do exactly that, but unfortunately it got bogged down in the Senate due to infighting.

Investing In Human Capital

Innovation, at its core, is something that people do, which is why education was every bit as important to Bush’s vision as investment was. “If ability, and not the circumstance of family fortune, is made to determine who shall receive higher education in science, then we shall be assured of constantly improving quality at every level of scientific activity,” he wrote.

Programs like the GI Bill delivered on that promise. We made what is perhaps the biggest investment ever in human capital, sending millions to college and creating a new middle class. American universities, considered far behind their European counterparts earlier in the century, especially in the sciences, came to be seen as the best in the world by far.

Today, however, things have gone horribly wrong. A recent study found that about half of all college students struggle with food insecurity, which is probably why only 60% of students at 4-year institutions and even less at community colleges ever earn a degree. The ones that do graduate are saddled with decades of debt

So the bright young people who we don’t starve we are condemning to decades of what is essentially indentured servitude. That’s no way to run an entrepreneurial economy. In fact, a study done by the Federal Reserve Bank of Philadelphia found that student debt has a measurable negative impact on new business creation.

Recommitting Ourselves To Free and Competitive Markets

There is no principle more basic to capitalism than that of free markets, which provide the “invisible hand” to efficiently allocate resources. When market signals get corrupted, we get less of what we need and more of what we don’t. Without vigorous competition, firms feel less of a need to invest and innovate, and become less productive.

There is abundant evidence that is exactly what has happened. Since the late 1970s antitrust enforcement has become lax, ushering in a new gilded age. While digital technology was hyped as a democratizing force, over 75% of industries have seen a rise in concentration levels since the late 1990s, which has led to a decline in business dynamism.

The problem isn’t just monopoly power dominating consumers, either, but also monopsony, or domination of suppliers by buyers, especially in labor markets. There is increasing evidence of collusion among employers designed to keep wages low, while an astonishing abuse of non-compete agreements that have affected more than a third of the workforce.

In a sense, this is nothing new. Adam Smith himself observed in The Wealth of Nations that “Our merchants and master-manufacturers complain much of the bad effects of high wages in raising the price, and thereby lessening the sale of their goods both at home and abroad. They say nothing concerning the bad effects of high profits. They are silent with regard to the pernicious effects of their own gains. They complain only of those of other people.”

Getting Back On Track

In the final analysis, solving the productivity puzzle shouldn’t be that complicated. It seems that everything we need to do we’ve done before. We built a scientific architecture that remains unparalleled even today. We led the world in educating our people. American markets were the most competitive on the planet.

Yet somewhere we lost our way. Beginning in the early 1970s, we started reducing our investment in scientific research and public education. In the early 1980s, the Chicago school of competition law started to gain traction and antitrust enforcement began to wane. Since 2000, competitive markets in the United States have been in serious decline.

None of this was inevitable. We made choices and those choices had consequences. We can make other ones. We can choose to invest in discovering new knowledge, educate our children without impoverishing them, to demand our industries compete and hold our institutions to account. We’ve done these things before and can do so again.

All that’s left is the will and the understanding that the economy doesn’t exist in the financial press, on the floor of the stock markets or in the boardrooms of large corporations, but in our own welfare as well as in our ability to actualize our potential and realize our dreams. Our economy should be there to serve our needs, not the other way around.

— Article courtesy of the Digital Tonto blog
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The Digital Revolution Has Been A Giant Disappointment

The Digital Revolution Has Been A Giant Disappointment

GUEST POST from Greg Satell

One of the most often repeated episodes in the history of technology is when Steve Jobs was recruiting John Sculley from his lofty position as CEO at Pepsi to come to Apple. “Do you want to sell sugar water for the rest of your life,”Jobs asked, “or do you want to come with me and change the world?”

It’s a strange conceit of digital denizens that their businesses are something nobler than other industries. While it is true that technology can do some wonderful things, if the aim of Silicon Valley entrepreneurs was truly to change the world, why wouldn’t they apply their formidable talents to something like curing cancer or feeding the hungry?

The reality, as economist Robert Gordon explains in the The Rise and Fall of American Growth, is that the measurable impact has been relatively meager. According to the IMF, except for a relatively short burst in growth between 1996 and 2004, productivity has been depressed since the 1970s. We need to rethink how technology impacts our world.

The Old Productivity Paradox

In the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so strange that they called it the productivity paradox to underline their confusion.

The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit seeking businesses continue to make substantial investments, you expect to see a return. Yet with IT investment in the 70s and 80s, firms continued to increase their investment with negligible measurable benefit.

A paper by researchers at the University of Sheffield sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, were a small portion of total capital investment. Third, businesses weren’t necessarily investing to improve productivity, but to survive in a more demanding marketplace.

Yet by the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for the technology to hit critical mass.

The New Productivity Paradox

By 2004, the law of increasing returns was there for everyone to see. Google already dominated search, Amazon ruled e-commerce, Apple would go on to dominate mobile computing and Facebook would rule social media. Yet as the dominance of the tech giants grew, productivity would once again fall to depressed levels.

Yet today, more than a decade later, we’re in the midst of a second productivity paradox, just as mysterious as the first one. New technologies like mobile computing and artificial intelligence are there for everyone to see, but they have done little, if anything, to boost productivity.

At the same time the power of digital technology is diminishing. Moore’s law, the decades old paradigm of continuous doubling in the power of computer processing is slowing down and soon will end completely. Without advancement in the underlying technology, it is hard to see how digital technology will ever power another productivity boom.

Considering the optimistic predictions of digital entrepreneurs like Steve Jobs, this is incredibly disappointing. Compare the meager eight years of elevated productivity that digital technology produced with the 50-year boom in productivity created in the wake of electricity and internal combustion and it’s clear that digital technology simply doesn’t measure up.

The Baumol Effect, The Clothesline Paradox and Other Headwinds

Much like the first productivity paradox, it’s hard to determine exactly why the technological advancement over the last 15 years has amounted to so little. Most likely, it is not one factor in particular, but the confluence of a number of them. Increasing productivity growth in an advanced economy is no simple thing.

One possibility for the lack of progress is the Baumol effect, the principle that some sectors of the economy are resistant to productivity growth. For example, despite the incredible efficiency that Jeff Bezos has produced at Amazon, his barber still only cuts one head of hair at a time. In a similar way, sectors like healthcare and education, which require a large amount of labor inputs that resist automation, will act as a drag on productivity growth.

Another factor is the Clothesline paradox, which gets its name from the fact that when you dry your clothes in a machine, it figures into GDP data, but when you hang them on a clothesline, no measurable output is produced. In much the same way, when you use a smartphone to take pictures or to give you directions, there is considerable benefit that doesn’t result in any financial transactions. In fact, because you use less gas and don’t develop film, GDP decreases somewhat.

Additionally, the economist Robert Gordon, mentioned above, notes six headwinds to economic growth, including aging populations, limits to increasing education, income inequality, outsourcing, environmental costs due to climate change and rising household and government debt. It’s hard to see how digital technology will make a dent in any of these problems.

Technology is Never Enough to Change the World

Perhaps the biggest reason that the digital revolution has been such a big disappointment is because we expected the technology to largely do the work for us. While there is no doubt that computers are powerful tools, we still need to put them to good use and we have clearly missed opportunities in that regard.

Think about what life was like in 1900, when the typical American family didn’t have access to running water, electricity or gas powered machines such as tractors or automobiles. Even something simply like cooking a meal took hours of backbreaking labor. Yet investments in infrastructure and education combined with technology to produce prosperity.

Today, however, there is no comparable effort to invest in education and healthcare for those who cannot afford it, to limit the effects of climate change, to reduce debt or to do anything of anything of significance to mitigate the headwinds we face. We are awash in nifty gadgets, but in many ways we are no better off than we were 30 years ago.

None of this was inevitable, but the somewhat the results of choices that we have made. We can, if we really want to, make different choices in the days and years ahead. What I hope we have learned from our digital disappointments is that technology itself is never enough. We are truly the masters of our fate, for better or worse.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Humans, Not Technology, Drive Business Success

Humans, Not Technology, Drive Business Success

GUEST POST from Greg Satell

Silicon Valley is often known as a cut-throat, technocratic place where the efficiency of algorithms often define success. Competition is ferocious and the pace of disruption and change can be dizzying. It’s not the type of environment where soft skills are valued particularly highly or even at all.

So, it’s somewhat ironic that Bill Campbell became a Silicon Valley legend by giving hugs and professing love to those he worked with. As coach to executives ranging from Steve Jobs to the entire Google executive team, Campbell preached and practiced a very personal style of business.

Yet while I was reading Trillion Dollar Coach in which former Google executives explain Campbell’s leadership principles, it became clear why he had such an impact. Even in Silicon Valley, technology will only take you so far. The success of a business ultimately depends on the success of the people in it. To compete over the long haul, that’s where you need to focus.

The Efficiency Paradox

In 1911, Frederick Winslow Taylor published The Principles of Scientific Management, based on his experience as a manager in a steel factory. It took aim at traditional management methods and suggested a more disciplined approach. Rather than have workers pursue tasks in their own manner, he sought to find “the one best way” and train accordingly.

Taylor wrote, “It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured. And the duty of enforcing the adoption of standards and enforcing this cooperation rests with management alone.”

Before long, Taylor’s ideas became gospel, spawning offshoots such as scientific marketing, financial engineering and the Six Sigma movement. It was no longer enough to simply work hard, you had to measure, analyze and optimize everything. Over the years these ideas have become so central to business thinking that they are rarely questioned.

Yet management guru Henry Mintzberg has pointed out how a “by-the-numbers” depersonalized approach can often backfire. “Managing without soul has become an epidemic in society. Many managers these days seem to specialize in killing cultures, at the expense of human engagement.”

The evidence would seem to back him up. One study found that of 58 large companies that have announced Six Sigma programs, 91 percent trailed the S&P 500 in stock performance. That, in essence, is the efficiency paradox. When you manage only what you can measure, you end up ignoring key factors to success.

How Generosity Drives Innovation

While researching my book, Mapping Innovation, I interviewed dozens of top innovators. Some were world class scientists and engineers. Others were high level executives at large corporations. Still others were highly successful entrepreneurs. Overall, it was a pretty intimidating group.

So, I was surprised to find that, with few exceptions, they were some of the kindest and most generous people I have ever met. The behavior was so consistent that I felt that it couldn’t be an accident. So I began to research the matter further and found that when it comes to innovation, generosity really is a competitive advantage.

For example, one study of star engineers at Bell Labs found that the best performers were not the ones with the best academic credentials, but those with the best professional networks. A similar study of the design firm IDEO found that great innovators essentially act as brokers able to access a diverse array of useful sources.

A third study helps explain why knowledge brokering is so important. Analyzing 17.9 million papers, the researchers found that the most highly cited work tended to be largely rooted within a traditional field, but with just a smidgen of insight taken from some unconventional place. Breakthrough creativity occurs at the nexus of conventionality and novelty.

The truth is that the more you share with others, the more they’ll be willing to share with you and that makes it much more likely you’ll come across that random piece of information or insight that will allow you to crack a really tough problem.

People As Profit Centers

For many, the idea that innovation is a human centered activity is intuitively obvious. So it makes sense that the high-tech companies that Bill Campbell was involved in would work hard to create environments to attract the best and the brightest people. However, most businesses have much lower margins and have to keep a close eye on the bottom line.

Yet here too there is significant evidence that a human-focused approach to management can yield better results. In The Good Jobs Strategy MIT’s Zeynep Ton found that investing more in well-trained employees can actually lower costs and drive sales. A dedicated and skilled workforce results in less turnover, better customer service and greater efficiency.

For example, when the recession hit in 2008, Mercadona, Spain’s leading discount retailer, needed to cut costs. But rather than cutting wages or reducing staff, it asked its employees to contribute ideas. The result was that it managed to reduce prices by 10% and increased its market share from 15% in 2008 to 20% in 2012.

Its competitors maintained the traditional mindset. They reduced cut wages and employee hours, which saved them some money, but customers found poorly maintained stores with few people to help them, which damaged their brand long-term. The cost savings Mercadona’s employees identified, on the other hand, in many cases improved service and productivity and these gains persisted long after the crisis was over.

Management Beyond Metrics

The truth is that it’s easy to talk about putting people first, but much harder to do it in practice. Research suggests that once a group goes much beyond 200 people social relationships break down, so once a business gets beyond that point, it becomes natural to depersonalize management and focus on metrics.

Yet the best managers understand that it’s the people that drive the numbers. As legendary IBM CEO Lou Gerstner once put it, “Culture isn’t just one aspect of the game… It is the game. What does the culture reward and punish – individual achievement or team play, risk taking or consensus building?”

In other words, culture is about values. The innovators I interviewed for my book valued solving problems, so were enthusiastic about sharing their knowledge and expertise with others, who happily reciprocated. Mercadona valued its people, so when it asked them to find ways to save money during the financial crisis, they did so enthusiastically.

That’s why today, three years after his death, Bill Campbell remains a revered figure in Silicon Valley, because he valued people so highly and helped them learn to value each other. Management is not an algorithm. It is, in the final analysis, an intensely human activity and to do it well, you need to put people first.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Why Humans Fail to Plan for the Future

Why Humans Fail to Plan for the Future

GUEST POST from Greg Satell

I was recently reading Michiu Kaku’s wonderful book, The Future of Humanity, about colonizing space and was amazed how detailed some of the plans are. Plans for a Mars colony, for example, are already fairly advanced. In other cases, scientists are actively thinking about technologies that won’t be viable for a century or more.

Yet while we seem to be so good at planning for life in outer space, we are much less capable of thinking responsibly about the future here on earth, especially in the United States. Our federal government deficit recently rose to 4.6% of GDP, which is obviously unsustainable in an economy that’s growing at a meager 2.3%.

That’s just one data point, but everywhere you look we seem to be unable to plan for the future. Consumer debt in the US recently hit levels exceeding those before the crash in 2008. Our infrastructure is falling apart. Air quality is getting worse. The list goes on. We need to start thinking more seriously about the future, but don’t seem to be able. Why is that?

It’s Biology, Stupid

The simplest and most obvious explanation for why we fail to plan for the future is basic human biology. We have pleasure centers in our brains that release a hormone called dopamine, which gives us a feeling of well-being. So, it shouldn’t be surprising that we seek to maximize our dopamine fix in the present and neglect the future.

Yuval Noah Harari made this argument in his book Homo Deus, in which he argued that “organisms are algorithms.” Much like a vending machine is programed to respond to buttons, Harari argues, humans and other animals are programed by genetics and evolution to respond to “sensations, emotions and thoughts.” When those particular buttons are pushed, we respond much like a vending machine does.

He gives various data points for this point of view. For example, he describes psychological experiments in which, by monitoring brainwaves, researchers are able to predict actions, such as whether a person will flip a switch, even before he or she is aware of it. He also points out that certain chemicals, such as Ritalin and Prozac, can modify behavior.

Yet this somehow doesn’t feel persuasive. Adults in even primitive societies are expected to overcome basic urges. Citizens of Ancient Rome were taxed to pay for roads that led to distant lands and took decades to build. Medieval communities built churches that stood for centuries. Why would we somehow lose our ability to think long-term in just the past generation or so?

The Profit Motive

Another explanation of why we neglect the future is the profit motive. Pressed by demanding shareholders to deliver quarterly profits, corporate executives focus on showing short-term profits instead of investing for the future. The result is increased returns to fund managers, but a hollowing out of corporate competitiveness.

A recent article in Harvard Business Review would appear to bear this out. When a team of researchers looked into the health of the innovation ecosystem in the US, they found that corporate America has largely checked out. They also observed that storied corporate research labs, such as Bell Labs and Xerox PARC have diminished over time.

Yet take a closer look and the argument doesn’t hold up. In fact, the data from the National Science Foundation shows that corporate research has increased from roughly 40% of total investment in the 1950s and 60s to more than 60% today. At the same time, while some firms have closed research facilities, others, such as Microsoft, IBM and Google have either opened new ones or greatly expanded previous efforts. Overall R&D spending has risen over time.

Take a look at how Google innovates and you’ll be able to see the source for some the dissonance. 50 years ago, the only real option for corporate investment in research was a corporate lab. Today, however, there are many other avenues, including partnerships with academic researchers, internal venture capital operations, incubators, accelerators and more.

The Free Rider Problem

A third reason we may fail to invest in the future is the free rider problem. In this view, the problem is not that we don’t plan for the future, but that we don’t want to spend money on others who are undeserving. For example, why should we pay higher taxes to educate kids from outside our communities? Or to infrastructure projects that are wasteful and corrupt?

This type of welfare queen argument can be quite powerful. Although actual welfare fraud has been shown to be incredibly rare, there are many who believe that the public sector is inherently wasteful and money would be more productively invested elsewhere. This belief doesn’t only apply to low-income people, but also to “elites” such as scientists.

Essentially, this is a form of kinship selection. We are more willing to invest in the future of people who we see as similar to ourselves, because that is a form of self-survival. However, when we find ourselves asked to invest in the future of those we see as different from ourselves, whether that difference is of race, social class or even profession, we balk.

Yet here again, a closer look and the facts don’t quite fit with the narrative. Charitable giving, for example, has risen almost every year since 1977. So, it’s strange that we’re increasingly generous in giving to those who are in need, but stingy when it comes to things like infrastructure and education.

A New Age of Superstition

What’s especially strange about our inability to plan for the future is that it’s relatively new. In fact, after World War II, we invested heavily in the future. We created new avenues for scientific investment at agencies like the National Science Foundation and the National Institutes of Health, rebuilt Europe with the Marshall Plan and educated an entire generation with the GI Bill.

It wasn’t until the 1980s that our willingness to plan for and invest in the future began to wane, mostly due to two ideas that warped decision making. The first, called the Laffer Curve, argued that by lowering taxes we can increase revenue and that tax cuts, essentially, pay for themselves. The second, shareholder value, argued that whatever was best for shareholders is also best for society.

Both ideas have been partially or thoroughly debunked. Over the past 40 years, lower tax rates have consistently led to lower revenues and higher deficits. The Business Roundtable, an influential group of almost 200 CEOs of America’s largest companies, recently denounced the concept of shareholder value. Yet strangely, many still use both to support anti-future decisions.

We seem to be living in a new era of superstition, where mere belief is enough to inspire action. So projects which easily capture the imagination, such as colonizing Mars, are able to garner fairly widespread support, while investing in basic things like infrastructure, debt reduction or the environment are neglected.

The problem, in other words, seems to be mostly in the realm of a collective narrative. We are more than capable of enduring privation today to benefit tomorrow, just as businesses routinely take less profits today to invest in tomorrow. We are even capable of giving altruistically to others in need. All we need is a story to believe in.

There is, however, the possibility that it is not the future we really have a problem with, but each other and that our lack of a common story arises from a lack of shared values which leads to major differences in how we view the same facts. In any case, the future suffers.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Confessions of a Business Artist

Confessions of a Business Artist

I am an artist.

There, I’ve said it. This statement may confuse some people who know me, and come as a shock to others.

Braden, what do you mean you’re an artist? You’ve got an MBA from London Business School, you’ve led change programs for global organizations, helped companies build their innovation capabilities and cultures, are an expert in digital transformation, and you can’t even draw a straight line without a ruler. What makes you think you’re an artist?

Well, okay, that may all be true, but there are lots of different kinds of artists. I may not be a painter, a sculptor, a musician, an illustrator, or even a singer, but I am an artist, a business artist.

What is a business artist you ask?

A business artist sees through complexity to what matters most. A business artist loves working with PowerPoint and telling stories, often through keynote speeches and training facilitation, or through writing. A business artist loves to share, often doing so for the greater good, sometimes to their own financial detriment, in an effort to accelerate the knowledge, learning, and creating new capabilities in others. A business artist is a builder, often creating new businesses, new web sites, and new thinking. A business artist is comfortable stepping into a number of different business contexts and bringing a different energy and a different approach to creating solutions to complex requirements. Part of the reason a business artist can do this is because a business artist values their intuitive skills just as much as they value their intellectual skills, and may also consciously invest in getting in touch with higher levels of intuitive capabilities, enabling them to excel in roles that involve a great deal of what might be termed ‘organizational psychology’.

A business artist often appears to be a jack of all trades, sometimes bordering on what was portrayed in the television show The Pretender, and can be an incredibly powerful addition to any team tackling a big challenge, but a business artist’s incredible ability to contribute to the success of an organization is often discounted by the traditional recruiting processes of most human resource organizations because of its emphasis on skill matching and experience, skewing hiring in favor of someone with a lot of experience at being mediocre at a certain skillset over someone with limited experience but greater capability. A business artist often appears to be ahead of the curve, often to their own detriment, arriving too early to the party by grasping where organizations need to go before the rest of the organization is willing to accept the new reality. This is a real problem for business artists.

Now is the time for a change. Given human’s increasing access to knowledge, and the shorter time now required to acquire the necessary knowledge and skills required to perform a task, people who are comfortable with complexity, ambiguity, and capable of learning quickly are incredibly valuable to organizations as continual change becomes the new normal. Because experience is increasingly detrimental to success instead of a long-lived asset, given the accelerating pace of innovation and change, we need business artists now more than ever.

So how do we create more business artists?

Unfortunately our public schools are far too focused on indoctrination than education, on repetition over discovery. Our educational system specializes in creating trivia masters and kids that hate school, instead of building a new generation of creative problem solvers that love to learn and explore new approaches instead of defending status conferred based on mastery of current truths (which may be tomorrow’s fallacies). We are far too obsessed with STEM (Science Technology Engineering and Math) when we should be focused on STEAM (Science Technology Engineering Art and Music). Music is creative math after all. My daughter’s school has a limited music program and NO ART. How is this possible?

To create more business artists we need to shift our focus towards art, creative problem solving and demonstrated learning, and away from memorization, metrics, and repetition. Can we do this?

Can we create an environment where the status quo is seen not as a source of power through current mastery and instead towards a system where improvements to the status quo are seen as the new source of power?

Organizations that want to survive will do so. Countries that want to stay at the top of the economic pyramid will do so. So what kind of country do you want to live in? What kind of company do you want to be part of?

Do you have the courage to join me as a business artist or to help create a new generation of them?

Image credit: blogs.nd.edu

This article originally appeared on Linkedin


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.