Tag Archives: energy

The End of AI Data Centers

Why Decentralized Compute is the Only Resilient Future

LAST UPDATED: May 11, 2026 at 11:24 AM

The End of AI Data Centers

by Braden Kelley and Art Inteligencia


I. Introduction: The Fragility of the AI “Crown Jewels”

The race to dominate artificial intelligence has triggered a global construction boom unlike anything the technology industry has ever seen. Governments and corporations are pouring hundreds of billions of dollars into massive AI data centers packed with advanced GPUs, specialized networking hardware, and enough electrical infrastructure to power small cities. These facilities are rapidly becoming the economic and strategic “crown jewels” of the twenty-first century.

But in the rush to scale AI capability, we may be building exactly the wrong architecture for the world that is emerging around us.

The current model of AI infrastructure is overwhelmingly centralized. Instead of distributing compute across millions of smaller nodes, we are concentrating unprecedented amounts of economic, military, and technological capability into a relatively small number of gigantic facilities. Each hyperscale AI campus represents not only a massive financial investment, but also a critical dependency for national competitiveness, intelligence operations, logistics, cybersecurity, and military decision-making.

In effect, the AI industry has unintentionally created the ultimate single point of failure.

As AI becomes increasingly essential to economic productivity and national defense, these centralized facilities naturally evolve from commercial assets into strategic targets. Their importance guarantees that adversaries will study them, map them, probe them, and eventually develop methods to disrupt or destroy them. The more valuable these AI fortresses become, the more irresistible they become as targets during geopolitical conflict.

This reality formed the basis of a previous argument that the AI data centers of 2030 may ultimately require sovereign-level protection — potentially functioning more like hardened military installations than traditional commercial real estate. Once AI infrastructure becomes critical to national security, protecting it may no longer be optional.

But militarizing data centers only treats the symptom, not the disease.

Building bigger walls around centralized AI infrastructure may delay catastrophe, but it does not eliminate the underlying strategic vulnerability. A fortress is still a fortress. It still has a location. It still has supply lines. It still has power dependencies. And most importantly, it still presents adversaries with a concentrated target whose destruction could create disproportionate economic and military disruption.

Modern warfare is increasingly demonstrating that concentration itself is becoming obsolete.

The emerging lesson from contemporary conflict is that large, static, centralized assets are becoming dangerously vulnerable in an era of cheap autonomous systems, distributed attacks, cyber-physical warfare, and AI-enabled targeting. Resilience no longer comes from concentrating strength behind thicker walls. Resilience comes from distribution, redundancy, mobility, and the elimination of obvious centers of gravity.

The future of AI infrastructure may therefore require a fundamental architectural shift — away from the “Fortress” model and toward something far more decentralized and resilient.

Instead of concentrating compute into a handful of hyperscale compounds, the smarter long-term strategy may be to distribute AI capability across millions of interconnected nodes embedded throughout society itself. Homes, businesses, vehicles, factories, and local energy systems could collectively form a resilient national AI fabric that is vastly harder to disrupt because it has no singular brain to destroy.

In other words, the ultimate defense against the vulnerabilities of centralized AI infrastructure may not be better fortifications at all.

It may be the elimination of the fortress entirely.

II. Lessons from the Front: Operation Spiderweb and the Death of “Large & Static”

For decades, military doctrine revolved around concentration of force. Nations projected power by building larger air bases, larger aircraft carriers, larger command centers, and larger logistical hubs. Strategic advantage often came from assembling overwhelming capability in centralized locations that could be defended through scale, distance, and hardened infrastructure.

But modern warfare is beginning to expose a dangerous flaw in that logic.

Ukraine’s Operation Spiderweb offered a glimpse into the future of asymmetric conflict — and a warning for anyone investing heavily in centralized AI infrastructure. In the operation, relatively inexpensive drones launched from concealed shipping containers reportedly destroyed or severely damaged billions of dollars of Russian military hardware. The attack demonstrated how low-cost autonomous systems can bypass traditional defensive assumptions and threaten even heavily protected strategic assets.

The significance of the operation was not merely tactical. It was architectural.

A modern military aircraft may cost tens or even hundreds of millions of dollars to build, maintain, and defend. Yet those investments can now be threatened by autonomous systems costing a tiny fraction of the target’s value. This is the new asymmetry of modern conflict: increasingly cheap offensive capabilities versus increasingly expensive centralized assets.

The implications extend far beyond the battlefield.

Hyperscale AI data centers are emerging as the civilian equivalent of concentrated military infrastructure. A single AI campus may contain billions of dollars worth of GPUs, networking equipment, transformers, cooling systems, and backup power infrastructure concentrated within a relatively small geographic footprint. These facilities consume enormous amounts of electricity, require extensive water access, and depend on stable transportation and communication links.

In strategic terms, they are ideal targets.

Even if protected by advanced cybersecurity systems, physical security barriers, and military-grade defenses, the economics of attack versus defense are increasingly unfavorable. A nation may spend tens of billions hardening an AI fortress, while adversaries invest comparatively little developing autonomous drones, cyber-physical sabotage systems, electromagnetic disruption tools, or attacks against supporting infrastructure such as substations and fiber routes.

The uncomfortable reality is that static concentration itself is becoming the vulnerability.

This same lesson is already reshaping military thinking. Around the world, defense planners are reconsidering centralized command structures, massive forward operating bases, and tightly clustered logistics hubs. The future military is likely to become more distributed, more mobile, and more redundant — relying on decentralized command systems, autonomous coordination, modular logistics, and dispersed operational assets that can continue functioning even when individual nodes are destroyed.

AI infrastructure must evolve the same way.

If artificial intelligence becomes the backbone of economic productivity, national security, industrial automation, cybersecurity, healthcare, transportation, and military operations, then centralized AI compute becomes too strategically important to remain concentrated in a handful of giant facilities. The more essential AI becomes, the more dangerous centralization becomes.

The lesson of Operation Spiderweb is not simply that drones are dangerous.

The deeper lesson is that resilient systems survive by distributing critical capability across wide networks rather than concentrating it into singular targets. A decentralized system may lose individual nodes without catastrophic failure. A centralized system risks collapse if its core infrastructure is compromised.

In the emerging era of autonomous conflict, resilience increasingly belongs to the distributed.

III. The Social & Political Bottleneck: The Rise of the “NIMBY” Data Center

Even if centralized AI mega-campuses could somehow be fully protected from military and cyber threats, they still face another growing obstacle that may ultimately prove just as limiting: public opposition.

Across the United States and around the world, communities are increasingly resisting the construction of massive data centers in their neighborhoods. What was once viewed as relatively harmless digital infrastructure is now being recognized as an enormous industrial footprint with significant demands on land, water, electricity, and local infrastructure.

Residents are beginning to ask uncomfortable questions.

Why should local communities absorb rising utility costs, water consumption concerns, constant construction traffic, backup generator noise, and visual blight so that a handful of technology companies can consolidate AI power? Why should neighborhoods sacrifice scarce electrical capacity for facilities that may create relatively few permanent local jobs compared to their physical scale and resource consumption?

As AI adoption accelerates, these tensions are likely to intensify rather than diminish.

The scale of future AI infrastructure requirements is staggering. Advanced AI models require immense amounts of compute power, and every new generation of models appears to demand exponentially more energy and hardware than the last. Entire regions are already experiencing concerns about grid strain, water availability, permitting delays, and environmental impact as hyperscale facilities compete for resources with local populations.

This creates a growing sovereignty conflict between national strategic priorities and local community interests.

From the perspective of national governments, AI infrastructure increasingly resembles critical infrastructure on par with ports, railroads, telecommunications networks, or energy systems. Nations that fail to secure sufficient AI compute capacity may find themselves economically disadvantaged, technologically dependent, or strategically vulnerable.

But from the perspective of local residents, a giant AI campus often appears as an unwanted industrial intrusion that consumes disproportionate resources while providing limited direct community benefit.

The collision between these perspectives could become one of the defining infrastructure battles of the next decade.

Governments may attempt to override local opposition through federal permitting reforms, strategic infrastructure designations, or national security arguments. Technology companies may offer tax incentives, local investments, or infrastructure improvements to secure approval. Yet none of these approaches fundamentally solve the underlying tension created by concentrating massive amounts of AI compute into highly visible facilities.

The more AI infrastructure grows in scale, the harder it becomes to hide its impact.

This is why decentralization may represent not only a strategic advantage, but also a political one. It is partly because of expected increases in opposition to terrestrial AI data centers that Elon Musk and others are advocating for space-based AI data centers. But, even on earth we can solve both for fragility/vulnerability and growing political/social opposition.

Instead of forcing communities to accept gigantic industrial AI campuses, future infrastructure could become embedded into the fabric of everyday life itself. Rather than concentrating compute into enormous fortified compounds, AI processing power could be distributed across homes, apartment buildings, offices, vehicles, factories, and local energy systems.

In this model, AI infrastructure becomes largely invisible.

The electrical grid itself offers an instructive analogy. Most people rarely think about the countless distributed components that collectively generate and manage electrical power. The system works precisely because it is distributed, redundant, and woven into the broader physical environment rather than concentrated into a few singular facilities.

Decentralized AI compute could evolve in much the same way.

Instead of building isolated industrial parks dedicated exclusively to AI, society could gradually transform millions of existing structures into intelligent compute nodes. Homes equipped with solar panels, battery storage, smart electrical systems, and AI acceleration hardware could collectively form a national compute fabric that scales organically alongside everyday infrastructure upgrades.

The strategic benefit is resilience.

The political benefit is acceptance.

Infrastructure people barely notice is often infrastructure they are far more willing to live with.

Distributed AI infrastructure - PulteGroup, Nvidia, and Span

IV. The New Architecture: Residential AI Nodes (The Nvidia-Pulte-Span Model)

The transition from centralized AI fortresses to distributed AI infrastructure may sound futuristic, but early versions of this architecture are already beginning to emerge.

One of the clearest signals came from the 2026 partnership between PulteGroup, Nvidia, and Span — an alliance that hinted at a radically different vision for the future of AI compute. Instead of treating homes solely as passive consumers of electricity and internet services, the partnership pointed toward a future where residential properties themselves become intelligent infrastructure nodes participating in a larger distributed compute network.

At the center of this shift is the growing convergence of three technologies that historically operated independently: AI acceleration hardware, residential energy systems, and intelligent electrical management.

Nvidia provides the AI compute layer through increasingly compact and energy-efficient GPU systems optimized for local inference and edge processing. Span contributes the intelligent electrical infrastructure capable of dynamically managing household energy loads, battery systems, solar generation, and grid interaction. PulteGroup represents the large-scale residential deployment mechanism capable of embedding these systems into new homes at scale.

Together, these technologies begin to transform the modern home into something entirely new: a residential AI node.

This concept fundamentally changes the role homes play within both the energy grid and the digital economy. Traditionally, homes consume electricity, bandwidth, and cloud services while contributing relatively little back into the broader infrastructure ecosystem. But with intelligent power management, local battery storage, rooftop solar generation, and dedicated AI hardware, homes can evolve into active participants in a distributed national compute fabric.

In practical terms, this means millions of homes could collectively provide enormous amounts of distributed AI inference capacity without requiring the construction of massive standalone data centers.

The timing of this shift is important because AI workloads themselves are evolving.

Training frontier AI models will likely continue requiring large-scale centralized infrastructure for the foreseeable future. But inference — the process of actually running AI models to serve applications, automate tasks, power agents, process data, and support real-time decision-making — is increasingly capable of operating on smaller, distributed hardware systems.

That distinction changes everything.

Instead of routing every AI request through hyperscale facilities, future AI ecosystems may distribute inference workloads dynamically across millions of geographically dispersed residential nodes. AI processing could occur closer to the end user, reducing latency, improving resilience, lowering bandwidth costs, and minimizing pressure on centralized infrastructure.

The energy implications are equally significant.

One of the biggest criticisms of hyperscale AI infrastructure is its extraordinary power consumption. Massive data centers require huge dedicated energy resources that often strain local grids and trigger political resistance. Distributed residential AI nodes offer a different model by leveraging energy systems that are already being deployed into homes for broader electrification efforts.

Homes equipped with solar panels and battery packs effectively become micro-energy systems capable of storing and managing local power generation. Smart electrical panels can determine when energy demand is low, when renewable generation is abundant, or when excess electricity would otherwise go unused. During those periods, AI inference workloads could be activated opportunistically across distributed residential infrastructure.

In effect, AI compute becomes partially synchronized with the natural rhythms of the electrical grid.

Instead of building ever-larger centralized facilities that demand constant peak power availability, distributed AI infrastructure could absorb excess off-peak generation, stabilize demand curves, and make more efficient use of existing electrical capacity.

The homeowner incentives could also be compelling.

Just as homeowners today can sell excess solar generation back to the grid, future residential AI systems could potentially generate compute revenue by contributing idle processing power to distributed inference networks. Reduced utility costs, subsidized hardware, lower internet expenses, and participation payments could transform homes from passive infrastructure liabilities into productive digital assets.

This creates a powerful alignment between national strategic interests and individual economic incentives.

Governments gain a far more resilient and geographically distributed AI infrastructure. Technology companies gain scalable edge compute capacity without constructing as many hyperscale facilities. Electrical grids gain flexible demand management capabilities. And homeowners gain direct economic participation in the AI economy itself.

Most importantly, the resulting system becomes dramatically harder to disrupt.

A centralized AI fortress presents adversaries with a concentrated target. A distributed residential AI fabric diffuses compute capability across millions of ordinary structures woven throughout society. What once existed inside a handful of highly visible compounds instead becomes embedded everywhere and nowhere at the same time.

In the emerging era of strategic AI competition, that distinction may prove decisive.

V. Strategic Advantages of the Distributed AI Grid

If centralized AI infrastructure represents a high-value target with concentrated risk, then decentralized AI infrastructure represents the opposite: a system designed around dispersion, redundancy, and continual adaptability. The advantages of this shift are not incremental — they are structural.

The most immediate benefit is what might be called kinetic resilience. In a centralized model, a single facility may represent a critical node whose disruption could degrade national AI capability in a meaningful way. In a distributed model, however, compute is spread across thousands or millions of independent nodes. No single strike, outage, or localized failure can meaningfully degrade the system as a whole. The network simply reroutes, reallocates, and continues operating.

This changes the strategic calculus entirely. Instead of defending a small number of high-value assets at extraordinary cost, resilience is achieved through ubiquity. The system becomes less like a fortress and more like a living ecosystem — continuously adapting to localized disruptions without systemic collapse.

A second advantage is power efficiency and grid stability. Hyperscale data centers often require dedicated energy infrastructure, new transmission lines, and significant upgrades to local grids. They tend to behave like industrial-scale energy sinks, demanding predictable and sustained power delivery at massive scale.

A distributed AI grid behaves differently. By embedding compute capability into residential and commercial environments already connected to the electrical system, AI workloads can be dynamically aligned with existing energy flows rather than forcing entirely new ones.

In practical terms, this enables several efficiencies:

  • Utilization of residential solar generation that would otherwise be unused or exported inefficiently
  • Charging and discharging of home battery systems in coordination with AI workload demand
  • Shifting inference tasks to off-peak hours when grid demand is lower and electricity is cheaper
  • Reducing the need for large new transmission infrastructure dedicated solely to AI growth

Instead of AI competing with other sectors for scarce centralized power capacity, it becomes a flexible participant in a broader distributed energy ecosystem.

A third advantage is latency reduction and proximity to the user. As AI becomes more embedded in daily life — powering assistants, autonomous systems, real-time translation, predictive services, and physical automation — the distance between compute and user begins to matter more.

Distributed inference at the edge of the network enables faster response times, reduced dependency on long-haul network routing, and greater robustness during partial connectivity disruptions. In many cases, AI systems embedded in homes, vehicles, and local infrastructure can respond instantaneously without requiring round trips to distant centralized servers.

Taken together, these advantages suggest that decentralization is not simply a defensive posture against geopolitical risk — it is also an optimization of efficiency, responsiveness, and system-wide adaptability.

Perhaps most importantly, the distributed model reduces systemic fragility at exactly the moment AI systems are becoming more deeply integrated into critical societal functions. The more intelligence we embed into infrastructure, the more dangerous it becomes to concentrate that intelligence into a small number of failure-prone locations.

In this sense, decentralization is not a retreat from progress. It is an evolution toward resilience.

VI. Conclusion: From Fortresses to Fabrics

The trajectory of AI infrastructure is often described as a race toward scale: larger models, larger clusters, larger data centers, and larger investments concentrated into fewer and fewer locations. On the surface, this appears to be the natural endpoint of technological progress — efficiency achieved through consolidation.

But that framing assumes a world where concentration remains an advantage. Increasingly, the opposite may be true.

As AI becomes more deeply embedded in national economies, critical infrastructure, and defense systems, the risks associated with centralization grow in parallel with its capabilities. What once looked like an optimization problem begins to resemble a resilience problem. And resilience, in complex systems, rarely comes from concentration.

The “AI Fortress” model — massive, highly capable, strategically critical data centers protected by layers of physical and digital security — may represent an important transitional phase. It enables rapid scaling of capability at a moment when demand is exploding and architectures are still stabilizing. But it is unlikely to represent the final stable equilibrium.

Over time, the logic of vulnerability, energy distribution, political friction, and technological enablement all converge on a different structure: one that is distributed by default, not by exception.

In that future, AI compute is no longer something that exists “somewhere.” It is something that exists everywhere — embedded into homes, vehicles, factories, grids, and local systems, continuously interacting with the physical world rather than being isolated from it.

This is the shift from fortresses to fabrics.

A fortress is defined by its boundaries: inside is protected, outside is excluded, and value is concentrated at the center. A fabric, by contrast, derives its strength from interconnection. It is resilient not because it is hardened in one place, but because it is woven across many places. Damage to one thread does not collapse the structure; it is absorbed, rerouted, and contained.

A distributed AI fabric would behave in the same way. Compute capacity would be ubiquitous but not centralized, powerful but not singularly fragile, intelligent but not dependent on any single point of control or failure.

In this model, the question is no longer how to protect the brain of the system by enclosing it within ever more secure walls. Instead, the question becomes how to ensure there is no single brain to target in the first place.

That shift has profound strategic implications.

It reframes AI infrastructure from something that must be defended at a few critical locations into something that must be designed as a resilient, adaptive system distributed across society itself. It also aligns national security objectives with individual participation, energy efficiency with compute demand, and technological advancement with infrastructural sustainability.

In an era shaped by asymmetric threats, autonomous systems, and rapidly evolving geopolitical risk, the most robust systems will not be those that concentrate power most effectively, but those that distribute it most intelligently.

The future of AI infrastructure may therefore not be a monument.

It may be a mesh.

And in that shift from fortresses to fabrics lies the real foundation of long-term resilience in the age of artificial intelligence.

FAQ: Decentralized AI Compute and Infrastructure Resilience

FAQ

Why are centralized AI data centers considered vulnerable?
Centralized AI data centers concentrate massive compute, energy, and strategic value into a small number of physical locations. This creates single points of failure that can be targeted by physical attacks, cyber operations, or infrastructure disruptions, potentially causing disproportionate economic and national security impact.

What is meant by a “distributed AI fabric”?
A distributed AI fabric refers to an architecture where AI compute is spread across millions of interconnected nodes such as homes, businesses, and edge devices. Instead of relying on a few large data centers, intelligence is embedded throughout the network, improving resilience, reducing latency, and eliminating critical single points of failure.

How could residential AI nodes support the power grid and economy?
Residential AI nodes can leverage solar power, home battery systems, and off-peak electricity to run AI inference workloads locally. This helps balance grid demand, utilize excess renewable energy, reduce strain on centralized infrastructure, and potentially allow homeowners to participate economically in distributed compute networks.

EDITOR’S NOTE: You should read this article to learn more about Why the AI Data Centers of 2030 Will Be Sovereign Fortresses.

Content Authenticity Statement: The topic area, key elements to focus on, etc. were decisions made by Braden Kelley, with a little help from ChatGPT and Google Gemini to clean up the article, add images and create infographics.

Image credits: Google Gemini, SPAN (via mortgagepoint.com)

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Sometimes Too Much is Too Much

Sometimes Too Much is Too Much

GUEST POST from Mike Shipulski

When you’re out of gas, you’re out of gas. And there are no two ways about it, the last year has emptied our tanks. And when your tank is empty, it’s empty. When there’s nothing left, there’s nothing left. But what if you’re asked for more?

What is the mechanism to communicate that the workload is too much? How do you tell your boss that you can’t produce as you did before the pandemic because, well, you’re emotionally exhausted? How do you tell company leadership that this is not the time to layer on more corporate initiatives and elevate the importance of accountability? And if you do deliver those messages, will there be ramifications to your career? No ramifications you say? Then why do most feel overwhelmed yet say nothing?

How might we conserve our emotional energy to focus on what’s important? And what if the company thinks business continuity is most important and you think your family’s continuity is most important? What’s a caring parent to do? How about a loving spouse? How about an exhausted employee who wants desperately to contribute to the cause? And what if you’re all three? And what about your mental health?

If you can help someone, help them. If you don’t have the energy for that, tell them you know they are suffering and sit with them. They don’t expect you to fix it, they just want you to sit with them.

If you’re part of a team, check in with your teammates. Again, no need to try and fix them, just listen to them. Really listen. Listen so you can repeat what you heard in your own words. There’s power in being heard.

If you’re in a position to tell company leadership that people are living on the edge, tell them. If you’re not in that position, find someone who might be and ask them to pass it along. Tell them it’s important. Tell them it’s dire.

And when you go home to your family, tell them you’re exhausted and tell them you love them. And you’re doing your best. And tell them you know they’re doing their best, too. And tell them you love them.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Innovation or Not – Amazon Alexa Pay for Gas

Amazon Alexa Pay for Gas

You can now use the Alexa app on your phone or Alexa-enabled device in your car for an easy way to pay for gas at Exxon and Mobil stations nationwide.

Here’s how it works in a nutshell:

  1. Drive your vehicle up to the pump at your Exxon or Mobil station.
  2. Use the Alexa-enabled device in your car or Alexa app on your phone and say “Alexa, pay for gas.”
  3. Follow Alexa’s prompts to activate the pump.
  4. Fuel up and drive away. Payment is handled automatically.

I’m not sure whether they’re using Near Field Communications (NFC) or cellular data to communicate, but basically what’s happening is that in the same way a card swipe or tap to pay reader on the pump receives payment method information and validates payment, the pumps at select Exxon Mobil stations can now receive Amazon Pay default payment information, validate it and unlock the pump in the same way.

It’s a nice convenience and a clever way of trying to increase the adoption of Amazon Pay, but is it an innovation?

What do you think?


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Powering Monday Night Football with Feet?

Shell Kinetic Soccer Field in BrazilElectricity.

It’s not exactly cheap, and in rapidly modernizing countries (or even U.S. municipalities with budget woes) the idea of illuminating a neighborhood soccer field so kids and adults can play at night (especially in a poorer neighborhood), might seem like an impossibility.

But a couple of weeks ago Pelé (the Brazilian soccer player) and Shell (the global oil – ahem energy company) this week showed off a soccer revolution, a field located in the heart of Morro da Mineira, a Rio de Janeiro favela, capable of capturing the kinetic energy created by the movement of players around the field and combining it with nearby solar power to provide a source of renewable electricity for lighting the field.

The field uses two hundred high-tech, underground tiles to capture the energy created by players running around the field, along with energy created by solar panels next to the field and stores it in batteries next to the field. These new floodlights provides the players with a lit field and everyone else in the favela a safe and secure community area at night.

Until it was redeveloped by Shell, the soccer field was largely unusable and many young people were forced to play in the streets. The Morro da Mineira project shows how creative ideas delivered through committed partnerships can shape neighborhoods and transform communities.

The effort is a component of the Shell #makethefuture program, which endeavors to inspire entrepreneurs and young people to see science and engineering as potential career choices, and hopes to inspire both to use their minds to develop energy solutions for our planet’s future. The kinetic technology used at the soccer field was developed by a UK Shell LiveWIRE grant, which is designed to be a catalyst for young students and entrepreneurs seeking to grow promising ideas into viable and sustainable businesses.

Could we someday see a World Cup match lit by the players or maybe even a Monday Night Football game?

Only time, and a continued commitment to advancements in renewable energy generation and storage, will tell.

For other interesting kinetic energy inventions (and potential innovations), continue reading here (link broken).

Image Source: Treehugger


Build a common language of innovation on your team

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Using Gravity to Save and Improve Lives

Using Gravity to Save and Improve Lives

I came across an IndieGogo project that is focused on building and trialing a gravity-powered power station that can serve either as a lantern or as a flexible power source that can be used to power a task light, recharge batteries, or potentially other things that users might dream up that the designers can’t yet imagine.

Check out their video from IndieGogo:

They have already raised FIVE TIMES the money they set out to raise on IndieGogo.

I found it interesting in their promotional video that initially they started with a design challenge of designing a system that would charge a light for indoor use using a solar panel, but that they decided to abandon the approach specified from the outset and pursue alternate power sources.

Also interesting from the IndieGogo project page are the following facts:

The World Bank estimates that, as a result, 780 million women and children inhale smoke which is equivalent to smoking 2 packets of cigarettes every day. 60% of adult, female lung-cancer victims in developing nations are non-smokers. The fumes also cause eye infections and cataracts, but burning kerosene is also more immediately dangerous: 2.5 million people a year, in India alone, suffer severe burns from overturned kerosene lamps. Burning Kerosene also comes with a financial burden: kerosene for lighting ALONE can consume 10 to 20% of a household’s income. This burden traps people in a permanent state of subsistence living, buying cupfuls of fuel for their daily needs, as and when they can.

The burning of Kerosene for lighting also produces 244 million tonnes of Carbon Dioxide annually.

So, what do you think, a meaningful innovation or an interesting but impractical invention?

More information available on their web site here.


Build a common language of innovation on your team

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.