In an age of disruption, the only viable strategy is to adapt. Today, we are undergoing major shifts in technology, resources, migration and demography that will demand that we make changes in how we think and what we do. The last time we saw this much change afoot was during the 1920s and that didn’t end well. The stakes are high.
In a recent speech, the EU’s High Representative for Foreign Affairs and Security Policy Josep Borrell highlighted the need for Europe to change and adapt to shifts in the geopolitical climate. He also pointed out that change involves far more than interests and incentives, carrots and sticks, but even more importantly, identity.
“Remember this sentence,” he said. “’It is the identity, stupid.’ It is no longer the economy, it is the identity.” What he meant was that human beings build attachments to things they identify with and, when those are threatened, they are apt to behave in a visceral, reactive and violent way. That’s why change and identity are always inextricably intertwined.
“We can’t define the change we want to pursue until we define who we want to be.” — Greg Satell
The Making Of A Dominant Model
Traditional models come to us with such great authority that we seldom realize that they too once were revolutionary. We are so often told how Einstein is revered for showing that Newton’s mechanics were flawed it is easy to forget that Newton himself was a radical insurgent, who rewrote the laws of nature and ushered in a new era.
Still, once a model becomes established, few question it. We go to school, train for a career and hone our craft. We make great efforts to learn basic principles and gain credentials when we show that we have grasped them. As we strive to become masters of our craft we find that as our proficiency increases, so does our success and status.
The models we use become more than mere tools to get things done, but intrinsic to our identity. Back in the nineteenth century, the miasma theory, the notion that bad air caused disease, was predominant in medicine. Doctors not only relied on it to do their job, they took great pride in their mastery of it. They would discuss its nuances and implications with colleagues, signaling their membership in a tribe as they did.
In the 1840s, when a young doctor named Ignaz Semmelweis showed that doctors could prevent infections by washing their hands, many in the medical establishment were scandalized. First, the suggestion that they, as men of prominence, could spread something as dirty as disease was insulting. Even more damaging, however, was the suggestion that their professional identity was, at least in part, based on a mistake.
Things didn’t turn out well for Semmelweis. He railed against the establishment, but to no avail. He would eventually die in an insane asylum, ironically of an infection he contracted under care, and the questions he raised about the prevailing miasma paradigm went unanswered.
A Gathering Storm Of Accumulating Evidence
We all know that for every rule, there are exceptions and anomalies that can’t be explained. As the statistician George Box put it, “all models are wrong, but some are useful.” The miasma theory, while it seems absurd today, was useful in its own way. Long before we had technology to study bacteria, smells could alert us to their presence in unsanitary conditions.
But Semmelweis’s hand-washing regime threatened doctors’ view of themselves and their role. Doctors were men of prominence, who saw disease emanating from the smells of the lower classes. This was more than a theory. It was an attachment to a particular view of the world and their place in it, which is one reason why Semmelweis experienced such backlash.
Yet he raised important questions and, at least in some circles, doubts about the miasma theory continued to grow. In 1854, about a decade after Semmelweis instituted hand washing, a cholera epidemic broke out in London and a miasma theory skeptic named John Snow was able to trace the source of the infection to a single water pump.
Yet once again, the establishment could not accept evidence that contradicted its prevailing theory. William Farr, a prominent medical statistician, questioned Snow’s findings. Besides, Snow couldn’t explain how the water pump was making people sick, only that it seemed to be the source of some pathogen. Farr, not Snow, won the day.
Later it would turn out that a septic pit had been dug too close to the pump and the water had been contaminated with fecal matter. But for the moment, while doubts began to grow about the miasma theory, it remained the dominant model and countless people would die every year because of it.
Breaking Through To A New Paradigm
In the early 1860s, as the Civil War was raging in the US, Louis Pasteur was researching wine-making in France. While studying the fermentation process, he discovered that microorganisms spoiled beverages such as beer and milk. He proposed that they be heated to temperatures between 60 and 100 degrees Celsius to avoid spoiling, a process that came to be called pasteurization
Pasteur guessed that the similar microorganisms made people sick which, in turn, led to the work of Robert Koch and Joseph Lister. Together they would establish the germ theory of disease. This work then led to not only better sanitary practices, but eventually to the work of Alexander Fleming, Howard Florey and Ernst Chain and development of antibiotics.
To break free of the miasma theory, doctors needed to change the way they saw themselves. The miasma theory had been around since Hippocrates. To forge a new path, they could no longer be the guardians of ancient wisdom, but evidence-based scientists, and that would require that everything about the field be transformed.
None of this occurred in a vacuum. In the late 19th century, a number of long-held truths, from Euclid’s Geometry to Aristotle’s logic, were being discarded, which would pave the way for strange new theories, such as Einstein’s relativity and Turing’s machine. To abandon these old ideas, which were considered gospel for thousands of years, was no doubt difficult. Yet it was what we needed to do to create the modern world.
Moving From Disruption to Resilience
Today, we stand on the precipice of a new paradigm. We’ve suffered through a global financial crisis, a pandemic and the most deadly conflict in Europe since World War II. The shifts in technology, resources, migration and demography are already underway. The strains and dangers of these shifts are already evident, yet the benefits are still to come.
To successfully navigate the decade ahead, we must make decisions not just about what we want, but who we want to be. Nowhere is this playing out more than in Ukraine right now, where the war being waged is almost solely about identity. Russians want to deny Ukrainian identity and to defy what they see as the US-led world order. Europeans need to take sides. So do the Chinese. Everyone needs to decide who they are and where they stand.
This is not only true in international affairs, but in every facet of society. Different eras make different demands. The generation that came of age after World War II needed to rebuild and they did so magnificently. Yet as things grew, inefficiencies mounted and the Boomer Generation became optimizers. The generations that came after worshiped disruption and renewal. These are, of course, gross generalizations, but the basic narrative holds true.
What should be clear is that where we go from here will depend on who we want to be. My hope is that we become protectors who seek to make the shift from disruption to resilience. We can no longer simply worship market and technological forces and leave our fates up to them as if they were gods. We need to make choices and the ones we make will be greatly influenced by how we see ourselves and our role.
As Josep Borrell so eloquently put it: It is the identity, stupid. It is no longer the economy, it is the identity.
— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
When life doesn’t so much give you lemons as hurl them at you from a great height with the intent of inflicting significant damage on you it’s sometimes a moment for big change. It can force you to jump the tracks, find ways around the problem, re-frame the world.
Innovation history is full of examples. Take the case of Django Reinhardt, a successful musician in the 1920s whose career was nearly brought to a sudden end in 1928 when his caravan caught fire leaving him with life-threatening burns over half of his body. Including damaging two fingers of his left hand with all that implied for his ever being able to play guitar again. His response was to evolve a completely new style using his remaining fingers and creating the distinctive approach which made his name as one of the founders of ‘gypsy jazz’.
The pianist Keith Jarrett had a similar challenge to his ability to play though fortunately less physically direct. Contracted to give a late night concert in Cologne’s opera house he arrived to find a mix-up meant that the piano on which he was to play was an out of tune and malfunctioning rehearsal machine. With some frantic and unfinished attempts by a tuner to bring it into line Jarrett embarked on a journey of improvisation, adapting to the limitations of the instrument to create what has become a legendary (and thankfully recorded) performance.
It’s not a new phenomenon. One of history’s great innovators was, arguably, pushed to reframe his world and think differently about it as the result of what he later termed ‘ a happy accident’. Not quite the description most of us might use to capture falling ill with smallpox and losing the use of a leg which eventually has to be amputated. But that’s where the huge innovation legacy of Josiah Wedgwood began.
Born in 1730 in Burslem, Staffordshire, clay — or at least traces of it — was in his blood. He and his relatives had been turning pots for four generations and he’d learned the trade the hard way. Whatever needed doing — putting his shoulder to the big wheel with the horses that drove the smaller potters wheel on which shapes were formed, stacking for firing and then unloading from the kiln, fetching, carrying, packing and shipping.
But his career as a master potter was sadly cut short by an attack of smallpox when he was eleven years old which weakened his knee to the point where he could not work the potter’s kick wheel on which the trade depended. Formally apprenticed to his older brother his bad leg meant that he was unable to perform such laborious tasks as throwing pottery clay. Instead he began to spend much of his enforced sitting time reading and experimenting, trying out new ideas and recipes and painstakingly recording the results in his notebooks. It forced him to look at the whole process from a different angle, and reframing it threw up powerful — and valuable — insights.
Because pottery was about to become big business. From a pretty early date we’ve made use of clay to make functional utensils like plates and cups; relics found in Xianren province in China are close to 20,000 years old. But ceramics have also been a long-standing part of history as a visual pleasure, formed and decorated in exquisite ways using complex materials and techniques. The trouble is that only the very wealthy could afford the workmanship and materials needed to create the fine porcelain that was so prized in the early 18th century.
The future would belong to the innovators. Of whom his brother was not one….
In 1749, his apprenticeship ended. The family fortunes had improved somewhat thanks to his ideas but his brother was not convinced of the value of innovation and refused to take Josiah in as a partner. So he left the family business and in 1752, he formed a partnership with John Harrison (of clockmaking fame) and Thomas Alders. This was a short-lived association, as Wedgwood and Harrison clashed over manufacturing ideas. From too little to too much innovation under one roof….
His next move, in 1754, was more productive, a meeting of minds since his new partner Thomas Whieldon, a successful potter who also “loved to experiment”. Whieldon was interested in Josiah’s approach, not least because there was an urgent need to improve the quality of his lead-glazed creamware while keeping costs competitive. As Wedgwood noted, “…these considerations induced me to try for some more solid improvement, as well in the Body, as the Glazes, the Colours, & the Forms, of the articles of our manufacture….”
He started keeping his ‘Experiment Book’ at this time, containing details of his work and carefully listing measurements and ingredients, using a coded system that only he could understand. Over the next years it swelled to contain the details of thousands of experiments, many of them failures but crucially providing a roadmap for future innovation directions.
In 1759, Josiah parted company from Whieldon and set up on his own, leasing the Ivy House pottery in Burslem, Staffordshire, from one of his uncles. Business grew and he opened a second works in the town. But in 1762 returning trouble with his leg forced him to spend several weeks in bed; as it turned out another candidate for ‘happy accident’ status. Because it was while he was bed-bound in the city of Liverpool that a friend introduced him to Thomas Bentley who eventually became his business partner.
Innovation is rarely a solo act; in most cases it is the convergence of different skills, experience and insight which can help build something new. Think Hewlett and Packard, Jobs and Wozniak, Gates and Allen. And it was certainly true of Wedgwood and Bentley; he brought a deep understanding of the trading side of the pottery business together with a classical education and a rich network of contacts to the party. He understood the ways in which ceramic fashions were changing and how the technical skills of Wedgwood could help play in such a market.
Over the next years they not only made a wide range of tableware but also speciality wares for retailers, dairies, sanitary suppliers (including tiles for indoor bathrooms and sewers all over England), and the home. But beyond the functional Bentley also saw a growing demand for artefacts inspired by the classical Greek, Roman and Etruscan styles and the company began making cameos, vases, jugs, and plaques decorated with such themes. Just like the Meissen company in Germany had begun to work with artists and designers so Wedgwood and Bentley began to draw on Bentley’s contacts to supply the artwork and give a distinctive style to their products.
Josiah Wedgwood didn’t like porcelain. Or rather he did, from an aesthetic point of view. Its white purity, translucent thin strength, the wonderful shapes it could be fashioned into, all of these triggered his potter’s admiration as the high point of the craft.
His objection to porcelain was entirely economic. This was a time when the big prize was not selling expensive ceramics to wealthy aristocrats but somehow giving the same experience of fine pottery plates, cups, saucers, pots and jugs to the growing middle class. The Industrial Revolution was changing the economic as well as the social structure of Britain; there was now a potential mass market and an appetite for new goods — all manufacturers like Wedgwood had to do was create good quality products to satisfy it. He understood this and worked hard to bring to simpler earthenware and stoneware the distinctive features, fine designs and tactile quality of porcelain.
This ceramic offered a cheaper alternative to porcelain production, which by this time was being made by a number of manufactories at Bow in London, Plymouth and Bristol who had mastered the art of porcelain production at huge financial cost to themselves and their customers.
His favourite motto was ‘Everything yields to experiment..’. And so he continued the laboratory work, aiming to move pottery from a “rude uncultivated craft” into a field of applied science. And it was through this process that he made his first major achievement, the “invention of a green glaze,” recorded after six months of unsuccessful trial and error as experiment number seven. It proved to be popular and helped establish a reputation for innovation, something which he followed up with in the development of a better form of ‘creamware’. Creamware — a cream-coloured earthenware — had become popular as an economic alternative to porcelain but it had significant limitations. Through his experimental approach Wedgwood transformed it into a high-quality ceramic that was very versatile in that it could be thrown on a wheel, turned on a lathe, or cast.
He began receiving orders from the highest-ranking people and in 1765, received an invitation from St James’s Palace, London, for a ‘complete set of tea things…. ‘with a gold ground & raised flowers upon it in green….’. The invitation came from Queen Charlotte, the wife of King George III and it took the form of a competitive tender; fortunately Wedgwood’s hard work in the laboratory paid off. He won the competition and the contract; more importantly he’d been canny enough to include other samples of his wares in his delivery and they attracted further interest. The Queen was so pleased that in 1766 she gave him what must rank as one of the first ‘celebrity endorsements’, issuing a royal warrant with the wording: ‘To this manufacturer the Queen is pleased to give her name and patronage, commanding it to be called Queensware, and honouring the inventor by appointing him Her Majesty’s Potter’.
Wedgwood was nothing if not quick on the uptake and soon the title ‘Potter to Her Majesty’, was being added to invoices and orders while ‘Queensware’ featured prominently in newspaper advertisements for his products. As with the Meissen porcelain business in Germany the importance of a brand identity became apparent. It was not customary for Staffordshire potters to put their name or mark on their wares but Josiah began stamping the base of his products as a mark of authenticity and quality.
His marketing wasn’t confined to influencers and advertising; he pioneered many innovative approaches to reaching and serving his growing market including offering free delivery from his factory to London and free replacement of items broken in transit. He opened shops and showrooms in fashionable cities like Bath as well as in the capital. And he pioneered a ‘two-tiered’ approach, selling first to the aristocracy at premium prices and then using the cachet which their adoption gave to promote sales at lower prices to aspirational middle class buyers.
In 1764, he had received his first order from abroad and built on that success. By 1769 he declared his aim was to become “Vase Maker General to the Universe”. He might not have exported off planet but did a pretty good job in terrestrial terms — by 1784, he was exporting nearly 80% of his total produce. By 1790, he had sold his wares in every city in Europe.
Image: AI generated via Google Imagen
Perhaps the project which best underlines his grasp of the competitive edge which a combination of technical competence, great design and sophisticated marketing skills can offer is the famous ‘Frog service’ commission which came from Empress Catherine the Great of Russia in 1773. This called for a huge dinner and dessert service (944 pieces) for use at the Chesme Palace near St. Petersburg. It was located on marshy ground and had once been called ‘La Grenouillerie’ because of the large frog population. Catherine wanted to use this frog motif on every item of the service and for it to contain 1222 topographically correct hand painted views of British landscapes! Significantly she also wanted it to be made not of traditional porcelain, but in Wedgwood’s Queensware.
Wedgwood undertook this huge commission and delivered, even though it took over 30 artists and two years to complete. It was never a commercial success; the cost of final delivery was £2612 against a commission price of £2290 (£503,280 and £439,680 in today’s terms). But it more than made up the shortfall in reputation and marketing; the service was first displayed in London before delivery and attracted huge crowds, powerfully demonstrating that Wedgwood’s earthenware and stoneware could rival the best porcelain in the world.
When the service in its 22 crates finally arrived in St Petersburg in the autumn of 1774 it was displayed in the palace as a spectacle for visitors. The majority of it has survived and is now in the Hermitage Museum, St Petersburg.
But Wedgwood wasn’t only working on the marketing side; from his early days as an apprentice he’d looked for ways to improve production operations, focusing not just on single problem areas but looking at the manufacturing system as a whole.
He was an early adopter of steam power, something which significantly reduced transportation costs since it mean that mills for grinding and preparing materials for manufacturing could now be located on the same site. It also mechanized the processes of throwing and turning pots, previously driven by foot or hand wheels. But it was less in his adoption of new machinery than in his approach to production organization that he had the biggest impact. He was fascinated by the ideas of Adam Smith around the concept of division of labour, focusing on specialisation rather than having a single person carry out all the tasks in a series of operations. Mixing clay, throwing, firing and decorating were all separated into distinct operations and staffed by people trained in those areas, supported by specialised equipment. The result was a massive improvement in productivity and the approach enabled the volume production needed to meet the demands of a growing mass market.
Significantly Wedgwood also recognised the need to manage the major changes this would bring to working lives — not least the elimination of the old apprentice and journeyman system. In 1769, he opened a new factory complex, named ‘Etruria’ in a nod to the ancient Etruscans whose civilization had inspired many of his best-selling designs. It was a planned community designed to house his workshops, showrooms, and his workers and their families — far cry from the ‘cottage industry’ in which he had grown up. It gave him practical advantages such as co-location of key activities reducing time and transportation costs but it also represented an early attempt to create a different working environment. He passed some of the benefits of the (significantly) higher productivity at the factory by paying higher wages and he experimented with ways of improving the working environment, providing clothing, washing facilities, and even an early form of air conditioning.
Etruria was strategically located next to the newly-constructed Trent and Mersey canal, a venture which he had campaigned hard for and which was to help significantly in managing the wider logistics and distribution challenges of the growing business. These weren’t small; in mid 1700s the pottery industry was sourcing its clays and other materials from the south west in Devon and Dorset which meant shipping them to ports like Liverpool or Chester and then transporting them slowly along an antiquated road system down to Staffordshire. Wedgwood’s efforts to promote better roads and particularly the cutting of a 94 mile canal linking Liverpool with the Potteries paid off; despite a long planning battle with Parliament the canal was opened in 1777 and with it the chance to reduce inbound logistics costs and open up better distribution to his increasingly global market.
Growing a business often carries with it the risk that cash flow gets out of balance but this wasn’t the case with Wedgwood. His early family history gave him an abiding sense of the need to control costs, once complaining that his sales were at an all-time high, yet profits were minimal. He studied cost structures and came to value economies of scale, trying to avoid producing one-off vases ‘at least till we are got into a more methodicall way of making the same sorts over again’.
And he brought a scientific approach to his work, carefully recording the results of his experiments to build a clearer understanding of how to move manufacturing from a haphazard trial and error process to one which allowed for reproduceable control. In 1783 he was awarded a patent for a pyrometer designed to measure the extreme temperatures within a kiln, helping tame the chaotic and unpredictable firing process. For this he was recognised by being elected a Fellow of the Royal Society, joining his scientific friends and colleagues like Joseph Priestley and Matthew Boulton. Although named as a key achievement the award really testified to over thirty years of systematic research and development.
Wedgwood wasn’t one to rest on his laurels; in 1774 built on his success with creamware with another major innovation –Jasperware. This was a new material, laboriously developed to offer a new approach to pottery making and it led to a material that was unglazed and had a distinctive matte, or “biscuit,” finish. He experimented with many different colours including green, lilac, yellow, black, and white (the Victoria and Albert museum in London has several display trays showing the different samples). But the distinctive light blue colour which caught the imagination and which survives to this day in thousands of artist’s palettes and children’s colouring books was Wedgwood blue.
Jasperware’s development built on Bentley’s observation of the growing interest in ancient cultural artefacts from Greek, Roman and Etruscan civilizations. Wealthier people were beginning to undertake the ‘Grand Tour’ of Europe and bringing back souvenirs such as Roman cameos; Jasperware provided the perfect medium for making such products available in England. Pieces would be ornamented with scenes and reliefs not simply painted on but applied as a separate layer of clay before firing; it gave Wedgwood products a distinctive trademark to further bolster their brand.
In keeping with Wedgwood’s philosophy the Jasperware product and process continuously evolved. For example early specimens used cobalt to colour the entire body by mixing it in with the clay; this was extremely expensive and so later development used a dipping process in which a thin layer of coloured slip — watery clay — was applied just before firing.
This continuous improvement of the Jasperware concept led to perhaps Wedgwood’s last and what he considered his ‘great work’ — the five year journey towards creating a replica of the ancient Roman Portland Vase. The original was a masterpiece of cameo glass from the 1st century BC and considered one of the greatest works of antiquity. No-one knew how it had been made; the secret of its creation had been lost for over 1700 years. It was less a commercial venture (though once again it had powerful reputational benefits) but instead was the ultimate challenge of his technical skills.
To achieve this, he conducted thousands of experiments over nearly five years to perfect the blue-black colour and the delicate, low-relief figures in his signature Jasperware. He relied on his pyrometer to control the firings and worked with renowned sculptor John Flaxman to create the intricate white reliefs. The project had many challenges, including blistering and cracking. But his persistence paid off. The first successful copies of the vase were released in 1790 and proved to be so accurate that when the original was accidentally shattered at the British Museum, Wedgwood’s Jasperware copy was used to help piece it back together.
Image: Wikimedia Commons
Wedgwood’s health had never been great; he’d finally had his leg amputated in 1768 and by 1770 his sight was beginning to fail him. When Bentley died in 1780 he stepped back from the marketing side and focused his remaining attention on the factory and his laboratory. But in 1794 he fell ill again and died in 1795, aged 64.
Was it worth it? He’d started by inheriting £20 from his father, and when he died he left one of the finest industrial concerns in England with a personal worth of £500,000 (around £50 million today). When he began his business the big names in Staffordshire pottery were those of manufacturers like Josiah Spode and Thomas Minton; it didn’t take long before the name of Wedgwood and Bentley was up there with them, their company arguably the best-known pottery in the western world.
In doing so he helped create an industry which continues to produce beautiful artefacts for widespread use around the world. And one which has grown in value; the ceramic tableware market size was worth $12.4bn in 2024 and is forecast to reach $22bn in the next ten years.
He left the business to his sons and they continued through several generations to maintain the reputation for quality and innovation. The company remained independent until 1987, when it merged with Waterford Crystal, then with Royal Doulton. In July 2015, it was acquired by a Finnish consumer goods company who have retained the brand and still produce ‘prestige’ wares such as hand-painted and limited edition objects. Jasperware is still made by a small team of skilled workers at the Barlaston factory, while the rest of the company’s output is produced in Indonesia.
So in a sense Josiah is still vase making to the universe….
The Convergence of Biology, Technology, and Human-Centered Innovation
GUEST POST from Art Inteligencia
For centuries, the principles of manufacturing have been rooted in a linear, resource-intensive model: extract, produce, use, and dispose. In this paradigm, our most creative biological processes, like fermentation, have been limited by their own inherent constraints—slow yields, inconsistent outputs, and reliance on non-renewable inputs like sugars. But as a human-centered change and innovation thought leader, I see a new convergence emerging, one that promises to rewrite the rules of industry. It’s a profound synthesis of biology and technology, a marriage of microbes and micro-currents. I’m talking about electrofermentation, and it’s not just a scientific breakthrough; it’s a paradigm shift that enables us to produce the goods of the future in a way that is smarter, cleaner, and fundamentally more sustainable. This is about using electricity to guide and accelerate nature’s most powerful processes, turning waste into value and inefficiency into a new engine for growth.
The Case for a ‘Smarter’ Fermentation
Traditional fermentation, from brewing beer to creating biofuels, is an impressive but imperfect process. It is a biological balancing act, often limited by thermodynamic and redox imbalances that reduce yield and produce unwanted byproducts. Think of it as a chef trying to cook a complex dish without being able to precisely control the heat or the ingredients. This lack of fine-tuned control leads to waste and inefficiency, a costly reality in a world where every resource counts.
Electrofermentation revolutionizes this by introducing electrodes directly into the microbial bioreactor. This allows scientists to apply an electric current that acts as an electron source or sink, providing a powerful, precise control mechanism. This subtle electrical “nudge” steers the microbial metabolism, overcoming the natural limitations of traditional fermentation. The result is a process that is not only more efficient but also more versatile. It enables us to use unconventional feedstocks, such as industrial waste gases or CO₂, and convert them into valuable products with unprecedented speed and yield. It’s the difference between guessing and knowing, between a linear process and a circular one.
The Startups and Companies Leading the Charge
This revolution is already underway, driven by a new generation of companies and startups that are harnessing the power of electrofermentation to solve some of the world’s most pressing problems. At the forefront is LanzaTech, a company that has pioneered a process to recycle carbon emissions. They are essentially retrofitting breweries onto industrial sites like steel mills, using their proprietary microbes to ferment waste carbon gases into ethanol and other valuable chemicals. In the food sector, companies like Arkeon are redefining what we eat. They are building a new food system from the ground up by using microbes to convert CO₂ and hydrogen into sustainable proteins. And in the materials science space, innovators are exploring how this technology can create everything from biodegradable plastics to advanced biopolymers, all from non-traditional and renewable sources. These are not just scientific curiosities; they are real-world ventures creating scalable, impactful solutions that are actively building a circular economy.
Case Study 1: LanzaTech – Turning Pollution into Products
The Challenge:
Industrial emissions from steel mills and other heavy industries are a major contributor to climate change. These waste gases—rich in carbon monoxide (CO) and carbon dioxide (CO₂)—are a significant liability, but they also represent a vast, untapped resource. The challenge was to find a commercially viable way to capture these emissions and transform them into something valuable, rather than simply releasing them into the atmosphere.
The Electrofermentation Solution:
LanzaTech developed a gas fermentation process that uses a special strain of bacteria (Clostridium autoethanogenum) that feeds on carbon-rich industrial gases. This is a form of electrofermentation where the microbes use the electrons from the gas to power their metabolism. The process diverts carbon from being a pollutant and, through a biological synthesis, converts it into useful products. It’s like a biological recycling plant that fits onto a smokestack. The bacteria consume the waste gas, and in return, they produce fuels and chemicals like ethanol, which can then be used to make sustainable aviation fuel, packaging, and household goods. The key to its success is the precision of the fermentation process, which maximizes the conversion of waste carbon to valuable products.
The Human-Centered Result:
LanzaTech’s innovation is a powerful example of a human-centered approach to a global problem. It’s a technology that not only addresses a critical environmental challenge but also creates new economic opportunities and supply chains. By turning industrial emissions from a “bad” into a “good,” it redefines our relationship with waste. It’s a move away from a linear, extractive economy and toward a circular, regenerative one, proving that sustainability can be a catalyst for both innovation and profit. It has commercial plants in operation, showing that this is not just a theoretical solution but a scalable reality.
Case Study 2: Arkeon – The Future of Food from Air
The Challenge:
The global food system is under immense pressure. Rising populations, climate change, and resource-intensive agricultural practices are straining our ability to feed everyone sustainably. The production of protein, in particular, has a significant environmental footprint, requiring vast amounts of land and water and generating substantial greenhouse gas emissions. The challenge is to find a new, highly efficient, and sustainable source of protein that is not dependent on traditional agriculture.
The Electrofermentation Solution:
Arkeon is using a form of electrofermentation to create a protein-rich biomass from air. Their process involves using specialized microbes called archaea, which thrive in extreme environments and can be “fed” on CO₂ and hydrogen gas. By using an electrical current to power this process, Arkeon can precisely control the microbial activity to produce amino acids, the building blocks of protein, with incredible efficiency. This innovative process decouples food production from agricultural land, water, and sunlight, making it a highly resilient and sustainable source of nutrition. It’s a closed-loop system where waste (CO₂) is the primary input, and a high-value, functional protein powder is the output.
The Human-Centered Result:
Arkeon’s work is a powerful human-centered innovation because it tackles one of the most fundamental human needs: food security. By developing a method to create protein from waste gases, the company is not only providing a sustainable alternative but also building a more resilient food system. This technology could one day enable localized, decentralized food production, reducing reliance on complex supply chains and making communities more self-sufficient. It is a bold, forward-looking solution that envisions a future where the air we breathe can be a source of sustainable, high-quality nutrition for everyone.
Conclusion: The Dawn of a New Industrial Revolution
Electrofermentation is far more than a technical trick. It represents a paradigm shift from a linear, extractive model to a circular, regenerative one. By converging biology and technology, we are unlocking the ability to produce what we need, not from the earth’s finite resources, but from the waste and byproducts of our own civilization. It is a testament to the power of human-centered innovation, where the goal is not just to build a better widget but to create a better world. For leaders, the question is not if this will impact your industry, but how you will embrace it. The future belongs to those who see waste not as a liability, but as a feedstock, and who are ready to venture beyond the traditional. This is the dawn of a new industrial revolution, and it’s powered by a jolt of electricity and a microbe’s silent work, promising a more sustainable and abundant future for us all.
This video provides a concise overview of LanzaTech’s carbon recycling process, which is a key example of electrofermentation in action.
Disclaimer: This article speculates on the potential future applications of cutting-edge scientific research. While based on current scientific understanding, the practical realization of these concepts may vary in timeline and feasibility and are subject to ongoing research and development.
Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
Layoffs, Store Closures & What It Means for Customer Service
LAST UPDATED: September 25, 2025 at 10:58PM
Exclusive Interview with Mario Matulich
In a world where corporate decisions often prioritize efficiency, the human element can be the first to suffer. The recent layoffs and restructuring at Starbucks, a brand synonymous with a unique, human-centered “third place” experience, have sent a tremor through the industry. In a wide-ranging interview, we will unpack the strategic and operational implications of these changes. Together, we will explore the difficult balance between trimming corporate fat and maintaining a brand built on emotional connection, diving into how these decisions could affect everything from in-store morale to the long-term loyalty of its customers. Central to the conversation is the following strategic question:
How can a company that has undergone significant corporate restructuring and layoffs maintain and restore a premium, human-centered customer experience?
Today we will explore this question, along with its various aspects with our special guest Mario Matulich, a practice lead at the Customer Management Practice with a diverse commercial understanding in a variety of industry verticals across the customer management sector. He is well versed in market research, product development, sales, marketing, and operations in addition to cross functional management and leadership development.
Without further ado, here is the Q&A I had with Mario on a range of topics regarding the recent Starbucks’ store closures and layoffs and their implications:
The Strategic Context of the Layoffs
Q: Starbucks’ leadership framed the recent restructuring as a necessary step for efficiency and a return to their core mission. From your perspective in customer management, how do these internal changes directly affect the external customer experience in the short and long term? A: In the short term, layoffs, especially in corporate roles, can create gaps in innovation, brand narrative, and strategic support for store-level teams. Employees on the front lines may feel increased pressure, which can impact morale and the human connection customers expect. In the long term, if these gaps aren’t addressed, the result can be a more transactional experience that erodes both loyalty and trust.
Q: In many companies, layoffs are a last resort. Do you believe this restructuring reflects a failure of previous strategies, or is it a forward-thinking move to adapt to a changing market? What specific market trends do you think are driving these decisions? A: I don’t view this restructuring as purely a failure of previous strategies, but rather as an attempt to adapt to a changing market. That said, Starbucks’ bigger challenge is restoring its customer experience. Trends such as rising demand for personalized, convenient, and high-value experiences, along with increased competition in the premium coffee market, make it clear that customers are evaluating Starbucks not just on price, but on the overall experience delivered.
Q: The layoffs primarily targeted corporate roles in marketing, technology, and creative. How does the loss of talent in these specific areas impact the company’s ability to innovate and maintain its brand narrative? A: These areas are critical for innovation, storytelling, and digital experiences that connect customers to the brand. Losing talent here makes it more challenging to maintain a consistent, differentiated experience and risks further disengagement from customers.
Impact on the Human-Centered Experience
Q: Starbucks has long prided itself on the “third place” concept. How does restructuring and potential employee demoralization affect the in-store experience and the emotional connection customers have with the brand? A: The “third place” experience relies on motivated and supported employees. Restructuring can disrupt this, as uncertainty and low morale may trickle down to in-store interactions. Customers may perceive a decline in warmth, attentiveness, and consistency, which can undermine the emotional connection.
Q: With fewer people in corporate roles, who now owns the responsibility for a seamless customer journey? Does this push more responsibility onto store-level partners, and if so, are they equipped to handle it? A: While partners remain at the front line, the burden shouldn’t fall solely on them. Leadership must provide tools, guidance, and support to ensure a seamless experience, even as corporate teams shrink.
Q: Customer management is about building long-term loyalty. Do you believe this restructuring risks eroding the trust and loyalty of both employees and customers, and what would your practice recommend to mitigate that risk? A: Yes, there’s definitely a risk. The key is to go back to the basics and make the experience personal, easy, and fast. Nail those, and customers’ trust and loyalty will .,¬./come back, and the layoffs won’t linger in their minds.
Measuring and Recovering from the Impact
Q: How would you advise Starbucks to measure the real-time impact of these changes on customer satisfaction? Beyond traditional metrics like NPS, what holistic experience measures should they be tracking? A: Starbucks should look beyond NPS to measure speed of service, personalization, emotional connection, and overall experience consistency. These metrics provide a more comprehensive view of the customer journey and help identify gaps that layoffs may create.
Q: Layoffs can create a perception of instability. What is the most effective way for a company to communicate its recovery plan and rebuild confidence with its customer base after such a significant change? A: Clear communication focused on restoring the core pillars of customer experience, personalization, ease, and speed, is key. Customers respond when they see tangible improvements in the experience they receive every day.
Q: In your experience, what is the typical timeline for a company to recover from the brand and cultural damage that can follow widespread layoffs? What are the critical milestones they should be focused on achieving? A: Recovery timelines vary, but visible improvements in customer experience can begin within months if executed strategically. Critical milestones include reestablishing operational consistency, restoring employee morale, and relaunching key brand initiatives that reinforce the premium experience promise.
Future-Proofing for Long-Term Growth
Q: Looking ahead, how can Starbucks utilize this moment of disruption to adopt a more resilient and human-centered organizational model? What key lesson should other companies learn from their experience to avoid similar pitfalls? A: Starbucks has a chance here to get back to what really made it successful: combining innovative, tech-forward solutions with a human touch, every time. The bigger lesson for any company is clear. Growth and cost-cutting shouldn’t come at the expense of the customer experience. People are willing to pay a premium, but only if the experience feels worth it.
Q: What message does it send that the popular Starbucks Roastery location in Capitol Hill in Seattle is being closed as part of this layoff and restructuring initiative? Why do you think they chose to do it? A: Closing the Roastery signals a prioritization of efficiency over experiential destinations. While it may make financial sense in the short term, it also serves as a cautionary reminder that iconic, high-touch experiences are critical to maintaining brand differentiation and customer loyalty.
Conclusion
Thank you for the great conversation Mario!
Ultimately, the Starbucks case study is a powerful lesson for every organization. As Matulich’s insights make clear, the pursuit of efficiency and growth cannot come at the expense of the human experience that defines your brand. The true measure of a company’s resilience is not in its stock price, but in the trust it has built with its employees and customers. A single-minded focus on traditional metrics is insufficient; a holistic approach that values emotional connection and employee morale is the only path to sustainable growth. The greatest challenge for Starbucks now is to move beyond reacting to a difficult market and begin proactively shaping its future—not just through cost-cutting, but by recommitting to the core narrative that made it a cultural institution in the first place. The future of any business is not found in a spreadsheet; it’s built on a foundation of human connection, one interaction at a time.
Image credits: Pexels, Mario Matulich
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
Ten years ago, only the most technologically advanced companies used AI — although it barely resembled what companies use today when communicating with customers — and it was very, very expensive. But not anymore. Today, any company can implement an AI strategy using ChatGPT-type technologies, often creating experiences that give customers what they want. But not always, which is why the information below is important.
The 2025 Findings
My annual customer service and customer experience (CX) research study surveys more than 1,000 U.S. consumers weighted to the population’s demographics of age, gender, ethnicity and geography. It included an entire group of questions focused on how customers react to and accept (or don’t accept) AI options to ask questions, resolve problems and communicate with a company or brand. Consider the following findings:
AI Success: Half of U.S. customers (50%) said they have successfully resolved a customer service issue using AI or ChatGPT-type technologies without needing human assistance. In 2024, only three out of 10 customers (32%) did so. That’s great news, but it’s important to point out that age makes a difference. Six out of 10 Gen-Z customers (61%) successfully used AI support versus just 32% of Boomers.
AI Is Far From Perfect: Half of U.S. customers (51%) said they received incorrect information from an AI self-service bot. Even with incredible improvement in AI’s capabilities, it still serves up wrong information. That destroys trust, not only in the company but also in the technology as a whole. A few bad answers and customers will be reluctant, at least in the near term, to choose self-service over the traditional mode of communication, the phone.
Still, Customers Believe: Four out of 10 customers (42%) believe AI and ChatGPT can handle complex customer service inquiries as effectively as humans. Even with the mistakes, customers believe AI solutions work. However, 86% of customers think companies using AI should always provide an option to speak or text with a real person.
The Phone Still Rules: It’s still too early to throw away phone support. My prediction is that it will be years, if ever, that human-to-human interactions completely disappear, which was proven when we asked, “When you have a problem or issue with a company, which solution do you prefer to use: phone or digital self-service?” The answer is that 68% of customers will still choose the phone over digital self-service. That number is highly influenced by the 82% of Baby Boomers who choose to call a company over any other type of digital support.
The Future Looks Strong For AI Customer Support: Six out of 10 customers (63%) expect AI-fueled technologies to become the primary mode of customer support. We asked the same question in 2021, and only 21% of customers felt this way.
The Strategy Behind Using AI For CX
Age Matters: As you can see from some of the above findings, there is a big generational gap between younger and older customers. Gen-Z customers are more comfortable, have had more success, and want more digital/AI interactions compared to older customers. Know your customer demographics and provide the appropriate support and communication options based on their age. Recognize you may need to provide different support options if your customer base is “everyone.”
Trust Is a Factor: Seven out of 10 customers (70%) have concerns about privacy and security when interacting with AI. Once again, age makes a difference. Trust and confidence with AI consistently decrease with age.
The Future of AI
As AI continues to evolve, especially in the customer service and experience world, companies and brands must find a balance between technology and the human touch. While customers are becoming more comfortable and finding success with AI, we can’t become so enamored with it that we abandon what many of our customers expect. The future of AI isn’t a choice between technology and humans. It’s about creating a blended experience that plays to the technology’s strengths and still gives customers the choice.
Furthermore, if every business had a 100% digital experience, what would be a competitive differentiator? Unless you are the only company that sells a specific product, everything becomes a commodity. Again, I emphasize that there must be a balance. I’ll close with something I’ve written before, but bears repeating:
The greatest technology in the world can’t replace the ultimate relationship-building tool between a customer and a business: the human touch.
This article was originally published on Forbes.com.
Image Credits: Google Gemini
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
Building a learning organization goes beyond adopting new methods or tools. At its core, it’s about fostering a culture where continuous growth, adaptability, and shared learning are prioritized at every level.
Creating this culture requires a top-down commitment led by leadership and management teams who embody a growth mindset, promote psychological safety, and actively engage in building a learning-focused environment.
Without this dedication, organizations miss a crucial opportunity to develop the capabilities essential for innovation and future-readiness.
Why is this important? Well, in today’s unpredictable and rapidly evolving landscape, a learning organization isn’t just a “nice-to-have” – it’s an imperative. While a company may excel in current operations, failing to invest in learning and adaptability poses significant risks to long-term success. Can any organization truly afford to ignore the need to shape its future?
Three Key Pillars
The foundation of a strong learning organization rests on three pillars:
A growth mindset,
psychological safety,
and an unwavering commitment to fostering a culture of learning.
Leaders must first embody these values to inspire the entire organization to follow. It starts with self-reflection: How can leaders upgrade their mindset, skills, and tools to champion this change? How can they be supported in making it happen?
Only when leaders truly commit to this journey can we build a resilient organization where people and teams possess the adaptability, skills, and mindset needed to innovate, grow, and thrive.
Image Credit: Stefan Lindegaard
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
Population, Scarcity, and the New Era of Human Value
LAST UPDATED: December 3, 2025 at 6:17 PM
GUEST POST from Art Inteligencia
We stand at a unique crossroads in human history. For centuries, the American story has been a tale of growth and expansion. We built an empire on a relentless increase in population and labor, a constant flow of people and ideas fueling ever-greater economic output. But what happens when that foundational assumption is not just inverted, but rendered obsolete? What happens when a country built on the idea of more hands and more minds needing more work suddenly finds itself with a shrinking demand for both, thanks to the exponential rise of artificial intelligence and robotics?
The Old Equation: A Sinking Ship
The traditional narrative of immigration as an economic engine is now a relic of a bygone era. For decades, we debated whether immigrants filled low-skilled labor gaps or competed for high-skilled jobs. That entire argument is now moot. Robotics and autonomous systems are already replacing a vast swath of low-skilled labor, from agriculture to logistics, with greater speed and efficiency than any human ever could. This is not a future possibility; it’s a current reality accelerating at an exponential pace. The need for a large population to perform physical tasks is over.
But the disruption is far more profound. While we were arguing about factory floors and farm fields, Artificial Intelligence (AI) has quietly become a peer-level, and in many cases, superior, knowledge worker. AI can now draft legal briefs, write code, analyze complex data sets, and even generate creative content with a level of precision and speed no human can match. The very “high-skilled” jobs we once championed as the future — the jobs we sought to fill with the world’s brightest minds — are now on the chopping block. The traditional value chain of human labor, from manual to cognitive, is being dismantled from both ends simultaneously.
But workers are not the only thing being disrupted. Governments will be disrupted as well. Why? Because companies will be incentivized to decrease profitability by investing in compute to remain competitive. This means the tax base will shrink at the same time that humans will need increased financial assistance from the government. Taxes are only paid by businesses when there is profit (unless you switch to a revenue basis) and workers only pay taxes when they’re employed. A decreasing tax base and rising welfare costs is obviously unsustainable and another proof point for why smart countries have already started reducing their population to decrease the chances of default and social unrest.
“The question is no longer ‘What can humans do?’ but ‘What can only a human do?'”
The New Paradigm: Radical Scarcity
This creates a terrifying and necessary paradox. The scarcity we must now manage is not one of labor or even of minds, but of human relevance. The old model of a growing population fueling a growing economy is not just inefficient; it is a direct path to social and economic collapse. A population designed for a labor-based economy is fundamentally misaligned with a future where labor is a non-human commodity. The only logical conclusion is a Great Contraction — a deliberate and necessary reduction of our population to a size that can be sustained by a radically transformed economy.
This reality demands a ruthless re-evaluation of our immigration policy. We can no longer afford to see immigrants as a source of labor, knowledge, or even general innovation. The only value that matters now is singular, irreplaceable talent. We must shift our focus from mass immigration to an ultra-selective, curated approach. The goal is no longer to bring in more people, but to attract and retain the handful of individuals whose unique genius and creativity are so rare that AI can’t replicate them. These are the truly exceptional minds who will pioneer new frontiers, not just execute existing tasks.
The future of innovation lies not in the crowd, but in the individual who can forge a new path where none existed before. We must build a system that only allows for the kind of talent that is a true outlier — the Einstein, the Tesla, the Brin, but with the understanding that even a hundred of them will not be enough to employ millions. We are not looking for a workforce; we are looking for a new type of human capital that can justify its existence in a world of automated plenty. This is a cold and pragmatic reality, but it is the only path forward.
Human-Centered Value in a Post-Labor World
My core philosophy has always been about human-centered innovation. In this new world, that means understanding that the purpose of innovation is not just about efficiency or profit. It’s about preserving and cultivating the rare human qualities that still hold value. The purpose of immigration, therefore, must shift. It is not about filling jobs, but about adding the spark of genius that can redefine what is possible for a smaller, more focused society. We must recognize that the most valuable immigrants are not those who can fill our knowledge economy, but those who can help us build a new economy based on a new, more profound understanding of what it means to be human.
The political and social challenges of this transition are immense. But the choice is clear. We can either cling to a growth-based model and face the inevitable social and economic fallout, or we can embrace this new reality. We can choose to see this moment not as a failure, but as an opportunity to become a smaller, more resilient, and more truly innovative nation. The future isn’t about fewer robots and more people. It’s about robots designing, building and repairing other robots. And, it’s about fewer people, but with more brilliant, diverse, and human ideas.
This may sound like a dystopia to some people, but to others it will sound like the future is finally arriving. If you’re still not quite sure what this future might look like and why fewer humans will be needed in America, here are a couple of videos from the present that will give you a glimpse of why this may be the future of America:
INFOGRAPHIC ADDED DECEMBER 3, 2025:
Image credits: Google Gemini
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
Last night, I lied to a room full of MBA students. I showed them the Design Squiggle, and explained that innovation starts with (what feels like) chaos and ends with certainty.
The chaos part? Absolutely true.
The certainty part? A complete lie.
Nothing is Ever Certain (including death and taxes)
Last week I wrote about the different between risk and uncertainty. Uncertainty occurs when we cannot predict what will happen when acting or not acting. It can also be broken down into Unknown uncertainty (resolved with more data) and Unknowable uncertainty (which persists despite more data).
But no matter how we slice, dice, and define uncertainty, it never goes away.
It may be higher or lower at different times,
More importantly, it changes focus.
Four Dimensions of Uncertainty
Something new that creates value (i.e. an innovation) is multi-faceted and dynamic. Treating uncertainty as a single “thing” therefore clouds our understanding and ability to find and addresses root causes.
That’s why we need to look at different dimensions of uncertainty.
WHAT: Content uncertainty relates to the outcome or goal of the innovation process. To minimize it, we must address what we want to make, what we want the results to be, and what our goals are for the endeavor.
WHO: Participation uncertainty relates to the people, partners, and relationships active at various points in the process. It requires constant re-assessment of expertise and capabilities required and the people who need to be involved.
HOW: Procedure uncertainty focuses on the process, methods, and tools required to make progress. Again, it requires constant re-assessment of how we progress towards our goals.
WHERE: Time-space uncertainty focuses on the fact that the work may need to occur in different locations and on different timelines, requiring us to figure out when to start and where to work.
It’s tempting to think each of these are resolved in an orderly fashion, by clear decisions made at the start of a project, but when has a decision made on Day 1 ever held to launch day?
Uncertainty in Pharmaceutical Development
Let’s take the case of NatureComp, a mid-sized company pharmaceutical company and the uncertainties they navigated while working to replicate, develop, and commercialize a natural substance to target and treat heart disease.
What molecule should the biochemists research?
How should the molecule be produced?
Who has the expertise and capability to synthetically poduce the selected molecule because NatureComp doesn’t have the experience required internally?
Where to produce that meets the synthesization criteria and could produce cost-effectively at low volume?
What target disease specifically should the molecule target so that initial clincial trials can be developed and run?
Who will finance the initial trials and, hopefully, become a commercialization partner?
Where would the final commercial entity exist (e.g. stay in NatureComp, move to partner, stand-alone startup) and the molecule produced?
And those are just the highlights.
It’s all a bit squiggly
The knotty, scribbly mess at the start of the Design Squiggle is true. The line at the end is a lie because uncertainty never goes away. Instead, we learn and adapt until it feels manageable.
Next week, you’ll learn how.
Image credit: The Process of Design Squiggle by Damien Newman, thedesignsquiggle.com
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.
“Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually slaves of some defunct economist,” John Maynard Keynes, himself a long dead economist, once wrote. We are, much more than we’d like to admit, creatures of our own age, taking our cues from our environment.
That’s why we need to be on the lookout for our own biases. The truth, as we see it, is often more of a personalized manifestation of the zeitgeist than it is the product of any real insight or reflection. As Richard Feynman put it, “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that.”
We can’t believe everything we think. We often seize upon the most easily available information, rather than the most reliable sources. We then seek out information that confirms those beliefs and reject evidence that contradicts existing paradigms. That’s what leads to bad decisions. If what we see determines how we act, we need to look carefully.
The Rise And Fall Of Social Darwinism
In the 1860s, in response to Darwin’s ideas, Herbert Spencer and others began promoting the theory of Social Darwinism. The basic idea was that “survival of the fittest” meant that society should reflect a Hobbesian state of nature, in which most can expect a life that is “nasty, brutish and short,” while an exalted few enjoy the benefits of their superiority.
This was, of course, a gross misunderstanding of Darwin’s work. First, Darwin never used the term, “survival of the fittest,” which was actually coined by Spencer himself. Secondly, Darwin never meant to suggest that there are certain innate qualities that make one individual better than others, but that as the environment changes, certain traits tend to be propagated which, over time, can lead to a new species.
Still, if you see the world as a contest for individual survival, you will act accordingly. You will favor a laissez-faire approach to society, punishing the poor and unfortunate and rewarding the rich and powerful. In some cases, such as Nazi Germany and in the late Ottoman empire, Social Darwinism was used as a justification for genocide.
While some strains of Social Darwinism still exist, for the most part it has been discredited, partly because of excesses such as racism, eugenics and social inequality, but also because more rigorous approaches, such as evolutionary psychology, show that altruism and collaboration can themselves be adaptive traits.
The Making Of The Modern Organization
When Alfred Sloan created the modern corporation at General Motors in the early 20th century, what he really did was create a new type of organization. It had centralized management, far flung divisions and was exponentially more efficient at moving around men and material than anything that had come before.
He called it “federal decentralization.” Management would create operating principles, set goals and develop overall strategy, while day-to-day decisions were performed by people lower down in the structure. While there was some autonomy, it was more like an orchestra than a jazz band, with the CEO as conductor.
Here again, what people saw determined how they acted. Many believed that a basic set of management principles, if conceived and applied correctly, could be adapted to any kind of business, which culminated in the “Nifty Fifty” conglomerates of the 60’s and 70’s. It was, in some sense, an idea akin to Social Darwinism, implying that there are certain innate traits that make an organization more competitive.
Yet business environments change and, while larger organizations may be able to drive efficiencies, they often find it hard to adapt to changing conditions. When the economy hit hard times in the 1970s, the “Nifty Fifty” stocks vastly under-performed the market. By the time the 80s rolled around, conglomerates had fallen out of fashion.
Industries and Value Chains
In 1985, a relatively unknown professor at Harvard Business School named Michael Porter published a book called Competitive Advantage, which explained that by optimizing every facet of the value chain, a firm could consistently outperform its competitors. The book was an immediate success and made Porter a management superstar.
Key to Porter’s view was that firms compete in industries that are shaped by five forces: competitors, customers, suppliers, substitutes, and new market entrants. So he advised leaders to build and leverage bargaining power in each of those directions to create a sustainable competitive advantage for the long term.
If you see your business environment as being neatly organized in specific industries, everybody is a potential rival. Even your allies need to be viewed with suspicion. So, for example, when a new open source operating system called Linux appeared, Microsoft CEO Steve Ballmer considered it to be a threat and immediately attacked, calling it a cancer.
Yet even as Ballmer went on the attack, the business environment was changing. As the internet made the world more connected, technology companies found that leveraging that connectivity through open source communities was a winning strategy. Microsoft’s current CEO, Satya Nadella, says that the company loves Linux. Ultimately, it recognized that it couldn’t continue to shut itself out and compete effectively.
Looking To The Future
Take a moment to think about what the world must have looked like to J.P. Morgan a century ago, in 1922. The disruptive technologies of the day, electricity and internal combustion, were already almost 40 years old, but had little measurable economic impact. Life largely went on as it always had and the legendary financier lorded over his domain of corporate barons.
That would quickly change over the next decade when those technologies would gain traction, form ecosystems and drive a 50-year boom. The great “trusts” that he built would get broken up and by 1930 virtually all of them would be dropped as components of the Dow Jones Industrial average. Every face of life would be completely transformed.
We’re at a similar point today, on the brink of enormous transformation. The recent string of calamities, including a financial meltdown, a pandemic and the deadliest war in Europe in 80 years, demand that we take a new path. Powerful shifts in technology, demographics, resources and migration, suggest that even more disruption may be in our future.
The course we take from here will be determined by how we see the world we live in. Do we see our fellow citizens as a burden or an asset? Are new technologies a blessing or a threat? Is the world full of opportunities to be embraced or dangers we need to protect ourselves from? These are questions we need to think seriously about.
How we answer them will determine what comes next.
— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash
Sign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.