Category Archives: Innovation

Innovation Theater – How to Fake It ‘Till You Make It

Innovation Theater - How to Fake It 'Till You Make It

GUEST POST from Arlen Meyers

The overwhelming number of doctors, engineers and scientists don’t have an entrepreneurial mindset. What’s more, when they have an idea, they don’t know what to do with it since they will not learn those competencies in their formal training. They just don’t know how to innovate their way out of our sick care mess.

But, that hasn’t stopped lots of them from trying, include non-sick care entrepreneurs. They just improvise.

Now that Elizabeth Holmes has been convicted, many are commenting on the pros and cons of the “fake it ’till you make it” ethos of entrepreneurs and Silicon Valley. But, is this really a black and white issue? Is it true that “You have no business being something you are not, or doing something without proving your worth.” I venture to say many of us, including me, have done something that was not a good fit and we have all tried things when we simple did not know what we did not know.

Here’s how to fake it when you don’t know what you are doing or you forgot your lines:

  1. Avoid these wannapreneur rookie mistakes.
  2. If you are a female, find a male wing man so someone will invest in your product
  3. Surround yourself with people who are way above your pay grade at lots of Meetups
  4. Practice Therantology
  5. When you inevitably fail, make a big deal out of it and about how much you learned from your mistakes and include them on your Linked profile. Rinse. Repeat
  6. Wear a fleece vest with your company logo
  7. Plead ignorance about how hard it is to get anybody in sick care to change and the long sales cycles.
  8. Be sure you have lots of hood ornaments (doctors with fancy titles) on your advisory board prominently posted on your website
  9. Hire a virtual assistant that answers all of your calls and says that she/he will not be able to immediately connect you because you are in an investor meeting
  10. Get your co-working space guy to allow you to use more space than you are actually paying for when people come for meetings. Bribe interns with pizza to come and look busy.
  11. Forget being your authentic self. “You are generally better off coming across as likable, which will generally require some effort, restraint, and attention to what others expect and want to see. Seeming authentic in the process is the cherry on top of the cake, but it requires a fair amount of faking.”
  12. Try being a good rebel even if you are a bad one.
No alt text provided for this image

During these times, we are supposed to wear a mask. Most of us wear a mask all the time to hide our insecurities or avoid being outed as an imposter or physician wannapreneur, so none of this should be new to you.

 In a follow-up to their February 2021 article challenging the commonly understood definition of imposter syndrome, authors Ruchika Tulshyan and Jodi-Ann Burey offer actionable steps managers can take to end imposter syndrome in their organizations. Doing so will require work at both the interpersonal and organizational levels, and success will depend in part on gathering data and implementing real mechanisms for accountability. The authors call on managers to stop calling natural, human tendencies of self-doubt, hesitation, and lack of confidence “imposter syndrome.” Those who want women to lend their full talents and expertise must question the culture at work — not their confidence at work.

These things come with practice. But, since you are part of innovation theater, practice your lines, be sure you are wearing the right costume and that the stage is set properly. Break a leg.

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Innovative Companies Using Emerging Technologies

Innovative Companies Using Emerging Technologies

GUEST POST from Chateau G Pato

In the fast-paced world of business, companies must constantly innovate to stay ahead. Today, leveraging emerging technologies is essential for gaining a competitive advantage. Here, I explore how three pioneering companies are using emerging technologies to transform their industries and what lessons can be learned from their experiences.

Case Study 1: Tesla – Revolutionizing the Auto Industry with Autonomous Driving

Company Overview

Tesla, founded in 2003, has become synonymous with electric vehicles and innovations in the auto industry. Under the leadership of Elon Musk, Tesla has pushed the boundaries of what’s possible with cars, focusing on sustainability and advanced technology.

Technology Innovation

One of Tesla’s most groundbreaking endeavors is the development of autonomous driving technology. With the introduction of its Autopilot and Full Self-Driving (FSD) systems, Tesla not only enhances vehicle safety but also opens doors to a future where cars could drive themselves without human intervention.

Impact

This technology is setting a new standard in automotive innovation. Tesla’s approach to software updates over-the-air ensures that their cars get smarter over time, maintaining an edge over traditional automakers. The advent of autonomous driving could revolutionize transport logistics, reduce traffic congestion, and enhance overall road safety.

Case Study 2: Amazon – Transforming Customer Experience with AI and Robotics

Company Overview

Amazon started as an online bookstore in 1994 and has since evolved into a global e-commerce and cloud computing giant. Its founder, Jeff Bezos, has always placed a high value on innovation and customer-centric service.

Technology Innovation

Amazon has been at the forefront of AI and robotics to improve its logistics and customer experience. The use of Kiva robots in its warehouses and AI-driven recommendations on its website are just a few examples of how Amazon hones its competitive edge.

Impact

These technologies have tremendously improved the efficiency and speed of Amazon’s logistics network, allowing the company to deliver goods faster and more reliably. Moreover, AI-powered personal recommendations have increased conversion rates and enhanced the shopping experience, driving customer loyalty.

Case Study 3: IBM – Harnessing Quantum Computing for Unprecedented Problem-Solving

Company Overview

IBM is a legendary technology company that has played a pivotal role in the computing industry for over a century. Always at the forefront of tech innovation, IBM now focuses on quantum computing, AI, and cloud solutions.

Technology Innovation

IBM’s commitment to quantum computing is a game-changer. Quantum computers hold the potential to solve complex problems that are currently impossible for classical computers. IBM develops quantum systems and software and provides quantum computing as a service (QCaaS) through the IBM Quantum Experience.

Impact

This technology could revolutionize fields such as cryptography, drug discovery, and financial modeling. By providing access to quantum computing capabilities, IBM empowers researchers and businesses to explore new possibilities, thus driving innovation across various industries.

Conclusion

These case studies illustrate how companies can harness emerging technologies to redefine industry standards, improve efficiencies, and spearhead innovation. By looking at Tesla, Amazon, and IBM, we see the power of visionary thinking and technological adoption in driving business success. As we move forward, it’s essential for businesses to stay ahead by continuously exploring how emerging technologies can be integrated into their operations and strategies to lead their fields.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Not Invented Here

Sometimes Letting Go is the Hardest Part of Innovation

Not Invented Here

GUEST POST from John Bessant

(You can find a podcast version of this story here)

The Welsh valleys are amongst the most beautiful in the world. Lush green hills steeply falling into gorges with silver water glistening below. It’s a place of almost perfect peace, the only movement the gentle trudge of sheep grazing safely, shuffling across the jagged landscape the way they’ve done for thousands of years. And amongst the most scenic and peaceful of these valleys are those situated between Dolgellau in the north, and Machynlleth in the south.

Except when there’s traffic in the ‘Mach loop’ — which is what the region is known as in military circles. It’s the place where young men and women from a variety of international air forces hone their skills at high-speed low-level flying, often as low as 75 meters from the ground. At any moment your peaceful walk may be rudely interrupted by the roar of afterburners, your view across the green hillsides suddenly broken by the nose of an F16 or Typhoon poking its way up from one of the gorges below.

Your reaction may be mixed; annoyance at the interruption or admiration for the flying skills of those pilots giving you a personal air display. But it’s certainly impossible to ignore. And it does raise an interesting question — despite the impressive skills being demonstrated, do we actually need pilots flying the planes? Is there an alternative technology which allows low level high precision flying but which can be carried out by an operator sitting far away in a remote location? After all we’ve become pretty good at controlling devices at a distance, can even land them on distant planets or steer a course through the furthest reaches of our universe.

UAVs — unmanned aerial vehicles — are undoubtedly changing the face of aviation. But are they also a disruptive innovation, particularly in the military world where the heroic tradition of those magnificent men (and women) in their flying machines is still so strong?

A brief history of drones

The idea of using unmanned flying vehicles isn’t new; back in 1839 Austrian soldiers attacked the city of Venice with unmanned balloons filled with explosives. During the early years following the Wright brothers successful flight researchers began looking at the possibilities of unmanned aircraft. The first prototype took off in 1916 in the form of the Ruston Proctor Aerial Target; as its name suggests it was a pilotless machine designed to help train British aircrew in dogfighting. Importantly it drew on early versions of radio control and was one the many brainchildren of Nikolai Tesla but its early performance was unremarkable and the British military chose to scrap the project, believing that unmanned aerial vehicles had limited military potential.

A year later, an American alternative was created: the Hewitt-Sperry Automatic Airplane and successful trials led to the development of a production version, the Kettering Bug in 1918. Although its performance was impressive it arrived too late to be used in the war and further development was shelved.

By the time of the Second World War the enabling technologies around control and navigation had improved enormously; whilst still crude the German V1 and V2 rockets and flying bombs provided a powerful demonstration of what could be achieved at scale. Emphasis was placed on remote delivery of explosives — using UAVs as flying bombs or aerial torpedoes — but the possibilities of using them in other applications such as reconnaissance were beginning to be explored.

The Vietnam war saw this aspect come to the fore; the difficulties of operating in remote jungle and mountain zones made reconnaissance flying hazardous and the risks to aircrew who were shot down led to extensive use of UAVs. The Ryan Firebee drone flew over 5000 surveillance missions, controlled by a ground operator using a remote camera. Its versatility meant that it could be used for surveillance, delivery of supplies and as a weapon; UAVs began to be viewed as an alternative to manned aircraft. But despite their success and promise it was not until the 1990s that they began to occupy an increasingly significant role.

Early Drone - Wikimedia Commons

The technology found more support in Israel and during the 1973 Yom Kippur war UAVs were used in a variety of ways, as part of an integrated approach alongside piloted aircraft. A great deal of learning in this context meant that for a while Israel became the key source of UAV technology with the US acquiring and deploying this knowledge to improve its own capabilities, leading to the new generation deployed in the Gulf War. UAVs emerged as a critical tool for gathering intelligence at the tactical level. These systems were employed for battlefield damage assessment, targeting, and surveillance missions, particularly in high-threat airspace.

Fast forward to today. There’s been an incredible acceleration in the key enabling technologies which has helped UAVs established themselves as serious contenders for many aerial roles. For example GPS has moved from its early days in 1981 where a unit weighed 50kg and cost over $100k to a current cost of less than $5 for a chip-based unit weighing less than a gram. The Internal Measurement Unit (IMU) which measures a drone’s velocity, orientation and accelerations has followed a similar trajectory; in the 1960s an IMU weighed several kilograms and cost several million dollars but today the chipset which puts these features on your phone costs around $1. Kodak’s 1976 digital camera could only manage a 0.1 megapixel image from a unit weighing 2kg and costing over $10,000. Today’s digital cameras are approximately a billion times better (1000x resolution, 1000x smaller and 100x cheaper). And (perhaps most important) the communications capabilities now offered by Wi-Fi and Bluetooth enable accurate and long-range communication and control.

With an improvement trajectory like this you might be forgiven for assuming that UAVs would have largely replaced manned flying in most applications. It’s a cheap technology, versatile and (in military terms) expendable — losing a drone doesn’t carry with it the tragic costs of losing a trained pilot. Yet the reality is that the Mach Loop continues to reverberate with the sound of fast jets and their pilots practicing high-speed low-level maneuvers.

Not invented here?

Continuing to rely on manned aircraft is also a costly option — when a British F-35 Lightning crashed after take-off from an aircraft carrier in 2021 it represented over £100m sinking beneath the waves. So why is the adoption of UAV technology still problematic within major established air forces? It almost looks like another case of ‘not invented here’ — that strange innovation phenomenon in which otherwise smart organizations reject or bury promising new ideas.

At first sight it fits into a pattern which has been around a long time. Take the case of continuous aim gunfire at sea. Sounds rather dry and technical but what it boils down to is that 19th century naval warfare was not a very accurate game. Trying to shoot at something a long way away whilst perched on a ship which is rocking and rolling unexpectedly isn’t easy; most ships firing at other ships missed their targets. A study commissioned by the US Bureau of Ordnance in the late 1890s found an accuracy rate of less than 3%; in one test in 1899 five ships of the British North Atlantic Squadron fired for five minutes each at an old hulk at a range of 1600 yards; after 25 minutes only 2 hits had been registered on the target.

Clearly there was scope for innovation and it took place in 1898 on the decks of a British navy frigate called HMS Scylla, under the command of Percy Scott. He’d noticed that one of his gun crews was managing a much better performance and began studying and exploring what they were doing with a view to developing it into a system. By the time he was in command, two years later, of a squadron in the South China Sea he had refined his methods and equipped his flagship, HMS Terrible with new equipment and trained his gun crews.

Image: Painting by Christoffer Wilhelm Eckersberg, public domain

The improvements were significant and importantly influenced a young US lieutenant on secondment to the squadron. William Sims learned about the new system and applied it on his own ship with remarkable results; convinced of the power of this innovation he decided it was his mission to carry the news to Washington and change naval practice. What followed is a fascinating story for what it reveals about NIH and the many ways in which it can be deployed.

In his fascinating account Elting Morison highlights three strategies used by the US military to defend against the new idea. The first was simply to bury the idea; Sims’ reports to the Bureau of Ordnance and the Bureau of Navigation were simply filed away and forgotten. The second was to try and rebut the information; the response from the Bureau of Ordnance was along the lines of claiming that US equipment was as good as the British so any differences in firing performance must be due to the men involved. More important was their argument that continuous-aim firing was impossible; when that failed they conducted experiments to try to show there was no significant benefit from the approach. By running them on dry land they were able to cast doubt on the relative advantage of the new approach.

And their third strategy was to try and sideline Sims, painting him as an eccentric troublemaker, stressing his youth and lack of experience, highlighting the fact that he’d spent too long with the British navy and in other ways undermining his credibility. Needless to say this only strengthened Sims’ resolve and he duly went over the heads of the senior staff and appealed to President Roosevelt himself. He finally ‘won’; he remained as unpopular as ever but the new approach was grudgingly adopted and quickly became the dominant design for future naval gunnery.

Image: UK HMSO Public Domain

On dry land and a decade later a similar outsider — Major J. C. Fuller — was working with the British Army. He’d seen the possibilities in using tanks as a fast mobile strike capability and his ideas were eventually borne out, briefly in the latter part of the First World War when they were used to good effect in Cambrai and Amiens. But despite being given responsibility for introducing the new technology he met with resistance (not helped by his abrasive nature); there were many who saw tanks as an unwelcome diversion. It didn’t help that the organizational location in the command structure was in the Cavalry Corps — the very group most threatened by the change to tanks. Their post-war strategy was to continue to rely on the equine model; ‘more hay, more horses’ rather than investment in tanks or learning about tank warfare. Elsewhere though his ideas found fertile soil and he was credited by Adolf Hitler himself as the architect of the idea of ‘blitzkrieg’ — the fast mobile warfare which helped overrun France and much of Europe within a few weeks at the start of World War 2.

Drones as disruptive innovation?

Of course it’s complicated but could the case of drone adoption be history repeating itself? One explanation for why NIH happens in this fashion can be found in what we’ve learned about disruptive innovation. When it was published twenty five years ago Clayton Christensen’s classic book exploring the phenomenon ‘The innovator’s dilemma’ had the intriguing subtitle ‘When new technologies cause great firms to fail’. His core argument was that the organizations which were affected by disruptive innovations were not stupid but rather selectively blind, a consequence of their very market success and the organizational arrangements which had grown up over a long period of time to support that success.

For him the challenge wasn’t the old one of balancing radical and incremental change with the losing firms being too cautious. Rather it was about trajectories; whether a new technology was sustaining — reinforcing the existing trajectory — or disrupting, offering a new trajectory. As we’ve come to realize the core issue is about business models; disruption occurs when someone frames the new trajectory as something which can create value under different conditions.

The search for such a new business model doesn’t take place in the mainstream as a direct challenge; instead it emerges in different markets which are unserved or underserved but where the new features offer potential value (often good enough performance at much lower cost). These fringe markets provide the laboratory in which learning and refinement of the new technology and development of the business model can take place.

The problem arises when the new business model built on a new trajectory begins to appeal to the old mainstream market. At this point it’s a challenge to existing incumbents to let go of their old business model and reconfigure a new one. Jumping the tracks to a new trajectory is risky anyway but when you carry the baggage of years, perhaps decades or even centuries of the old model it becomes very hard. That’s when NIH rears its head and it can snap and bite at the new idea with surprising defensive vigor.

There’s almost a cycle to it like that developed by Elizabeth Kubler-Ross to explain the grieving process. First there’s denial — ignore it and it will go away, it’s not relevant to us, it won’t work in our industry, it’s not our core business, etc. Then there’s a period of rationalization, accepting the new idea but dismissing it as not relevant to the core business, followed by experimentation designed not so much to learn as to demonstrate why and how the new model offers little advantage. Variations on this theme include locating the experiments in the very part of the organization which has the most to lose (think about giving tank development to the Cavalry Corps).

Only when the evidence becomes impossible to ignore (often as a clear shift in the market towards the new trajectory and a significant competitive threat) comes the moment of acceptance. But even then commitment is often slow and lukewarm and the opportunity to get on the bus may have been missed.

Meanwhile in another part of the galaxy…

It’s not easy for the innovators trying to introduce the change. They struggle to break into the mainstream because they have no presence in that market and they are up against established interests and networks. Their best strategy is to continue to work with their fringe markets who do see the value in their model and to hope that eventually a cross-over to the mainstream takes place. Which is what has happened in the world of drone technology.

Demanding users in fringe application markets have provided the laboratory for fast learning. Early markets were in aerial photography where the cost of hiring planes and pilots could be cut significantly but where challenges around stability and development of lightweight equipment forced rapid innovation. Or mapping and surveying where difficult and sometimes inaccessible territory could be explored remotely. Once drones were able to carry specialized lightweight tools they could be used for remote repair and maintenance on oil platforms and other difficult or dangerous locations. Their capabilities in transportation opened up new possibilities in logistics, especially in challenging areas like delivering humanitarian aid. Significantly the demands of these fringe markets drove innovation around stability, payload, propulsion and other technologies, reinforcing and widening the appeal.

Estimates suggest the 2021 drone services market is worth $9 billion with predictions of growth rates as high as 45% per year. Application sectors outside mainstream aviation include infrastructure, agriculture, transport, security, media and entertainment, insurance, telecommunication and mining.

Holding the horses?

These days UAVs can do a lot for low price. Just like low-cost flying, mini mill steelmaking, and earthmoving equipment they represent a technology which has already changed the game in many sectors. They qualify as a disruptive innovation but they also trigger some interesting NIH behavior amongst the established incumbents. ‘We’ve always done it this way’ is particularly powerful when ‘this way’ has been around a long time and is associated with a history of past success.

Elting Morison has another story which underlines this challenge. Once again it concerns gunnery, this time the firing performance of mobile artillery crews in the British army during World War 2. A time and motion study was carried out using photographs of the procedure; the researcher was increasingly puzzled by the fact that at a certain point just before the gun was fired two men would peel away and stand several meters distant. It wasn’t until he discussed his findings with a retired colonel from the First World War that the mystery was solved. He was able to explain that the move was perfectly clear — the men were holding the horses. Despite the fact that the 1942 artillery was transported by truck the procedures for horse-drawn guns still remained in place.

Something worth reflecting on when you are walking in those Welsh hills…

Image: Pixabay

Image credits: as captioned, Wikimedia Commons, Pixabay

If you’re interested in more innovation stories please check out my website here

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Tools and Software for Tracking Innovation

Tools and Software for Tracking Innovation

GUEST POST from Art Inteligencia

In today’s fast-paced world, organizations must be agile and adaptive to remain competitive. Central to this adaptability is the ability to track and manage innovation effectively. Various tools and software platforms have been developed to help organizations manage the complexity of innovation processes, from ideation to implementation. This article will explore some of these tools, illustrating how they can be applied to real-world scenarios through case studies.

1. Understanding Innovation Tracking

Innovation tracking involves monitoring the development and implementation of new ideas within an organization. This process can include capturing inspiration, managing projects, and measuring impact. With a robust tracking system, teams can ensure alignment with strategic goals and demonstrate progress to stakeholders.

2. Essential Tools for Innovation Tracking

Several tools have emerged as leaders in innovation tracking due to their comprehensive features and user-friendly interfaces. Some of these include:

  • Idea Management Software: Platforms like Spigit, Brightidea, and IdeaScale help collect, evaluate, and prioritize innovative ideas from employees and stakeholders.
  • Project Management Tools: Tools such as Trello, Asana, and Monday.com support teams in managing tasks and workflows associated with innovation projects.
  • Data Analytics Platforms: Using platforms like Tableau and Power BI can help teams analyze and visualize innovation performance data.

3. Case Studies

Case Study 1: Johnson & Johnson’s Use of Brightidea

Johnson & Johnson (J&J), a global healthcare leader, faced the challenge of managing innovation across its vast network of employees. To streamline this process, J&J adopted Brightidea, an idea management platform that enables employees to submit, discuss, and evaluate new ideas.

“The introduction of Brightidea has transformed the way we approach innovation. By allowing employees at all levels to contribute, we’ve seen a dramatic increase in both the quality and quantity of ideas brought forward,” – Director of Innovation at Johnson & Johnson.

Brightidea facilitated the capturing of ideas from over 60,000 employees. By prioritizing ideas that align with strategic goals, Johnson & Johnson can efficiently allocate resources and develop new products that meet market needs. The platform’s intuitive interface and comprehensive analytics tools provide insights, enabling J&J to track the progress and impact of each innovation initiative.

Case Study 2: Trello and Power BI at XYZ Corporation

XYZ Corporation, a mid-sized tech company, struggled with fragmented innovation processes causing misalignment and delayed project timelines. By integrating Trello for project management and Power BI for analytics, XYZ significantly enhanced its innovation tracking capabilities.

“Utilizing Trello and Power BI has brought unprecedented visibility and efficiency to our innovation efforts, aligning teams and accelerating time-to-market,” – Innovation Program Manager at XYZ Corporation.

The Kanban-style interface of Trello allowed teams to manage tasks more effectively, improving collaboration and reducing project bottlenecks. Meanwhile, Power BI enabled the aggregation of project data for detailed analysis and reporting. As a result, XYZ Corporation could track performance metrics in real-time, gain insightful data-driven decisions, and optimize innovation strategies for greater success.

Conclusion

In conclusion, tracking innovation is an essential component for organizations seeking to maintain competitive advantage. By leveraging the right tools, businesses can cultivate a robust culture of innovation, ensuring ideas are nurtured from conception to implementation. Whether it’s through idea management platforms, project management software, or analysis tools, the right technology can empower organizations to remain agile and innovative in a dynamic market.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

The Era of Moving Fast and Breaking Things is Over

The Era of Moving Fast and Breaking Things is Over

GUEST POST from Greg Satell

On July 16th, 1945, when the world’s first nuclear explosion shook the plains of New Mexico, the leader of the Manhattan Project, J. Robert Oppenheimer quoted from the Bhagavad Gita, “Now I am become Death, the destroyer of worlds.” Clearly, he was troubled by what he had unleashed and for good reason. The world was never truly the same after that.

Today, however, we have lost much of that reverence for the power of technology. Instead of proceeding deliberately and with caution, tech entrepreneurs have prided themselves on their willingness to “move fast and break things” and, almost reflexively, casually deride anyone who questions the practice as those who “don’t get it.”

It’s hard to see how, by any tangible metric, any of this has made us better off. We set out to disrupt industries, but disrupted people instead. It wasn’t always like this. Throughout our history we have asked hard questions and made good choices about technological progress. As we enter a new era of innovation, we desperately need to recapture some of that wisdom.

How We Put the Nuclear Genie Back in the Bottle

The story of nuclear weapons didn’t start with Oppenheimer, not by a long shot. In fact, if we were going to attribute the Manhattan Project to a single person, it would probably be a Hungarian immigrant physicist named Leo Szilard, who was one of the first to conceive of the possibility of a nuclear chain reaction.

In 1939, upon hearing of the discovery of nuclear fission in Germany he, along with fellow Hungarian emigre Eugene Wigner, decided that the authorities needed to be warned. Szilard then composed a letter warning of the possibility of a nuclear bomb that was eventually signed by Albert Einstein and sent to President Roosevelt. That’s what led to the American development program.

Yet after the explosions at Hiroshima and Nagasaki, many of the scientists who worked to develop the bomb wanted to educate the public of its dangers. In 1955, the philosopher Bertrand Russell issued a manifesto signed by a number of scientific luminaries. Based on this, a series of conferences at Pugwash, Nova Scotia were convened to discuss different approaches to protect the world from weapons of mass destruction.

These efforts involved far more than talk, but helped to shape the non-proliferation agenda and led to concrete achievements such as the Partial Test Ban Treaty. In fact, these contributions were so crucially important that the organizers of the Pugwash conferences were awarded the Nobel Peace Prize in 1995 and they continue even today.

Putting Limits On What We Do With the Code of Life

While the nuclear age started with a bang, the genetic age began with a simple article in the scientific journal Nature, written by two relatively unknown scientists named James Watson and Francis Crick, that described the structure of DNA. It was one of those few watershed moments when an entirely new branch of science arose from a single event.

The field progressed quickly and, roughly 20 years later, a brilliant researcher named Paul Berg discovered that you could merge human DNA with that from other living things, creating new genetic material that didn’t exist in nature. Much like Oppenheimer, Berg understood that, due to his work, humanity stood on a precipice and it wasn’t quite clear where the edge was.

He organized a conference at Asilomar State Beach in California to establish guidelines. Importantly, participation wasn’t limited to scientists. A wide swath of stakeholders were invited, including public officials, members of the media and ethical specialists. The result, now known as the Berg Letter, called for a moratorium on the riskiest experiments until the dangers were better understood. These norms were respected for decades.

Today, we’re undergoing another revolution in genomics and synthetic biology. New technologies, such as CRISPR and mRNA techniques, have opened up incredible possibilities, but also serious dangers. Yet here again, pioneers in the field like Jennifer Doudna are taking the lead in devising sensible guardrails and using the technology responsibly.

The New Economy Meets the New Era of Innovation

When Netscape went public in 1995, it hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25. Still, a tiny enterprise with no profits was almost instantly worth $2.9 billion.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet today, it’s clear that the “new economy” was a mirage. Despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

The digital revolution has been a real disappointment. In fact, when you look at outcomes, if anything we’re worse off. Rather than a democratized economy, market concentration has markedly increased in most industries. Income inequality in advanced economies has soared. In America wages have stagnated and social mobility has declined for decades. At the same time, social media has been destroying our mental health.

Now we’re entering a new era of innovation, in which we will unleash technologies much more powerful. New computing architectures like quantum and neuromorphic technologies will power things like synthetic biology and materials science to create things that would have seemed like science fiction a generation ago. We simply can no longer afford to be so reckless.

Shifting From Agility Toward Resilience

Moving fast and breaking things only seems like a good idea in a stable world. When you operate in a safe environment, it’s okay to take a little risk and see what happens. Clearly, we no longer live in such a world (if we ever did). Taking on more risk in financial markets led to the Great Recession. Being blase about data security has nearly destroyed our democracy. Failure to prepare for a pandemic has nearly brought modern society to its knees.

Over the next decade, the dangers will only increase. We will undergo four major shifts in technology, resources, migration and demographics. To put that in perspective, a similar shift in demography was enough to make the 60s a tumultuous decade. We haven’t seen a confluence of so many disruptive forces since the 1920s and that didn’t end well.

Unfortunately it’s far too easy to underinvest in order to mitigate the risk of a danger that may never come to fruition. Moving fast and breaking things can seem attractive because the costs are often diffuse. Although it has impoverished society as a whole and made us worse off in so many ways, it has created a small cadre of fabulously wealthy plutocrats.

Yet history is not destiny. We have the power to shape our path by making better choices. We can abandon the cult of disruption and begin to invest in resilience. In fact, we have to. By this point there should be no doubt that the dangers are real. The only question is whether we will act now or simply wait for it to happen and accept the consequences.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Measuring the Impact of Social and Environmental Innovation

Measuring the Impact of Social and Environmental Innovation

GUEST POST from Chateau G Pato

As we advance into an era of conscientious capitalism, the role of social and environmental innovation has become more critical than ever. Organizations are increasingly measured not just on their financial performance, but on their ability to generate positive social and environmental outcomes. However, to truly recognize the value of these innovations, we must develop robust methods for measuring their impact.

In this article, we’ll explore key strategies for evaluating the impact of social and environmental innovation, supported by two illustrative case studies.

Importance of Measuring Impact

Measuring impact is vital for several reasons. It provides accountability, guiding companies to deliver on their promises. It also helps in securing funding and support from stakeholders and enhances decision-making by providing insights into what works and what doesn’t. Moreover, clear metrics can foster increased transparency and trust between an organization and its stakeholders.

Approaches to Measuring Impact

While there is no one-size-fits-all approach, several methodologies can be used to measure impact:

  • Social Return on Investment (SROI): This method quantifies the social, environmental, and economic value created by an organization relative to the resources invested.
  • Triple Bottom Line (TBL): Focuses on people, planet, and profit, evaluating social and environmental performance alongside financial outcomes.
  • Key Performance Indicators (KPIs): Specific metrics tailored to a project’s goals, offering a direct line to assessing impact.

Case Study 1: Interface, Inc.

Background

Interface, Inc., one of the largest global manufacturers of modular carpet, embarked on a transformative mission to become a fully sustainable enterprise. Their initiative, Mission Zero, aimed to eliminate any negative impact the company had on the environment by 2020.

Measuring Impact

Interface used a comprehensive approach to measure its environmental innovations — they tracked metrics such as carbon emissions, water usage, and waste reduction. They also calculated their progress toward Mission Zero goals, establishing clear KPIs and regularly publishing sustainability reports.

Results

By the end of 2020, Interface had managed to reduce its greenhouse gas emissions by 96% and waste to landfills by 91% from a 1996 baseline, all while increasing their recycled and bio-based content across products.

Case Study 2: The MicroLoan Foundation

Background

The MicroLoan Foundation provides small loans, business training, and mentorship to women in sub-Saharan Africa, aiming to lift communities out of poverty through female entrepreneurship.

Measuring Impact

This organization uses a Social Return on Investment (SROI) framework to evaluate the socioeconomic impact of its programs. They assess metrics such as income increase, business success rate, and improvements in quality of life. Moreover, they track the ripple effect within communities, measuring how these microloans improve education and healthcare access.

Results

Women supported by the MicroLoan Foundation reported a 96% success rate in their businesses with significant improvements in household income and education access for their children, demonstrating a substantial SROI.

Moving Forward

As businesses aim to achieve sustainable and inclusive growth, the ability to precisely measure social and environmental impact becomes a vital asset. By leveraging diverse measurement strategies, companies can ensure they are not only contributing positively to society and the environment but are also reaping the rewards of their efforts through enhanced reputation and stakeholder trust.

Ultimately, the evolving landscape of business underscores that financial gain and social good do not have to be mutually exclusive but can coexist to create a more inclusive and sustainable future.

As leaders in change and innovation, let us commit to not just measuring outcomes, but driving meaningful impact that transforms lives and safeguards our planet.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Embracing Failure – Lessons Learned from Setbacks

Embracing Failure - Lessons Learned from Setbacks

GUEST POST from Art Inteligencia

In the world of innovation, failure is not just inevitable, it’s essential. Embracing failure can lead to groundbreaking discoveries, foster resilience, and cultivate a culture that thrives on learning. While the stigma of failure persists, forward-thinking organizations understand that embracing setbacks is a cornerstone of progress. Here, we explore two compelling case studies that illustrate how failure can be transformed into a stepping stone for future success.

Case Study 1: The Rise of Airbnb

When Brian Chesky and Joe Gebbia first conceived the idea of renting out air mattresses on their apartment floor, their concept wasn’t an overnight sensation. The fledgling platform struggled, with its initial website launch garnering disappointing engagement. The duo faced numerous rejections from investors, many of whom doubted the viability of the idea. However, rather than viewing these setbacks as failures, the team saw them as opportunities to refine their model and focus on user experience.

Lessons Learned:

  • Pivoting is powerful: Chesky and Gebbia used feedback from failures to adapt their business model, eventually redefining the travel and lodging industry.
  • Persistence is key: Despite numerous rejections, they persisted, displaying resilience that would eventually lead to Airbnb’s global success.

Case Study 2: The WD-40 Story

WD-40, now a staple in households worldwide, originated from a series of failures. The product’s creation was the result of 39 unsuccessful attempts to develop a formula to prevent corrosion. Instead of seeing these failed attempts as a loss, the creators viewed each one as a learning opportunity. The breakthrough came with the 40th formula, hence the name “WD-40” which stands for “Water Displacement, 40th formula”.

Lessons Learned:

  • Learning from repetition: Every failed attempt provided valuable data, ultimately leading to a successful product.
  • Failure can fortify determination: The triple-digit number of attempts underscores how determination can lead to ultimate success.

Conclusion

Both of these stories demonstrate that failure is not the opposite of success; it is part of its journey. Organizations willing to embrace failure cultivate a learning culture, fostering innovation and improvement. Embracing failure also sets the stage for transformational change as each setback provides the chance to learn, innovate, and ultimately succeed.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Driving Innovation Through Empathy, Leadership and Understanding

Driving Innovation Through Empathy, Leadership and Understanding

GUEST POST from Art Inteligencia

In the rapidly evolving world of business, innovation stands as a critical driver for success. While processes, structures, and technologies play substantial roles, the human element—particularly empathy—holds profound potential. Empathy allows leaders to deeply understand and genuinely connect with their teams and customers, fostering an environment where innovation thrives. This article explores the intricate relationship between empathy and leadership, anchored by compelling case studies that illustrate transformative outcomes when empathy is prioritized.

Case Study 1: The LEGO Group

LEGO, the beloved toy company, experienced significant challenges in the early 2000s. The company was nearing bankruptcy due to a failure to adapt to the changing interests of its core audience—children. The leadership team at LEGO realized a need to step back and adopt a fresh perspective grounded in empathy.

The turnaround strategy, famously termed “LEGO’s Business Transformation,” required the leadership to immerse themselves in the world of their customers—children. By spending time observing and interacting with children during play sessions, LEGO’s leaders understood the emotional and creative needs of their audience. This led to innovations like the immensely popular LEGO Friends series, which was designed based on detailed feedback from young girls who were previously underserved by traditionally boy-oriented LEGO products.

The result was not only an incredible resurgence in profitability but also an innovation culture that prioritizes deep customer connection and iterative feedback—a testament to the power of empathy-driven leadership.

Case Study 2: Microsoft’s Cultural Transformation

When Satya Nadella became the CEO of Microsoft in 2014, the company was seen as a bureaucratic giant struggling to compete with more nimble tech innovators. Nadella’s leadership focused heavily on empathy, both internally across Microsoft’s vast workforce and externally toward customers.

Internally, Nadella encouraged a cultural shift from a “know-it-all” to a “learn-it-all” philosophy. He challenged teams to use empathy to transform customer engagement strategies and product development processes. A concrete example is the development of features for people with disabilities, inspired by Nadella’s personal experiences as a father of a child with special needs.

This empathy-first approach led to breakthrough innovations such as Seeing AI, an app that narrates the world for the visually impaired, exemplifying how deep understanding and leadership empathy could drive product innovation while simultaneously enhancing Microsoft’s brand value and market relevance.

Conclusion

Empathy enables leaders to connect deeply with their teams and customers, providing a compass that guides innovative practices. The stories of LEGO and Microsoft underscore the profound impact that empathy can have when it shapes leadership strategies. As businesses grapple with complex challenges, those that integrate empathy into the very fabric of their leadership are not only poised to innovate but to do so in a manner that genuinely resonates with human needs.

In embracing empathy, leaders unlock the key to sustainable innovation, transforming their organizations into environments where understanding, creativity, and impact coexist harmoniously.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

New Skills Needed for a New Era of Innovation

New Skills Needed for a New Era of Innovation

GUEST POST from Greg Satell

The late Clayton Christensen had a theory about “jobs to be done.” In his view, customers don’t buy products as much as they “hire” companies to do specific “jobs” for them. To be competitive, firms need to understand what that job is and how to do it well. In other words, no one wants a quarter-inch drill bit, they want a quarter-inch hole.

The same can be said for an entire society. We need certain jobs to be done and will pay handsomely for ones that we hold in high regard, even as we devalue others. Just as being the best blacksmith in town won’t earn you much of a living today, great coding skills wouldn’t do you much good in a medieval village.

This is especially important to keep in mind today as the digital revolution comes to an end and we enter a new era of innovation in which some tasks will be devalued and others will be increasingly in demand. Much like Christensen said about firms, we as a society need to learn to anticipate which skills will lose value in future years and which will be considered critical.

The Evolution of Economies

The first consumer product was most likely the Acheulean hand axe, invented by some enterprising stone age entrepreneur over 100,000 years ago. Evidence suggests that, for the most part, people made stone axes themselves, but as technology evolved, some began to specialize in different crafts, such as smithing, weaving, cobbling and so on.

Inventions like the steam engine, and then later electricity and the internal combustion engine, brought about the industrial revolution, which largely put craftsmen out of work and reshaped society around cities that could support factories. It also required new skills to organize work, leading to the profession of management and the knowledge economy.

The inventions of the microchip and the internet have led to an information economy in which even a teenager with a smartphone has better access to knowledge than a specialist working in a major institution a generation ago. Much like the industrial era automated physical tasks, the digital era has automated many cognitive tasks.

Now as the digital era is ending we are entering a new era of innovation in which we will shift to post-digital computing architectures such as quantum computing and neuromorphic chips and enormous value will be created through bits powering atoms in fields like synthetic biology and materials science.

Innovation, Jobs and Wages

As economies evolved, some tasks became devalued as others increased in importance. When people could go to a smith for metal tools, they had no need to create stone axes. In much the same way, the industrial revolution put craft guilds out of business and technologies like tractors and combine harvesters drastically reduced the number of people working on farms.

Clearly replacing human labor with technology is disruptive, but it has historically led to dramatic increases in productivity. So labor displacement effects have been outweighed by greater wages and new jobs created by new industries. For the most part, innovation has made all of us better off, even, to a great extent, the workers who were displaced.

Consider the case of Henry Ford. Because technology replaced many tasks on the family farm, he didn’t need to work on it and found a job as an engineer for Thomas Edison, where he earned enough money and had enough leisure time to tinker with engines. That led him to create his own company, pioneer an industry and create good jobs for many others.

Unfortunately, there is increasing evidence that more recent innovations may not be producing comparable amounts of productivity and that’s causing problems. For example, when a company replaces a customer service agent with an automated system, it’s highly doubtful that the productivity gains will be enough to finance entire new industries that will train that call center employee to, say, design websites or run marketing campaigns.

Identifying New Jobs To Be Done

To understand the disconnect between technological innovation and productivity it’s helpful to look at some underlying economic data. In US manufacturing, for instance, productivity has skyrocketed, roughly doubling output in the 30 years between 1987 and 2017, even as employment in the sector decreased by roughly a third.

It is the increased productivity growth in manufacturing that has fueled employment growth in the service sector. However, productivity gains in service jobs have been relatively meager and automation through technological innovation has not resulted in higher wages, but greater income inequality as returns to capital dwarf returns to labor.

Further economic analysis shows that the divide isn’t so much between “white collar” and “blue collar” jobs, but between routine and non-routine tasks. So warehouse workers and retail clerks have suffered, but designers and wedding planners have fared much better. In other words, technological automation is creating major shifts in the “jobs to be done.”

A recent analysis by the McKinsey Global Institute bears this out. It identified 56 “foundational skills” that are crucial to the future of work, but aren’t in traditional categories such as “engineering” or “sales,” but rather things like self awareness, emotional intelligence and critical thinking.

Collaboration Is The New Competitive Advantage

The industrial revolution drove a shift from animal power to machine power and from physical skills to cognitive skills. What we’re seeing now is a similar shift from cognitive skills to social skills as automation takes over many routine cognitive tasks, increasingly the “job” that humans are valued for is relating to other humans.

There are some things a machine will never do. An algorithm will never strike out at a Little League game, see its child born or have a bad day at work. We can, of course, train computers to mimic these things by training them on data, but they will never actually have the experience and that limits their ability to fully relate to human emotions.

To see how this is likely to play out, simply go and visit your local Apple Store. It is a highly automated operation, without traditional checkout aisles or cash registers. Still, the first thing that catches your eye is a sea of blue shirts waiting to help you. They are not there to execute transactions, which you can easily do online, but to engage with you, understand what you’re trying to achieve and help you get it done.

We’ve seen similar trends at work even in highly technical fields. A study of 19.9 million scientific papers found that not only has the percentage of papers published by teams steadily increased over the past 50 years, the size of those teams has also grown and their research is more highly cited. The journal Nature got similar results and also found that the work being done is far more interdisciplinary and done at greater distances.

What’s becoming clear is that collaboration is increasingly becoming a competitive advantage. The ultimate skill is no longer knowledge or proficiency in a particular domain, but to build a shared purpose with others, who possess a diverse set of skills and perspectives, in order to solve complex problems. In other words, the most important jobs the ones we do in the service of a common objective.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Aligning Innovation Metrics with Business Objectives

Aligning Innovation Metrics with Business Objectives

GUEST POST from Art Inteligencia

In today’s rapidly evolving business landscape, fostering a culture of innovation is crucial for organizations aiming to maintain a competitive edge. However, one prevalent challenge that leaders face is how to effectively measure innovation. More importantly, how can organizations ensure that the metrics they use to evaluate innovation align with their overarching business objectives? It’s essential to choose the right indicators that not only provide insight into the innovation process but also reflect the value added to the organization. This article explores the importance of aligning innovation metrics with business objectives and presents case studies illustrating successful implementations.

The Importance of Alignment

While innovation is celebrated as the driver of progress, it must be strategically aligned with the organization’s objectives to create meaningful impact. This alignment ensures that resources dedicated to innovation contribute to the achievement of business goals. Misaligned metrics might encourage behaviors that do not necessarily drive desired business outcomes, such as focusing on quantity over quality, or pursuing innovation for its own sake without regard to strategic fit. Thus, aligning innovation metrics with business objectives is critical for ensuring innovation efforts contribute to a sustainable competitive advantage.

Framework for Aligning Innovation Metrics

A well-structured framework for aligning innovation metrics with business objectives involves the following steps:

  1. Understand Business Goals: Begin with establishing a clear understanding of the business’s strategic objectives.
  2. Identify Relevant Innovation Metrics: Select innovation indicators that reflect progress towards those objectives. These might include metrics related to R&D efficiency, time to market, new product introduction rate, or customer satisfaction.
  3. Connect Metrics to Business Outcomes: Ensure that each innovation metric can be directly linked to a specific business goal, such as revenue growth, market share expansion, or operational efficiency improvement.
  4. Continuously Review and Adjust: Innovation is dynamic; thus, regularly review and refine metrics to ensure they remain aligned with evolving business objectives.

Case Study 1: Tech Innovators Inc.

Tech Innovators Inc., a leading technology company, faced challenges in aligning their innovation metrics with business objectives. Initially, the company focused on the number of patents filed as its primary innovation metric. However, leadership realized that while patent filings were increasing, they were not translating into market success or revenue growth.

To address this, the company realigned its innovation metrics by linking them to specific business goals. They introduced metrics such as “Revenue from new products” and “Market penetration rate of products filed under patents.” By shifting their focus, Tech Innovators Inc. successfully transformed their innovation efforts, resulting in a 20% increase in revenues from new products within two years, and a significant improvement in market share.

Case Study 2: Green Future Energy

Green Future Energy is a renewable energy company committed to sustainability. Initially, their innovation efforts were evaluated using metrics such as “Number of green technologies developed.” However, this did not align with the company’s core objective of reducing carbon emissions.

By aligning innovation metrics to business objectives, Green Future Energy adopted measures such as “Reduction in carbon footprint per dollar of revenue” and “Energy efficiency improvement in new technologies.” This realignment allowed the company to focus on impactful innovations. Consequently, they achieved a 30% reduction in carbon emissions over three years, securing their position as a leader in sustainable energy solutions.

Conclusion

Aligning innovation metrics with business objectives is not merely about measurement but about meaningful measurement that drives value creation. By ensuring that metrics reflect strategic priorities, organizations can foster an environment where innovation translates into market success, revenue growth, and operational excellence. The case studies of Tech Innovators Inc. and Green Future Energy illustrate that with the right framework and mindset, aligning metrics with objectives can transform innovation from a nebulous concept into a strategic asset.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.