Monthly Archives: February 2022

Not Invented Here

Sometimes Letting Go is the Hardest Part of Innovation

Not Invented Here

GUEST POST from John Bessant

(You can find a podcast version of this story here)

The Welsh valleys are amongst the most beautiful in the world. Lush green hills steeply falling into gorges with silver water glistening below. It’s a place of almost perfect peace, the only movement the gentle trudge of sheep grazing safely, shuffling across the jagged landscape the way they’ve done for thousands of years. And amongst the most scenic and peaceful of these valleys are those situated between Dolgellau in the north, and Machynlleth in the south.

Except when there’s traffic in the ‘Mach loop’ — which is what the region is known as in military circles. It’s the place where young men and women from a variety of international air forces hone their skills at high-speed low-level flying, often as low as 75 meters from the ground. At any moment your peaceful walk may be rudely interrupted by the roar of afterburners, your view across the green hillsides suddenly broken by the nose of an F16 or Typhoon poking its way up from one of the gorges below.

Your reaction may be mixed; annoyance at the interruption or admiration for the flying skills of those pilots giving you a personal air display. But it’s certainly impossible to ignore. And it does raise an interesting question — despite the impressive skills being demonstrated, do we actually need pilots flying the planes? Is there an alternative technology which allows low level high precision flying but which can be carried out by an operator sitting far away in a remote location? After all we’ve become pretty good at controlling devices at a distance, can even land them on distant planets or steer a course through the furthest reaches of our universe.

UAVs — unmanned aerial vehicles — are undoubtedly changing the face of aviation. But are they also a disruptive innovation, particularly in the military world where the heroic tradition of those magnificent men (and women) in their flying machines is still so strong?

A brief history of drones

The idea of using unmanned flying vehicles isn’t new; back in 1839 Austrian soldiers attacked the city of Venice with unmanned balloons filled with explosives. During the early years following the Wright brothers successful flight researchers began looking at the possibilities of unmanned aircraft. The first prototype took off in 1916 in the form of the Ruston Proctor Aerial Target; as its name suggests it was a pilotless machine designed to help train British aircrew in dogfighting. Importantly it drew on early versions of radio control and was one the many brainchildren of Nikolai Tesla but its early performance was unremarkable and the British military chose to scrap the project, believing that unmanned aerial vehicles had limited military potential.

A year later, an American alternative was created: the Hewitt-Sperry Automatic Airplane and successful trials led to the development of a production version, the Kettering Bug in 1918. Although its performance was impressive it arrived too late to be used in the war and further development was shelved.

By the time of the Second World War the enabling technologies around control and navigation had improved enormously; whilst still crude the German V1 and V2 rockets and flying bombs provided a powerful demonstration of what could be achieved at scale. Emphasis was placed on remote delivery of explosives — using UAVs as flying bombs or aerial torpedoes — but the possibilities of using them in other applications such as reconnaissance were beginning to be explored.

The Vietnam war saw this aspect come to the fore; the difficulties of operating in remote jungle and mountain zones made reconnaissance flying hazardous and the risks to aircrew who were shot down led to extensive use of UAVs. The Ryan Firebee drone flew over 5000 surveillance missions, controlled by a ground operator using a remote camera. Its versatility meant that it could be used for surveillance, delivery of supplies and as a weapon; UAVs began to be viewed as an alternative to manned aircraft. But despite their success and promise it was not until the 1990s that they began to occupy an increasingly significant role.

Early Drone - Wikimedia Commons

The technology found more support in Israel and during the 1973 Yom Kippur war UAVs were used in a variety of ways, as part of an integrated approach alongside piloted aircraft. A great deal of learning in this context meant that for a while Israel became the key source of UAV technology with the US acquiring and deploying this knowledge to improve its own capabilities, leading to the new generation deployed in the Gulf War. UAVs emerged as a critical tool for gathering intelligence at the tactical level. These systems were employed for battlefield damage assessment, targeting, and surveillance missions, particularly in high-threat airspace.

Fast forward to today. There’s been an incredible acceleration in the key enabling technologies which has helped UAVs established themselves as serious contenders for many aerial roles. For example GPS has moved from its early days in 1981 where a unit weighed 50kg and cost over $100k to a current cost of less than $5 for a chip-based unit weighing less than a gram. The Internal Measurement Unit (IMU) which measures a drone’s velocity, orientation and accelerations has followed a similar trajectory; in the 1960s an IMU weighed several kilograms and cost several million dollars but today the chipset which puts these features on your phone costs around $1. Kodak’s 1976 digital camera could only manage a 0.1 megapixel image from a unit weighing 2kg and costing over $10,000. Today’s digital cameras are approximately a billion times better (1000x resolution, 1000x smaller and 100x cheaper). And (perhaps most important) the communications capabilities now offered by Wi-Fi and Bluetooth enable accurate and long-range communication and control.

With an improvement trajectory like this you might be forgiven for assuming that UAVs would have largely replaced manned flying in most applications. It’s a cheap technology, versatile and (in military terms) expendable — losing a drone doesn’t carry with it the tragic costs of losing a trained pilot. Yet the reality is that the Mach Loop continues to reverberate with the sound of fast jets and their pilots practicing high-speed low-level maneuvers.

Not invented here?

Continuing to rely on manned aircraft is also a costly option — when a British F-35 Lightning crashed after take-off from an aircraft carrier in 2021 it represented over £100m sinking beneath the waves. So why is the adoption of UAV technology still problematic within major established air forces? It almost looks like another case of ‘not invented here’ — that strange innovation phenomenon in which otherwise smart organizations reject or bury promising new ideas.

At first sight it fits into a pattern which has been around a long time. Take the case of continuous aim gunfire at sea. Sounds rather dry and technical but what it boils down to is that 19th century naval warfare was not a very accurate game. Trying to shoot at something a long way away whilst perched on a ship which is rocking and rolling unexpectedly isn’t easy; most ships firing at other ships missed their targets. A study commissioned by the US Bureau of Ordnance in the late 1890s found an accuracy rate of less than 3%; in one test in 1899 five ships of the British North Atlantic Squadron fired for five minutes each at an old hulk at a range of 1600 yards; after 25 minutes only 2 hits had been registered on the target.

Clearly there was scope for innovation and it took place in 1898 on the decks of a British navy frigate called HMS Scylla, under the command of Percy Scott. He’d noticed that one of his gun crews was managing a much better performance and began studying and exploring what they were doing with a view to developing it into a system. By the time he was in command, two years later, of a squadron in the South China Sea he had refined his methods and equipped his flagship, HMS Terrible with new equipment and trained his gun crews.

Image: Painting by Christoffer Wilhelm Eckersberg, public domain

The improvements were significant and importantly influenced a young US lieutenant on secondment to the squadron. William Sims learned about the new system and applied it on his own ship with remarkable results; convinced of the power of this innovation he decided it was his mission to carry the news to Washington and change naval practice. What followed is a fascinating story for what it reveals about NIH and the many ways in which it can be deployed.

In his fascinating account Elting Morison highlights three strategies used by the US military to defend against the new idea. The first was simply to bury the idea; Sims’ reports to the Bureau of Ordnance and the Bureau of Navigation were simply filed away and forgotten. The second was to try and rebut the information; the response from the Bureau of Ordnance was along the lines of claiming that US equipment was as good as the British so any differences in firing performance must be due to the men involved. More important was their argument that continuous-aim firing was impossible; when that failed they conducted experiments to try to show there was no significant benefit from the approach. By running them on dry land they were able to cast doubt on the relative advantage of the new approach.

And their third strategy was to try and sideline Sims, painting him as an eccentric troublemaker, stressing his youth and lack of experience, highlighting the fact that he’d spent too long with the British navy and in other ways undermining his credibility. Needless to say this only strengthened Sims’ resolve and he duly went over the heads of the senior staff and appealed to President Roosevelt himself. He finally ‘won’; he remained as unpopular as ever but the new approach was grudgingly adopted and quickly became the dominant design for future naval gunnery.

Image: UK HMSO Public Domain

On dry land and a decade later a similar outsider — Major J. C. Fuller — was working with the British Army. He’d seen the possibilities in using tanks as a fast mobile strike capability and his ideas were eventually borne out, briefly in the latter part of the First World War when they were used to good effect in Cambrai and Amiens. But despite being given responsibility for introducing the new technology he met with resistance (not helped by his abrasive nature); there were many who saw tanks as an unwelcome diversion. It didn’t help that the organizational location in the command structure was in the Cavalry Corps — the very group most threatened by the change to tanks. Their post-war strategy was to continue to rely on the equine model; ‘more hay, more horses’ rather than investment in tanks or learning about tank warfare. Elsewhere though his ideas found fertile soil and he was credited by Adolf Hitler himself as the architect of the idea of ‘blitzkrieg’ — the fast mobile warfare which helped overrun France and much of Europe within a few weeks at the start of World War 2.

Drones as disruptive innovation?

Of course it’s complicated but could the case of drone adoption be history repeating itself? One explanation for why NIH happens in this fashion can be found in what we’ve learned about disruptive innovation. When it was published twenty five years ago Clayton Christensen’s classic book exploring the phenomenon ‘The innovator’s dilemma’ had the intriguing subtitle ‘When new technologies cause great firms to fail’. His core argument was that the organizations which were affected by disruptive innovations were not stupid but rather selectively blind, a consequence of their very market success and the organizational arrangements which had grown up over a long period of time to support that success.

For him the challenge wasn’t the old one of balancing radical and incremental change with the losing firms being too cautious. Rather it was about trajectories; whether a new technology was sustaining — reinforcing the existing trajectory — or disrupting, offering a new trajectory. As we’ve come to realize the core issue is about business models; disruption occurs when someone frames the new trajectory as something which can create value under different conditions.

The search for such a new business model doesn’t take place in the mainstream as a direct challenge; instead it emerges in different markets which are unserved or underserved but where the new features offer potential value (often good enough performance at much lower cost). These fringe markets provide the laboratory in which learning and refinement of the new technology and development of the business model can take place.

The problem arises when the new business model built on a new trajectory begins to appeal to the old mainstream market. At this point it’s a challenge to existing incumbents to let go of their old business model and reconfigure a new one. Jumping the tracks to a new trajectory is risky anyway but when you carry the baggage of years, perhaps decades or even centuries of the old model it becomes very hard. That’s when NIH rears its head and it can snap and bite at the new idea with surprising defensive vigor.

There’s almost a cycle to it like that developed by Elizabeth Kubler-Ross to explain the grieving process. First there’s denial — ignore it and it will go away, it’s not relevant to us, it won’t work in our industry, it’s not our core business, etc. Then there’s a period of rationalization, accepting the new idea but dismissing it as not relevant to the core business, followed by experimentation designed not so much to learn as to demonstrate why and how the new model offers little advantage. Variations on this theme include locating the experiments in the very part of the organization which has the most to lose (think about giving tank development to the Cavalry Corps).

Only when the evidence becomes impossible to ignore (often as a clear shift in the market towards the new trajectory and a significant competitive threat) comes the moment of acceptance. But even then commitment is often slow and lukewarm and the opportunity to get on the bus may have been missed.

Meanwhile in another part of the galaxy…

It’s not easy for the innovators trying to introduce the change. They struggle to break into the mainstream because they have no presence in that market and they are up against established interests and networks. Their best strategy is to continue to work with their fringe markets who do see the value in their model and to hope that eventually a cross-over to the mainstream takes place. Which is what has happened in the world of drone technology.

Demanding users in fringe application markets have provided the laboratory for fast learning. Early markets were in aerial photography where the cost of hiring planes and pilots could be cut significantly but where challenges around stability and development of lightweight equipment forced rapid innovation. Or mapping and surveying where difficult and sometimes inaccessible territory could be explored remotely. Once drones were able to carry specialized lightweight tools they could be used for remote repair and maintenance on oil platforms and other difficult or dangerous locations. Their capabilities in transportation opened up new possibilities in logistics, especially in challenging areas like delivering humanitarian aid. Significantly the demands of these fringe markets drove innovation around stability, payload, propulsion and other technologies, reinforcing and widening the appeal.

Estimates suggest the 2021 drone services market is worth $9 billion with predictions of growth rates as high as 45% per year. Application sectors outside mainstream aviation include infrastructure, agriculture, transport, security, media and entertainment, insurance, telecommunication and mining.

Holding the horses?

These days UAVs can do a lot for low price. Just like low-cost flying, mini mill steelmaking, and earthmoving equipment they represent a technology which has already changed the game in many sectors. They qualify as a disruptive innovation but they also trigger some interesting NIH behavior amongst the established incumbents. ‘We’ve always done it this way’ is particularly powerful when ‘this way’ has been around a long time and is associated with a history of past success.

Elting Morison has another story which underlines this challenge. Once again it concerns gunnery, this time the firing performance of mobile artillery crews in the British army during World War 2. A time and motion study was carried out using photographs of the procedure; the researcher was increasingly puzzled by the fact that at a certain point just before the gun was fired two men would peel away and stand several meters distant. It wasn’t until he discussed his findings with a retired colonel from the First World War that the mystery was solved. He was able to explain that the move was perfectly clear — the men were holding the horses. Despite the fact that the 1942 artillery was transported by truck the procedures for horse-drawn guns still remained in place.

Something worth reflecting on when you are walking in those Welsh hills…

Image: Pixabay

Image credits: as captioned, Wikimedia Commons, Pixabay

If you’re interested in more innovation stories please check out my website here

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Tools and Software for Tracking Innovation

Tools and Software for Tracking Innovation

GUEST POST from Art Inteligencia

In today’s fast-paced world, organizations must be agile and adaptive to remain competitive. Central to this adaptability is the ability to track and manage innovation effectively. Various tools and software platforms have been developed to help organizations manage the complexity of innovation processes, from ideation to implementation. This article will explore some of these tools, illustrating how they can be applied to real-world scenarios through case studies.

1. Understanding Innovation Tracking

Innovation tracking involves monitoring the development and implementation of new ideas within an organization. This process can include capturing inspiration, managing projects, and measuring impact. With a robust tracking system, teams can ensure alignment with strategic goals and demonstrate progress to stakeholders.

2. Essential Tools for Innovation Tracking

Several tools have emerged as leaders in innovation tracking due to their comprehensive features and user-friendly interfaces. Some of these include:

  • Idea Management Software: Platforms like Spigit, Brightidea, and IdeaScale help collect, evaluate, and prioritize innovative ideas from employees and stakeholders.
  • Project Management Tools: Tools such as Trello, Asana, and Monday.com support teams in managing tasks and workflows associated with innovation projects.
  • Data Analytics Platforms: Using platforms like Tableau and Power BI can help teams analyze and visualize innovation performance data.

3. Case Studies

Case Study 1: Johnson & Johnson’s Use of Brightidea

Johnson & Johnson (J&J), a global healthcare leader, faced the challenge of managing innovation across its vast network of employees. To streamline this process, J&J adopted Brightidea, an idea management platform that enables employees to submit, discuss, and evaluate new ideas.

“The introduction of Brightidea has transformed the way we approach innovation. By allowing employees at all levels to contribute, we’ve seen a dramatic increase in both the quality and quantity of ideas brought forward,” – Director of Innovation at Johnson & Johnson.

Brightidea facilitated the capturing of ideas from over 60,000 employees. By prioritizing ideas that align with strategic goals, Johnson & Johnson can efficiently allocate resources and develop new products that meet market needs. The platform’s intuitive interface and comprehensive analytics tools provide insights, enabling J&J to track the progress and impact of each innovation initiative.

Case Study 2: Trello and Power BI at XYZ Corporation

XYZ Corporation, a mid-sized tech company, struggled with fragmented innovation processes causing misalignment and delayed project timelines. By integrating Trello for project management and Power BI for analytics, XYZ significantly enhanced its innovation tracking capabilities.

“Utilizing Trello and Power BI has brought unprecedented visibility and efficiency to our innovation efforts, aligning teams and accelerating time-to-market,” – Innovation Program Manager at XYZ Corporation.

The Kanban-style interface of Trello allowed teams to manage tasks more effectively, improving collaboration and reducing project bottlenecks. Meanwhile, Power BI enabled the aggregation of project data for detailed analysis and reporting. As a result, XYZ Corporation could track performance metrics in real-time, gain insightful data-driven decisions, and optimize innovation strategies for greater success.

Conclusion

In conclusion, tracking innovation is an essential component for organizations seeking to maintain competitive advantage. By leveraging the right tools, businesses can cultivate a robust culture of innovation, ensuring ideas are nurtured from conception to implementation. Whether it’s through idea management platforms, project management software, or analysis tools, the right technology can empower organizations to remain agile and innovative in a dynamic market.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

The Era of Moving Fast and Breaking Things is Over

The Era of Moving Fast and Breaking Things is Over

GUEST POST from Greg Satell

On July 16th, 1945, when the world’s first nuclear explosion shook the plains of New Mexico, the leader of the Manhattan Project, J. Robert Oppenheimer quoted from the Bhagavad Gita, “Now I am become Death, the destroyer of worlds.” Clearly, he was troubled by what he had unleashed and for good reason. The world was never truly the same after that.

Today, however, we have lost much of that reverence for the power of technology. Instead of proceeding deliberately and with caution, tech entrepreneurs have prided themselves on their willingness to “move fast and break things” and, almost reflexively, casually deride anyone who questions the practice as those who “don’t get it.”

It’s hard to see how, by any tangible metric, any of this has made us better off. We set out to disrupt industries, but disrupted people instead. It wasn’t always like this. Throughout our history we have asked hard questions and made good choices about technological progress. As we enter a new era of innovation, we desperately need to recapture some of that wisdom.

How We Put the Nuclear Genie Back in the Bottle

The story of nuclear weapons didn’t start with Oppenheimer, not by a long shot. In fact, if we were going to attribute the Manhattan Project to a single person, it would probably be a Hungarian immigrant physicist named Leo Szilard, who was one of the first to conceive of the possibility of a nuclear chain reaction.

In 1939, upon hearing of the discovery of nuclear fission in Germany he, along with fellow Hungarian emigre Eugene Wigner, decided that the authorities needed to be warned. Szilard then composed a letter warning of the possibility of a nuclear bomb that was eventually signed by Albert Einstein and sent to President Roosevelt. That’s what led to the American development program.

Yet after the explosions at Hiroshima and Nagasaki, many of the scientists who worked to develop the bomb wanted to educate the public of its dangers. In 1955, the philosopher Bertrand Russell issued a manifesto signed by a number of scientific luminaries. Based on this, a series of conferences at Pugwash, Nova Scotia were convened to discuss different approaches to protect the world from weapons of mass destruction.

These efforts involved far more than talk, but helped to shape the non-proliferation agenda and led to concrete achievements such as the Partial Test Ban Treaty. In fact, these contributions were so crucially important that the organizers of the Pugwash conferences were awarded the Nobel Peace Prize in 1995 and they continue even today.

Putting Limits On What We Do With the Code of Life

While the nuclear age started with a bang, the genetic age began with a simple article in the scientific journal Nature, written by two relatively unknown scientists named James Watson and Francis Crick, that described the structure of DNA. It was one of those few watershed moments when an entirely new branch of science arose from a single event.

The field progressed quickly and, roughly 20 years later, a brilliant researcher named Paul Berg discovered that you could merge human DNA with that from other living things, creating new genetic material that didn’t exist in nature. Much like Oppenheimer, Berg understood that, due to his work, humanity stood on a precipice and it wasn’t quite clear where the edge was.

He organized a conference at Asilomar State Beach in California to establish guidelines. Importantly, participation wasn’t limited to scientists. A wide swath of stakeholders were invited, including public officials, members of the media and ethical specialists. The result, now known as the Berg Letter, called for a moratorium on the riskiest experiments until the dangers were better understood. These norms were respected for decades.

Today, we’re undergoing another revolution in genomics and synthetic biology. New technologies, such as CRISPR and mRNA techniques, have opened up incredible possibilities, but also serious dangers. Yet here again, pioneers in the field like Jennifer Doudna are taking the lead in devising sensible guardrails and using the technology responsibly.

The New Economy Meets the New Era of Innovation

When Netscape went public in 1995, it hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25. Still, a tiny enterprise with no profits was almost instantly worth $2.9 billion.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet today, it’s clear that the “new economy” was a mirage. Despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

The digital revolution has been a real disappointment. In fact, when you look at outcomes, if anything we’re worse off. Rather than a democratized economy, market concentration has markedly increased in most industries. Income inequality in advanced economies has soared. In America wages have stagnated and social mobility has declined for decades. At the same time, social media has been destroying our mental health.

Now we’re entering a new era of innovation, in which we will unleash technologies much more powerful. New computing architectures like quantum and neuromorphic technologies will power things like synthetic biology and materials science to create things that would have seemed like science fiction a generation ago. We simply can no longer afford to be so reckless.

Shifting From Agility Toward Resilience

Moving fast and breaking things only seems like a good idea in a stable world. When you operate in a safe environment, it’s okay to take a little risk and see what happens. Clearly, we no longer live in such a world (if we ever did). Taking on more risk in financial markets led to the Great Recession. Being blase about data security has nearly destroyed our democracy. Failure to prepare for a pandemic has nearly brought modern society to its knees.

Over the next decade, the dangers will only increase. We will undergo four major shifts in technology, resources, migration and demographics. To put that in perspective, a similar shift in demography was enough to make the 60s a tumultuous decade. We haven’t seen a confluence of so many disruptive forces since the 1920s and that didn’t end well.

Unfortunately it’s far too easy to underinvest in order to mitigate the risk of a danger that may never come to fruition. Moving fast and breaking things can seem attractive because the costs are often diffuse. Although it has impoverished society as a whole and made us worse off in so many ways, it has created a small cadre of fabulously wealthy plutocrats.

Yet history is not destiny. We have the power to shape our path by making better choices. We can abandon the cult of disruption and begin to invest in resilience. In fact, we have to. By this point there should be no doubt that the dangers are real. The only question is whether we will act now or simply wait for it to happen and accept the consequences.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Case Studies of Companies Leading in Inclusive Design

Case Studies of Companies Leading in Inclusive Design

GUEST POST from Chateau G Pato

In today’s rapidly evolving marketplace, inclusive design has become a cornerstone for innovation and effective product development. Companies that prioritize inclusivity not only enhance user experience but also expand their market reach and foster customer loyalty. Let’s examine two leading companies at the forefront of inclusive design in their industries.

Case Study 1: Microsoft – Empowering Everyone

Background

Microsoft has been a trailblazer in the realm of inclusive design, recognizing that the true potential of technology lies in its ability to serve the needs of all users, regardless of their abilities or circumstances.

Inclusive Design Initiatives

The company has implemented several initiatives aimed at making computing accessible to everyone. One of their landmark products is the Xbox Adaptive Controller, designed for gamers with limited mobility. The controller features large programmable buttons and connectors for external devices, offering a customizable experience for individuals with diverse physical needs.

Impact

Microsoft’s commitment to inclusivity extends beyond product development. They actively engage with the community to understand accessibility challenges and work with disabled individuals to co-create solutions. This initiative has not only opened up gaming to a broader audience but has also set a new standard for inclusive product design in the tech industry.

Case Study 2: OXO – Universal Design in Everyday Tools

Background

OXO, a manufacturer of kitchen and household tools, has long championed the principles of universal design, creating products that cater to a wide spectrum of users with varying needs.

Inclusive Design Initiatives

The company’s journey into inclusive design began with the design of the iconic OXO Good Grips line in the 1990s. These tools featured comfortable grips and easy-to-use mechanisms, specifically addressing the needs of individuals with arthritis but providing benefits to all users. This ethos of inclusivity is evident in OXO’s continued dedication to research and user feedback in crafting its products.

Impact

OXO’s approach to inclusive design has transformed everyday objects into accessible tools, helping many people with dexterity challenges enjoy cooking and daily tasks. The success of OXO’s products demonstrates that inclusivity can be a key differentiator in crowded markets, appealing to both niche and mass-market segments.

Conclusion

The commitment to inclusive design by companies like Microsoft and OXO illustrates the potential for innovation when diversity and accessibility are prioritized. By creating products that serve a broader range of users, businesses can not only drive social impact but also achieve significant business success. As more companies follow suit, inclusive design will undoubtedly continue to transform industries and enhance consumer experiences around the globe.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Finding the Right Physician Advisor for a Healthcare Startup

Finding the Right Physician Advisor for a Healthcare Startup

GUEST POST from Arlen Meyers

It has never been easier to create a sickcare startup, particularly in digital health. Part of that process requires that founders find the right players to be on the team. In many instances, that will involve finding physician advisors or consultants.

But, how do you find the right physician advisors?

Here are some tips:

  1. Clearly define the optimal candidate by writing a job description that includes the knowledge, skills, attitudes and competencies you want. Are you looking for someone with an entrepreneurial mindset or someone with just domain expertise? One expert suggests that they should be able to communicate a deep understanding of their domain effectively and understand the context of both their organizations and those they work with. Moreover, emotional competence is essential to developing strong interpersonal skills and succeeding in any workplace. Professionals should also strive to be effective teachers and build a large network of human connections. Finally, possessing an ethical compass will be important as algorithm-driven machines begin to make morally weighted decisions.
  2. Look for past experience and results
  3. Decide how much and what kind of compensation you are prepared to offer, either in cash, equity or both
  4. Make it clear how long you want to engage your advisor. Is it for one hour or one year or more? Or, maybe it’s best to try before you buy and hire for a renewable three-month term.
  5. Clearly define your expectations, deliverables and timelines and how you will measure the results. What roles, holes and goals do you want your advisor to fill?
  6. Solicit candidates using networks, social media channels, word of mouth referrals or responses to a call to action on your website or other marketing collateral
  7. Screen candidates using the criteria you have defined
  8. Decide whether you want someone to fill a business or clinical advisory position. Finding a clinical business advisor is difficult since few doctors or other healthcare professionals have both a clinical and entrepreneurial mindset and the knowledge, skills, abilities and competencies to help you achieve your next critical success endpoints.
  9. Agree on whether you are hiring for periodic strategic input or more tactical, hands-on execution.
  10. Interview candidates to see if they comply with your requirements and whether they are the right fit
  11. Negotiate an agreement
  12. Execute an advisory services agreement that defines the terms and conditions of the relationship

Finding the right physician advisors is an important part of recruiting your startup team. Don’t hire someone simply because of the initials after their name.

Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Integrating Agile Practices into Non-Software Projects

Integrating Agile Practices into Non-Software Projects

GUEST POST from Art Inteligencia

Agile practices are often celebrated in the software development realm, promising flexibility, responsiveness, and enhanced collaboration. But, the principles of Agile can be extended beyond software. At its core, Agile strives to deliver value and facilitate continuous improvement, making it a valuable methodology for a variety of disciplines. In this article, we will explore how Agile practices can be integrated into non-software projects, supported by two compelling case studies.

Case Study 1: Agile in Marketing Campaign Management

Background: A global retail company, RetailCorp, faced challenges with their traditional marketing campaign management process, which was rigid, slow to adapt to market trends, and resulted in delayed campaign launches.

Agile Implementation: RetailCorp adopted Scrum, one of the most popular Agile frameworks, for their marketing team. They formed a cross-functional team including designers, content creators, data analysts, and campaign managers to collaborate and focus on delivering incremental value. Daily stand-ups, sprint planning, and retrospectives were introduced to the non-software team.

Outcomes:

  • Increased Flexibility: The marketing team could swiftly pivot strategies in response to competitors’ actions or new market data.
  • Enhanced Collaboration: The cross-functional team dynamic fostered innovation and creative problem-solving.
  • Reduced Time to Market: Campaigns were launched 30% faster compared to the previous process.

Case Study 2: Agile in Product Design and Development

Background: DesignStudio, a company specializing in developing consumer electronics, sought a way to accelerate their product design and development timeline without compromising quality.

Agile Implementation: DesignStudio embraced Kanban, aiming for a leaner workflow. They visualized the design and development process using Kanban boards, which provided transparency and facilitated the spotting and resolution of bottlenecks.

Outcomes:

  • Improved Workflow Efficiency: By limiting work in progress, DesignStudio minimized context-switching and improved focus.
  • Enhanced Quality: Continuous feedback loops ensured that design flaws were identified and corrected earlier in the process.
  • Faster Development Lifecycle: Products were designed and ready for market 25% quicker.

Keys to Successful Agile Integration in Non-Software Projects

Here are several strategies for successfully integrating Agile practices into non-software projects:

  • Adapt and Tailor: Customize Agile practices to fit the unique requirements and constraints of your non-software projects.
  • Focus on Training: Provide comprehensive Agile training to ensure teams understand the principles and can swiftly adapt.
  • Emphasize Collaborative Culture: Foster an environment where open communication and collaboration are prioritized, breaking down traditional silos.
  • Measure and Iterate: Regularly assess the effectiveness of Agile practices in achieving project goals and iterate for continuous improvement.

By harnessing Agile practices, non-software projects can achieve higher levels of efficiency, flexibility, and quality. The principles underpinning Agile aren’t limited to software; they are about fostering a culture of adaptability, continuous learning, and value-driven outcomes. As organizations continue to evolve in competitive landscapes, Agile methodologies offer a powerful tool for achieving sustainable success.

This article provides an insightful exploration of integrating Agile practices into non-software projects, featuring two illustrative case studies. It demonstrates practical examples and key strategies for successful Agile adoption beyond the realm of software development.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Measuring the Impact of Social and Environmental Innovation

Measuring the Impact of Social and Environmental Innovation

GUEST POST from Chateau G Pato

As we advance into an era of conscientious capitalism, the role of social and environmental innovation has become more critical than ever. Organizations are increasingly measured not just on their financial performance, but on their ability to generate positive social and environmental outcomes. However, to truly recognize the value of these innovations, we must develop robust methods for measuring their impact.

In this article, we’ll explore key strategies for evaluating the impact of social and environmental innovation, supported by two illustrative case studies.

Importance of Measuring Impact

Measuring impact is vital for several reasons. It provides accountability, guiding companies to deliver on their promises. It also helps in securing funding and support from stakeholders and enhances decision-making by providing insights into what works and what doesn’t. Moreover, clear metrics can foster increased transparency and trust between an organization and its stakeholders.

Approaches to Measuring Impact

While there is no one-size-fits-all approach, several methodologies can be used to measure impact:

  • Social Return on Investment (SROI): This method quantifies the social, environmental, and economic value created by an organization relative to the resources invested.
  • Triple Bottom Line (TBL): Focuses on people, planet, and profit, evaluating social and environmental performance alongside financial outcomes.
  • Key Performance Indicators (KPIs): Specific metrics tailored to a project’s goals, offering a direct line to assessing impact.

Case Study 1: Interface, Inc.

Background

Interface, Inc., one of the largest global manufacturers of modular carpet, embarked on a transformative mission to become a fully sustainable enterprise. Their initiative, Mission Zero, aimed to eliminate any negative impact the company had on the environment by 2020.

Measuring Impact

Interface used a comprehensive approach to measure its environmental innovations — they tracked metrics such as carbon emissions, water usage, and waste reduction. They also calculated their progress toward Mission Zero goals, establishing clear KPIs and regularly publishing sustainability reports.

Results

By the end of 2020, Interface had managed to reduce its greenhouse gas emissions by 96% and waste to landfills by 91% from a 1996 baseline, all while increasing their recycled and bio-based content across products.

Case Study 2: The MicroLoan Foundation

Background

The MicroLoan Foundation provides small loans, business training, and mentorship to women in sub-Saharan Africa, aiming to lift communities out of poverty through female entrepreneurship.

Measuring Impact

This organization uses a Social Return on Investment (SROI) framework to evaluate the socioeconomic impact of its programs. They assess metrics such as income increase, business success rate, and improvements in quality of life. Moreover, they track the ripple effect within communities, measuring how these microloans improve education and healthcare access.

Results

Women supported by the MicroLoan Foundation reported a 96% success rate in their businesses with significant improvements in household income and education access for their children, demonstrating a substantial SROI.

Moving Forward

As businesses aim to achieve sustainable and inclusive growth, the ability to precisely measure social and environmental impact becomes a vital asset. By leveraging diverse measurement strategies, companies can ensure they are not only contributing positively to society and the environment but are also reaping the rewards of their efforts through enhanced reputation and stakeholder trust.

Ultimately, the evolving landscape of business underscores that financial gain and social good do not have to be mutually exclusive but can coexist to create a more inclusive and sustainable future.

As leaders in change and innovation, let us commit to not just measuring outcomes, but driving meaningful impact that transforms lives and safeguards our planet.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Embracing Failure – Lessons Learned from Setbacks

Embracing Failure - Lessons Learned from Setbacks

GUEST POST from Art Inteligencia

In the world of innovation, failure is not just inevitable, it’s essential. Embracing failure can lead to groundbreaking discoveries, foster resilience, and cultivate a culture that thrives on learning. While the stigma of failure persists, forward-thinking organizations understand that embracing setbacks is a cornerstone of progress. Here, we explore two compelling case studies that illustrate how failure can be transformed into a stepping stone for future success.

Case Study 1: The Rise of Airbnb

When Brian Chesky and Joe Gebbia first conceived the idea of renting out air mattresses on their apartment floor, their concept wasn’t an overnight sensation. The fledgling platform struggled, with its initial website launch garnering disappointing engagement. The duo faced numerous rejections from investors, many of whom doubted the viability of the idea. However, rather than viewing these setbacks as failures, the team saw them as opportunities to refine their model and focus on user experience.

Lessons Learned:

  • Pivoting is powerful: Chesky and Gebbia used feedback from failures to adapt their business model, eventually redefining the travel and lodging industry.
  • Persistence is key: Despite numerous rejections, they persisted, displaying resilience that would eventually lead to Airbnb’s global success.

Case Study 2: The WD-40 Story

WD-40, now a staple in households worldwide, originated from a series of failures. The product’s creation was the result of 39 unsuccessful attempts to develop a formula to prevent corrosion. Instead of seeing these failed attempts as a loss, the creators viewed each one as a learning opportunity. The breakthrough came with the 40th formula, hence the name “WD-40” which stands for “Water Displacement, 40th formula”.

Lessons Learned:

  • Learning from repetition: Every failed attempt provided valuable data, ultimately leading to a successful product.
  • Failure can fortify determination: The triple-digit number of attempts underscores how determination can lead to ultimate success.

Conclusion

Both of these stories demonstrate that failure is not the opposite of success; it is part of its journey. Organizations willing to embrace failure cultivate a learning culture, fostering innovation and improvement. Embracing failure also sets the stage for transformational change as each setback provides the chance to learn, innovate, and ultimately succeed.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Creating Seamless Omnichannel Experiences

Creating Seamless Omnichannel Experiences

GUEST POST from Chateau G Pato

The modern consumer demands a unified and personalized experience across all channels of interaction. Whether they’re shopping online, on a mobile app, or in-store, customers expect consistency, efficiency, and a connected narrative from brands. Achieving this seamless omnichannel experience requires not just technological integration but a fundamental shift in how businesses think about customer journeys.

Understanding Omnichannel Experience

A true omnichannel experience is much more than simply being present on multiple channels. It requires the integration of every communication and sales channel to reflect a unified and personalized journey for the customer. This involves harmonizing data, creating consistent brand messaging, and ensuring that customers can switch between channels effortlessly, with the assurance that the company recognizes them at every touchpoint.

Key Elements of a Seamless Omnichannel Experience

  • Unified Data: Implement solutions that can centralize customer data from all channels, allowing for a personalized approach in real-time.
  • Consistent Branding: Ensure your brand message, style, and tone are consistent across every channel.
  • Integrated Technology: Use platforms that allow for seamless transitions and communication between channels.
  • Customer-Centric Approach: Design experiences from the customer’s perspective for ease of use and satisfaction.

Case Study 1: Starbucks

Starbucks is a pioneer in delivering seamless omnichannel experiences. Through their mobile app, they have successfully integrated numerous channels to enrich customer interaction. Customers can order ahead on their app, earn and redeem loyalty points, review past orders, and pay for purchases—all within a unified ecosystem. This integration has not only enhanced customer satisfaction but also increased sales, as it supports customers in deciding when and how to make purchases.

Additionally, Starbucks ensures that their promotions, brand messages, and updates are consistent across all channels, from their app to in-store displays and advertisements. This consistency reinforces their brand identity and helps maintain a cohesive customer experience.

Case Study 2: Disney

Disney offers another exemplary omnichannel experience, notably through their parks and resorts. The company has designed its My Disney Experience app to act as a comprehensive planning and guide tool for visitors. Before their visit, customers can book tickets, make dining reservations, and plan their itinerary. On the day of the visit, the app transforms into a navigator, with features like wait-time updates, interactive maps, and mobile ordering.

The seamless experience extends to physical locations with the MagicBand technology, which serves as an entry ticket, room key, and payment method. By providing a blend of digital and in-store interactions that are flawlessly connected, Disney ensures that their customers can focus on experiences, not logistics.

Conclusion

The journey towards creating seamless omnichannel experiences involves embracing both technological integration and a commitment to customer-centric innovation. By studying leaders like Starbucks and Disney, organizations can glean valuable insights into designing a strategy that fulfills today’s customer expectations. Future-ready omnichannel experiences are essential for maintaining competitive edge and fostering long-lasting customer relationships.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Driving Innovation Through Empathy, Leadership and Understanding

Driving Innovation Through Empathy, Leadership and Understanding

GUEST POST from Art Inteligencia

In the rapidly evolving world of business, innovation stands as a critical driver for success. While processes, structures, and technologies play substantial roles, the human element—particularly empathy—holds profound potential. Empathy allows leaders to deeply understand and genuinely connect with their teams and customers, fostering an environment where innovation thrives. This article explores the intricate relationship between empathy and leadership, anchored by compelling case studies that illustrate transformative outcomes when empathy is prioritized.

Case Study 1: The LEGO Group

LEGO, the beloved toy company, experienced significant challenges in the early 2000s. The company was nearing bankruptcy due to a failure to adapt to the changing interests of its core audience—children. The leadership team at LEGO realized a need to step back and adopt a fresh perspective grounded in empathy.

The turnaround strategy, famously termed “LEGO’s Business Transformation,” required the leadership to immerse themselves in the world of their customers—children. By spending time observing and interacting with children during play sessions, LEGO’s leaders understood the emotional and creative needs of their audience. This led to innovations like the immensely popular LEGO Friends series, which was designed based on detailed feedback from young girls who were previously underserved by traditionally boy-oriented LEGO products.

The result was not only an incredible resurgence in profitability but also an innovation culture that prioritizes deep customer connection and iterative feedback—a testament to the power of empathy-driven leadership.

Case Study 2: Microsoft’s Cultural Transformation

When Satya Nadella became the CEO of Microsoft in 2014, the company was seen as a bureaucratic giant struggling to compete with more nimble tech innovators. Nadella’s leadership focused heavily on empathy, both internally across Microsoft’s vast workforce and externally toward customers.

Internally, Nadella encouraged a cultural shift from a “know-it-all” to a “learn-it-all” philosophy. He challenged teams to use empathy to transform customer engagement strategies and product development processes. A concrete example is the development of features for people with disabilities, inspired by Nadella’s personal experiences as a father of a child with special needs.

This empathy-first approach led to breakthrough innovations such as Seeing AI, an app that narrates the world for the visually impaired, exemplifying how deep understanding and leadership empathy could drive product innovation while simultaneously enhancing Microsoft’s brand value and market relevance.

Conclusion

Empathy enables leaders to connect deeply with their teams and customers, providing a compass that guides innovative practices. The stories of LEGO and Microsoft underscore the profound impact that empathy can have when it shapes leadership strategies. As businesses grapple with complex challenges, those that integrate empathy into the very fabric of their leadership are not only poised to innovate but to do so in a manner that genuinely resonates with human needs.

In embracing empathy, leaders unlock the key to sustainable innovation, transforming their organizations into environments where understanding, creativity, and impact coexist harmoniously.

Bottom line: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.

Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.