Tag Archives: creation

From Concept to Creation: A Guide to Ideation

From Concept to Creation: A Guide to Ideation

GUEST POST from Chateau G Pato

In the ever-evolving landscape of innovation, transitioning from a broad concept to a tangible creation can often be the most challenging yet rewarding journey an organization can undertake. Ideation, the creative process of generating, developing, and communicating new concepts, is a cornerstone of this journey. As a thought leader in human-centered change and innovation, it’s my pleasure to guide you through the critical stages of ideation using two compelling case studies as illustrations.

The Ideation Process

The ideation process involves several key stages: inspiration, creative generation, refinement, prototyping, and execution. Each phase is crucial, requiring both structured methodologies and a flexible mindset. Successful ideation fosters a culture of creativity and openness, leveraging diverse perspectives to develop solutions that resonate with real human needs.

Case Study 1: Airbnb – Revolutionizing Travel Accommodation

Inspiration:

The founders of Airbnb, Brian Chesky and Joe Gebbia, were struggling to pay rent in San Francisco in 2007. They saw an opportunity during a local design conference, when hotels were fully booked. This sparked the idea of renting out air mattresses in their apartment to attendees looking for affordable accommodation.

Creative Generation:

The idea expanded beyond their immediate need. Chesky and Gebbia, alongside Nathan Blecharczyk, envisioned a platform where homeowners could list and rent spaces globally. This was revolutionary, challenging the traditional hotel industry.

Refinement and Prototyping:

Initial website versions were simple, but enough to validate the concept through real users. Continuous feedback helped refine the platform to better match user needs, laying the foundation for what Airbnb is today.

Execution:

Airbnb launched officially in 2008 and has since grown exponentially, leveraging lessons learned from user feedback and scaling the model to accommodate millions of users worldwide.

Case Study 2: The Dyson Vacuum Cleaner – Engineering Innovation

Inspiration:

James Dyson, frustrated with the inefficiency of traditional vacuum cleaners, sought out a solution. Noticing the industrial cyclone separators used in sawmills inspired him to apply similar technology to home vacuuming.

Creative Generation:

The concept of a bagless vacuum cleaner took shape. Dyson’s vision was to create a powerful vacuum that maintained its suction, unlike traditional models losing power as bags filled.

Refinement and Prototyping:

Dyson created over 5,000 prototypes over five years, iterating designs based on performance and user input. This relentless refinement was driven by his commitment to solving a real problem.

Execution:

The Dyson DC01 launched in 1993 and revolutionized the market. Its success built upon Dyson’s perseverance through ideation stages, ultimately establishing a new standard in home cleaning technology.

Key Takeaways

Both Airbnb and Dyson exemplify the power of effective ideation. Here are a few key takeaways from their journeys:

  • User-Centric Mindset: Focus on understanding and solving real user problems.
  • Iterative Prototyping: Test, learn, and refine ideas continuously.
  • Persistence and Flexibility: Stay committed to your vision, but be flexible enough to adapt based on feedback and new insights.

Conclusion

The journey from concept to creation demands a balance of creativity, strategy, and resilience. By fostering a culture that embraces these qualities, organizations can transform great ideas into groundbreaking innovations. Remember, successful ideation is not just about having a bright idea — it’s about nurturing that idea through each phase of its evolution, just as seen in the transformative journeys of Airbnb and Dyson.

Extra Extra: Because innovation is all about change, Braden Kelley’s human-centered change methodology and tools are the best way to plan and execute the changes necessary to support your innovation and transformation efforts — all while literally getting everyone all on the same page for change. Find out more about the methodology and tools, including the book Charting Change by following the link. Be sure and download the TEN FREE TOOLS while you’re here.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

A New Age Of Innovation and Our Next Steps

A New Age Of Innovation and Our Next Steps

GUEST POST from Greg Satell

In Mapping Innovation, I wrote that innovation is never a single event, but a process of discovery, engineering and transformation and that those three things hardly ever happen at the same time or in the same place. Clearly, the Covid-19 pandemic marked an inflection point which demarcated several important shifts in those phases.

Digital technology showed itself to be transformative, as we descended into quarantine and found an entire world of video conferencing and other technologies that we scarcely knew existed. At the same time it was revealed that the engineering of synthetic biology—and mRNA technology in particular—was more advanced than we had thought.

This is just the beginning. I titled the last chapter of my book, “A New Era of Innovation,” because it had become clear that we had begun to cross a new rubicon in which digital technology becomes so ordinary and mundane that it’s hard to remember what life was like without it, while new possibilities alter existence to such an extent we will scarcely believe it.

Post-Digital Architectures

For the past 50 years, the computer industry—and information technology in general—has been driven by the principle known as Moore’s Law, which determined we could double the number of transistors on chips every 18 months. Yet now Moore’s Law is ending and that means we will have to revisit some very basic assumptions about how technology works.

To be clear, the end of Moore’s Law does not mean the end of advancement. There are a number of ways we can speed up computing. We can, for instance, use technologies such as ASIC and FPGA to optimize chips for specialized tasks. Still, those approaches come with tradeoffs, Moore’s law essentially gave us innovation for free.

Another way out of the Moore’s Law conundrum is to shift to completely new architectures, such as quantum, neuromorphic and, possibly, biological computers. Yet here again, the transition will not be seamless or without tradeoffs. Instead of technology based on transistors, we will have multiple architectures based on entirely different logical principles.

So it seems that we will soon be entering a new era of heterogeneous computing, in which we use digital technology to access different technologies suited to different tasks. Each of these technologies will require very different programming languages and algorithmic approaches and, most likely, different teams of specialists to work on them.

What that means is that those who run the IT operations in the future, whether that person is a vaunted CTO or a lowly IT manager, will be unlikely to understand more than a small part of the system. They will have to rely heavily on the expertise of others to an extent that isn’t required today.

Bits Driving Atoms

While the digital revolution does appear to be slowing down, computers have taken on a new role in helping to empower technologies in other fields, such as synthetic biology, materials science and manufacturing 4.0. These, unlike so many digital technologies, are rooted in the physical world and may have the potential to be far more impactful.

Consider the revolutionary mRNA technology, which not only empowered us to develop a Covid vaccine in record time and save the planet from a deadly pandemic, but also makes it possible to design new vaccines in a matter of hours. There is no way we could achieve this without powerful computers driving the process.

There is similar potential in materials discovery. Suffice it to say, every product we use, whether it is a car, a house, a solar panel or whatever, depends on the properties of materials to perform its function. Some need to be strong and light, while others need special electrical properties. Powerful computers and machine learning algorithms can vastly improve our ability to discover better materials (not to mention overcome supply chain disruptions).

Make no mistake, this new era of innovation will be one of atoms, not bits. The challenge we face now is to develop computer scientists who can work effectively with biologists, chemists, factory managers and experts of all kinds to truly create a new future.

Creation And Destruction

The term creative destruction has become so ingrained in our culture we scarcely stop to think where it came from. It was largely coined by economist Joseph Schumpeter to overcome what many saw as an essential “contradiction” of capitalism. Essentially, some thought that if capitalists did their jobs well, then there would be increasing surplus value, which would then be appropriated to accumulate power to rig the system further in capitalists favor.

Schumpeter pointed out that this wasn’t necessarily true because of technological innovation. Railroads, for example, completely changed the contours of competition in the American Midwest. Surely, there had been unfair competition in many cities and towns, but once the railroad came to town, competition flourished (and if it didn’t come, the town died).

For most of history since the beginning of the Industrial Revolution, this has been a happy story. Technological innovation displaced businesses and workers, but resulted in increased productivity which led to more prosperity and entirely new industries. This cycle of creation and destruction has, for the most part, been a virtuous one.

That is, until fairly recently. Digital technology, despite the hype, hasn’t produced the type of productivity gains that earlier technologies, such as electricity and internal combustion, did but actually displaced labor at a faster rate. Put simply, the productivity gains from digital technology are too meager to finance enough new industries with better jobs, which has created income inequality rather than greater prosperity.

We Need To Move From Disrupting Markets To Tackling Grand Challenges

There’s no doubt that digital technology has been highly disruptive. In industry after industry, from retail to media to travel and hospitality, nimble digital upstarts have set established industries on their head, completely changing the basis upon which firms compete. Many incumbents haven’t survived. Many others are greatly diminished.

Still, in many ways, the digital revolution has been a huge disappointment. Besides the meager productivity gains, we’ve seen a ​​global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets, not to mention an anxiety epidemic, increased obesity and, at least in the US, decreased life expectancy.

We can—and must—do better. We can learn from the mistakes we made during the digital revolution and shift our mindset from disrupting markets to tackling grand challenges. This new era of innovation will give us the ability to shape the world around us like never before, at a molecular level and achieve incredible things.

Yet we can’t just leave our destiny to the whims of market and technological forces. We must actually choose the outcomes we prefer and build strategies to achieve them. The possibilities that we will unlock from new computing architectures, synthetic biology, advanced materials science, artificial intelligence and other things will give us that power.

What we do with it is up to us.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.