In 2014, when Silicon Valley was still largely seen as purely a force for good, George Packer wrote in The New Yorker how tech entrepreneurs tended to see politics through the lens of an engineering mindset. Their first instinct was to treat every problem as if it could be reduced down to discrete variables and solved like an equation.
Despite its romantic illusions, the digital zeitgeist merely echoed more than a century of failed attempts to generalize engineering approaches, such as scientific management, financial engineering, six sigma and shareholder value. All showed initial promise and then disappointed, in some cases catastrophically.
Proponents of the engineering mindset tend to blame its failures on poor execution. Surely, logic would suggest that as long as a set of principles are internally consistent, they should be externally relevant. Yet the problem is that reality is not simple and clear-cut, but complex and nonlinear, which is why we need be ready to adapt to the unexpected and nonsensical.
The Rise of the Engineering Mindset
In the 1920s, a group of intellectuals in Berlin and Vienna, much like many of the Silicon Valley digerati today, became enamored with the engineering mindset. By this time electricity and internal combustion had begun to reshape the world and Einstein’s theory of relativity, confirmed in 1919, had reshaped our conception of the universe. It seemed that there was nothing that scientific precision couldn’t achieve.
Yet human affairs were just as messy as always. Just a decade before Europe had blundered its way into the most horrible war in history. Social scientists still seemed no more advanced than voodoo doctors and philosophers were still making essentially the same arguments the ancient Greeks used two thousand years before.
It seemed obvious to them that human endeavors could be built on a more logical basis and saw a savior in Ludwig Wittgenstein and his Tractatus, which described a world made up of “atomic facts” that could be combined to create “states of affairs.” He concluded, famously, that “Whereof one cannot speak, thereof one must remain silent,” meaning that whatever could not be proved logically must be disregarded.
The intellectuals branded their movement logical positivism and based it on the principle of verificationism. Only verifiable propositions would be taken as meaningful. All other statements would be treated as silly talk and gobbledygook. Essentially, if it didn’t fit in an algorithm, it didn’t exist.
A Foundational Crisis
Unfortunately, and again much like Silicon Valley denizens of today, the exuberant confidence of the logical positivists belied serious trouble underfoot. In fact, while the intellectuals in Berlin and Vienna were trying to put social sciences on a more logical footing, logic itself was undergoing a foundational crisis.
At the root of the crisis was a strange paradox, which can be illustrated by the sentence, “The barber shaves every man who does not shave himself.” Notice the problem? If the barber shaves every man who doesn’t shave himself, then who shaves the Barber? If he shaves himself, he violates the statement and if he does not shave himself, he also violates it.
It seems a bit silly, but the Barber’s Paradox is actually a simplified version of Russell’s Paradox involving sets that are members of themselves, which had baffled mathematicians and logicians for decades. Clearly, for a logical system to be valid and verifiable, statements need to be provably true or false. 2+2 for example, needs to always equal four. Yet the paradox exposed a hole that no one seemed able to close.
Eventually, the situation came to a head when David Hilbert, one of the most prominent logical positivists, proposed a program that rested on three pillars. First, mathematics needed to be shown to be complete in that it worked for all statements. Second, mathematics needed to be shown to be consistent, no contradictions or paradoxes allowed. Finally, all statements need to be computable, meaning they yielded a clear answer.
The hope was that the foundational crisis would be resolved, the hole at the center of logic could be closed and the logical positivists could move along with their project.
The System Crashes
Hilbert and his colleagues received and answer faster than most had expected. In 1931, just 11 years after Hilbert proposed his foundational problems, 25-year-old Kurt Gödel published his incompleteness theorems. It wasn’t the answer anyone was expecting. Gödel showed that any logical system could be either complete or consistent, but not both,
Put more simply, Gödel proved that every logical system will always crash. It’s only a matter of time. Logic would remain broken forever and the positivists hopes were dashed. Obviously, you can’t engineer a society based on a logical system that itself is hopelessly flawed. For better or for worse, the world would remain a messy place.
Yet the implications of the downfall of logic turned out to be far different, and far more strange, than anyone had expected. In 1937, building on Gödel’s proof, Alan Turing published his own paper on Hilbert’s computability problem. Much like the Austrian, he found that all problems are not computable, but with a silver lining. As part of his proof, he included a description of a simple machine that could compute every computable number.
Ironically, Turing’s machine would usher in a new era of digital computing. These machines, constructed on the basis that they would all eventually crash, have proven to be incredibly useful, as long as we accept them for what they are — flawed machines. As it turns out, to solve big, important problems, we often need to discard up our illusions first.
We Need to Think Less Like Engineers and More Like Gardeners
The 20th century ushered in a new era of science. We conquered infectious diseases, explored space and unlocked the genetic code. So, it was not at all unreasonable to want to build on that success by applying an engineering mindset to other fields of human endeavor. However, at this point, it should be clear that the approach is far past the point of saving.
It would be nice if the general well-being could be reduced to a single metric like GDP or the success of an enterprise could be fully encapsulated in a stock price. Yet today we live, as Danny Hillis has put it, in an age of the entanglement, where even a limited set of variables can lead to the emergence of a new and unexpected order.
We need to take a more biological view in which we think less like engineers and more like gardeners that grow and nurture ecosystems. The logical positivists had no idea what they were growing, but somehow what emerged from the soil they tilled turned out to be far more wondrous—not to mention exponentially more useful—than what they had originally intended.
As I wrote at the beginning of this crazy year, the time has come to rediscover our humanity. We are, in so many ways, at a crossroads. Technology will not save us. Markets will not save us. We simply need to make better choices.
In today’s fast-paced, competitive landscape, the ability to innovate is more critical than ever. Creativity is the engine of innovation, and leadership plays a pivotal role in fostering a culture where creativity can flourish. But what exactly can leaders do to cultivate creativity within their organizations?
Creating a Culture of Experimentation
Leaders must create an environment where experimentation is encouraged and failure is not stigmatized. This involves not only providing the resources and freedom needed for experimentation but also showing support when experiments don’t go as planned.
A culture of experimentation promotes risk-taking, which is essential for creativity. Employees must feel confident that their innovative ideas will be heard and respected, and that they will not be penalized for thinking outside the box.
Case Study: Google’s “20% Time”
Google has long been hailed as a leader in fostering a creative workplace culture. One of their groundbreaking policies was the “20% Time” initiative, where employees were allowed to spend 20% of their work time on projects that interested them, even if they were unrelated to their normal duties.
This policy led to the creation of successful products such as Gmail and AdSense. By empowering employees to explore their creative ideas without traditional constraints, Google harnessed the collective inventive potential of its workforce.
Empowering Diverse Voices
Diversity in thought and experience is a powerful driver of creativity. Leaders should actively cultivate a diverse and inclusive environment, encouraging participation and input from people of different backgrounds, disciplines, and perspectives.
By valuing diverse voices, organizations can enhance their problem-solving capabilities and drive more innovative solutions. Leaders must demonstrate a commitment to diversity not just in policy, but in practice.
Case Study: IBM’s Diversity Initiatives
IBM has long been at the forefront of diversity and inclusion, recognizing its importance to creativity and innovation. Their commitment to diversity is exemplified by their focused recruiting efforts and the establishment of programs that support women and minority groups.
IBM’s embrace of diversity has proven beneficial in creating innovation clusters within the company and has enabled the development of solutions that cater to a wider global audience.
Encouraging Continuous Learning
Leaders should promote a culture of continuous learning, where employees have the opportunity to develop their skills and knowledge. Providing access to learning resources and opportunities encourages employees to stay curious and capable, laying the groundwork for creativity and innovation.
Investing in employee development signals to the workforce that leadership values their growth, and it helps retain creative talent within the company.
The Leadership Mindset
Ultimately, the role of leadership in cultivating creativity goes beyond implementing policies and initiatives. It requires a mindset that values open communication, embraces uncertainty, and relentlessly supports the creative endeavors of its people.
Leaders must inspire trust and lead by example, consistently demonstrating a commitment to innovation. They need to be mentors and coaches, guiding individuals and teams toward creative breakthroughs.
In conclusion, by cultivating a culture of experimentation, empowering diverse voices, encouraging continuous learning, and embodying a supportive leadership mindset, leaders can unlock the creative potential of their organizations. Creativity is not just a function of individual brilliance; it’s the result of a thriving ecosystem nurtured by effective leadership.
Extra Extra: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.
Image credit: Unsplash
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
Judging by the headlines on their LinkedIn profile and their presence on social media, more and more MD/DOs are innovators, coaches, entrepreneurs and non-clinical consultants. Many are starting or working with biomedical and clinical startups, including a group of medical school graduates, who don’t do a residency or starting their own company.. But:
They are not trained to do so
Entrepreneurship in the US has been in a downward spiral in the US for the past 40 years.
In today’s fast-paced business environment, understanding the fundamental relationship between employee engagement and organizational productivity is paramount. As human-centered change and innovation thought leaders, we recognize that tackling productivity challenges isn’t about squeezing more output from workers but rather about unlocking their intrinsic motivations. This article explores the linkage between engagement and productivity, supplemented by two enlightening case studies.
The Engagement-Productivity Nexus
Employee engagement refers to the emotional commitment employees have towards their organization and its goals. Engaged employees tend to expend discretionary effort, driving innovation and propelling productivity. Conversely, disengaged employees may only fulfill the minimum requirements, stymie innovation, and harbor dissatisfaction.
The nexus between engagement and productivity is complex but demonstrably significant. Engaged employees are more likely to be aligned with company objectives, leading to enhanced collaborative efforts, reduced turnover, and increased profitability.
Case Study 1: Tech Innovators Inc.
Company Overview
Tech Innovators Inc., a global leader in software development, faced a major challenge two years ago when productivity metrics began declining across several departments. Employee engagement surveys indicated waning interest and rising burnout levels.
Intervention Strategy
The company’s leadership rolled out an initiative called “Engage for Change,” aimed at enhancing employee engagement through inclusive leadership practices. Key actions included:
Implementing a “Flexible Work Hours” policy to promote work-life balance.
Facilitating monthly “Innovation Days,” allowing teams to focus on passion projects outside their usual scope.
Establishing a transparent feedback channel with bi-weekly one-on-one sessions.
Results
Within six months, employee engagement scores rose by 35%, with productivity metrics following suit with a 20% increase. Employees reported feeling more valued and empowered, fostering a culture of innovation and dedication.
“The shift was palpable; when people feel heard and valued, they perform at their best,” noted the HR Director, Lisa Chen.
Case Study 2: GreenFuture Industries
Company Overview
GreenFuture Industries, a company committed to sustainable solutions, struggled with high turnover rates and lackluster performance. Internal assessments pointed to a lack of meaningful connection between employees’ roles and the company’s mission.
Intervention Strategy
To rejuvenate their workforce, GreenFuture introduced the “Mission Engagement Program.” Steps included:
Embedding sustainability goals in personal KPIs for all employees.
Hosting quarterly “Vision and Values” workshops to reiterate the organization’s objectives and how every role contributes.
Launching a mentorship program linking new hires with seasoned sustainability advocates within the company.
Results
The initiative resulted in a 40% decline in turnover and a 25% increase in productivity. Employees developed a renewed sense of purpose, aligning personal values with corporate goals.
“Our work started to feel like a personal mission, not just a job,” shared Senior Ecologist, Marcus Lee.
Conclusion
The evidence from these case studies underscores a compelling truth: engagement is the catalyst for productivity. Organizations that foster environments where employees feel valued, connected, and empowered are the ones that thrive. By understanding and deliberately enhancing the engagement-productivity link, companies can drive meaningful organizational change and innovate more effectively.
Leaders who prioritize engagement reap benefits far beyond productivity. They cultivate resilient cultures that adapt to change, promote creative problem-solving, and build lasting success. As we navigate the complexities of modern business, let us remain steadfast in our commitment to human-centric strategies that bridge the gap between engagement and productivity.
Extra Extra: Because innovation is all about change, Braden Kelley’s human-centered change methodology and tools are the best way to plan and execute the changes necessary to support your innovation and transformation efforts — all while literally getting everyone all on the same page for change. Find out more about the methodology and tools, including the book Charting Change by following the link. Be sure and download the TEN FREE TOOLS while you’re here.
Image credit: Pexels
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
In today’s rapidly evolving business environment, disruption is the new normal. Companies that manage to thrive amidst continuous change aren’t necessarily those with the most resources but those that are agile, innovative, and prepared. As we navigate industry disruptions, understanding how to adapt and innovate becomes crucial.
The Essence of Disruption
Disruption can arise from various avenues—technological breakthroughs, regulatory shifts, market dynamics, or global events. The key to navigating these disruptions lies not only in responding to them effectively but anticipating them and embedding adaptability into the organizational fabric.
Case Study 1: Netflix – From DVDs to Streaming
Netflix’s journey is perhaps the quintessential case study of strategic adaptability and innovation. Originally a DVD rental service, Netflix faced significant challenges as technology favored streaming over physical discs. The impending obsolescence of its original business model didn’t deter Netflix; instead, it served as a catalyst for transformation.
By investing heavily in streaming technology and content production, Netflix successfully pivoted to a digital-first model. This shift not only retained its customer base but expanded it exponentially across the globe, making it a leader in content streaming. The company’s commitment to innovation didn’t stop at distribution; Netflix then disrupted the industry again by producing original content, winning numerous accolades, and setting new standards in the entertainment sector.
Lessons Learned
Anticipate shifts in consumer behavior to stay ahead.
Invest in technology to support scalable change.
Don’t just adapt; innovate to define new industry standards.
Case Study 2: LEGO – Reinventing Through Innovation
LEGO’s story reflects a different, yet equally powerful narrative of navigating industry disruption. In the early 2000s, LEGO faced a significant crisis—falling sales, high debts, and the growing allure of digital games threatened its core business model based on physical play.
LEGO’s response to this disruption was multi-faceted. They realigned their product strategies focusing on core themes that resonated with their customer base like City, Star Wars, and Technic. More importantly, LEGO embraced digitalization, launching video games, movies, and interactive experiences that extended its brand universe beyond physical bricks.
The introduction of the LEGO Ideas platform also marked a pivotal innovation, allowing fans to design new sets with the potential for actual production. This not only sparked greater brand engagement but harnessed the creativity of its community, reinforcing customer loyalty and market relevance.
Lessons Learned
Engage with your customer community for insights and innovation.
Diversify offerings to stay relevant across changing consumer preferences.
Leverage your brand’s strengths while exploring new growth avenues.
Strategies for Confidence in Disruption
Based on the insights from the case studies above, the following strategies can help organizations confidently navigate disruptions:
Build an Agile Culture
Cultivate a culture that embraces change. This means encouraging experimentation, tolerating failures, and iterating quickly. When employees are empowered to innovate and adapt, the organization becomes inherently more resilient.
Continuous Learning and Development
Equip your workforce with the skills needed to address future challenges. Investing in employee development fosters a dynamic environment ready to tackle new technologies and methodologies.
Customer-Centric Innovation
Your customers are your greatest source of feedback and inspiration. Design your products and services around their evolving needs to stay relevant. Use data analytics to glean insights and mold your strategies.
Conclusion
Navigating industry disruptions requires confidence, foresight, and an innovative spirit. Organizations that understand and implement these principles can not only survive disruptive forces but thrive in them. By embedding adaptability into your DNA, like Netflix and LEGO, you can pivot strategically and emerge stronger in any competitive landscape.
Image credit: Pexels
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
In today’s world, the intersection of innovation and sustainability is no longer optional; it is a necessity. Businesses are increasingly expected to adopt eco-friendly practices not just for compliance, but as a core component of their operations. The concept of eco-innovation, which refers to the development of products and processes that contribute to sustainable development, plays a pivotal role in redefining how businesses operate while minimizing their environmental impact. This article delves into the essence of eco-innovation and examines two insightful case studies of companies that have carved out a niche in sustainable business practices.
Understanding Eco-Innovation
Eco-innovation is the amalgamation of new approaches, ideas, products, and services that lead to both improved economic performance and reduced environmental footprint. It involves redesigning traditional business operations, adopting circular economy principles, and leveraging technology to create sustainable solutions. The key to successful eco-innovation lies in embedding sustainability into the very DNA of business strategies, rather than treating it as an add-on.
The Benefits of Eco-Innovation
Economic Growth: Eco-innovative companies can tap into new markets and create job opportunities by developing green products and services.
Resource Efficiency: By optimizing the use of natural resources, businesses can reduce waste and lower operational costs.
Competitive Advantage: Companies that lead in sustainability often enjoy enhanced brand reputation and customer loyalty.
Risk Management: Eco-innovation helps in mitigating the risks associated with regulatory changes and resource scarcity.
Case Study 1: Patagonia—Taking the Lead with Responsible Retail
Patagonia, the outdoor apparel company, exemplifies how eco-innovation can be seamlessly integrated into business operations. With a strong commitment to environmental stewardship, Patagonia leads by example in the retail industry, demonstrating that profitability and sustainability can coexist.
Sustainable Practices
Worn Wear Program: Patagonia encourages customers to buy used apparel through its Worn Wear program, which promotes recycling and reduces clothing waste. This initiative not only reduces the need for new resources but also strengthens customer relationships by fostering a community focused on sustainability.
Material Innovations: The company invests heavily in researching and developing sustainable materials, such as organic cotton and recycled polyester. Patagonia was one of the first to adopt Yulex pure—a sustainable alternative to neoprene—for wetsuits.
Supply Chain Transparency: Patagonia maintains a high level of transparency in its supply chain, ensuring fair labor practices and environmental standards. It shares comprehensive details about the factories, materials, and environmental impacts involved in its products.
Impact
Patagonia’s initiatives have significantly reduced its carbon footprint while also inspiring the wider industry to follow suit. It consistently invests 1% of its sales in environmental causes, showcasing a deep commitment to social responsibility. This has resulted in a loyal customer base that values the company’s dedication to making a positive impact on the planet.
Case Study 2: IKEA—Building a Circular Business Model
IKEA’s journey toward sustainability involves rethinking the traditional linear business model in favor of a circular approach. As one of the world’s leading furniture retailers, IKEA has set ambitious goals to embrace eco-innovation and influence consumer behavior globally.
Circular Economy Initiatives
Circular Product Design: IKEA designs products with the end in mind, emphasizing durability, reparability, and recyclability. The company’s goal is for all products to be made from renewable or recycled materials by 2030.
Take-Back Programs: Through initiatives like the furniture take-back and resell program, IKEA encourages customers to return used furniture. This program aims to extend product life cycles and reduce waste.
Sustainable Supply Chain: IKEA has partnered with suppliers to implement sustainable forestry practices and improve raw material sourcing. By adopting responsible sourcing standards, the company ensures that its wood and cotton are sourced sustainably.
Impact
IKEA’s dedication to sustainability has led to significant waste reduction and resource efficiency. The circular strategies have not only decreased the environmental impact but also opened up new revenue streams. By 2025, IKEA aims to become a fully climate-positive company, setting a benchmark for the retail industry.
The Road Ahead
As we witness the rise of eco-innovation, it is crucial for businesses to embrace change and leverage innovation for sustainable development. The transformation requires an organization-wide commitment to rethink business operations and prioritize the planet alongside profits.
Steps to Foster Eco-Innovation:
Culture of Innovation: Cultivate an organizational culture that encourages experimentation, sustainability-focused thinking, and cross-functional collaboration.
Collaboration with Stakeholders: Partner with suppliers, customers, and communities to co-create sustainable solutions and drive system-wide changes.
Investment in R&D: Allocate resources to research and development of sustainable technologies and materials.
Commitment to Education: Educate employees, customers, and other stakeholders about the importance of sustainable practices to drive widespread adoption.
In conclusion, eco-innovation is not just about doing less harm; it’s about doing more good. Companies like Patagonia and IKEA demonstrate that sustainable business practices can lead to significant positive impacts for both the environment and the bottom line. As leaders and change-makers, it is our responsibility to champion eco-innovation and pave the way for a sustainable future.
Extra Extra: Because innovation is all about change, Braden Kelley’s human-centered change methodology and tools are the best way to plan and execute the changes necessary to support your innovation and transformation efforts — all while literally getting everyone all on the same page for change. Find out more about the methodology and tools, including the book Charting Change by following the link. Be sure and download the TEN FREE TOOLS while you’re here.
Image credit: Unsplash
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
In today’s fast-paced, technology-driven world, understanding your users is crucial. Successful innovation requires insights into users’ needs, behaviors, and challenges. Effective user research uncovers these insights and informs design and business decisions. Here, I’ll share some essential techniques for conducting impactful user research, illustrated with real-world case studies.
Why User Research Matters
Before diving into techniques, let’s understand why user research is essential. It helps in:
Identifying user needs: Understand what users want and need from your products or services.
Enhancing user experience: Create intuitive and enjoyable experiences by aligning with user expectations.
Reducing risk: Avoid costly design flops by validating concepts before launch.
Key User Research Techniques
1. Interviews
Interviews are one of the most direct ways to gather rich, qualitative data. Conducting one-on-one discussions allows for in-depth exploration of user perspectives.
Case Study: HealthTech Startup
A healthtech startup utilized interviews to understand how patients manage chronic conditions. By conducting interviews with patients, caregivers, and healthcare providers, they discovered barriers in medication adherence. Insights gained informed the design of a reminder and support feature within their app, leading to increased user engagement and improved health outcomes.
2. Surveys and Questionnaires
Surveys provide quantitative data that can represent broader user trends. When well-designed, they offer valuable insights into user preferences and satisfaction levels.
3. Observational Studies
Observational studies involve watching users interact with products in natural settings. This technique uncovers real-world usage patterns and potential areas for improvement.
Case Study: Retail Experience
A major retailer used observational studies to analyze customer behavior in their stores. By observing shoppers, they identified pain points in store navigation and checkout processes. This led to strategic store layout changes and self-checkout technology implementations, enhancing convenience and boosting customer satisfaction.
4. Usability Testing
Usability testing evaluates how easily users can navigate a product. By having users perform tasks while observing their interactions, designers can identify and fix usability issues.
5. Focus Groups
Focus groups bring diverse users together to discuss their experiences. Facilitators can explore different perspectives in a dynamic group setting, uncovering collective insights.
Best Practices for Conducting User Research
Clearly define objectives: Know what you aim to learn to select appropriate research methods.
Recruit the right participants: Ensure your sample accurately represents your target audience.
Maintain ethical standards: Prioritize participant privacy and obtain informed consent.
Iterate and refine: Use findings to refine hypotheses and improve research processes.
Conclusion
Effective user research is pivotal in crafting solutions that resonate with users and drive business success. By applying these techniques thoughtfully, businesses and innovators can create products that truly meet user needs, leading to greater user satisfaction and loyalty.
Extra Extra: Because innovation is all about change, Braden Kelley’s human-centered change methodology and tools are the best way to plan and execute the changes necessary to support your innovation and transformation efforts — all while literally getting everyone all on the same page for change. Find out more about the methodology and tools, including the book Charting Change by following the link. Be sure and download the TEN FREE TOOLS while you’re here.
Image credit: Unsplash
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
In the dynamic world of Agile project management, Scrum and Kanban are two popular methodologies. Both frameworks help teams work more efficiently, but which one is right for your team? Let’s dive into the characteristics of each and examine real-world case studies to help you make an informed decision.
Understanding Scrum
Scrum is a structured framework that promotes teamwork, accountability, and iterative progress toward a well-defined goal. It consists of time-boxed iterations called sprints, typically lasting two to four weeks. Key roles in Scrum include the Scrum Master, Product Owner, and Development Team. Scrum ceremonies such as sprint planning, daily stand-ups, sprint reviews, and retrospectives are integral to the process.
Understanding Kanban
Kanban, on the other hand, is a visual method for managing workflows with an emphasis on continuous delivery. Unlike Scrum, it doesn’t prescribe fixed roles or timeframes. Work items are visualized on a Kanban board, which helps teams manage the flow and limit work in progress (WIP) to enhance productivity and quality.
Case Study 1: Tech Innovators, Inc.
Tech Innovators, Inc., a software development firm, initially adopted Scrum to tackle complex software projects. The structure allowed them to deliver high-quality software consistently. With well-defined sprint goals and regular feedback loops, the team improved their collaboration and accountability. However, as the team matured and gained confidence, they realized that some aspects of Scrum were constraining.
They transitioned to Kanban for its flexibility in handling unexpected work and continuous delivery. With Kanban, they could prioritize tasks dynamically and respond better to customer needs. This shift enabled Tech Innovators to reduce their lead time by 30% and significantly improve customer satisfaction.
Case Study 2: Creative Market Agency
Creative Market Agency, specializing in digital marketing campaigns, had complex, non-linear projects with frequent changes in scope. Initially, they used Kanban to manage their ever-changing project requirements. The visual nature of Kanban suited their needs as it provided transparency and adaptability.
However, as projects grew larger and involved more stakeholders, the lack of structure became a bottleneck. They switched to Scrum to impose a necessary order and discipline. The cadence of sprints, coupled with defined roles, helped the agency streamline their processes, improve predictability, and enhance stakeholder communication.
Key Considerations
Deciding between Scrum and Kanban depends on your team’s specific needs and project dynamics:
Structure vs. Flexibility: Scrum provides structure with fixed roles and sprints, while Kanban offers more flexibility.
Workload and Prioritization: If managing workload and prioritizing tasks dynamically is crucial, Kanban might be more suitable.
Project Complexity: For complex projects needing alignment and stakeholder engagement, Scrum’s structured approach is beneficial.
Team Maturity: Mature teams comfortable with autonomy might thrive in a Kanban environment, whereas less experienced teams may benefit from Scrum’s guidance.
Conclusion
Both Scrum and Kanban have their strengths. Your choice should align with your team’s goals, project requirements, and maturity. Whether you need Scrum’s structured sprints or Kanban’s continuous flow, what matters most is tailoring the framework to your unique context for optimal team performance and innovation.
Extra Extra: Futurology is not fortune telling. Futurists use a scientific approach to create their deliverables, but a methodology and tools like those in FutureHacking™ can empower anyone to engage in futurology themselves.
Image credit: Pexels
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.
READER QUESTION:If humans don’t die out in a climate apocalypse or asteroid impact in the next 10,000 years, are we likely to evolve further into a more advanced species than what we are at the moment? Harry Bonas, 57, Nigeria
Humanity is the unlikely result of 4 billion years of evolution.
From self-replicating molecules in Archean seas, to eyeless fish in the Cambrian deep, to mammals scurrying from dinosaurs in the dark, and then, finally, improbably, ourselves – evolution shaped us.
Organisms reproduced imperfectly. Mistakes made when copying genes sometimes made them better fit to their environments, so those genes tended to get passed on. More reproduction followed, and more mistakes, the process repeating over billions of generations. Finally, Homo sapiens appeared. But we aren’t the end of that story. Evolution won’t stop with us, and we might even be evolving faster than ever.
The Conversation’s new series, co-published with BBC Future, seeks to answer our readers’ nagging questions about life, love, death and the universe. We work with professional researchers who have dedicated their lives to uncovering new perspectives on the questions that shape our lives.
It’s hard to predict the future. The world will probably change in ways we can’t imagine. But we can make educated guesses. Paradoxically, the best way to predict the future is probably looking back at the past, and assuming past trends will continue going forward. This suggests some surprising things about our future.
We will likely live longer and become taller, as well as more lightly built. We’ll probably be less aggressive and more agreeable, but have smaller brains. A bit like a golden retriever, we’ll be friendly and jolly, but maybe not that interesting. At least, that’s one possible future. But to understand why I think that’s likely, we need to look at biology.
The end of natural selection?
Some scientists have argued that civilisation’s rise ended natural selection. It’s true that selective pressures that dominated in the past – predators, famine, plague, warfare – have mostly disappeared.
Starvation and famine were largely ended by high-yield crops, fertilisers and family planning. Violence and war are less common than ever, despite modern militaries with nuclear weapons, or maybe because of them. The lions, wolves and sabertoothed cats that hunted us in the dark are endangered or extinct. Plagues that killed millions – smallpox, Black Death, cholera – were tamed by vaccines, antibiotics, clean water.
But evolution didn’t stop; other things just drive it now. Evolution isn’t so much about survival of the fittest as reproduction of the fittest. Even if nature is less likely to murder us, we still need to find partners and raise children, so sexual selection now plays a bigger role in our evolution.
And if nature doesn’t control our evolution anymore, the unnatural environment we’ve created – culture, technology, cities – produces new selective pressures very unlike those we faced in the ice age. We’re poorly adapted to this modern world; it follows that we’ll have to adapt.
And that process has already started. As our diets changed to include grains and dairy, we evolved genes to help us digest starch and milk. When dense cities created conditions for disease to spread, mutations for disease resistance spread too. And for some reason, our brains have got smaller. Unnatural environments create unnatural selection.
To predict where this goes, we’ll look at our prehistory, studying trends over the past 6 million years of evolution. Some trends will continue, especially those that emerged in the past 10,000 years, after agriculture and civilisation were invented.
We’re also facing new selective pressures, such as reduced mortality. Studying the past doesn’t help here, but we can see how other species responded to similar pressures. Evolution in domestic animals may be especially relevant – arguably we’re becoming a kind of domesticated ape, but curiously, one domesticated by ourselves.
I’ll use this approach to make some predictions, if not always with high confidence. That is, I’ll speculate.
Lifespan
Humans will almost certainly evolve to live longer – much longer. Life cycles evolve in response to mortality rates, how likely predators and other threats are to kill you. When mortality rates are high, animals must reproduce young, or might not reproduce at all. There’s also no advantage to evolving mutations that prevent ageing or cancer – you won’t live long enough to use them.
When mortality rates are low, the opposite is true. It’s better to take your time reaching sexual maturity. It’s also useful to have adaptations that extend lifespan, and fertility, giving you more time to reproduce. That’s why animals with few predators – animals that live on islands or in the deep ocean, or are simply big – evolve longer lifespans. Greenland sharks, Galapagos tortoises and bowhead whales mature late, and can live for centuries.
Even before civilisation, people were unique among apes in having low mortality and long lives. Hunter-gatherers armed with spears and bows could defend against predators; food sharing prevented starvation. So we evolved delayed sexual maturity, and long lifespans – up to 70 years.
Still, child mortality was high – approaching 50% or more by age 15. Average life expectancy was just 35 years. Even after the rise of civilisation, child mortality stayed high until the 19th century, while life expectancy went down – to 30 years – due to plagues and famines.
Then, in the past two centuries, better nutrition, medicine and hygiene reduced youth mortality to under 1% in most developed nations. Life expectancy soared to 70 years worldwide , and 80 in developed countries. These increases are due to improved health, not evolution – but they set the stage for evolution to extend our lifespan.
Now, there’s little need to reproduce early. If anything, the years of training needed to be a doctor, CEO, or carpenter incentivise putting it off. And since our life expectancy has doubled, adaptations to prolong lifespan and child-bearing years are now advantageous. Given that more and more people live to 100 or even 110 years – the record being 122 years – there’s reason to think our genes could evolve until the average person routinely lives 100 years or even more.
Size, and strength
Animals often evolve larger size over time; it’s a trend seen in tyrannosaurs, whales, horses and primates – including hominins.
Why we got big is unclear. In part, mortality may drive size evolution; growth takes time, so longer lives mean more time to grow. But human females also prefertall males. So both lower mortality and sexual preferences will likely cause humans to get taller. Today, the tallest people in the world are in Europe, led by the Netherlands. Here, men average 183cm (6ft); women 170cm (5ft 6in). Someday, most people might be that tall, or taller.
As we’ve grown taller, we’ve become more gracile. Over the past 2 million years, our skeletons became more lightly built as we relied less on brute force, and more on tools and weapons. As farming forced us to settle down, our lives became more sedentary, so our bone density decreased. As we spend more time behind desks, keyboards and steering wheels, these trends will likely continue.
Humans have also reduced our muscles compared to other apes, especially in our upper bodies. That will probably continue. Our ancestors had to slaughter antelopes and dig roots; later they tilled and reaped in the fields. Modern jobs increasingly require working with people, words and code – they take brains, not muscle. Even for manual laborers – farmers, fisherman, lumberjacks – machinery such as tractors, hydraulics and chainsaws now shoulder a lot of the work. As physical strength becomes less necessary, our muscles will keep shrinking.
Our jaws and teeth also got smaller. Early, plant-eating hominins had huge molars and mandibles for grinding fibrous vegetables. As we shifted to meat, then started cooking food, jaws and teeth shrank. Modern processed food – chicken nuggets, Big Macs, cookie dough ice cream – needs even less chewing, so jaws will keep shrinking, and we’ll likely lose our wisdom teeth.
Beauty
After people left Africa 100,000 years ago, humanity’s far-flung tribes became isolated by deserts, oceans, mountains, glaciers and sheer distance. In various parts of the world, different selective pressures – different climates, lifestyles and beauty standards – caused our appearance to evolve in different ways. Tribes evolved distinctive skin colour, eyes, hair and facial features.
With civilisation’s rise and new technologies, these populations were linked again. Wars of conquest, empire building, colonisation and trade – including trade of other humans – all shifted populations, which interbred. Today, road, rail and aircraft link us too. Bushmen would walk 40 miles to find a partner; we’ll go 4,000 miles. We’re increasingly one, worldwide population – freely mixing. That will create a world of hybrids – light brown skinned, dark-haired, Afro-Euro-Australo-Americo-Asians, their skin colour and facial features tending toward a global average.
Sexual selection will further accelerate the evolution of our appearance. With most forms of natural selection no longer operating, mate choice will play a larger role. Humans might become more attractive, but more uniform in appearance. Globalised media may also create more uniform standards of beauty, pushing all humans towards a single ideal. Sex differences, however, could be exaggerated if the ideal is masculine-looking men and feminine-looking women.
Intelligence and personality
Last, our brains and minds, our most distinctively human feature, will evolve, perhaps dramatically. Over the past 6 million years, hominin brain size roughly tripled, suggesting selection for big brains driven by tool use, complex societies and language. It might seem inevitable that this trend will continue, but it probably won’t.
It could be that fat and protein were scarce once we shifted to farming, making it more costly to grow and maintain large brains. Brains are also energetically expensive – they burn around 20% of our daily calories. In agricultural societies with frequent famine, a big brain might be a liability.
Maybe hunter-gatherer life was demanding in ways farming isn’t. In civilisation, you don’t need to outwit lions and antelopes, or memorise every fruit tree and watering hole within 1,000 square miles. Making and using bows and spears also requires fine motor control, coordination, the ability to track animals and trajectories — maybe the parts of our brains used for those things got smaller when we stopped hunting.
Or maybe living in a large society of specialists demands less brainpower than living in a tribe of generalists. Stone-age people mastered many skills – hunting, tracking, foraging for plants, making herbal medicines and poisons, crafting tools, waging war, making music and magic. Modern humans perform fewer, more specialised roles as part of vast social networks, exploiting division of labour. In a civilisation, we specialise on a trade, then rely on others for everything else.
That being said, brain size isn’t everything: elephants and orcas have bigger brains than us, and Einstein’s brain was smaller than average. Neanderthals had brains comparable to ours, but more of the brain was devoted to sight and control of the body, suggesting less capacity for things like language and tool use. So how much the loss of brain mass affects overall intelligence is unclear. Maybe we lost certain abilities, while enhancing others that are more relevant to modern life. It’s possible that we’ve maintained processing power by having fewer, smaller neurons. Still, I worry about what that missing 10% of my grey matter did.
Curiously, domestic animals also evolved smaller brains. Sheep lost 24% of their brain mass after domestication; for cows, it’s 26%; dogs, 30%. This raises an unsettling possibility. Maybe being more willing to passively go with the flow (perhaps even thinking less), like a domesticated animal, has been bred into us, like it was for them.
Our personalities must be evolving too. Hunter-gatherers’ lives required aggression. They hunted large mammals, killed over partners and warred with neighbouring tribes. We get meat from a store, and turn to police and courts to settle disputes. If war hasn’t disappeared, it now accounts for fewer deaths, relative to population, than at any time in history. Aggression, now a maladaptive trait, could be bred out.
Changing social patterns will also change personalities. Humans live in much larger groups than other apes, forming tribes of around 1,000 in hunter-gatherers. But in today’s world people living in vast cities of millions. In the past, our relationships were necessarily few, and often lifelong. Now we inhabit seas of people, moving often for work, and in the process forming thousands of relationships, many fleeting and, increasingly, virtual. This world will push us to become more outgoing, open and tolerant. Yet navigating such vast social networks may also require we become more willing to adapt ourselves to them – to be more conformist.
Not everyone is psychologically well-adapted to this existence. Our instincts, desires and fears are largely those of stone-age ancestors, who found meaning in hunting and foraging for their families, warring with their neighbours and praying to ancestor-spirits in the dark. Modern society meets our material needs well, but is less able to meet the psychological needs of our primitive caveman brains.
Perhaps because of this, increasing numbers of people suffer from psychological issues such as loneliness, anxiety and depression. Many turn to alcohol and other substances to cope. Selection against vulnerability to these conditions might improve our mental health, and make us happier as a species. But that could come at a price. Many great geniuses had their demons; leaders like Abraham Lincoln and Winston Churchill fought with depression, as did scientists such as Isaac Newton and Charles Darwin, and artists like Herman Melville and Emily Dickinson. Some, like Virginia Woolf, Vincent Van Gogh and Kurt Cobain, took their own lives. Others – Billy Holliday, Jimi Hendrix and Jack Kerouac – were destroyed by substance abuse.
A disturbing thought is that troubled minds will be removed from the gene pool – but potentially at the cost of eliminating the sort of spark that created visionary leaders, great writers, artists and musicians. Future humans might be better adjusted – but less fun to party with and less likely to launch a scientific revolution — stable, happy and boring.
New species?
There were once nine human species, now it’s just us. But could new human species evolve? For that to happen, we’d need isolated populations subject to distinct selective pressures. Distance no longer isolates us, but reproductive isolation could theoretically be achieved by selective mating. If people were culturally segregated – marrying based on religion, class, caste, or even politics – distinct populations, even species, might evolve.
In The Time Machine, sci-fi novelist H.G. Wells saw a future where class created distinct species. Upper classes evolved into the beautiful but useless Eloi, and the working classes become the ugly, subterranean Morlocks – who revolted and enslaved the Eloi.
In the past, religion and lifestyle have sometimes produced genetically distinct groups, as seen in for example Jewish and Gypsy populations. Today, politics also divides us – could it divide us genetically? Liberals now move to be near other liberals, and conservatives to be near conservatives; many on the left won’t date Trump supporters and vice versa.
Could this create two species, with instinctively different views? Probably not. Still, to the extent culture divides us, it could drive evolution in different ways, in different people. If cultures become more diverse, this could maintain and increase human genetic diversity.
Strange New Possibilities
So far, I’ve mostly taken a historical perspective, looking back. But in some ways, the future might be radically unlike the past. Evolution itself has evolved.
One of the more extreme possibilities is directed evolution, where we actively control our species’ evolution. We already breed ourselves when we choose partners with appearances and personalities we like. For thousands of years, hunter-gatherers arranged marriages, seeking good hunters for their daughters. Even where children chose partners, men were generally expected to seek approval of the bride’s parents. Similar traditions survive elsewhere today. In other words, we breed our own children.
And going forward, we’ll do this with far more knowledge of what we’re doing, and more control over the genes of our progeny. We can already screen ourselves and embryos for genetic diseases. We could potentially choose embryos for desirable genes, as we do with crops. Direct editing of the DNA of a human embryo has been proven to be possible — but seems morally abhorrent, effectively turning children into subjects of medical experimentation. And yet, if such technologies were proven safe, I could imagine a future where you’d be a bad parent not to give your children the best genes possible.
Computers also provide an entirely new selective pressure. As more and more matches are made on smartphones, we are delegating decisions about what the next generation looks like to computer algorithms, who recommend our potential matches. Digital code now helps choose what genetic code passed on to future generations, just like it shapes what you stream or buy online. This might sound like dark science fiction, but it’s already happening. Our genes are being curated by computer, just like our playlists. It’s hard to know where this leads, but I wonder if it’s entirely wise to turn over the future of our species to iPhones, the internet and the companies behind them.
Discussions of human evolution are usually backward looking, as if the greatest triumphs and challenges were in the distant past. But as technology and culture enter a period of accelerating change, our genes will too. Arguably, the most interesting parts of evolution aren’t life’s origins, dinosaurs, or Neanderthals, but what’s happening right now, our present – and our future.
3 Ways to Leverage Human-Centered Design at Your Organization
GUEST POST from Patricia Salamone
In a world where business challenges are increasingly complex, identifying your objective and framing your problem correctly is an integral way to demonstrate leadership and ensure teams don’t inadvertently solve the wrong problem. This is where a Human-Centered Design (HCD) mindset comes in—providing a groundbreaking way to define and ensure teams are focused on the right objective.
First, consider the challenge and objectives.
Not all business challenges need to be completely reimagined. Before jumping back to the drawing board, ask yourself, is there an obvious answer? Is there a clear approach to finding a solution? Can the team define what isn’t right? If you can’t say yes to these questions, then your business can benefit from the application of HCD principles. While teams understand they need to align and reframe challenges, having the proper tools in place is where many teams can fall short.
Move past traditional methods and be inspired to see challenges by taking a step back to reframe the problem:
Align the team. Often, internal teams will have differing viewpoints on a business problem. Rather than seeing this as a barrier, cross-functional alignment can open the door for creativity and new ideas.
Keep the focus on the issue. It’s often tempting to jump from “we have a problem” to, “here’s what we should do.” Instead, keep digging deeper. For every apparent problem definition, ask, “why does that matter?” multiple times, enabling yourself to get to the root cause and ensure you’re focusing on the “problem” rather than a “symptom of the problem.”
Use different words to reframe. Next time your team states a problem, challenge everyone to restate it using different words. Each iteration can reveal new facets of the problem, bringing clarity to the challenge at hand.
Zoom out. Rather than using a microscope to see details that aren’t immediately visible, approach the problem from a broader, more abstract perspective. Look at the customer’s “job to be done,” rather than what they may say their challenge is. This enables a more holistic and pragmatic view.
By making problem-reframing a habit, you are opening your organization up to greater flexibility and new pathways for innovation. This method also has the added benefit of clarifying gaps in knowledge and revealing where additional customer insight is needed.
Make empathy a daily habit.
A core principle of HCD is that empathy must permeate every aspect of traditional research initiatives. Simply seeking customer feedback to develop strategies often leads to insular thinking. While a research project-driven mindset is very much the norm, empathy in an HCD context is much more than that, it must permeate every aspect of the work.
Similar to reframing challenges, it is imperative to listen and learn from customer stories and perspectives. Here are some ways to establish daily habits and build stronger relationships with your customers.
Advocate for the customer’s voice in team meetings. Always begin by asking questions like, “how would our customers feel about this?”
Socialize existing wisdom within an HCD team on a weekly basis. This could look like emails containing important insights or bringing in a small group of clients together for “speed dating” with stakeholders to gain a human understanding of your customers’ experiences, wishes, and pain points.
Obtain real-time feedback. Online research communities can enable on-demand responses to explore fuzzy, front-end ideas, rapidly iterate on new product concepts, or gather deep insights into how your customers use a product post-launch.
Apply an agile mindset.
One of the hallmarks of HCD is agility. But being agile isn’t just about being “fast,” it’s about delivering value as efficiently as possible. In practice, an agile mindset means thinking differently about how your work gets done and the ways in which a team can break through functional silos.
Not sure where to begin? Here are some tactics to get you started:
Break up the work of the team into two-week sprints. Define what can be done in those two weeks and create measurable goals to work toward them (even if those outcomes are only intermediate steps toward a bigger goal).
Commit to short and frequent stand-ups with your team to share commitments and highlight possible hurdles to accomplishing the goals of the current sprint.
Portion out deliverables. Rather than focusing on your next big presentation as your deliverable, think about how you can break your work down and deliver portions of that content to your stakeholders sooner in a more informal way.
While the above suggestions are purely jumping-off points, they serve as solid examples of practical ways you can begin to transition from understanding HCD as a concept to it becoming an enabler of rethinking both your own work, as well as becoming a catalyst to higher-performing teams.
At the end of the day, embracing the principles of HCD is a long-term journey. These proven steps will help you lead and inspire teams to begin developing new habits that quickly demonstrate the strong potential HCD has in creating a new way to see innovation through the eyes of your customers.
Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.