Tag Archives: economics

The Digital Revolution Has Been A Giant Disappointment

The Digital Revolution Has Been A Giant Disappointment

GUEST POST from Greg Satell

One of the most often repeated episodes in the history of technology is when Steve Jobs was recruiting John Sculley from his lofty position as CEO at Pepsi to come to Apple. “Do you want to sell sugar water for the rest of your life,”Jobs asked, “or do you want to come with me and change the world?”

It’s a strange conceit of digital denizens that their businesses are something nobler than other industries. While it is true that technology can do some wonderful things, if the aim of Silicon Valley entrepreneurs was truly to change the world, why wouldn’t they apply their formidable talents to something like curing cancer or feeding the hungry?

The reality, as economist Robert Gordon explains in the The Rise and Fall of American Growth, is that the measurable impact has been relatively meager. According to the IMF, except for a relatively short burst in growth between 1996 and 2004, productivity has been depressed since the 1970s. We need to rethink how technology impacts our world.

The Old Productivity Paradox

In the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so strange that they called it the productivity paradox to underline their confusion.

The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit seeking businesses continue to make substantial investments, you expect to see a return. Yet with IT investment in the 70s and 80s, firms continued to increase their investment with negligible measurable benefit.

A paper by researchers at the University of Sheffield sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, were a small portion of total capital investment. Third, businesses weren’t necessarily investing to improve productivity, but to survive in a more demanding marketplace.

Yet by the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for the technology to hit critical mass.

The New Productivity Paradox

By 2004, the law of increasing returns was there for everyone to see. Google already dominated search, Amazon ruled e-commerce, Apple would go on to dominate mobile computing and Facebook would rule social media. Yet as the dominance of the tech giants grew, productivity would once again fall to depressed levels.

Yet today, more than a decade later, we’re in the midst of a second productivity paradox, just as mysterious as the first one. New technologies like mobile computing and artificial intelligence are there for everyone to see, but they have done little, if anything, to boost productivity.

At the same time the power of digital technology is diminishing. Moore’s law, the decades old paradigm of continuous doubling in the power of computer processing is slowing down and soon will end completely. Without advancement in the underlying technology, it is hard to see how digital technology will ever power another productivity boom.

Considering the optimistic predictions of digital entrepreneurs like Steve Jobs, this is incredibly disappointing. Compare the meager eight years of elevated productivity that digital technology produced with the 50-year boom in productivity created in the wake of electricity and internal combustion and it’s clear that digital technology simply doesn’t measure up.

The Baumol Effect, The Clothesline Paradox and Other Headwinds

Much like the first productivity paradox, it’s hard to determine exactly why the technological advancement over the last 15 years has amounted to so little. Most likely, it is not one factor in particular, but the confluence of a number of them. Increasing productivity growth in an advanced economy is no simple thing.

One possibility for the lack of progress is the Baumol effect, the principle that some sectors of the economy are resistant to productivity growth. For example, despite the incredible efficiency that Jeff Bezos has produced at Amazon, his barber still only cuts one head of hair at a time. In a similar way, sectors like healthcare and education, which require a large amount of labor inputs that resist automation, will act as a drag on productivity growth.

Another factor is the Clothesline paradox, which gets its name from the fact that when you dry your clothes in a machine, it figures into GDP data, but when you hang them on a clothesline, no measurable output is produced. In much the same way, when you use a smartphone to take pictures or to give you directions, there is considerable benefit that doesn’t result in any financial transactions. In fact, because you use less gas and don’t develop film, GDP decreases somewhat.

Additionally, the economist Robert Gordon, mentioned above, notes six headwinds to economic growth, including aging populations, limits to increasing education, income inequality, outsourcing, environmental costs due to climate change and rising household and government debt. It’s hard to see how digital technology will make a dent in any of these problems.

Technology is Never Enough to Change the World

Perhaps the biggest reason that the digital revolution has been such a big disappointment is because we expected the technology to largely do the work for us. While there is no doubt that computers are powerful tools, we still need to put them to good use and we have clearly missed opportunities in that regard.

Think about what life was like in 1900, when the typical American family didn’t have access to running water, electricity or gas powered machines such as tractors or automobiles. Even something simply like cooking a meal took hours of backbreaking labor. Yet investments in infrastructure and education combined with technology to produce prosperity.

Today, however, there is no comparable effort to invest in education and healthcare for those who cannot afford it, to limit the effects of climate change, to reduce debt or to do anything of anything of significance to mitigate the headwinds we face. We are awash in nifty gadgets, but in many ways we are no better off than we were 30 years ago.

None of this was inevitable, but the somewhat the results of choices that we have made. We can, if we really want to, make different choices in the days and years ahead. What I hope we have learned from our digital disappointments is that technology itself is never enough. We are truly the masters of our fate, for better or worse.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Humans, Not Technology, Drive Business Success

Humans, Not Technology, Drive Business Success

GUEST POST from Greg Satell

Silicon Valley is often known as a cut-throat, technocratic place where the efficiency of algorithms often define success. Competition is ferocious and the pace of disruption and change can be dizzying. It’s not the type of environment where soft skills are valued particularly highly or even at all.

So, it’s somewhat ironic that Bill Campbell became a Silicon Valley legend by giving hugs and professing love to those he worked with. As coach to executives ranging from Steve Jobs to the entire Google executive team, Campbell preached and practiced a very personal style of business.

Yet while I was reading Trillion Dollar Coach in which former Google executives explain Campbell’s leadership principles, it became clear why he had such an impact. Even in Silicon Valley, technology will only take you so far. The success of a business ultimately depends on the success of the people in it. To compete over the long haul, that’s where you need to focus.

The Efficiency Paradox

In 1911, Frederick Winslow Taylor published The Principles of Scientific Management, based on his experience as a manager in a steel factory. It took aim at traditional management methods and suggested a more disciplined approach. Rather than have workers pursue tasks in their own manner, he sought to find “the one best way” and train accordingly.

Taylor wrote, “It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured. And the duty of enforcing the adoption of standards and enforcing this cooperation rests with management alone.”

Before long, Taylor’s ideas became gospel, spawning offshoots such as scientific marketing, financial engineering and the Six Sigma movement. It was no longer enough to simply work hard, you had to measure, analyze and optimize everything. Over the years these ideas have become so central to business thinking that they are rarely questioned.

Yet management guru Henry Mintzberg has pointed out how a “by-the-numbers” depersonalized approach can often backfire. “Managing without soul has become an epidemic in society. Many managers these days seem to specialize in killing cultures, at the expense of human engagement.”

The evidence would seem to back him up. One study found that of 58 large companies that have announced Six Sigma programs, 91 percent trailed the S&P 500 in stock performance. That, in essence, is the efficiency paradox. When you manage only what you can measure, you end up ignoring key factors to success.

How Generosity Drives Innovation

While researching my book, Mapping Innovation, I interviewed dozens of top innovators. Some were world class scientists and engineers. Others were high level executives at large corporations. Still others were highly successful entrepreneurs. Overall, it was a pretty intimidating group.

So, I was surprised to find that, with few exceptions, they were some of the kindest and most generous people I have ever met. The behavior was so consistent that I felt that it couldn’t be an accident. So I began to research the matter further and found that when it comes to innovation, generosity really is a competitive advantage.

For example, one study of star engineers at Bell Labs found that the best performers were not the ones with the best academic credentials, but those with the best professional networks. A similar study of the design firm IDEO found that great innovators essentially act as brokers able to access a diverse array of useful sources.

A third study helps explain why knowledge brokering is so important. Analyzing 17.9 million papers, the researchers found that the most highly cited work tended to be largely rooted within a traditional field, but with just a smidgen of insight taken from some unconventional place. Breakthrough creativity occurs at the nexus of conventionality and novelty.

The truth is that the more you share with others, the more they’ll be willing to share with you and that makes it much more likely you’ll come across that random piece of information or insight that will allow you to crack a really tough problem.

People As Profit Centers

For many, the idea that innovation is a human centered activity is intuitively obvious. So it makes sense that the high-tech companies that Bill Campbell was involved in would work hard to create environments to attract the best and the brightest people. However, most businesses have much lower margins and have to keep a close eye on the bottom line.

Yet here too there is significant evidence that a human-focused approach to management can yield better results. In The Good Jobs Strategy MIT’s Zeynep Ton found that investing more in well-trained employees can actually lower costs and drive sales. A dedicated and skilled workforce results in less turnover, better customer service and greater efficiency.

For example, when the recession hit in 2008, Mercadona, Spain’s leading discount retailer, needed to cut costs. But rather than cutting wages or reducing staff, it asked its employees to contribute ideas. The result was that it managed to reduce prices by 10% and increased its market share from 15% in 2008 to 20% in 2012.

Its competitors maintained the traditional mindset. They reduced cut wages and employee hours, which saved them some money, but customers found poorly maintained stores with few people to help them, which damaged their brand long-term. The cost savings Mercadona’s employees identified, on the other hand, in many cases improved service and productivity and these gains persisted long after the crisis was over.

Management Beyond Metrics

The truth is that it’s easy to talk about putting people first, but much harder to do it in practice. Research suggests that once a group goes much beyond 200 people social relationships break down, so once a business gets beyond that point, it becomes natural to depersonalize management and focus on metrics.

Yet the best managers understand that it’s the people that drive the numbers. As legendary IBM CEO Lou Gerstner once put it, “Culture isn’t just one aspect of the game… It is the game. What does the culture reward and punish – individual achievement or team play, risk taking or consensus building?”

In other words, culture is about values. The innovators I interviewed for my book valued solving problems, so were enthusiastic about sharing their knowledge and expertise with others, who happily reciprocated. Mercadona valued its people, so when it asked them to find ways to save money during the financial crisis, they did so enthusiastically.

That’s why today, three years after his death, Bill Campbell remains a revered figure in Silicon Valley, because he valued people so highly and helped them learn to value each other. Management is not an algorithm. It is, in the final analysis, an intensely human activity and to do it well, you need to put people first.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Why Humans Fail to Plan for the Future

Why Humans Fail to Plan for the Future

GUEST POST from Greg Satell

I was recently reading Michiu Kaku’s wonderful book, The Future of Humanity, about colonizing space and was amazed how detailed some of the plans are. Plans for a Mars colony, for example, are already fairly advanced. In other cases, scientists are actively thinking about technologies that won’t be viable for a century or more.

Yet while we seem to be so good at planning for life in outer space, we are much less capable of thinking responsibly about the future here on earth, especially in the United States. Our federal government deficit recently rose to 4.6% of GDP, which is obviously unsustainable in an economy that’s growing at a meager 2.3%.

That’s just one data point, but everywhere you look we seem to be unable to plan for the future. Consumer debt in the US recently hit levels exceeding those before the crash in 2008. Our infrastructure is falling apart. Air quality is getting worse. The list goes on. We need to start thinking more seriously about the future, but don’t seem to be able. Why is that?

It’s Biology, Stupid

The simplest and most obvious explanation for why we fail to plan for the future is basic human biology. We have pleasure centers in our brains that release a hormone called dopamine, which gives us a feeling of well-being. So, it shouldn’t be surprising that we seek to maximize our dopamine fix in the present and neglect the future.

Yuval Noah Harari made this argument in his book Homo Deus, in which he argued that “organisms are algorithms.” Much like a vending machine is programed to respond to buttons, Harari argues, humans and other animals are programed by genetics and evolution to respond to “sensations, emotions and thoughts.” When those particular buttons are pushed, we respond much like a vending machine does.

He gives various data points for this point of view. For example, he describes psychological experiments in which, by monitoring brainwaves, researchers are able to predict actions, such as whether a person will flip a switch, even before he or she is aware of it. He also points out that certain chemicals, such as Ritalin and Prozac, can modify behavior.

Yet this somehow doesn’t feel persuasive. Adults in even primitive societies are expected to overcome basic urges. Citizens of Ancient Rome were taxed to pay for roads that led to distant lands and took decades to build. Medieval communities built churches that stood for centuries. Why would we somehow lose our ability to think long-term in just the past generation or so?

The Profit Motive

Another explanation of why we neglect the future is the profit motive. Pressed by demanding shareholders to deliver quarterly profits, corporate executives focus on showing short-term profits instead of investing for the future. The result is increased returns to fund managers, but a hollowing out of corporate competitiveness.

A recent article in Harvard Business Review would appear to bear this out. When a team of researchers looked into the health of the innovation ecosystem in the US, they found that corporate America has largely checked out. They also observed that storied corporate research labs, such as Bell Labs and Xerox PARC have diminished over time.

Yet take a closer look and the argument doesn’t hold up. In fact, the data from the National Science Foundation shows that corporate research has increased from roughly 40% of total investment in the 1950s and 60s to more than 60% today. At the same time, while some firms have closed research facilities, others, such as Microsoft, IBM and Google have either opened new ones or greatly expanded previous efforts. Overall R&D spending has risen over time.

Take a look at how Google innovates and you’ll be able to see the source for some the dissonance. 50 years ago, the only real option for corporate investment in research was a corporate lab. Today, however, there are many other avenues, including partnerships with academic researchers, internal venture capital operations, incubators, accelerators and more.

The Free Rider Problem

A third reason we may fail to invest in the future is the free rider problem. In this view, the problem is not that we don’t plan for the future, but that we don’t want to spend money on others who are undeserving. For example, why should we pay higher taxes to educate kids from outside our communities? Or to infrastructure projects that are wasteful and corrupt?

This type of welfare queen argument can be quite powerful. Although actual welfare fraud has been shown to be incredibly rare, there are many who believe that the public sector is inherently wasteful and money would be more productively invested elsewhere. This belief doesn’t only apply to low-income people, but also to “elites” such as scientists.

Essentially, this is a form of kinship selection. We are more willing to invest in the future of people who we see as similar to ourselves, because that is a form of self-survival. However, when we find ourselves asked to invest in the future of those we see as different from ourselves, whether that difference is of race, social class or even profession, we balk.

Yet here again, a closer look and the facts don’t quite fit with the narrative. Charitable giving, for example, has risen almost every year since 1977. So, it’s strange that we’re increasingly generous in giving to those who are in need, but stingy when it comes to things like infrastructure and education.

A New Age of Superstition

What’s especially strange about our inability to plan for the future is that it’s relatively new. In fact, after World War II, we invested heavily in the future. We created new avenues for scientific investment at agencies like the National Science Foundation and the National Institutes of Health, rebuilt Europe with the Marshall Plan and educated an entire generation with the GI Bill.

It wasn’t until the 1980s that our willingness to plan for and invest in the future began to wane, mostly due to two ideas that warped decision making. The first, called the Laffer Curve, argued that by lowering taxes we can increase revenue and that tax cuts, essentially, pay for themselves. The second, shareholder value, argued that whatever was best for shareholders is also best for society.

Both ideas have been partially or thoroughly debunked. Over the past 40 years, lower tax rates have consistently led to lower revenues and higher deficits. The Business Roundtable, an influential group of almost 200 CEOs of America’s largest companies, recently denounced the concept of shareholder value. Yet strangely, many still use both to support anti-future decisions.

We seem to be living in a new era of superstition, where mere belief is enough to inspire action. So projects which easily capture the imagination, such as colonizing Mars, are able to garner fairly widespread support, while investing in basic things like infrastructure, debt reduction or the environment are neglected.

The problem, in other words, seems to be mostly in the realm of a collective narrative. We are more than capable of enduring privation today to benefit tomorrow, just as businesses routinely take less profits today to invest in tomorrow. We are even capable of giving altruistically to others in need. All we need is a story to believe in.

There is, however, the possibility that it is not the future we really have a problem with, but each other and that our lack of a common story arises from a lack of shared values which leads to major differences in how we view the same facts. In any case, the future suffers.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Were Wrong About What Drove the 21st Century

We Were Wrong About What Drove the 21st Century

GUEST POST from Greg Satell

Every era contains a prism of multitudes. World War I gave way to the “Roaring 20s” and a 50-year boom in productivity. The Treaty of Versailles sowed the seeds to the second World War, which gave way to the peace and prosperity post-war era. Vietnam and the rise of the Baby Boomers unlocked a cultural revolution that created new freedoms for women and people of color.

Our current era began with the 80s, the rise of Ronald Reagan and a new confidence in the power of markets. Genuine achievements of the Chicago School of economics led by Milton Friedman, along with the weakness Soviet system, led to an enthusiasm for market fundamentalism that dominated policy circles.

So it shouldn’t be that surprising that veteran Republican strategist Stuart Stevens wrote a book denouncing that orthodoxy as a lie. The truth is he has a point. But politicians can only convince us of things we already want to believe. The truth is that we were fundamentally mistaken in our understanding of how the world works. It’s time that we own up to it.

Mistake #1: The End Of The Cold War Would Strengthen Capitalism

When the Berlin Wall came down in 1989, the West was triumphant. Communism was shown to be a corrupt system bereft of any real legitimacy. A new ideology took hold, often called the Washington Consensus, that preached fiscal discipline, free trade, privatization and deregulation. The world was going to be remade in capitalism’s image.

Yet for anybody who was paying attention, communism had been shown to be bankrupt and illegitimate since the 1930s when Stalin’s failed collectivization effort and industrial plan led him to starve his own people. Economists have estimated that, by the 1970s, Soviet productivity growth had gone negative, meaning more investment actually brought less output. The system’s collapse was just a matter of time.

At the same time, there were early signs that there were serious problems with the Washington Consensus. Many complained that bureaucrats at the World Bank and the IMF were mandating policies for developing nations that citizens in their own countries would not accept. So called “austerity programs” led to human costs that were both significant and real. In a sense, the error of the Soviets was being repeated—ideology was put before people.

Today, instead of a capitalist utopia and an era of peace and prosperity, we got a global rise in authoritarian populism, stagnant wages, reduced productivity growth and weaker competitive markets. In particular in the United States, by almost every metric imaginable, capitalism has been weakened.

Mistake #2: Digital Technology Would Make Everything Better

In November 1989, the same year that the Berlin Wall fell, Tim Berners-Lee created the World Wide Web and ushered in a new technological era of networked computing that we now know as the “digital revolution.” Much like the ideology of market fundamentalism that took hold around the same time, technology was seen as determinant of a new, brighter age.

By the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance.

Yet by 2004, productivity growth had slowed again to its earlier lethargic pace. Today, despite very real advances in processing speed, broadband penetration, artificial intelligence and other things, we seem to be in the midst of a second productivity paradox in which we see digital technology everywhere except in the economic statistics.

Digital technology was supposed to empower individuals and reduce the dominance of institutions, but just the opposite has happened. Income inequality in advanced economies markedly increased. In America wages have stagnated and social mobility has declined. At the same time, social media has been destroying our mental health.

When Silicon Valley told us they intended to “change the world,” is this what they meant?

Mistake #3: Medical Breakthroughs Would Automatically Make Us Healthier

Much like the fall of the Berlin Wall and the rise of the Internet, the completion of the Human Genome Project in 2003 promised great things. No longer would we be at the mercy of terrible terrible diseases such as cancer and Alzheimer’s, but would design genetic therapies that would rewire our bodies to find off disease by themselves.

The advances since then have been breathtaking. The Cancer Genome Atlas, which began in 2005, helped enable doctors to develop therapies targeted at specific mutations, rather than where in the body a tumor happened to be found. Later, CRISPR revolutionized synthetic biology, bringing down costs exponentially.

The rapid development of Covid-19 vaccines have shown how effective these new technologies are. Scientists have essentially engineered new viruses containing the viral genome to produce a few proteins, just enough to provoke an immune response but not nearly enough to make us sick. 20 years ago, this would have been considered science fiction. Today, it’s a reality.

Yet we are not healthier. Worldwide obesity has tripled since 1975 and has become an epidemic in the United States. Anxiety and depression have as well. American healthcare costs continue to rise even as life expectancy declines. Despite the incredible advance in our medical capability, we seem to be less healthy and more miserable.

Worse Than A Crime, It Was A Blunder

Whenever I bring up these points among technology people, they vigorously push back. Surely, they say, you can see the positive effects all around you. Can you imagine what the global pandemic would be like without digital technologies? Without videoconferencing? Hasn’t there been a significant global decline in extreme poverty and violence?

Yes. There have absolutely been real achievements. As someone who spent roughly half my adult life in Eastern Bloc countries, I can attest to how horrible the Soviet system was. Digital technology has certainly made our lives more convenient and, as noted above, medical advances have been very real and very significant.

However, technology is a process that involves both revealing and building. Yes, we revealed the power of market forces and the bankruptcy of the Soviet system, but failed to build a more prosperous and healthy society. In much the same way, we revealed the power of the microchip, miracle cures and many other things, but failed to put them to use in such a way that would make us measurably better off.

When faced with a failure this colossal, people often look for a villain. They want to blame the greed of corporations, the arrogance of Silicon Valley entrepreneurs or the incompetence of government bureaucrats. The truth is, as the old saying goes, it was worse than a crime, it was a blunder. We simply believed that market forces and technological advancement would work their magic and all would be well in hand.

By now we should know better. We need to hold ourselves accountable, make better choices and seek out greater truths.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Confessions of a Business Artist

Confessions of a Business Artist

I am an artist.

There, I’ve said it. This statement may confuse some people who know me, and come as a shock to others.

Braden, what do you mean you’re an artist? You’ve got an MBA from London Business School, you’ve led change programs for global organizations, helped companies build their innovation capabilities and cultures, are an expert in digital transformation, and you can’t even draw a straight line without a ruler. What makes you think you’re an artist?

Well, okay, that may all be true, but there are lots of different kinds of artists. I may not be a painter, a sculptor, a musician, an illustrator, or even a singer, but I am an artist, a business artist.

What is a business artist you ask?

A business artist sees through complexity to what matters most. A business artist loves working with PowerPoint and telling stories, often through keynote speeches and training facilitation, or through writing. A business artist loves to share, often doing so for the greater good, sometimes to their own financial detriment, in an effort to accelerate the knowledge, learning, and creating new capabilities in others. A business artist is a builder, often creating new businesses, new web sites, and new thinking. A business artist is comfortable stepping into a number of different business contexts and bringing a different energy and a different approach to creating solutions to complex requirements. Part of the reason a business artist can do this is because a business artist values their intuitive skills just as much as they value their intellectual skills, and may also consciously invest in getting in touch with higher levels of intuitive capabilities, enabling them to excel in roles that involve a great deal of what might be termed ‘organizational psychology’.

A business artist often appears to be a jack of all trades, sometimes bordering on what was portrayed in the television show The Pretender, and can be an incredibly powerful addition to any team tackling a big challenge, but a business artist’s incredible ability to contribute to the success of an organization is often discounted by the traditional recruiting processes of most human resource organizations because of its emphasis on skill matching and experience, skewing hiring in favor of someone with a lot of experience at being mediocre at a certain skillset over someone with limited experience but greater capability. A business artist often appears to be ahead of the curve, often to their own detriment, arriving too early to the party by grasping where organizations need to go before the rest of the organization is willing to accept the new reality. This is a real problem for business artists.

Now is the time for a change. Given human’s increasing access to knowledge, and the shorter time now required to acquire the necessary knowledge and skills required to perform a task, people who are comfortable with complexity, ambiguity, and capable of learning quickly are incredibly valuable to organizations as continual change becomes the new normal. Because experience is increasingly detrimental to success instead of a long-lived asset, given the accelerating pace of innovation and change, we need business artists now more than ever.

So how do we create more business artists?

Unfortunately our public schools are far too focused on indoctrination than education, on repetition over discovery. Our educational system specializes in creating trivia masters and kids that hate school, instead of building a new generation of creative problem solvers that love to learn and explore new approaches instead of defending status conferred based on mastery of current truths (which may be tomorrow’s fallacies). We are far too obsessed with STEM (Science Technology Engineering and Math) when we should be focused on STEAM (Science Technology Engineering Art and Music). Music is creative math after all. My daughter’s school has a limited music program and NO ART. How is this possible?

To create more business artists we need to shift our focus towards art, creative problem solving and demonstrated learning, and away from memorization, metrics, and repetition. Can we do this?

Can we create an environment where the status quo is seen not as a source of power through current mastery and instead towards a system where improvements to the status quo are seen as the new source of power?

Organizations that want to survive will do so. Countries that want to stay at the top of the economic pyramid will do so. So what kind of country do you want to live in? What kind of company do you want to be part of?

Do you have the courage to join me as a business artist or to help create a new generation of them?

Image credit: blogs.nd.edu

This article originally appeared on Linkedin


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Whither Innovation in Indiana?

Whither Innovation in Indiana?Now that I’ve got your attention, let’s talk about homosexuality and whether it has any impact on innovation. There probably are two no more polarizing topics in the United States than homosexuality and abortion. But the truth is that if both sides of the political and religious spectrum focused on the golden rule, there would be less corruption, we’d all be a lot happier, probably have more innovation, and our politics would be more productive.

Today we have another great case study for how short people’s attention spans have gotten, how the government can help or hinder innovation, how little investigative journalism still remains in the United States, and how easily people are swayed by a soundbite that runs contrary to (or in support of) their own personal religious or political beliefs.

But this article isn’t going to be some diatribe in support or opposition to Indiana’s Religious Freedom Restoration Act (RFRA) legislation (referred to by the media as an anti-gay law) because I freely admit I don’t fully understand all of the implications of a similar federal law and whether federal protections for gays apply to the state law.

Instead I’d like to focus briefly on what this controversy brings to mind for me in regards to the efforts of hard-working folks attempting to stimulate innovation in Indiana (and elsewhere).

Point #1: People Must Feel Safe to Innovate

If we take Maslow’s Heirarchy of Needs as gospel (okay, maybe that’s dangerous word choice), then safety is one of the most important needs for people, and in order to innovate people must feel safe. True innovation usually requires taking risks and doing things in a new way, and if people feel that trying something new or even just being different has a high price, then people won’t step out of their comfort zone and push the boundaries of conventional wisdom. So if we are truly trying to do everything we can to inspire innovation in our region, shouldn’t we also try to do everything we can to make it feel like a place where it is safe to be different and where that difference is potentially even celebrated?

Point #2: Diversity is Important (to a point)

We all look at the same situation through different eyes and a different history of experiences, values and beliefs. This diversity can help create different idea fragments that can be connected together to create revolutionary new ideas with the potential to become innovations. But at the same time, having some shared experiences helps to make it easier to communicate and to have a higher level of trust (assuming those experiences were good ones). So if we are truly trying to do everything we can to inspire innovation in our region, shouldn’t we also do everything we can to make different groups of people look to our region as a good place to move to so we have a diverse talent pool?

Conclusion: If Culture Trumps Strategy, Environment Trumps Startups

The world is changing. It used to be that companies started and grew in the community where they were founded, hiring increasing numbers of people from the surrounding areas and attracting others from elsewhere. Now, an increasing number of companies (especially digital ones) are moving to more distributed models where they create satellite offices where the talent is rather than trying to attract all of the talent to a single location.

Economically this is meaning that it is becoming less important that the next Facebook starts in your town than it is for the next Facebook to want to have an office in your town. This means that for cities, counties, states and countries, the greater economic impact is likely to be made not from trying to encourage lots of startups, but instead from trying to create an environment that young, talented people choose to live in.

And when you create a place that is attractive for smart, creative people to move to, you know what, you’re likely to end up not just with more growing digital companies seeking a presence, but also a larger number of startups than if you started with the goal of specifically trying to encourage startups.

Does your region focus on creating startups as the primary goal or on making itself an attractive place for a young, diverse and talented population to live?

Does this uproar help Indiana establish its as an attractive place to be, or work against that perception?

I’ll let you decide!

P.S. If you’re curious, here are The Metro Areas With the Largest, and Smallest, Gay Populations (for what it’s worth, Indianapolis isn’t on either list)


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why the Maker Movement Matters

Making MakersThe Maker movement is steadily gaining steam and some cities are looking to help it grow and thrive, seeing it as an opportunity to inspire artists and entrepreneurs. One such city is Edmonton, which lies in the Alberta province of Canada, and its program in their public library system to provide maker spaces staffed with library employees and equipped with 3D printers, computers with Apple’s Garage Band and Adobe’s Creative Suite, and more.

Here is a video of Peter Schoenberg of the Edmonton Public Library introducing the EPL MakerSpace:



If you’re not familiar with the Maker movement, then check out these pages:

Maker Faire
Maker Culture – Wikipedia

Or check out these quotes from Time magazine’s article titled “Why the Maker Movement is Important to America’s Future“:

“According to Atmel, a major backer of the Maker movement, there are approximately 135 million U.S. adults who are makers, and the overall market for 3D printing products and various maker services hit $2.2 billion in 2012. That number is expected to reach $6 billion by 2017 and $8.41 billion by 2020. According to USA Today, makers fuel business with some $29 billion poured into the world economy each year.”

“As someone who has seen firsthand what can happen if the right tools, inspiration and opportunity are available to people, I see the Maker Movement and these types of Maker Faires as being important for fostering innovation. The result is that more and more people create products instead of only consuming them, and it’s my view that moving people from being only consumers to creators is critical to America’s future. At the very least, some of these folks will discover life long hobbies, but many of them could eventually use their tools and creativity to start businesses. And it would not surprise me if the next major inventor or tech leader was a product of the Maker Movement.”

So what do you think?

How much of a contribution to the future of innovation will the Maker Movement make?

How important is supporting the maker movement to the future of an economy?

Is this trend sustainable?


Build a common language of innovation on your team

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Global Innovation Index 2011 – Country Rankings

Global Innovation Index 2011 - Country RankingsThis article will be the first of a series of articles digging into the recently released Global Innovation Index 2011. In this issue we will give you just the country rankings. In the coming days I will dig through the Global Innovation Index 2011 report and see what interesting insights I can uncover about innovation in different regions, and report back here on Human-Centered Change & Innovation.

The Global Innovation Index 2011 was put together by Insead along with knowledge partners Alcatel-Lucent, Booz & Co., the Confederation of Indian Industry (CII), and the World Intellectual Property Organization (WIPO).

So without further ado, here is the Global Innovation Index 2011 rankings of the world’s most innovative countries (based on inputs and outputs):

Global Innovation Index 2011 Country Rankings

Check back in the coming days for additional articles highlighting whatever insights I can extract from the Global Innovation Index 2011 report. In the meantime, feel free to sound off in the comments about whether you believe your country’s position is justified or off base.

In the meantime, consider following the Human-Centered Change & Innovation page on LinkedIn.

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.