Tag Archives: productivity

9 of 10 Companies Requiring Employees to Return to the Office in 2024

9 of 10 Companies Requiring Employees to Return to the Office in 2024

GUEST POST from Shep Hyken

Happy employees mean more engaged and productive employees. I’ve written many times that what’s happening inside an organization will be felt on the outside by customers. A good employee experience (EX) will positively impact the customer experience (CX). And of course, the opposite is true. A “ripple effect” of employee satisfaction or dissatisfaction will inevitably reach your customers, impacting their overall experience.

As a result of the Covid-19 pandemic, which forced a shutdown, many companies and organizations realized—or at least thought—their employees could work remotely. Many companies walked away from their offices and didn’t renew their leases. This shift in the traditional in-office, five-day-a-week schedule was either eliminated or modified, and many workers discovered they enjoyed working from home. However, it looks as if this “experiment” didn’t work out as planned, and many companies will start requiring RTO (return to office) in a schedule that looks similar to pre-pandemic office hours and attendance requirements.

In August, ResumeBuilder surveyed 1,000 corporate decision-makers about their RTO plans. Here are the main results:

    • 90% of companies will return to the office by 2024.
    • only 2% say their company never plans to require employees to return to work in person.
    • 72% say RTO has improved revenue.
    • 28% will threaten to fire employees who don’t comply with RTO policies.

The Opportunity

Why return to the traditional office environment? The answer is something we already know. Because companies potentially make more money.

The move to return to the office started in 2021, just after the lockdown. That year, 31% of companies required employees to return to their offices, 41% in 2022 and 27% in 2023. Most of the respondents to the survey claimed they saw an improvement in revenue, productivity and worker retention.

And for those companies that plan to demand RTO in 2024, 81% say it will improve revenue, 81% believe it will improve the company culture and 83% say it will improve worker productivity.

These decision-makers aren’t making an arbitrary determination. They recognize the negative impact an RTO policy can have. Many of them (72%) said their company would offer commuter benefits, 57% would help with child-care costs and 64% would provide catered meals. But are the perks enough?

The Danger

There is concern that a shift back to full-time office hours could cause a company to lose good employees in a hiring environment in which candidates are “calling the shots” and working for companies that not only give them a steady paycheck and traditional benefits, but also a work schedule and in-office policy that aligns with their need for work/life balance. Even so, according to the survey, 28% of the decision-makers surveyed claimed they would fire employees for not complying with their RTO policies.

As we navigate the complexities of a post-pandemic working world, companies face a tough choice that will shape and impact both the employee and customer experiences. Suppose a company decides to require a 100% return to the office. It must recognize and weigh the opportunities—primarily, increased productivity and revenue—with the negatives—less-than-enthusiastic employees and the potential (even probable) loss of employees.

This article originally appeared on Forbes.com

Image Credits: Shep Hyken

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

AI and the Productivity Paradox

AI and the Productivity Paradox

GUEST POST from Greg Satell

In the 1970’s and 80’s, business investment in computer technology were increasing by more than twenty percent per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so strange that they called it the productivity paradox to underline their confusion.

Productivity growth would take off in the late 1990s, but then mysteriously drop again during the mid-aughts. At each juncture, experts would debate whether digital technology produced real value or if it was all merely a mirage. The debate would continue even as industry after industry was disrupted.

Today, that debate is over, but a new one is likely to begin over artificial intelligence. Much like in the early 1970s, we have increasing investment in a new technology, diminished productivity growth and “experts” predicting massive worker displacement . Yet now we have history and experience to guide us and can avoid making the same mistakes.

You Can’t Manage (Or Evaluate) What You Can’t Measure

The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit seeking businesses continue to make substantial investments, you expect to see a return. Yet with IT investment in the 70s and 80s, firms continued to increase their investment with negligible measurable benefit.

A paper by researchers at the University of Sheffield sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, were a small portion of total capital investment. Third, the aggregate productivity numbers didn’t reflect differences in management performance.

Consider a widget company in the 1970s that invested in IT to improve service so that it could ship out products in less time. That would improve its competitive position and increase customer satisfaction, but it wouldn’t produce any more widgets. So, from an economic point of view, it wouldn’t be a productive investment. Rival firms might then invest in similar systems to stay competitive but, again, widget production would stay flat.

So firms weren’t investing in IT to increase productivity, but to stay competitive. Perhaps even more importantly, investment in digital technology in the 70s and 80s was focused on supporting existing business models. It wasn’t until the late 90s that we began to see significant new business models being created.

The Greatest Value Comes From New Business Models—Not Cost Savings

Things began to change when firms began to see the possibilities to shift their approach. As Josh Sutton, CEO of Agorai, an AI marketplace, explained to me, “The businesses that won in the digital age weren’t necessarily the ones who implemented systems the best, but those who took a ‘digital first’ mindset to imagine completely new business models.”

He gives the example of the entertainment industry. Sure, digital technology revolutionized distribution, but merely putting your programming online is of limited value. The ones who are winning are reimagining storytelling and optimizing the experience for binge watching. That’s the real paradigm shift.

“One of the things that digital technology did was to focus companies on their customers,” Sutton continues. “When switching costs are greatly reduced, you have to make sure your customers are being really well served. Because so much friction was taken out of the system, value shifted to who could create the best experience.”

So while many companies today are attempting to leverage AI to provide similar service more cheaply, the really smart players are exploring how AI can empower employees to provide a much better service or even to imagine something that never existed before. “AI will make it possible to put powerful intelligence tools in the hands of consumers, so that businesses can become collaborators and trusted advisors, rather than mere service providers,” Sutton says.

It Takes An Ecosystem To Drive Impact

Another aspect of digital technology in the 1970s and 80s was that it was largely made up of standalone systems. You could buy, say, a mainframe from IBM to automate back office systems or, later, Macintoshes or a PCs with some basic software to sit on employees desks, but that did little more than automate basic clerical tasks.

However, value creation began to explode in the mid-90s when the industry shifted from systems to ecosystems. Open source software, such as Apache and Linux, helped democratize development. Application developers began offering industry and process specific software and a whole cadre of systems integrators arose to design integrated systems for their customers.

We can see a similar process unfolding today in AI, as the industry shifts from one-size-fits-all systems like IBM’s Watson to a modular ecosystem of firms that provide data, hardware, software and applications. As the quality and specificity of the tools continues to increase, we can expect the impact of AI to increase as well.

In 1987, Robert Solow quipped that, “ You can see the computer age everywhere but in the productivity statistics,” and we’re at a similar point today. AI permeates our phones, smart speakers in our homes and, increasingly, the systems we use at work. However, we’ve yet to see a measurable economic impact from the technology. Much like in the 70s and 80s, productivity growth remains depressed. But the technology is still in its infancy.

We’re Just Getting Started

One of the most salient, but least discussed aspects of artificial intelligence is that it’s not an inherently digital technology. Applications like voice recognition and machine vision are, in fact, inherently analog. The fact that we use digital technology to execute machine learning algorithms is actually often a bottleneck.

Yet we can expect that to change over the next decade as new computing architectures, such as quantum computers and neuromorphic chips, rise to the fore. As these more powerful technologies replace silicon chips computing in ones and zeroes, value will shift from bits to atoms and artificial intelligence will be applied to the physical world.

“The digital technology revolutionized business processes, so it shouldn’t be a surprise that cognitive technologies are starting from the same place, but that’s not where they will end up. The real potential is driving processes that we can’t manage well today, such as in synthetic biology, materials science and other things in the physical world,” Agorai’s Sutton told me.

In 1987, when Solow made his famous quip, there was no consumer Internet, no World Wide Web and no social media. Artificial intelligence was largely science fiction. We’re at a similar point today, at the beginning of a new era. There’s still so much we don’t yet see, for the simple reason that so much has yet to happen.

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Productive Disagreement Requires Trust

Productive Disagreement Requires Trust

GUEST POST from Mike Shipulski

When there’s disagreement between words and behavior, believe the behavior. This is especially true when the words deny the behavior.

When there’s disagreement between the data and the decision, the data is innocent.

When there’s agreement that there’s insufficient data but a decision must be made, there should be no disagreement that the decision is judgment-based.

When there’s disagreement on the fact that there’s no data to support the decision, that’s a problem.

When there’s disagreement on the path forward, it’s helpful to have agreement on the process to decide.

When there’s disagreement among professionals, there is no place for argument.

When there’s disagreement, there is respect for the individual and a healthy disrespect for the ideas.

When there’s disagreement, the decisions are better.

When there’s disagreement, there’s independent thinking.

When there’s disagreement, there is learning.

When there’s disagreement, there is vulnerability.

When there’s disagreement, there is courage.

When there’s disagreement, there is trust.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

How to Make Your Customers Hate You

One Zone at a Time

How to Make Your Customers Hate You

GUEST POST from Geoffrey A. Moore

My most recent book, Zone to Win, lays out a game plan for digital transformation based on organizing your enterprise around four zones. They are the:

  1. Performance Zone, where you make, sell, and deliver the products and services that constitute your core business.
  2. Productivity Zone, where you host all the cost centers that support the Performance Zone, functions like finance, HR, IT, marketing, legal, customer support, and the like.
  3. Incubation Zone, where you experiment with next-generation technologies to see if and how they might play a role in your future.
  4. Transformation Zone, which you bring into existence on a temporary basis for the sole purpose of driving a digital transformation to completion.

The book uses these four zones to help you understand your own company’s dynamics. In this blog, however, we are going to use them to help you understand your customer’s company dynamics.

Here is the key insight. Customers buy your product to create value in one, and normally only one, zone. Depending on which zone they are seeking to improve, their expectations of you will vary dramatically. So, if you really want to get your customers to hate you, you have to know what zone they are targeting with your product or service.

To start with, if your customer is buying your product for their Productivity Zone, they want it to make them more efficient. Typically, that means taking cost out of their existing operations by automating one or more manual tasks, thereby reducing labor, improving quality, and speeding up cycle time. So, if you want to make this customer hate you, load up your overall offer with lots of extras that require additional training, have features that can confuse or distract end users, and generally just gum up the works. Your product will still do what you said it would do, but with any luck, they won’t save a nickel.

Now, if instead they are buying your product to experiment with in their Incubation Zone, they are looking to do some kind of proof of concept project. Of course, real salespeople never sell proofs of concepts, so continue to insist that they go all in for the full Monty. That way, when they find out they can’t actually do what they were hoping to, you will have still scored a good commission, and they will really hate you.

Moving up in the world, perhaps your customer has bought from you to upgrade their Performance Zone by making their operations more customer-focused. This is serious stuff because you are messing with their core business. What an opportunity! All you have to do is over-promise just a little bit, then put in a few bits that are not quite fully baked, turn the whole implementation over to a partner, and then, if the stars align, you can bring down their whole operation and blame it entirely on someone else. That really does get their dander up.

But if you really want to extract the maximum amount of customer vitriol, the best place to engage is in their Transformation Zone. Here the CEO has gone on record that the company will transform its core business to better compete in the digital era. This is the mother lode. Budget is no object, so soak it to the max. Every bell, whistle, doo-dad, service, product—you name it, load it into the cart. Guarantee a transformational trip to the moon and back. Just make sure that the timeline for the project is two years. That way you will be able to collect and cash your commission check before you have to find other employment.

Of course, if for some reason you actually wanted your customer to like you, I suppose you could reverse these recommendations. But where’s the fun in that?

That’s what I think. What do you think?

Image Credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

The Digital Revolution Has Been A Giant Disappointment

The Digital Revolution Has Been A Giant Disappointment

GUEST POST from Greg Satell

One of the most often repeated episodes in the history of technology is when Steve Jobs was recruiting John Sculley from his lofty position as CEO at Pepsi to come to Apple. “Do you want to sell sugar water for the rest of your life,”Jobs asked, “or do you want to come with me and change the world?”

It’s a strange conceit of digital denizens that their businesses are something nobler than other industries. While it is true that technology can do some wonderful things, if the aim of Silicon Valley entrepreneurs was truly to change the world, why wouldn’t they apply their formidable talents to something like curing cancer or feeding the hungry?

The reality, as economist Robert Gordon explains in the The Rise and Fall of American Growth, is that the measurable impact has been relatively meager. According to the IMF, except for a relatively short burst in growth between 1996 and 2004, productivity has been depressed since the 1970s. We need to rethink how technology impacts our world.

The Old Productivity Paradox

In the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so strange that they called it the productivity paradox to underline their confusion.

The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit seeking businesses continue to make substantial investments, you expect to see a return. Yet with IT investment in the 70s and 80s, firms continued to increase their investment with negligible measurable benefit.

A paper by researchers at the University of Sheffield sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, were a small portion of total capital investment. Third, businesses weren’t necessarily investing to improve productivity, but to survive in a more demanding marketplace.

Yet by the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage would lead to market dominance. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for the technology to hit critical mass.

The New Productivity Paradox

By 2004, the law of increasing returns was there for everyone to see. Google already dominated search, Amazon ruled e-commerce, Apple would go on to dominate mobile computing and Facebook would rule social media. Yet as the dominance of the tech giants grew, productivity would once again fall to depressed levels.

Yet today, more than a decade later, we’re in the midst of a second productivity paradox, just as mysterious as the first one. New technologies like mobile computing and artificial intelligence are there for everyone to see, but they have done little, if anything, to boost productivity.

At the same time the power of digital technology is diminishing. Moore’s law, the decades old paradigm of continuous doubling in the power of computer processing is slowing down and soon will end completely. Without advancement in the underlying technology, it is hard to see how digital technology will ever power another productivity boom.

Considering the optimistic predictions of digital entrepreneurs like Steve Jobs, this is incredibly disappointing. Compare the meager eight years of elevated productivity that digital technology produced with the 50-year boom in productivity created in the wake of electricity and internal combustion and it’s clear that digital technology simply doesn’t measure up.

The Baumol Effect, The Clothesline Paradox and Other Headwinds

Much like the first productivity paradox, it’s hard to determine exactly why the technological advancement over the last 15 years has amounted to so little. Most likely, it is not one factor in particular, but the confluence of a number of them. Increasing productivity growth in an advanced economy is no simple thing.

One possibility for the lack of progress is the Baumol effect, the principle that some sectors of the economy are resistant to productivity growth. For example, despite the incredible efficiency that Jeff Bezos has produced at Amazon, his barber still only cuts one head of hair at a time. In a similar way, sectors like healthcare and education, which require a large amount of labor inputs that resist automation, will act as a drag on productivity growth.

Another factor is the Clothesline paradox, which gets its name from the fact that when you dry your clothes in a machine, it figures into GDP data, but when you hang them on a clothesline, no measurable output is produced. In much the same way, when you use a smartphone to take pictures or to give you directions, there is considerable benefit that doesn’t result in any financial transactions. In fact, because you use less gas and don’t develop film, GDP decreases somewhat.

Additionally, the economist Robert Gordon, mentioned above, notes six headwinds to economic growth, including aging populations, limits to increasing education, income inequality, outsourcing, environmental costs due to climate change and rising household and government debt. It’s hard to see how digital technology will make a dent in any of these problems.

Technology is Never Enough to Change the World

Perhaps the biggest reason that the digital revolution has been such a big disappointment is because we expected the technology to largely do the work for us. While there is no doubt that computers are powerful tools, we still need to put them to good use and we have clearly missed opportunities in that regard.

Think about what life was like in 1900, when the typical American family didn’t have access to running water, electricity or gas powered machines such as tractors or automobiles. Even something simply like cooking a meal took hours of backbreaking labor. Yet investments in infrastructure and education combined with technology to produce prosperity.

Today, however, there is no comparable effort to invest in education and healthcare for those who cannot afford it, to limit the effects of climate change, to reduce debt or to do anything of anything of significance to mitigate the headwinds we face. We are awash in nifty gadgets, but in many ways we are no better off than we were 30 years ago.

None of this was inevitable, but the somewhat the results of choices that we have made. We can, if we really want to, make different choices in the days and years ahead. What I hope we have learned from our digital disappointments is that technology itself is never enough. We are truly the masters of our fate, for better or worse.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Bringing Yin and Yang to the Productivity Zone

Bringing Yin and Yang to the Productivity Zone

GUEST POST from Geoffrey A. Moore

Digital transformation is hardly new. Advances in computing create more powerful infrastructure which in turn enables more productive operating models which in turn can enable wholly new business models. From mainframes to minicomputers to PCs to the Internet to the Worldwide Web to cloud computing to mobile apps to social media to generative AI, the hits just keep on coming, and every IT organization is asked to both keep the current systems running and to enable the enterprise to catch the next wave. And that’s a problem.

The dynamics of productivity involve a yin and yang exchange between systems that improve efficiency and programs that improve effectiveness. Systems, in this model, are intended to maintain state, with as little friction as possible. Programs, in this model, are intended to change state, with maximum impact within minimal time. Each has its own governance model, and the two must not be blended.

It is a rare IT organization that does not know how to maintain its own systems. That’s Job 1, and the decision rights belong to the org itself. But many IT organizations lose their way when it comes to programs—specifically, the digital transformation initiatives that are re-engineering business processes across every sector of the global economy. They do not lose their way with respect to the technology of the systems. They are missing the boat on the management of the programs.

Specifically, when the CEO champions the next big thing, and IT gets a big chunk of funding, the IT leader commits to making it all happen. This is a mistake. Digital transformation entails re-engineering one or more operating models. These models are executed by organizations outside of IT. For the transformation to occur, the people in these organizations need to change their behavior, often drastically. IT cannot—indeed, must not—commit to this outcome. Change management is the responsibility of the consuming organization, not the delivery organization. In other words, programs must be pulled. They cannot be pushed. IT in its enthusiasm may believe it can evangelize the new operating model because people will just love it. Let me assure you—they won’t. Everybody endorses change as long as other people have to be the ones to do it. No one likes to move their own cheese.

Given all that, here’s the playbook to follow:

  1. If it is a program, the head of the operating unit that must change its behavior has to sponsor the change and pull the program in. Absent this commitment, the program simply must not be initiated.
  2. To govern the program, the Program Management Office needs a team of four, consisting of the consuming executive, the IT executive, the IT project manager, and the consuming organization’s program manager. The program manager, not the IT manager, is responsible for change management.
  3. The program is defined by a performance contract that uses a current state/future state contrast to establish the criteria for program completion. Until the future state is achieved, the program is not completed.
  4. Once the future state is achieved, then the IT manager is responsible for securing the system that will maintain state going forward.

Delivering programs that do not change state is the biggest source of waste in the Productivity Zone. There is an easy fix for this. Just say No.

That’s what I think. What do you think?

Image Credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Vacations and Holidays the Best Productivity Hack

Vacations and Holidays the Best Productivity Hack

GUEST POST from Mike Shipulski

It’s not a vacation unless you forget about work.

It’s not a holiday unless you leave your phone at home.

If you must check-in at work, you’re not on vacation.

If you feel guilty that you did not check-in at work, you’re not on holiday.

If you long for work while you’re on vacation, do something more interesting on vacation.

If you wish you were at work, you get no credit for taking a holiday.

If people know you won’t return their calls, they know you are on vacation.

If people would rather make a decision than call you, they know you’re on holiday.

If you check your voicemail, you’re not on vacation.

If you check your email, you’re not on holiday.

If your company asks you to check-in, they don’t understand vacation.

If people at your company invite you to a meeting, they don’t understand holiday.

Vacation is productive in that you return to work and you are more productive.

Holiday is not wasteful because when you return to work you don’t waste time.

Vacation is profitable because when you return you make fewer mistakes.

Holiday is skillful because when you return your skills are dialed in.

Vacation is useful because when you return you are useful.

Holiday is fun because when you return you bring fun to your work.

If you skip your vacation, you cannot give your best to your company and to yourself.

If neglect your holiday, you neglect your responsibility to do your best work.

Don’t skip your vacation and don’t neglect your holiday. Both are bad for business and for you.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Globalization and Technology Have Failed Us

Globalization and Technology Have Failed Us

GUEST POST from Greg Satell

In November 1989, there were two watershed events that would change the course of world history. The fall of the Berlin Wall would end the Cold War and open up markets across the world. That very same month, Tim Berners-Lee would create the World Wide Web and usher in a new technological era of networked computing.

It was a time of great optimism. Books like Francis Fukayama’s The End of History predicted a capitalist, democratic utopia, while pundits gushed over the seemingly neverending parade of “killer apps,” from email and e-commerce to social media and the mobile web. The onward march of history seemed unstoppable.

Today, 30 years on, it’s time to take stock and the picture is somewhat bleak. Instead of a global technological utopia, there are a number of worrying signs ranging from income inequality to the rise of popular authoritarianism. The fact is that technology and globalization have failed us. It’s time to address some very real problems.

Where’s the Productivity?

Think back, if you’re old enough, to before this all started. Life before 1989 was certainly less modern prior to 1989, we didn’t have mobile phones or the Internet, but for the most part it was fairly similar to today. We rode in cars and airplanes, watched TV and movies, and enjoyed the benefits of home appliances and air conditioners.

Now try to imagine what life was like in 1900, before electricity and internal combustion gained wide adoption. Even doing a simple task like cooking a meal or cleaning the house took hours of backbreaking labor to haul wood and water. While going back to living in the 1980s would involve some inconvenience, we would struggle to survive before 1920.

The productivity numbers bear out this simple observation. The widespread adoption of electricity and internal combustion led to a 50-year boom in productivity between 1920 and 1970. The digital revolution, on the other hand, created only an 8-year blip between 1996 and 2004. Even today, with artificial intelligence on the rise, productivity remains depressed.

At this point, we have to conclude that despite all the happy talk and grand promises of “changing the world,” the digital revolution has been a huge disappointment. While Silicon Valley has minted billionaires at record rates, digital technology has not made most of us measurably better off economically.

Winners Taking All

The increase of globalization and the rise of digital commerce was supposed to be a democratizing force, increasing competition and breaking the institutional monopoly on power. Yet just the opposite seems to have happened, with a relatively small global elite grabbing more money and more power.

Consider market consolidation. An analysis published in the Harvard Business Review showed that from airlines to hospitals to beer, market share is increasingly concentrated in just a handful of firms. As more expansive study of 900 industries conducted by The Economist found that two thirds have become more dominated by larger players.

Perhaps not surprisingly, we see the same trends in households as we do with businesses. The OECD reports that income inequality is at its highest level in over 50 years. Even in emerging markets, where millions have been lifted out of poverty, most of the benefits have gone to a small few.

The consequences of growing inequality are concrete and stark. Social mobility has been declining in America for decades, transforming the “land of opportunity” into what is increasingly a caste system. Anxiety and depression are rising to epidemic levels. Life expectancy for the white working class is actually declining, mostly due to “deaths of despair” due to drugs, alcohol and suicide. The overall picture is dim and seemingly getting worse.

The Failure Of Freedom

Probably the biggest source of optimism in the 1990s was the end of the Cold War. Capitalism was triumphant and many of the corrupt, authoritarian societies of the former Soviet Union began embracing democracy and markets. Expansion of NATO and the EU brought new hope to more than a hundred million people. China began to truly embrace markets as well.

I moved to Eastern Europe in the late 1990s and was able to observe this amazing transformation for myself. Living in Poland, it seemed like the entire country was advancing through a lens of time-lapse photography. Old, gray concrete building gave way to modern offices and apartment buildings. A prosperous middle class began to emerge.

Yet here as well things now seem to be going the other way. Anti-democratic regimes are winning elections across Europe while rising resentment against immigrant populations take hold throughout the western world. In America, we are increasingly mired in a growing constitutional crisis.

What is perhaps most surprising about the retreat of democracy is that it is happening not in the midst of some sort of global depression, but during a period of relative prosperity and low unemployment. Nevertheless, positive economic data cannot mask the basic truth that a significant portion of the population feels that the system doesn’t work for them.

It’s Time To Start Taking Responsibility For A Messy World

Looking back, it’s hard to see how an era that began with such promise turned out so badly. Yes, we’ve got cooler gadgets and streaming video. There have also been impressive gains in the developing world. Yet in so-called advanced economies, we seem to be worse off. It didn’t have to turn out this way. Our current predicament is the result of choices that we made.

Put simply, we have the problems we have today because they are the problems we have chosen not to solve. While the achievements of technology and globalization are real, they have also left far too many behind. We focused on simple metrics like GDP and shareholder value, but unfortunately the world is not so elegant. It’s a messy place and doesn’t yield so easily to reductionist measures and strategies.

There has, however, been some progress. The Business Roundtable, an influential group of almost 200 CEOs of America’s largest companies, in 2019 issued a statement that discarded the old notion that the sole purpose of a business is to provide value to shareholders. There are also a number of efforts underway to come up with broader measures of well being to replace GDP.

Yet we still need to learn an important lesson: technology alone will not save us. To solve complex challenges like inequality, climate change and the rise of authoritarianism we need to take a complex, network based approach. We need to build ecosystems of talent, technology and information. That won’t happen by itself, we have to make better choices.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

When You Have No Slack Time

When You Have No Slack Time

GUEST POST from Mike Shipulski

When you have no slack time, you can’t start new projects.

When you have no slack time, you can’t run toward the projects that need your help.

When you have no slack time, you have no time to think.

When you have no slack time, you have no time to learn.

When you have no slack time, there’s no time for concern for others.

When you have no slack time, there’s no time for your best judgment.

When there is no slack time, what used to be personal becomes transactional.

When there is no slack time, any hiccup creates project slip.

When you have no slack time, the critical path will find you.

When no one has slack time, one project’s slip ripples delay into all the others.

When you have no slack time, excitement withers.

When you have no slack time, imagination dies.

When you have no slack time, engagement suffers.

When you have no slack time, burnout will find you.

When you have no slack time, work sucks.

When you have no slack time, people leave.

I have one question for you. How much slack time do you have?

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Silicon Valley Has Become a Doomsday Machine

Silicon Valley Has Become a Doomsday Machine

GUEST POST from Greg Satell

I was working on Wall Street in 1995 when the Netscape IPO hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25 and, just like that, a tiny company with no profits was worth $2.9 billion.

It seemed crazy, but economists soon explained that certain conditions, such as negligible marginal costs and network effects, would lead to “winner take all markets” and increasing returns to investment. Venture capitalists who bet on this logic would, in many cases, become rich beyond their wildest dreams.

Yet as Charles Duhigg explained in The New Yorker, things have gone awry. Investors who preach prudence are deemed to be not “founder friendly” and cut out of deals. Evidence suggests that the billions wantonly plowed into massive failures like WeWork and Quibi are crowding out productive investments. Silicon Valley is becoming a ticking time bomb.

The Rise Of Silicon Valley

In Regional Advantage, author AnnaLee Saxenian explained how the rise of the computer can be traced to the buildup of military research after World War II. At first, most of the entrepreneurial activity centered around Boston, but the scientific and engineering talent attracted to labs based in Northern California soon began starting their own companies.

Back east, big banks were the financial gatekeepers. In the Bay Area, however, small venture capitalists, many of whom were ex-engineers themselves, invested in entrepreneurs. Stanford Provost Frederick Terman, as well as existing companies, such as Hewlett Packard, also devoted resources to broaden and strengthen the entrepreneurial ecosystem.

Saxenian would later point out to me that this was largely the result of an unusual confluence of forces. Because there was a relative dearth of industry in Northern California, tech entrepreneurs tended to stick together. In a similar vein, Stanford had few large corporate partners to collaborate with, so sought out entrepreneurs. The different mixture produced a different brew and Silicon Valley developed a unique culture and approach to business.

The early success of the model led to a process that was somewhat self-perpetuating. Engineers became entrepreneurs and got rich. They, in turn, became investors in new enterprises, which attracted more engineers to the region, many of whom became entrepreneurs. By the 1980’s, Silicon Valley had surpassed Route 128 outside Boston to become the center of the technology universe.

The Productivity Paradox and the Dotcom Bust

As Silicon Valley became ascendant and information technology gained traction, economists began to notice something strange. Although businesses were increasing investment in computers at a healthy clip, there seemed to be negligible economic impact. As Robert Solow put it, “You can see the computer age everywhere but in the productivity statistics.” This came to be known as the productivity paradox.

Things began to change around the time of the Netscape IPO. Productivity growth, which had been depressed since the early 1970s, began to surge and the idea of “increasing returns” began to take hold. Companies such as Webvan and Pets.com, with no viable business plan or path to profitability, attracted hundreds of millions of dollars from investors.

By 2000, the market hit its peak and the bubble burst. While some of the fledgling Internet companies, such as Cisco and Amazon, did turn out well, thousands of others went down in flames. Other more conventional businesses, such as Enron, World Com and Arthur Anderson, got caught up in the hoopla, became mired in scandal and went bankrupt.

When it was all over there was plenty of handwringing, a small number of prosecutions, some reminiscing about the Dutch tulip mania of 1637 and then everybody went on with their business. The Federal Reserve Bank pumped money into the economy, the Bush Administration pushed big tax cuts and within a few years things were humming again.

Web 2.0. Great Recession and the Rise Of the Unicorns

Out of the ashes of the dotcom bubble arose Web 2.0, which saw the emergence of new social platforms like Facebook, LinkedIn and YouTube that leveraged their own users to create content and grew exponentially. The launch of the iPhone in 2007 ushered in a new mobile era and, just like that, techno-enthusiasts were once again back in vogue. Marc Andreessen, who founded Netscape, would declare that software was eating the world.

Yet trouble was lurking under the surface. Productivity growth disappeared in 2005 just as mysteriously as it appeared in 1996. All the money being pumped into the economy by the Fed and the Bush tax cuts had to go somewhere and found a home in a booming housing market. Mortgage bankers, Wall Street traders, credit raters and regulators all looked the other way while the bubble expanded and then, somewhat predictably, imploded.

But this time, there were no zany West Coast startup entrepreneurs to blame. It was, in fact, the establishment that had run us off the cliff. The worthless assets at the center didn’t involve esoteric new business models, but the brick and mortar of our homes and workplaces. The techno-enthusiasts could whistle past the graveyard, pitying the poor suckers who got caught up in a seemingly anachronistic fascination with things made with atoms.

Repeating a now-familiar pattern, the Fed pumped money into the economy to fuel the recovery, establishment industries, such as the auto companies in Detroit were discredited and a superabundance of capital needed a place to go and Silicon Valley looked attractive.

The era of the unicorns, startup companies worth more than a billion dollars, had begun.

Charting A New Path Forward

In his inaugural address, Ronald Reagan declared that, “Government is not the solution to our problem, government is the problem.” In his view, bureaucrats were the enemy and private enterprise the hero, so he sought to dismantle federal regulations. This led to the Savings and Loan crisis that exploded, conveniently or inconveniently, during the first Bush administration.

So small town bankers became the enemy while hotshot Wall Street traders and, after the Netscape IPO, Internet entrepreneurs and venture capitalists became heroes. Wall Street would lose its luster after the global financial meltdown, leaving Silicon Valley’s venture-backed entrepreneurship as the only model left with any genuine allure.

That brings us to now and “big tech” is increasingly under scrutiny. At this point, the government, the media, big business, small business, Silicon Valley, venture capitalists and entrepreneurs have all been somewhat discredited. There is no real enemy left besides ourselves and there are no heroes coming to save us. Until we learn to embrace our own culpability we will never be able to truly move forward.

Fortunately, there is a solution. Consider the recent Covid crisis, in which unprecedented collaboration between governments, large pharmaceutical companies, innovative startups and academic scientists developed a life-saving vaccine in record time. Similar, albeit fledgling, efforts have been going on for years.

Put simply, we have seen the next big thing and it is each other. By discarding childish old notions about economic heroes and villains we can learn to collaborate across historical, organizational and institutional boundaries to solve problems and create new value. It is in our collective ability to solve problems that we will create our triumph or our peril.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.