Tag Archives: productivity

Silicon Valley Has Become a Doomsday Machine

Silicon Valley Has Become a Doomsday Machine

GUEST POST from Greg Satell

I was working on Wall Street in 1995 when the Netscape IPO hit like a bombshell. It was the first big Internet stock and, although originally priced at $14 per share, it opened at double that amount and quickly zoomed to $75. By the end of the day, it had settled back at $58.25 and, just like that, a tiny company with no profits was worth $2.9 billion.

It seemed crazy, but economists soon explained that certain conditions, such as negligible marginal costs and network effects, would lead to “winner take all markets” and increasing returns to investment. Venture capitalists who bet on this logic would, in many cases, become rich beyond their wildest dreams.

Yet as Charles Duhigg explained in The New Yorker, things have gone awry. Investors who preach prudence are deemed to be not “founder friendly” and cut out of deals. Evidence suggests that the billions wantonly plowed into massive failures like WeWork and Quibi are crowding out productive investments. Silicon Valley is becoming a ticking time bomb.

The Rise Of Silicon Valley

In Regional Advantage, author AnnaLee Saxenian explained how the rise of the computer can be traced to the buildup of military research after World War II. At first, most of the entrepreneurial activity centered around Boston, but the scientific and engineering talent attracted to labs based in Northern California soon began starting their own companies.

Back east, big banks were the financial gatekeepers. In the Bay Area, however, small venture capitalists, many of whom were ex-engineers themselves, invested in entrepreneurs. Stanford Provost Frederick Terman, as well as existing companies, such as Hewlett Packard, also devoted resources to broaden and strengthen the entrepreneurial ecosystem.

Saxenian would later point out to me that this was largely the result of an unusual confluence of forces. Because there was a relative dearth of industry in Northern California, tech entrepreneurs tended to stick together. In a similar vein, Stanford had few large corporate partners to collaborate with, so sought out entrepreneurs. The different mixture produced a different brew and Silicon Valley developed a unique culture and approach to business.

The early success of the model led to a process that was somewhat self-perpetuating. Engineers became entrepreneurs and got rich. They, in turn, became investors in new enterprises, which attracted more engineers to the region, many of whom became entrepreneurs. By the 1980’s, Silicon Valley had surpassed Route 128 outside Boston to become the center of the technology universe.

The Productivity Paradox and the Dotcom Bust

As Silicon Valley became ascendant and information technology gained traction, economists began to notice something strange. Although businesses were increasing investment in computers at a healthy clip, there seemed to be negligible economic impact. As Robert Solow put it, “You can see the computer age everywhere but in the productivity statistics.” This came to be known as the productivity paradox.

Things began to change around the time of the Netscape IPO. Productivity growth, which had been depressed since the early 1970s, began to surge and the idea of “increasing returns” began to take hold. Companies such as Webvan and Pets.com, with no viable business plan or path to profitability, attracted hundreds of millions of dollars from investors.

By 2000, the market hit its peak and the bubble burst. While some of the fledgling Internet companies, such as Cisco and Amazon, did turn out well, thousands of others went down in flames. Other more conventional businesses, such as Enron, World Com and Arthur Anderson, got caught up in the hoopla, became mired in scandal and went bankrupt.

When it was all over there was plenty of handwringing, a small number of prosecutions, some reminiscing about the Dutch tulip mania of 1637 and then everybody went on with their business. The Federal Reserve Bank pumped money into the economy, the Bush Administration pushed big tax cuts and within a few years things were humming again.

Web 2.0. Great Recession and the Rise Of the Unicorns

Out of the ashes of the dotcom bubble arose Web 2.0, which saw the emergence of new social platforms like Facebook, LinkedIn and YouTube that leveraged their own users to create content and grew exponentially. The launch of the iPhone in 2007 ushered in a new mobile era and, just like that, techno-enthusiasts were once again back in vogue. Marc Andreessen, who founded Netscape, would declare that software was eating the world.

Yet trouble was lurking under the surface. Productivity growth disappeared in 2005 just as mysteriously as it appeared in 1996. All the money being pumped into the economy by the Fed and the Bush tax cuts had to go somewhere and found a home in a booming housing market. Mortgage bankers, Wall Street traders, credit raters and regulators all looked the other way while the bubble expanded and then, somewhat predictably, imploded.

But this time, there were no zany West Coast startup entrepreneurs to blame. It was, in fact, the establishment that had run us off the cliff. The worthless assets at the center didn’t involve esoteric new business models, but the brick and mortar of our homes and workplaces. The techno-enthusiasts could whistle past the graveyard, pitying the poor suckers who got caught up in a seemingly anachronistic fascination with things made with atoms.

Repeating a now-familiar pattern, the Fed pumped money into the economy to fuel the recovery, establishment industries, such as the auto companies in Detroit were discredited and a superabundance of capital needed a place to go and Silicon Valley looked attractive.

The era of the unicorns, startup companies worth more than a billion dollars, had begun.

Charting A New Path Forward

In his inaugural address, Ronald Reagan declared that, “Government is not the solution to our problem, government is the problem.” In his view, bureaucrats were the enemy and private enterprise the hero, so he sought to dismantle federal regulations. This led to the Savings and Loan crisis that exploded, conveniently or inconveniently, during the first Bush administration.

So small town bankers became the enemy while hotshot Wall Street traders and, after the Netscape IPO, Internet entrepreneurs and venture capitalists became heroes. Wall Street would lose its luster after the global financial meltdown, leaving Silicon Valley’s venture-backed entrepreneurship as the only model left with any genuine allure.

That brings us to now and “big tech” is increasingly under scrutiny. At this point, the government, the media, big business, small business, Silicon Valley, venture capitalists and entrepreneurs have all been somewhat discredited. There is no real enemy left besides ourselves and there are no heroes coming to save us. Until we learn to embrace our own culpability we will never be able to truly move forward.

Fortunately, there is a solution. Consider the recent Covid crisis, in which unprecedented collaboration between governments, large pharmaceutical companies, innovative startups and academic scientists developed a life-saving vaccine in record time. Similar, albeit fledgling, efforts have been going on for years.

Put simply, we have seen the next big thing and it is each other. By discarding childish old notions about economic heroes and villains we can learn to collaborate across historical, organizational and institutional boundaries to solve problems and create new value. It is in our collective ability to solve problems that we will create our triumph or our peril.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

New Skills Needed for a New Era of Innovation

New Skills Needed for a New Era of Innovation

GUEST POST from Greg Satell

The late Clayton Christensen had a theory about “jobs to be done.” In his view, customers don’t buy products as much as they “hire” companies to do specific “jobs” for them. To be competitive, firms need to understand what that job is and how to do it well. In other words, no one wants a quarter-inch drill bit, they want a quarter-inch hole.

The same can be said for an entire society. We need certain jobs to be done and will pay handsomely for ones that we hold in high regard, even as we devalue others. Just as being the best blacksmith in town won’t earn you much of a living today, great coding skills wouldn’t do you much good in a medieval village.

This is especially important to keep in mind today as the digital revolution comes to an end and we enter a new era of innovation in which some tasks will be devalued and others will be increasingly in demand. Much like Christensen said about firms, we as a society need to learn to anticipate which skills will lose value in future years and which will be considered critical.

The Evolution of Economies

The first consumer product was most likely the Acheulean hand axe, invented by some enterprising stone age entrepreneur over 100,000 years ago. Evidence suggests that, for the most part, people made stone axes themselves, but as technology evolved, some began to specialize in different crafts, such as smithing, weaving, cobbling and so on.

Inventions like the steam engine, and then later electricity and the internal combustion engine, brought about the industrial revolution, which largely put craftsmen out of work and reshaped society around cities that could support factories. It also required new skills to organize work, leading to the profession of management and the knowledge economy.

The inventions of the microchip and the internet have led to an information economy in which even a teenager with a smartphone has better access to knowledge than a specialist working in a major institution a generation ago. Much like the industrial era automated physical tasks, the digital era has automated many cognitive tasks.

Now as the digital era is ending we are entering a new era of innovation in which we will shift to post-digital computing architectures such as quantum computing and neuromorphic chips and enormous value will be created through bits powering atoms in fields like synthetic biology and materials science.

Innovation, Jobs and Wages

As economies evolved, some tasks became devalued as others increased in importance. When people could go to a smith for metal tools, they had no need to create stone axes. In much the same way, the industrial revolution put craft guilds out of business and technologies like tractors and combine harvesters drastically reduced the number of people working on farms.

Clearly replacing human labor with technology is disruptive, but it has historically led to dramatic increases in productivity. So labor displacement effects have been outweighed by greater wages and new jobs created by new industries. For the most part, innovation has made all of us better off, even, to a great extent, the workers who were displaced.

Consider the case of Henry Ford. Because technology replaced many tasks on the family farm, he didn’t need to work on it and found a job as an engineer for Thomas Edison, where he earned enough money and had enough leisure time to tinker with engines. That led him to create his own company, pioneer an industry and create good jobs for many others.

Unfortunately, there is increasing evidence that more recent innovations may not be producing comparable amounts of productivity and that’s causing problems. For example, when a company replaces a customer service agent with an automated system, it’s highly doubtful that the productivity gains will be enough to finance entire new industries that will train that call center employee to, say, design websites or run marketing campaigns.

Identifying New Jobs To Be Done

To understand the disconnect between technological innovation and productivity it’s helpful to look at some underlying economic data. In US manufacturing, for instance, productivity has skyrocketed, roughly doubling output in the 30 years between 1987 and 2017, even as employment in the sector decreased by roughly a third.

It is the increased productivity growth in manufacturing that has fueled employment growth in the service sector. However, productivity gains in service jobs have been relatively meager and automation through technological innovation has not resulted in higher wages, but greater income inequality as returns to capital dwarf returns to labor.

Further economic analysis shows that the divide isn’t so much between “white collar” and “blue collar” jobs, but between routine and non-routine tasks. So warehouse workers and retail clerks have suffered, but designers and wedding planners have fared much better. In other words, technological automation is creating major shifts in the “jobs to be done.”

A recent analysis by the McKinsey Global Institute bears this out. It identified 56 “foundational skills” that are crucial to the future of work, but aren’t in traditional categories such as “engineering” or “sales,” but rather things like self awareness, emotional intelligence and critical thinking.

Collaboration Is The New Competitive Advantage

The industrial revolution drove a shift from animal power to machine power and from physical skills to cognitive skills. What we’re seeing now is a similar shift from cognitive skills to social skills as automation takes over many routine cognitive tasks, increasingly the “job” that humans are valued for is relating to other humans.

There are some things a machine will never do. An algorithm will never strike out at a Little League game, see its child born or have a bad day at work. We can, of course, train computers to mimic these things by training them on data, but they will never actually have the experience and that limits their ability to fully relate to human emotions.

To see how this is likely to play out, simply go and visit your local Apple Store. It is a highly automated operation, without traditional checkout aisles or cash registers. Still, the first thing that catches your eye is a sea of blue shirts waiting to help you. They are not there to execute transactions, which you can easily do online, but to engage with you, understand what you’re trying to achieve and help you get it done.

We’ve seen similar trends at work even in highly technical fields. A study of 19.9 million scientific papers found that not only has the percentage of papers published by teams steadily increased over the past 50 years, the size of those teams has also grown and their research is more highly cited. The journal Nature got similar results and also found that the work being done is far more interdisciplinary and done at greater distances.

What’s becoming clear is that collaboration is increasingly becoming a competitive advantage. The ultimate skill is no longer knowledge or proficiency in a particular domain, but to build a shared purpose with others, who possess a diverse set of skills and perspectives, in order to solve complex problems. In other words, the most important jobs the ones we do in the service of a common objective.

— Article courtesy of the Digital Tonto blog
— Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We’re Disrupting People Instead of Industries Now

We're Disrupting People Instead of Industries Now

In 1997, when Clayton Christensen first published The Innovator’s Dilemma and introduced the term “disruptive innovation,” it was a clarion call. Business leaders were put on notice: It is no longer enough to simply get better at what you already do, you need to watch out for a change in the basis of competition that will open the door for a disruptive competitor.

Today, it’s become fashionable for business pundits to say that we live in a VUCA era, one that is volatile, uncertain, complex and ambiguous, but the evidence says otherwise. Increasingly researchers are finding that businesses are enjoying a period that is less disruptive, less competitive and less dynamic.

The truth is that we don’t really disrupt businesses anymore, we disrupt people and that’s truly becoming a problem. As businesses are increasingly protected from competition, they are becoming less innovative and less productive. Americans, meanwhile, are earning less and paying more. It’s time we stop doubling down on failed ideas and begin to right the ship.

The Productivity Paradox

In the 1920s two emerging technologies, internal combustion and electricity, finally began to hit their stride and kicked off a 50-year boom in productivity growth. During that time things changed dramatically. We shifted from a world where few Americans had indoor plumbing, an automobile or electrical appliances to one in which the average family had all of these things.

Technology enthusiasts like to compare the digital revolution with that earlier era, but that’s hardly the case. If anybody today was magically transported 50 years back to 1970, they would see much they would recognize. Yet if most modern people had to live in 1920, where even something as simple as cooking a meal required hours of backbreaking labor, they would struggle to survive.

The evidence is far more than anecdotal however. Productivity statistics clearly show that productivity growth started to slow in the early 1970s, just as computer investment began to rise. With the introduction of the Internet, there was a brief bump in productivity between 1996 and 2004, but then it disappeared again. Today, even with the introduction of social media, mobile Internet and artificial intelligence, we appear to be in a second productivity paradox.

Businesses can earn an economic profit in one of two ways. They can unlock new value through innovation or they can seek to reduce competition. In an era of diminished productivity, it shouldn’t be surprising that many firms have chosen the latter. What is truly startling is the ease and extent to which we have let them get away with it.

Rent Seeking And Regulatory Capture

Investment decisions are driven by profit expectations. If, for instance, a firm sees great potential in a new technology, they will invest in research and development. On the other hand, if they see greater potential influencing governments, they will invest in that. So it is worrying that lobbying expenditures have more than doubled since 1998.

The money goes towards two basic purposes. The first, called rent seeking, involves businesses increasing profits by the law to work in their favor, as when car dealerships in New Jersey sued against Tesla’s direct sales model. The second, regulatory capture, seeks to co-opt agencies that are supposed to govern industry.

It seems like they’re getting their money’s worth. Corporate tax rates in the US have steadily decreased and are now among the lowest in the developed world. Occupational licensing, often the result of lobbying by trade associations, has increased fivefold since the 1950s. Antitrust regulation has become virtually nonexistent, while competition has been reduced.

The result is that while corporations earn record profits, we pay more and get less. This is especially clear in some highly visible industries, such as airlines, cable and mobile carriers, but the effect is much more widespread than that. Keep in mind that, in many states, legislators earn less than $20,000 per year. It’s easy to see how a little investment can go a long way.

Decreasing Returns To Labor

With businesses facing less competition and a more favorable regulatory environment, which not only lowers costs but raises barriers to new market entrants, it shouldn’t be surprising that the stock market has hit record highs. Ordinarily that would be something to cheer, but evidence suggests that the gains are coming at the expense of the rest of us.

A report from MicKinsey Global Institute finds that labor’s share of income has been declining rapidly since 2000, especially in the United States. This is, of course, due to a number of factors, such as low productivity, automation, globalization. Decreased labor bargaining power due to increased market power of employers, however, has been shown to play an especially significant role.

At the same time that our wages have been reduced, the prices we pay have increased, especially in education and healthcare. A study from Pew shows that, for most Americans, real wages have hardly budged since 1964. Instead of becoming better off over time, many families are actually doing worse.

The effects of this long-term squeeze have become dire. Increasingly, Americans are dying deaths of despair from things like alcohol abuse, drug overdose, and suicide. Recent research has also shown that the situation has gotten worse during Covid.

We Are Entering A Dangerous Decade

Decades of disruption have left us considerably worse off. Income inequality is at record highs. Anxiety and depression, already at epidemic levels, has worsened during the Covid-19 pandemic. These trends are most acute in the US, but are essentially global in nature and have contributed to the rise in populist authoritarianism around the world.

Things are likely to get worse over the next decade as we undergo profound shifts in technology, resources, migration and demographics. To put that in perspective, a demographic shift alone was enough to make the 60s a tumultuous era. Clearly, our near future is fraught with danger.

Yet history is not destiny. We have the power to shape our path by making better choices. A good first step would be to finally abandon the cult of disruption that’s served us so poorly and begin to once again invest in stability and resilience, by creating better, safer technology, more competitive and stable markets and a happier, more productive workforce.

Perhaps most of all, we need to internalize the obvious principle that systems and ideologies should serve people, not the other way around. If we increase GDP and the stock market hits record highs, but the population is poorer, less healthy and less happy, then what have we won?

— Article courtesy of the Digital Tonto blog
— Image credit: Pexels

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Get More Done

Get More Done

What Matters Most Management (WMMM) is the Key to Success

Most times you’ll see this posed as a question “What matters most?” as people grapple with finding the meaning of life. That is not the case here…

Instead I would like to share with you my simple management philosophy that will help you be more successful in today’s sometimes overwhelming, chaotic world of too many competing demands on your time.

I will help you succeed on a whim! (well, okay a WMMM)

Your success in this case comes from following the whim (or WMMM) of What Matters Most Management. It can be tailored for use in managing your time, a project, etc. For simplicity we’ll look at time management today by popular request (people ask me all the time how I manage to get so much done).

It involves quite simply making a quick inventory of all of the things that you could focus on today, or that you’re being asked to focus on, and identifying three key things:

1. How big of an impact will completing this task have (Hi/Med/Lo)

2. How big of an effort will it take to complete this task (Hi/Med/Lo)

3. When will my energy be the best for completing this task (Morning/Afternoon/Evening)

This daily inventory of tasks can be done in your head, or on paper, depending on how detail oriented you are. After you have your mental or written list, then plan your day, prioritizing of course any tasks with a low effort/high impact combination (often very rare).

You will also want to prioritize any tasks that involve getting others to do work. Getting others started on their work sooner rather than later, will lead to those tasks getting done faster because they are not sitting in your inbox.

Consider also whether it makes sense to start a task you can’t finish today or not. Sometimes there is no advantage to starting something today instead of tomorrow if you’ll end up finishing it tomorrow either way. Other times there will be tasks you need to finish tomorrow that you’ll have to start today to make it work. Going through this exercise is how you’ll identify What Matters Most (WMM).

I find this method to suit an organic person like me much more than a rigid system like Franklin Covey, plus systems like that don’t take into account when the ideal time might be to do a certain type of work based on the composition of your day and personal energy patterns. Save up somewhat mindless, administrative type work for when you’re brain is tired and do your more creative, intense work when your mind is fresh.

It’s also amazing how frequently the Pareto Principle proves out (where the items that deliver 80% of the value only require 20% of your effort, and vice versa). Focus on that 20% that will drive the 80% of your potential positive perception in the minds of others and in tangible impact in your life.

The WMMM approach works the same on projects, and can be super powerful when a family, project team, etc. all follow a similar philosophy.

The WMMM approach can also be used by product managers and entrepreneurs to create more successful products and services!

Go ahead! Try it! I think you’ll find that you’ll get more done, and sometimes more importantly, people will notice.

Image credit: earningmoneytoday.com

This article originally appeared on Linkedin


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Microsoft’s Latest Vision of Future Productivity

Microsoft's Latest Vision of Future ProductivityI came across the latest vision of future productivity from Microsoft today and thought I would share it with you, along with a whole series of previous videos from Microsoft taking a look at the same subject area, ranging from 2009-2015. It is interesting to see what has changed and what has stayed the same over those six years in their view of the future.

So, here is Microsoft’s latest vision of future productivity:

And here is a closer in, more present-oriented view of changes in how people think about technology, collaboration, and productivity from Julia White, General Manager, WW Office Marketing, Microsoft:

(sorry, someone made this video private)

It can also be interesting to see how visions of the future evolve over time, so here is Microsoft’s vision of the future from October 2011:

And their 2009 vision:

Does anything jump out that has either worked its way into Microsoft’s vision of the future of productivity or worked its way out of their vision that is notable?

I’d be curious to hear your thoughts and reactions to this series of videos and where you think things are going in the near term and longer term.


Accelerate your change and transformation success

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.