Category Archives: Government

Uber Economy is Killing Innovation, Prosperity and Entrepreneurship

Uber Economy is Killing Innovation, Prosperity and Entrepreneurship

GUEST POST from Greg Satell

Today, it seems that almost everyone wants to be the “Uber” of something, and why not? With very little capital investment, the company has completely disrupted the taxicab industry and attained a market value of over $100 billion. In an earlier era, it would have taken decades to have created that kind of impact on a global scale.

Still, we’re not exactly talking about Henry Ford and his Model T here. Or even the Boeing 707 or the IBM 360. Like Uber, those innovations quickly grew to dominance, but also unleashed incredible productivity. Uber, on the other hand, gushed red ink for more than a decade despite $25 billion invested. In 2021 it lost more than $6 billion, the company made progress in 2022 but still lost money, and it was only in 2023 that they finally made a profit.

The truth is that we have a major problem and, while Uber didn’t cause it, the company is emblematic of it. Put simply, a market economy runs on innovation. It is only through consistent gains in productivity that we can create real prosperity. The data and evidence strongly suggests that we have failed to do that for the past 50 years. We need to do better.

The Productivity Paradox Writ Large

The 20th century was, for the most part, an era of unprecedented prosperity. The emergence of electricity and internal combustion kicked off a 50-year productivity boom between 1920 and 1970. Yet after that, gains in productivity mysteriously disappeared even as business investment in computing technology increased, causing economist Robert Solow to observe that “You can see the computer age everywhere but in the productivity statistics.”

When the internet emerged in the mid-90’s things improved and everybody assumed that the mystery of the productivity paradox had been resolved. However, after 2004 productivity growth disappeared once again. Today, despite the hype surrounding things such as Web 2.0, the mobile Internet and, most recently, artificial intelligence, productivity continues to slump.

Take a closer look at Uber and you can begin to see why. Compare the $25 billion invested in the ride-sharing company with the $5 billion (worth about $45 billion today) IBM invested to build its System 360 in the early 1960s. The System 360 was considered revolutionary, changed computing forever and dominated the industry for decades.

Uber, on the other hand, launched with no hardware or software that was particularly new or revolutionary. In fact, the company used fairly ordinary technology to dis-intermediate relatively low-paid taxi dispatchers. The money invested was largely used to fend off would-be competitors through promoting the service and discounting rides.

Maybe the “productivity paradox” isn’t so mysterious after all.

Two Paths To Profitability

Anybody who’s ever taken an Economics 101 course knows that, under conditions of perfect competition, the forces of supply and demand are supposed to drive markets toward equilibrium. It is at this magical point that prices are high enough to attract supply sufficient to satisfy demand, but not any higher.

Unfortunately for anyone running a business, that equilibrium point is the same point at which economic profit disappears. So to make a profit over the long-term, managers need to alter market dynamics either through limiting competition, often through strategies such as rent seeking and regulatory capture, or by creating new markets through innovation.

As should be clear by now, the digital revolution has been relatively ineffective at creating meaningful innovation. Economists Daron Acemoglu and Pascual Restrepo refer to technologies like Uber, as well as things like automated customer service, as “so-so technologies,” because they displace workers without significantly increasing productivity.

Joseph Schumpeter pointed out long ago, market economies need innovation to fuel prosperity. Without meaningful innovation, managers are left with only strategies that limit innovation, undermine markets and impoverish society, which is what largely seems to have happened over the past few decades.

The Silicon Valley Doomsday Machine

The arrogance of Silicon Valley entrepreneurs seems so outrageous—and so childishly naive— that it is scarcely hard to believe. How could an industry that has produced so little in terms of productivity seem so sure that they’ve been “changing the world” for the better. And how have they made so much money?

The answer lies in something called increasing returns. As it turns out, under certain conditions, namely high up-front investment, negligible marginal costs, network effects and “winner-take-all markets,” the normal laws of economics can be somewhat suspended. In these conditions, it makes sense to pump as much money as possible into an early Amazon, Google or Facebook.

However this seemingly happy story has a few important downsides. First, to a large extent these technologies do not create new markets as much as they disrupt or displace old ones, which is one reason why productivity gains are so meager. Second, the conditions apply to a small set of products, namely software and consumer gadgets, which makes the Silicon Valley model a bad fit for many groundbreaking technologies.

Still, if the perception is that you can make a business viable by pumping a lot of cash into it, you can actually crowd-out a lot of good businesses with bad, albeit well-funded ones. In fact, there is increasing evidence that is exactly what is happening. Rather than an engine of prosperity, Silicon Valley is increasingly looking like a doomsday machine.

Returning To An Innovation Economy

Clearly, we cannot continue “Ubering” ourselves to death. We must return to an economy fueled by innovation, rather than disruption, which produces the kind of prosperity that lifts all boats, rather than outsized profits for a meager few. It is clearly in our power to do that, but we must begin to make better choices.

First, we need to recognize that innovation is something that people do, but instead of investing in human capital, we are actively undermining it. In the US, food insecurity has become an epidemic on college campuses. To make matters worse, the cost of college has created a student debt crisis, essentially condemning our best and brightest to decades of indentured servitude. To add insult to injury, healthcare costs continue to soar. Should we be at all surprised that entrepreneurship is in decline?

Second, we need to rebuild scientific capital. As Vannevar Bush once put it, “There must be a stream of new scientific knowledge to turn the wheels of private and public enterprise.” To take just one example, it is estimated that the $3.8 billion invested in the Human Genome Project generated nearly $800 billion of economic activity as of 2011. Clearly, we need to renew our commitment to basic research.

Finally, we need to rededicate ourselves to free and fair markets. In the United States, by almost every metric imaginable, whether it is industry concentration, occupational licensing, higher prices, lower wages or whatever else you want to look at capitalism has been weakened by poor regulation and oversight. Not surprisingly, innovation has suffered.

Perhaps most importantly, we need to shift our focus from disrupting markets to creating them, from “The Hacker Way”, to tackling grand challenges and from a reductionist approach to an economy based on dignity and well being. Make no mistake: The “Uber Economy” is not the solution, it’s the problem.

— Article courtesy of the Digital Tonto blog
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

We Are Starving Our Innovation Economy

We Are Starving Our Innovation Economy

GUEST POST from Greg Satell

The Cold War was fundamentally different from any conflict in history. It was, to be sure, less over land, blood and treasure than it was about ideas. Communist countries believed that their ideology would prevail. They were wrong. The Berlin Wall fell and capitalism, it seemed, was triumphant.

Today, however, capitalism is in real trouble. Besides the threat of a rising China, the system seems to be crumbling from within. Income inequality in developed countries is at 50-year highs. In the US, the bastion of capitalism, markets have weakened by almost every imaginable metric. This wasn’t what we imagined winning would look like.

Yet we can’t blame capitalism. The truth is that its earliest thinkers warned about the potential for excesses that lead to market failure. The fact is that we did this to ourselves. We believed that we could blindly leave our fates to market and technological forces. We were wrong. Prosperity doesn’t happen by itself. We need to invest in an innovation economy.

Capitalism’s (Seemingly) Fatal Contradiction

Anyone who’s taken an “Economics 101” course knows about Adam Smith and his invisible hand. Essentially, the forces of self-interest, by their very nature, work to identify the optimal price that attracts just enough supply of a particular good or service to satisfy demand. This magical equilibrium point creates prosperity through an optimal use of resources.

However, some argued that the story wasn’t necessarily a happy one. After all, equilibrium implies a lack of economic profit and certainly businesses would want to do better than that. They would seek to gain a competitive advantage and, in doing so, create surplus value, which would then be appropriated to accumulate power to rig the system further in their favor.

Indeed, Adam Smith himself was aware of this danger. “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices,” he wrote. In fact, the preservation of free markets was a major concern that ran throughout his work.

Yet as the economist Joseph Schumpeter pointed out, with innovation the contradiction dissipates. As long as we have creative destruction, market equilibriums are constantly shifting and don’t require capitalists to employ extractive, anti-competitive practices in order to earn excellent profits.

Two Paths To Profit

Anyone who manages a business must pursue at least one of two paths to profit. The first is to innovate. By identifying and solving problems in a competitive marketplace, firms can find new ways to create, deliver and capture value. Everybody wins.

Google’s search engine improved our lives in countless ways. Amazon and Walmart have dramatically improved distribution of goods throughout the economy, making it possible for us to pay less and get more. Pfizer and Moderna invested in an unproven technology that uses mRNA to deliver life-saving molecules and saved us from a deadly pandemic.

Still, the truth is that the business reality is not, “innovate or die,” but rather “innovate or find ways to reduce competition.” There are some positive ways to tilt the playing field, such as building a strong brand or specializing in some niche market. However, other strategies are not so innocent. They seek to profit by imposing costs on the rest of us

The first, called rent seeking, involves businesses increasing profits through getting litigation passed in their favor, as when car dealerships in New Jersey sued against Tesla’s direct sales model. The second, regulatory capture, seeks to co-opt agencies that are supposed to govern industry, resulting in favorable implementation and enforcement of the legal code.

Why “Pro-Business” Often Means Anti-Market

Corporations lobby federal, state and local governments to advance their interests and there’s nothing wrong with that. Elected officials should be responsive to their constituents’ concerns. That is, after all, how democracy is supposed to work. However, very often business interests try to maintain that they are arguing for the public good rather than their own.

Consider the issue of a minimum wage. Businesses argue that government regulation of wages is an imposition on the free market and that, given the magical forces of the invisible hand, letting the market set the price for wages would produce optimal outcomes. Artificially increasing wages, on the other hand, would unduly raise prices on the public and reduce profits needed to invest in competitiveness.

This line of argument is nothing new, of course. In fact, Adam Smith addressed it in The Wealth of Nations nearly 250 years ago:

Our merchants and master-manufacturers complain much of the bad effects of high wages in raising the price, and thereby lessening the sale of their goods both at home and abroad. They say nothing concerning the bad effects of high profits. They are silent with regard to the pernicious effects of their own gains. They complain only of those of other people.

At the same time corporations have themselves been undermining the free market for wages through the abuse of non-compete agreements. Incredibly, 38% of American workers have signed some form of non-compete agreement. Of course, most of these are illegal and wouldn’t hold up in court, but serve to intimidate employees, especially low-wage workers.

That’s just for starters. Everywhere you look, free markets are under attack. Occupational licensing, often the result of lobbying by trade associations, has increased five-fold since the 1950s. Antitrust regulation has become virtually nonexistent, while competition has been reduced in the vast majority of American industries.

Perhaps not surprisingly, while all this lobbying has been going on, recent decades have seen business investment and innovation decline, and productivity growth falter while new business formation has fallen by 50%. Corporate profits, on the other hand, are at record highs.

Getting Back On Track

At the end of World War II, America made important investments to create the world’s greatest innovation economy. The GI Bill made what is perhaps the biggest investment ever in human capital, sending millions to college and creating a new middle class. Investments in institutions such as the National Science Foundation (NSF) and the National Institutes of Health (NIH) would create scientific capital that would fuel US industry.

Unfortunately, we abandoned that very successful playbook. Over the past 20 years, college tuition in the US has roughly doubled in the last 20 years. Perhaps not surprisingly, we’ve fallen to ninth among OECD countries for post-secondary education. The ones who do graduate are often forced into essentially decades of indentured servitude in the form of student loans.

At the same time, government investment in research as a percentage of GDP has been declining for decades, limiting our ability to produce the kinds of breakthrough discoveries that lead to exciting new industries. What passes for innovation these days displaces workers, but does not lead to significant productivity gains. Legislation designed to rectify the situation and increase our competitiveness stalled in the Senate.

So after 250 years, capitalism remains pretty much as Adam Smith first conceived, powerful yet fragile, always at risk of being undermined and corrupted by the same basic animal spirits that it depends on to set prices efficiently. He never wrote, nor is there any indication he ever intended, that markets should be left to their own devices. In fact, he and others warned us that markets need to be actively promoted and protected.

We are free to choose. We need to choose more wisely.

— Article courtesy of the Digital Tonto blog
— Image credits: Microsoft CoPilot

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Department Of Energy Programs Helping to Create an American Manufacturing Future

Department Of Energy Programs Helping to Create an American Manufacturing Future

GUEST POST from Greg Satell

In the recession that followed the dotcom crash in 2000, the United States lost five million manufacturing jobs and, while there has been an uptick in recent years, all indications are that they may never be coming back. Manufacturing, perhaps more than any other sector, relies on deep networks of skills and assets that tend to be highly regional.

The consequences of this loss are deep and pervasive. Losing a significant portion of our manufacturing base has led not only to economic vulnerability, but to political polarization. Clearly, it is important to rebuild our manufacturing base. But to do that, we need to focus on new, more advanced, technologies

That’s the mission of the Advanced Manufacturing Office (AMO) at the Department of Energy. By providing a crucial link between the cutting edge science done at the National Labs and private industry, it has been able to make considerable progress. As the collaboration between government scientists widen and deepens over time, US manufacturing may well be revived.

Linking Advanced Research To Private Industry

The origins of the Department of Energy date back to the Manhattan Project during World War II. The immense project was, in many respects, the start of “big science.” Hundreds of top researchers, used to working in small labs, traveled to newly established outposts to collaborate at places like Los Alamos, New Mexico and Oak Ridge, Tennessee.

After the war was over, the facilities continued their work and similar research centers were established to expand the effort. These National Labs became the backbone of the US government’s internal research efforts. In 1977, the National Labs, along with a number of other programs, were combined to form the Department of Energy.

One of the core missions of the AMO is to link the research done at the National Labs to private industry and the Lab Embedded Entrepreneurship Programs (LEEP) have been particularly successful in this regard. Currently, there are four such programs, Cyclotron Road, Chain Reaction Innovations, West Gate and Innovation Crossroads.

I was able to visit Innovation Crossroads at Oak Ridge National Laboratory and meet the entrepreneurs in its current cohort. Each is working to transform a breakthrough discovery into a market changing application, yet due to technical risk, would not be able to attract funding in the private sector. The LEEP program offers a small amount of seed money, access to lab facilities and scientific and entrepreneurial mentorship to help them get off the ground.

That’s just one of the ways that the AMO opens up the resources of the National Labs. It also helps business get access to supercomputing resources (5 out of the 10 fastest computers in the world are located in the United States, most of them at the National Labs) and conducts early stage research to benefit private industry.

Leading Public-Private Consortia

Another area in which the AMO supports private industry is through taking a leading role in consortia, such as the Manufacturing Institutes that were set up to to give American companies a leg up in advanced areas such as clean energy, composite materials and chemical process intensification.

The idea behind these consortia is to create hubs that provide a critical link with government labs, top scientists at academic universities and private companies looking to solve real-world problems. It both helps firms advance in key areas and allows researchers to focus their work on where they will have the greatest possible impact.

For example, the Critical Materials Institute (CMI) was set up to develop alternatives to materials that are subject to supply disruptions, such as the rare earth elements that are critical to many high tech products and are largely produced in China. A few years ago it developed, along with several National Labs and Eck Industries, an advanced alloy that can replace more costly materials in components of advanced vehicles and aircraft.

“We went from an idea on a whiteboard to a profitable product in less than two years and turned what was a waste product into a valuable asset,” Robert Ivester, Director of the Advanced Manufacturing Office told me.

Technology Assistance Partnerships

In 2011, the International Organization for Standardization released its ISO 50001 guidelines. Like previous guidelines that focused on quality management and environmental impact, ISO 50001 recommends best practices to reduce energy use. These can benefit businesses through lower costs and result in higher margins.

Still, for harried executives facing cutthroat competition and demanding customers, figuring out how to implement new standards can easily get lost in the mix. So a third key role that the AMO plays is to assist companies who wish to implement new standards by providing tools, guides and access to professional expertise.

The AMO offers similar support for a number of critical areas, such as prototype development and also provides energy assessment centers for firms that want to reduce costs. “Helping American companies adopt new technology and standards helps keep American manufacturers on the cutting edge,” Ivester says.

“Spinning In” Rather Than Spinning Out

Traditionally we think of the role of government in business largely in terms of regulation. Legislatures pass laws and watchdog agencies enforce them so that we can have confidence in the the food we eat, the products we buy and the medicines that are supposed to cure us. While that is clearly important, we often overlook how government can help drive innovation.

Inventions spun out of government labs include the Internet, GPS and laser scanners, just to name a few. Many of our most important drugs were also originally developed with government funding. Still, traditionally the work has mostly been done in isolation and only later offered to private companies through licensing agreements.

What makes the Advanced Manufacturing Office different than most scientific programs is that it is more focused on “spinning in” private industry rather than spinning out technologies. That enables executives and entrepreneurs with innovative ideas to power them with some of the best minds and advanced equipment in the world.

As Ivester put it to me, “Spinning out technologies is something that the Department of Energy has traditionally done. Increasingly, we want to spin ideas from industry into our labs, so that companies and entrepreneurs can benefit from the resources we have here. It also helps keep our scientists in touch with market needs and helps guide their research.”

Make no mistake, innovation needs collaboration. Combining the ideas from the private sector with the cutting edge science from government labs can help American manufacturing compete for the 21st century.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Simple Innovations Sometimes Are the Best

Simple Innovations Sometimes Are the Best

by Braden Kelley

Innovations don’t have to be complicated to be impactful. They just need to deliver enough additional value that existing solutions become widely replaced, or flipped around, for the new solution to be widely adopted.

Recently I have been seeing a new simple, yet elegant, solution driving around the streets of Seattle.

It’s pictured in the photo above and it is quite simply the delivery of a temporary license for a newly purchased vehicle that can be printed and installed in a license plate holder in the same way that the eventual traditional license plate will be.

Now, perhaps your state or country already has this, but for me, every vehicle I have ever purchased was instantly defiled by a piece of paper and tape or tape residue that could be difficult remove after a couple months baking in the sun (especially in the summer).

This instant cheapening of a brand new vehicle is now a thing of the past!

Some may say that this is not really that big of a deal because you’re just moving the temporary registration from the back window to now live in the license plate frame, but there are several tangible benefits for multiple parties from this seemingly small change:

  1. Car Owner – improved aesthetics – the car just looks better!
  2. Car Owner – improved safety from increased visibility while driving
  3. State and Car Owner – increased toll revenue so everyone is paying their fair share
  4. Car Owner – improved safety – easier to identify hit and run drivers
  5. Police – improved safety – easier to identify vehicle during traffic stops
  6. Car Owner – improved convenience – easier to quickly find license number when it’s requested

What is your favorite simple innovation that you’ve seen or experienced recently?

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Is China Our New Sputnik Moment?

Is China Our New Sputnik Moment?

GUEST POST from Greg Satell

When the Soviets launched Sputnik, the first space satellite, into orbit in 1957, it was a wake-up call for America. Over the next year, President Eisenhower would sign the National Defense Education Act to spur science education, increase funding for research and establish NASA and DARPA to spur innovation.

A few years ago, a report by the Council on Foreign Relations (CFR) argued that we are at a similar point today, but with China. While we have been steadily decreasing federal investment in R&D over the past few decades, our Asian rival has been ramping up and now threatens our leadership in key technologies such as AI, genomics and quantum information technology.

Clearly, we need to increase our commitment to science and innovation and that means increasing financial investment. However, what the report makes clear is that money alone won’t solve the problem. We are, in several important ways, actually undermining our ability to innovate, now and in the future. We need to renew our culture of innovation in America.

Educating And Attracting Talent

The foundation of an innovation economy is education, especially in STEM subjects. Historically, America has been the world’s best educated workforce, but more recently we’ve fallen to fifth among OECD countries for post-secondary education. That’s alarming and something we will certainly need to reverse if we are to compete effectively.

Our educational descent can be attributed to three major causes. First, the rest of the world has become more educated, so the competition has become stiffer. Second, is financing. Tuition has nearly tripled in the last decade and student debt has become so onerous that it now takes about 20 years to pay off four years for college. Third, we need to work harder to attract talented people to the United States.

The CFR report recommends developing a “21st century National Defense Education Act” to create scholarships in STEM areas and making it easier for foreign students to get Green Cards when they graduate from our universities. It also points out that we need to work harder to attract foreign talent, especially in high impact areas like AI, genomics and quantum computing.

Unfortunately, we seem to be going the other way. The number of international students to American universities is declining. Policies like the muslim ban and concerns about gun violence are deterring scientific talent coming here. The denial rate for those on H1-B visas has increased from 4% in 2016 to 18% in the first quarter of 2019.

Throughout our history, it has been our openness to new people and new ideas that has made America exceptional. It’s a legitimate question whether that’s still true.

Building Technology Ecosystems

In the 1980s, the US semiconductor industry was on the ropes. Due to increased competition from low-cost Japanese manufacturers, American market share in the DRAM market fell from 70% to 20%. The situation not only had a significant economic impact, there were also important national security implications.

The federal government responded with two initiatives, the Semiconductor Research Corporation and SEMATECH, both of which were nonprofit consortiums that involved government, academia and industry. By the 1990s. American semiconductor manufacturers were thriving again.

Today, we have similar challenges with rare earth elements, battery technology and many manufacturing areas. The Obama administration responded by building similar consortiums to those that were established for semiconductors: The Critical Materials Institute for rare earth elements, JCESR for advanced batteries and the 14 separate Manufacturing Institutes.

Yet here again, we seem to be backsliding. The current administration has sought to slash funding for the Manufacturing Extension Partnership that supports small and medium sized producers. An addendum to the CFR report also points out that the administration has pushed for a 30% cut in funding for the national labs, which support much of the advanced science critical to driving American technology forward.

Supporting International Trade and Alliances

Another historical strength of the US economy has been our open approach to trade. The CFR report points out that our role as a “central node in a global network of research and development,” gave us numerous advantages, such as access to foreign talent at R&D centers overseas, investment into US industry and cooperative responses to global challenges.

However, the report warns that “the Trump administration’s indiscriminate use of tariffs against China, as well as partners and allies, will harm U.S. innovative capabilities.” It also faults the Trump administration for pulling out of the Trans-Pacific Partnership trade agreement, which would have bolstered our relationship with Asian partners and increased our leverage over China.

The tariffs undermine American industry in two ways. First, because many of the tariffs are on intermediate goods which US firms use to make products for export, we’re undermining our own competitive position, especially in manufacturing. Second, because trade partners such as Canada and the EU have retaliated against our tariffs, our position is weakened further.

Clearly, we compete in an ecosystem driven world in which power does not come from the top, but emanates from the center. Traditionally, America has positioned itself at the center of ecosystems by constantly connecting out. Now that process seems to have reversed itself and we are extremely vulnerable to others, such as China, filling the void.

We Need to Stop Killing Innovation in America

The CFR report, whose task force included such luminaries as Admiral William McRaven, former Google CEO Eric Schmidt and economist Laura Tyson, should set alarm bells ringing. Although the report was focused on national security issues, it pertains to general competitiveness just as well and the picture it paints is fairly bleak.

After World War II, America stood almost alone in the world in terms of production capacity. Through smart policy, we were able to transform that initial advantage into long-term technological superiority. Today, however we have stiff competition in areas ranging from AI to synthetic biology to quantum systems.

At the same time, we seem to be doing everything we can to kill innovation in America. Instead of working to educate and attract the world’s best talent, we’re making it harder for Americans to attain higher education and for top foreign talent to come and work here. Instead of ramping up our science and technology programs, presidential budgets regular recommend cutting them. Instead of pulling our allies closer, we are pushing them away.

To be clear, America is still at the forefront of science and technology, vying for leadership in every conceivable area. However, as global competition heats up and we need to be redoubling our efforts, we seem to be doing just the opposite. The truth is that our prosperity is not a birthright to which we are entitled, but a legacy that must be lived up to.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Elevating the Importance of Construction and Manufacturing

Elevating the Importance of Construction and Manufacturing

GUEST POST from Mike Shipulski

Restaurants aren’t open as much as they used to be because they cannot hire enough people to do the work. Simply put, there are too few people who want to take the orders; cook the food; deliver food to the tables; clear the tables; and wash the dishes. Sure, it’s an inconvenience that we can’t get a table, but because there are other ways to get food no one will starve because restaurants open. And while some restaurants will go out of business, this situation doesn’t fundamentally constrain the economy.

And the situation is similar with manufacturing and construction: no one wants those jobs either. But, that’s where the similarities end. The shortfall of people who want to work in manufacturing and construction will constrain the economy and prevent the renewal of our infrastructure. Gone are the days of relying on other countries to make all your products because we now know it’s not the most cost-effective way to go. But if there is no one willing to make the products, there will be no products made. And if there is no one willing to build the roads and bridges, roads and bridges will suffer. And if there are no products, no good roads, and no safe bridges, there can be no strong economy.

While there is disagreement around why people don’t want to work in manufacturing and construction, I will propose three for your consideration.

Firstly, the manufacturing and construction sectors have an image problem. People don’t see these jobs as high-tech, high-status jobs where the working environment is clean and safe. In short, people don’t see these jobs as jobs they can be proud and they don’t think others will think highly of them if they say they work in manufacturing or construction. And because of the history of layoffs, people don’t see these jobs as secure and predictable and don’t see them as reliable sources of income. This may not be the case for all people, but I think it applies to a lot of people.

Secondly, the manufacturing and construction sectors don’t pay enough. People don’t see these jobs as viable mechanisms to provide a solid standard of living for themselves and their families. This is a generalization, but I think it holds true.

Thirdly, the manufacturing and construction sectors require specialized knowledge, skills, and abilities skills that are not taught in traditional high schools or colleges. And without these qualifications, people are reluctant to apply. And if they do apply and a company hires them even though they don’t have the knowledge, skills, and abilities, companies must invest in training which creates a significant cost hurdle.

So, what are we to do?

To improve their image, the manufacturing and construction trade organizations and professional societies can come together and create a coordinated education program to change what people think about their industries. And states can help by educating their citizens on the importance of manufacturing and construction to the health of the states’ economies. This will be a long road, but I think it’s time to start.

To attract new talent, the manufacturing and construction sectors must pay a higher wage. In the short term, profits may be reduced, but imagine how much profits will be reduced if there are no people to build the products or fix the bridges. And over the long term, with improved business processes and working methods, profits will grow.

To train people to work in manufacturing and construction, we can reinstitute the Training Within Industry program of the 1940s. The Manufacturing Extension Partnership programs within the states can be a center of mass for this work along with the Construction Industry Institute and other construction trade organizations.

It’s time to join forces to make this happen.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Our Fear of China is Overblown

Our Fear of China is Overblown

GUEST POST from Greg Satell

The rise of China over the last 40 years has been one of history’s great economic miracles. According to the World Bank, since it began opening up its economy in 1979, China’s GDP has grown from a paltry $178 billion to a massive $13.6 trillion. At the same time, research by McKinsey shows that its middle class is expanding rapidly.

What’s more, it seems like the Asian giant is just getting started. China has become increasingly dominant in scientific research and has embarked on two major initiatives: Made in China 2025, which aims to make it the leading power in 10 emerging industries, and a massive Belt and Road infrastructure initiative that seeks to shore up its power throughout Asia.

Many predict that China will dominate the 21st century in much the same way that America dominated the 20th. Yet I’m not so sure. First, American dominance was due to an unusual confluence of forces unlikely to be repeated. Second, China has weaknesses—and we have strengths—that aren’t immediately obvious. We need to be clear headed about China’s rise.

The Making of an American Century

America wasn’t always a technological superpower. In fact, at the turn of the 20th century, much like China at the beginning of this century, the United States was largely a backwater. Still mostly an agrarian nation, the US lacked the industrial base and intellectual heft of Europe. Bright young students would often need to go overseas for advanced degrees. With no central bank, financial panics were common.

Yet all that changed quickly. Industrialists like Thomas Edison and Henry Ford put the United States at the forefront of the two most important technologies of the time, electricity and internal combustion. Great fortunes produced by a rising economy endowed great educational institutions. In 1913 the Federal Reserve Act was passed, finally bringing financial stability to a growing nation. By the 1920s, much like China today, America had emerged as a major world power.

Immigration also played a role. Throughout the early 1900s immigrants coming to America provided enormous entrepreneurial energy as well as cheap labor. With the rise of fascism in the 1930s, our openness to new people and new ideas attracted many of the world’s greatest scientists to our shores and created a massive brain drain in Europe.

At the end of World War II, the United States was the only major power left with its industrial base still intact. We seized the moment wisely, using the Marshall Plan to rebuild our allies and creating scientific institutions, such as the National Science Foundation (NSF) and the National Institutes of Health (NIH) that fueled our technological and economic dominance for the rest of the century.

There are many parallels between the 1920s and the historical moment of today, but there are also many important differences. It was a number of forces, including our geography, two massive world wars, our openness as a culture and a number of wise policy choices that led to America’s dominance. Some of these factors can be replicated, but others cannot.

MITI and the Rise of Japan

Long before China loomed as a supposed threat to American prosperity and dominance, Japan was considered to be a chief economic rival. Throughout the 1970s and 80s, Japanese firms came to lead in many key industries, such as automobiles, electronics and semiconductors. The United States, by comparison, seemed feckless and unable to compete.

Key to Japan’s rise was a long-term industrial policy. The Ministry of International Trade and Industry (MITI) directed investment and funded research that fueled an economic miracle. Compared to America’s haphazard policies, Japan’s deliberate and thoughtful strategy seemed like a decidedly more rational and wiser model.

Yet before long things began to unravel. While Japan continued to perform well in many of the industries and technologies that the MITI focused on, it completely missed out on new technologies, such as minicomputers and workstations in the 1980s and personal computers in the 1990s. As MITI continued to support failing industries, growth slowed and debt piled up, leading to a lost decade of economic malaise.

At the same time, innovative government policy in the US also helped turn the tide. For example, in 1987 a non-profit consortium made up of government labs, research universities and private sector companies, called SEMATECH, was created to regain competitiveness in the semiconductor industry. America soon retook the lead, which continues even today.

China 2025 and the Belt and Road Initiative

While the parallels with America in the 1920s underline China’s potential, Japan’s experience in the 1970s and 80s highlight its peril. Much like Japan, it is centralizing decision-making around a relatively small number of bureaucrats and focusing on a relatively small number of industries and technologies.

Much like Japan back then, China seems wise and rational. Certainly, the technologies it is targeting, such as artificial intelligence, electric cars and robotics would be on anybody’s list of critical technologies for the future. The problem is that the future always surprises us. What seems clear and obvious today may look ridiculous and naive a decade from now.

To understand the problem, consider quantum computing, which China is investing heavily in. However, the technology is far from monolithic. In fact, there are a wide variety of approaches being championed by different firms, such as IBM, Microsoft, Google, Intel and others. Clearly, some of these firms are going to be right and some will be wrong.

The American firms that get it wrong will fail, but others will surely succeed. In China, however, the ones that get it wrong will likely be government bureaucrats who will have the power to prop up state supported firms indefinitely. Debt will pile up and competitiveness will decrease, much like it did in Japan in the 1990s.

This is, of course, speculation. However, there are indications that it is already happening. A recent bike sharing bubble has ignited concerns that similar over-investment is happening in artificial intelligence. Many investors have also become concerned that China’s slowing economy will be unable to support its massive debt load.

The Path Forward

The rise of China presents a generational challenge. Clearly, we cannot ignore a rising power, yet we shouldn’t overreact either. While many have tried to cast China as a bad actor, engaging in intellectual theft, currency manipulation and other unfair trade policies, others point out that it is wisely investing for the long-term while the US manages by the quarter.

Interestingly, as Fareed Zakaria recently pointed out, the same accusations made about China’s unfair trade policies today were leveled at Japan 40 years ago. In retrospect, however, our fears about Japan seem almost quaint. Not only were we not crushed by Japan’s rise, we are clearly better for it, incorporating Japanese ideas like lean manufacturing and combining them with our own innovations.

I suspect, or at least I hope, that we will benefit from China’s rise much as we did from Japan’s. We will learn from its innovations and be inspired to develop more of our own. If a Chinese scientist invents a cure for cancer, American lives will be saved. If an American scientist invents a better solar panel, fewer Chinese will be choking on smog.

Perhaps most of all, we need to remember that what made the 20th Century the American Century was our ability to rise to the challenges that history presented. Whether it was rebuilding Europe in the 40s and 50s, or Sputnik in the 50s and 60s or Japan in the 70s and 80s, competition always brought out the best in us. Then, as now, our destiny was our own to determine.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Ways Governments Can Accelerate the Digital Transformation of Their Economies

Four Ways Governments Can Accelerate the Digital Transformation of Their Economies

GUEST POST from Art Inteligencia

In today’s digital world, governments have a critical role to play in accelerating digital transformation. As technology continues to evolve, governments must find ways to embrace and apply new technologies, while also ensuring that their citizens have access to the most advanced digital services.

To ensure success, there are several key steps that the government should take.

1. Governments Should Invest in Digital Infrastructure

By investing in the infrastructure necessary to support digital transformation, the government can create a platform for innovation and adoption of new technologies. This includes things like high-speed broadband, 5G networks, and cloud computing capabilities.

2. Governments Should Provide Incentives to Spur Digital Adoption

This could come in the form of tax breaks, grants, and other incentives to organizations that are investing in digital transformation. This will help create a climate of investment and innovation, which will in turn help accelerate the transformation process.

3. Governments Should Create a Supportive Regulatory Environment

This means creating laws and regulations that are conducive to digital transformation, such as data privacy and security laws. This will help ensure that organizations can safely and securely adopt new technologies and services.

4. Governments Should Invest in Digital Literacy and Education

By investing in digital literacy and education, the government can ensure that citizens have the tools and knowledge necessary to take advantage of the digital transformation. This can include programs such as coding boot camps and digital literacy courses for adults.

Conclusion

By taking these steps, the government can create an environment that is conducive to digital transformation and help accelerate the process. In doing so, the government can ensure that its citizens have access to the most advanced digital services and technologies, and that organizations can take advantage of the opportunities that come with digital transformation.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

3 Things Politicians Can Do to Create Innovation

3 Things Politicians Can Do to Create Innovation

GUEST POST from Greg Satell

In the 1960s, the federal government accounted for more than 60% of all research funding, yet by 2016 that had fallen to just over 20%. During the same time, businesses’ share of R&D investment more than doubled from about 30% to almost 70%. Government’s role in US innovation, it seems, has greatly diminished.

Yet new research suggests that the opposite is actually true. Analyzing all patents since 1926, researchers found that the number of patents that relied on government support has risen from 12% in the 1980s to almost 30% today. Interestingly, the same research found that startups benefitted the most from government research.

As we struggle to improve productivity from historical lows, we need the public sector to play a part. The truth is that the government has a unique role to play in driving innovation and research is only part of it. In addition to funding labs and scientists, it can help bring new ideas to market, act as a convening force and offer crucial expertise to private businesses.

1. Treat Knowledge As A Public Good

By 1941, it had become clear that the war raging in Europe would soon envelop the US. With this in mind, Vannevar Bush went to President Roosevelt with a visionary idea — to mobilize the nation’s growing scientific prowess for the war effort. Roosevelt agreed and signed an executive order that would create the Office of Scientific Research and Development (OSRD).

With little time to build labs, the OSRD focused on awarding grants to private organizations such as universities. It was, by all accounts, an enormous success and lead to important breakthroughs such as the atomic bomb, proximity fuze and radar. As the war was winding down, Roosevelt asked Bush to write a report to continue OSRD’s success peacetime.

That report, titled Science, The Endless Frontier, was delivered to President Truman and would set the stage for America’s lasting technological dominance. It set forth a new vision in which scientific advancement would be treated as a public good, financed by the government, but made available for private industry. As Bush explained:

Basic research leads to new knowledge. It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn. New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science.

The influence of Bush’s idea cannot be overstated. It led to the creation of new government agencies, such as the National Science Foundation (NSF), the National Institutes of Health (NIH) and, later, the Defense Advanced Research Projects Agency (DARPA). These helped to create a scientific infrastructure that has no equal anywhere in the world.

2. Help to Overcome the Valley of Death

Government has a unique role to play in basic research. Because fundamental discoveries are, almost by definition, widely applicable, they are much more valuable if they are published openly. At the same time, because private firms have relatively narrow interests, they are less able to fully leverage basic discoveries.

However, many assume that because basic research is a primary role for public investment that it is its only relevant function. Clearly, that’s not the case. Another important role government has to play is helping to overcome the gap between the discovery of a new technology and its commercialization, which is so fraught with peril that it’s often called the “Valley of Death.”

The oldest and best known of initiative is SBIR/STTR program, which is designed to help startups commercialize cutting-edge research. Grants are given in two phases. In the first, a proof-of-concept phase, grants are capped at $150,000. If that’s successful, up to $1 million more can be awarded. Some SBIR/STTR companies, such as Qualcomm, iRobot and Symantec, have become industry leaders.

Other more focused programs have also been established. ARPA-e focuses exclusively on advanced energy technologies. Lab Embedded Entrepreneurship Programs (LEEP) give entrepreneurs access to the facilities and expertise of the National Labs in addition to a small grant. The Manufacturing Extension Program (MEP) helps smaller companies build the skills they need to be globally competitive.

3. Act As a Convening Force

A third role government can play is that of a convening force. For example, in 1987 a non-profit consortium made up of government labs, research universities and private sector companies, called SEMATECH, was created to regain competitiveness in the semiconductor industry. America soon regained its lead, which continues even today.

The reason that SEMATECH was so successful was that it combined the scientific expertise of the country’s top labs with the private sector’s experience in solving real world problems. It also sent a strong signal that the federal government saw the technology as important, which encouraged private companies to step up their investment as well.

Today, a number of new initiatives have been launched that follow a similar model. The most wide-ranging is the Manufacturing USA Institutes, which are helping drive advancement in everything from robotics and photonics to biofabrication and composite materials. Others, such as JCESR and the Critical Materials Institute, are more narrowly focused.

Much like its role in supporting basic science and helping new technologies get through the “Valley of Death,” acting as a convening force is something that, for the most part, only the federal government can do.

Make No Mistake: This Is Our New Sputnik Moment

In the 20th century three key technologies, electricity, internal combustion and computing drove economic advancement and the United States led each one. That is why it is often called the “American Century.” No country, perhaps since the Roman Empire, has ever so thoroughly dominated the known world.

Yet the 21st century will be different. The most important technologies will be things like synthetic biology, materials science and artificial intelligence. These are largely nascent and it’s still not clear who, if anybody, will emerge as a clear leader. It is very possible that we will compete economically and technologically with China, much like we used to compete politically and militarily with the Soviet Union.

Yet back in the Cold War, it was obvious that the public sector had an important role to play. When Kennedy vowed to go to the moon, nobody argued that the effort should be privatized. It was clear that such an enormous undertaking needed government leadership at the highest levels. We pulled together and we won.

Today, by all indications, we are at a new Sputnik moment in which our global scientific and technological leadership is being seriously challenged. We can respond with imagination, creating novel ways to, as Bush put it, “turn the wheels of private and public enterprise,” or we can let the moment pass us by and let the next generation face the consequences.

One thing is clear. We will be remembered for what we chose to do.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Must Prepare for Future Crises Like We Prepare for War

We Must Prepare for Future Crises Like We Prepare for War

GUEST POST from Greg Satell

In a 2015 TED talk, Bill Gates warned that “if anything kills ten million people in the next few decades, it’s most likely to be a highly infectious virus rather than a war. Not missiles, but microbes.” He went on to point out that we have invested enormous amounts of money in nuclear deterrents, but relatively little to battle epidemics.

It’s an apt point. In the US, we enthusiastically spend nearly $700 billion on our military, but cut corners on nearly everything else. Major breakthroughs, such as GPS satellites, the Internet and transistors, are merely offshoots of budgets intended to help us fight wars more effectively. At the same time, politicians gleefully propose budget cuts to the NIH.

A crisis, in one sense, is like anything else. It eventually ends and, when it does, we hope to be wiser for it. No one knows how long this epidemic will last or what the impact will be, but one thing is for sure — it will not be our last crisis. We should treat this as a new Sputnik moment and prepare for the next crisis with the same vigor with which we prepare for war.

Getting Artificial Intelligence Under Control

In the Terminator series, an automated defense system called Skynet becomes “self aware” and launches a nuclear attack to end humanity. Machines called “cyborgs” are created to hunt down the survivors that remain. Clearly it is an apocalyptic vision. Not completely out of the realm of possibility, but very unlikely.

The dangers of artificial intelligence, however, are very real, although not nearly so dramatic. Four years ago, in 2016, I published an article in Harvard Business Review outlining the ethical issues we need to address, ranging from long standing thought experiments like the trolley problem to issues surrounding accountability for automated decisions.

Unlike the Terminator scenario, these issues are clear and present. Consider the problem of data bias. Increasingly, algorithms determine what college we attend, if we get hired for a job and even who goes to prison and for how long. Unlike human decisions, these mathematical models are rarely questioned, but affect materially people’s lives.

The truth is that we need our algorithms to be explainable, auditable and transparent. Just because the possibility of our machines turning on us is fairly remote, doesn’t mean we don’t need too address more subtle, but all to real, dangers. We should build our systems to serve humanity, not the other way around.

The Slow-Moving Climate Crisis

Climate change is an issue that seems distant and political. To most people, basic needs like driving to work, heating their homes and doing basic household chores are much more top of mind than the abstract dangers of a warming planet. Yet the perils of climate change are, in fact, very clear and present.

Consider that the National Oceanic and Atmospheric Administration has found that, since 1980, there have been at least 258 weather and climate disasters where overall damages reached or exceeded $1 billion and that the total cost of these events has been more than $1.7 trillion. That’s an enormous amount of money.

Yet it pales in comparison to what we can expect in the future. A 2018 climate assessment published by the US government warned that we can expect climate change to “increasingly affect our trade and economy, including import and export prices and U.S. businesses with overseas operations and supply chains,” and had similar concerns with regard to our health, safety and quality of life.

There have been, of course, some efforts to slow the increase of carbon in our atmosphere that causes climate change such as the Paris Climate Agreement. However, these efforts are merely down payments to stem the crisis and, in any case, few countries are actually meeting their Paris targets. The US pulled out of the accord entirely.

The Debt Time Bomb

The US national debt today stands at about 23.5 trillion dollars or roughly 110% of GDP. That’s a very large, but not catastrophic number. The deficit in 2020 was expected to be roughly $1 trillion, or about four percent of GDP, but with the impact of the Coronavirus, we can expect it to be at least two to three times that now.

Considering that the economy of the United States grows at about two percent a year on average, any deficit above that level is unsustainable. Clearly, we are far beyond that now and, with baby boomers beginning to retire in massive numbers, Medicare spending is set to explode. At some point, these bills will have to be paid.

Yet focusing solely on financial debt misses a big part of the picture. Not only have we been overspending and under-taxing, we’ve also been massively under investing. Consider that the American Society of Civil Engineers has estimated that we need to spend $4.5 trillion to repair our broken infrastructure. Add that infrastructure debt to our financial and environmental debt it likely adds up to $30-$40 trillion, or roughly 150%-200% of GDP.

Much like the dangers of artificial intelligence and the climate crisis, not to mention the other inevitable crises like the new pandemics that are sure to come, we will eventually have to pay our debts. The only question is how long we want to allow the interest to pile up.

The Visceral Abstract

Some years ago, I wrote about a concept I called the visceral abstract. We often fail to realize how obscure concepts affect our daily lives. The strange theories of quantum mechanics, for example, make modern electronics possible. Einstein’s relativity helps calibrate our GPS satellites. Darwin’s natural selection helps us understand diseases like the Coronavirus.

In much the same way, we find it easy to ignore dangers that don’t seem clear and present. Terminator machines hunting us down in the streets is terrifying, but the very real dangers of data bias in our artificial intelligence systems is easy to dismiss. We worry how to pay the mortgage next month, but the other debts mounting fade into the background.

The news isn’t all bad, of course. Clearly, the Internet has made it far easier to cope with social distancing. Technologies such as gene sequencing and supercomputing simulations make it more likely that we will find a cure or a vaccine. We have the capacity for both petty foolishness and extreme brilliance.

The future is not inevitable. It is what we make it. We can choose, as we have in the past, to invest in our ability to withstand crises and mitigate their effects, or we can choose to sit idly by and give ourselves up to the whims of fate. We pay the price either way. How we pay it is up to us.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.