Category Archives: Government

Department Of Energy Programs Helping to Create an American Manufacturing Future

Department Of Energy Programs Helping to Create an American Manufacturing Future

GUEST POST from Greg Satell

In the recession that followed the dotcom crash in 2000, the United States lost five million manufacturing jobs and, while there has been an uptick in recent years, all indications are that they may never be coming back. Manufacturing, perhaps more than any other sector, relies on deep networks of skills and assets that tend to be highly regional.

The consequences of this loss are deep and pervasive. Losing a significant portion of our manufacturing base has led not only to economic vulnerability, but to political polarization. Clearly, it is important to rebuild our manufacturing base. But to do that, we need to focus on new, more advanced, technologies

That’s the mission of the Advanced Manufacturing Office (AMO) at the Department of Energy. By providing a crucial link between the cutting edge science done at the National Labs and private industry, it has been able to make considerable progress. As the collaboration between government scientists widen and deepens over time, US manufacturing may well be revived.

Linking Advanced Research To Private Industry

The origins of the Department of Energy date back to the Manhattan Project during World War II. The immense project was, in many respects, the start of “big science.” Hundreds of top researchers, used to working in small labs, traveled to newly established outposts to collaborate at places like Los Alamos, New Mexico and Oak Ridge, Tennessee.

After the war was over, the facilities continued their work and similar research centers were established to expand the effort. These National Labs became the backbone of the US government’s internal research efforts. In 1977, the National Labs, along with a number of other programs, were combined to form the Department of Energy.

One of the core missions of the AMO is to link the research done at the National Labs to private industry and the Lab Embedded Entrepreneurship Programs (LEEP) have been particularly successful in this regard. Currently, there are four such programs, Cyclotron Road, Chain Reaction Innovations, West Gate and Innovation Crossroads.

I was able to visit Innovation Crossroads at Oak Ridge National Laboratory and meet the entrepreneurs in its current cohort. Each is working to transform a breakthrough discovery into a market changing application, yet due to technical risk, would not be able to attract funding in the private sector. The LEEP program offers a small amount of seed money, access to lab facilities and scientific and entrepreneurial mentorship to help them get off the ground.

That’s just one of the ways that the AMO opens up the resources of the National Labs. It also helps business get access to supercomputing resources (5 out of the 10 fastest computers in the world are located in the United States, most of them at the National Labs) and conducts early stage research to benefit private industry.

Leading Public-Private Consortia

Another area in which the AMO supports private industry is through taking a leading role in consortia, such as the Manufacturing Institutes that were set up to to give American companies a leg up in advanced areas such as clean energy, composite materials and chemical process intensification.

The idea behind these consortia is to create hubs that provide a critical link with government labs, top scientists at academic universities and private companies looking to solve real-world problems. It both helps firms advance in key areas and allows researchers to focus their work on where they will have the greatest possible impact.

For example, the Critical Materials Institute (CMI) was set up to develop alternatives to materials that are subject to supply disruptions, such as the rare earth elements that are critical to many high tech products and are largely produced in China. A few years ago it developed, along with several National Labs and Eck Industries, an advanced alloy that can replace more costly materials in components of advanced vehicles and aircraft.

“We went from an idea on a whiteboard to a profitable product in less than two years and turned what was a waste product into a valuable asset,” Robert Ivester, Director of the Advanced Manufacturing Office told me.

Technology Assistance Partnerships

In 2011, the International Organization for Standardization released its ISO 50001 guidelines. Like previous guidelines that focused on quality management and environmental impact, ISO 50001 recommends best practices to reduce energy use. These can benefit businesses through lower costs and result in higher margins.

Still, for harried executives facing cutthroat competition and demanding customers, figuring out how to implement new standards can easily get lost in the mix. So a third key role that the AMO plays is to assist companies who wish to implement new standards by providing tools, guides and access to professional expertise.

The AMO offers similar support for a number of critical areas, such as prototype development and also provides energy assessment centers for firms that want to reduce costs. “Helping American companies adopt new technology and standards helps keep American manufacturers on the cutting edge,” Ivester says.

“Spinning In” Rather Than Spinning Out

Traditionally we think of the role of government in business largely in terms of regulation. Legislatures pass laws and watchdog agencies enforce them so that we can have confidence in the the food we eat, the products we buy and the medicines that are supposed to cure us. While that is clearly important, we often overlook how government can help drive innovation.

Inventions spun out of government labs include the Internet, GPS and laser scanners, just to name a few. Many of our most important drugs were also originally developed with government funding. Still, traditionally the work has mostly been done in isolation and only later offered to private companies through licensing agreements.

What makes the Advanced Manufacturing Office different than most scientific programs is that it is more focused on “spinning in” private industry rather than spinning out technologies. That enables executives and entrepreneurs with innovative ideas to power them with some of the best minds and advanced equipment in the world.

As Ivester put it to me, “Spinning out technologies is something that the Department of Energy has traditionally done. Increasingly, we want to spin ideas from industry into our labs, so that companies and entrepreneurs can benefit from the resources we have here. It also helps keep our scientists in touch with market needs and helps guide their research.”

Make no mistake, innovation needs collaboration. Combining the ideas from the private sector with the cutting edge science from government labs can help American manufacturing compete for the 21st century.

— Article courtesy of the Digital Tonto blog and previously appeared on
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Simple Innovations Sometimes Are the Best

Simple Innovations Sometimes Are the Best

by Braden Kelley

Innovations don’t have to be complicated to be impactful. They just need to deliver enough additional value that existing solutions become widely replaced, or flipped around, for the new solution to be widely adopted.

Recently I have been seeing a new simple, yet elegant, solution driving around the streets of Seattle.

It’s pictured in the photo above and it is quite simply the delivery of a temporary license for a newly purchased vehicle that can be printed and installed in a license plate holder in the same way that the eventual traditional license plate will be.

Now, perhaps your state or country already has this, but for me, every vehicle I have ever purchased was instantly defiled by a piece of paper and tape or tape residue that could be difficult remove after a couple months baking in the sun (especially in the summer).

This instant cheapening of a brand new vehicle is now a thing of the past!

Some may say that this is not really that big of a deal because you’re just moving the temporary registration from the back window to now live in the license plate frame, but there are several tangible benefits for multiple parties from this seemingly small change:

  1. Car Owner – improved aesthetics – the car just looks better!
  2. Car Owner – improved safety from increased visibility while driving
  3. State and Car Owner – increased toll revenue so everyone is paying their fair share
  4. Car Owner – improved safety – easier to identify hit and run drivers
  5. Police – improved safety – easier to identify vehicle during traffic stops
  6. Car Owner – improved convenience – easier to quickly find license number when it’s requested

What is your favorite simple innovation that you’ve seen or experienced recently?

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Is China Our New Sputnik Moment?

Is China Our New Sputnik Moment?

GUEST POST from Greg Satell

When the Soviets launched Sputnik, the first space satellite, into orbit in 1957, it was a wake-up call for America. Over the next year, President Eisenhower would sign the National Defense Education Act to spur science education, increase funding for research and establish NASA and DARPA to spur innovation.

A few years ago, a report by the Council on Foreign Relations (CFR) argued that we are at a similar point today, but with China. While we have been steadily decreasing federal investment in R&D over the past few decades, our Asian rival has been ramping up and now threatens our leadership in key technologies such as AI, genomics and quantum information technology.

Clearly, we need to increase our commitment to science and innovation and that means increasing financial investment. However, what the report makes clear is that money alone won’t solve the problem. We are, in several important ways, actually undermining our ability to innovate, now and in the future. We need to renew our culture of innovation in America.

Educating And Attracting Talent

The foundation of an innovation economy is education, especially in STEM subjects. Historically, America has been the world’s best educated workforce, but more recently we’ve fallen to fifth among OECD countries for post-secondary education. That’s alarming and something we will certainly need to reverse if we are to compete effectively.

Our educational descent can be attributed to three major causes. First, the rest of the world has become more educated, so the competition has become stiffer. Second, is financing. Tuition has nearly tripled in the last decade and student debt has become so onerous that it now takes about 20 years to pay off four years for college. Third, we need to work harder to attract talented people to the United States.

The CFR report recommends developing a “21st century National Defense Education Act” to create scholarships in STEM areas and making it easier for foreign students to get Green Cards when they graduate from our universities. It also points out that we need to work harder to attract foreign talent, especially in high impact areas like AI, genomics and quantum computing.

Unfortunately, we seem to be going the other way. The number of international students to American universities is declining. Policies like the muslim ban and concerns about gun violence are deterring scientific talent coming here. The denial rate for those on H1-B visas has increased from 4% in 2016 to 18% in the first quarter of 2019.

Throughout our history, it has been our openness to new people and new ideas that has made America exceptional. It’s a legitimate question whether that’s still true.

Building Technology Ecosystems

In the 1980s, the US semiconductor industry was on the ropes. Due to increased competition from low-cost Japanese manufacturers, American market share in the DRAM market fell from 70% to 20%. The situation not only had a significant economic impact, there were also important national security implications.

The federal government responded with two initiatives, the Semiconductor Research Corporation and SEMATECH, both of which were nonprofit consortiums that involved government, academia and industry. By the 1990s. American semiconductor manufacturers were thriving again.

Today, we have similar challenges with rare earth elements, battery technology and many manufacturing areas. The Obama administration responded by building similar consortiums to those that were established for semiconductors: The Critical Materials Institute for rare earth elements, JCESR for advanced batteries and the 14 separate Manufacturing Institutes.

Yet here again, we seem to be backsliding. The current administration has sought to slash funding for the Manufacturing Extension Partnership that supports small and medium sized producers. An addendum to the CFR report also points out that the administration has pushed for a 30% cut in funding for the national labs, which support much of the advanced science critical to driving American technology forward.

Supporting International Trade and Alliances

Another historical strength of the US economy has been our open approach to trade. The CFR report points out that our role as a “central node in a global network of research and development,” gave us numerous advantages, such as access to foreign talent at R&D centers overseas, investment into US industry and cooperative responses to global challenges.

However, the report warns that “the Trump administration’s indiscriminate use of tariffs against China, as well as partners and allies, will harm U.S. innovative capabilities.” It also faults the Trump administration for pulling out of the Trans-Pacific Partnership trade agreement, which would have bolstered our relationship with Asian partners and increased our leverage over China.

The tariffs undermine American industry in two ways. First, because many of the tariffs are on intermediate goods which US firms use to make products for export, we’re undermining our own competitive position, especially in manufacturing. Second, because trade partners such as Canada and the EU have retaliated against our tariffs, our position is weakened further.

Clearly, we compete in an ecosystem driven world in which power does not come from the top, but emanates from the center. Traditionally, America has positioned itself at the center of ecosystems by constantly connecting out. Now that process seems to have reversed itself and we are extremely vulnerable to others, such as China, filling the void.

We Need to Stop Killing Innovation in America

The CFR report, whose task force included such luminaries as Admiral William McRaven, former Google CEO Eric Schmidt and economist Laura Tyson, should set alarm bells ringing. Although the report was focused on national security issues, it pertains to general competitiveness just as well and the picture it paints is fairly bleak.

After World War II, America stood almost alone in the world in terms of production capacity. Through smart policy, we were able to transform that initial advantage into long-term technological superiority. Today, however we have stiff competition in areas ranging from AI to synthetic biology to quantum systems.

At the same time, we seem to be doing everything we can to kill innovation in America. Instead of working to educate and attract the world’s best talent, we’re making it harder for Americans to attain higher education and for top foreign talent to come and work here. Instead of ramping up our science and technology programs, presidential budgets regular recommend cutting them. Instead of pulling our allies closer, we are pushing them away.

To be clear, America is still at the forefront of science and technology, vying for leadership in every conceivable area. However, as global competition heats up and we need to be redoubling our efforts, we seem to be doing just the opposite. The truth is that our prosperity is not a birthright to which we are entitled, but a legacy that must be lived up to.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Elevating the Importance of Construction and Manufacturing

Elevating the Importance of Construction and Manufacturing

GUEST POST from Mike Shipulski

Restaurants aren’t open as much as they used to be because they cannot hire enough people to do the work. Simply put, there are too few people who want to take the orders; cook the food; deliver food to the tables; clear the tables; and wash the dishes. Sure, it’s an inconvenience that we can’t get a table, but because there are other ways to get food no one will starve because restaurants open. And while some restaurants will go out of business, this situation doesn’t fundamentally constrain the economy.

And the situation is similar with manufacturing and construction: no one wants those jobs either. But, that’s where the similarities end. The shortfall of people who want to work in manufacturing and construction will constrain the economy and prevent the renewal of our infrastructure. Gone are the days of relying on other countries to make all your products because we now know it’s not the most cost-effective way to go. But if there is no one willing to make the products, there will be no products made. And if there is no one willing to build the roads and bridges, roads and bridges will suffer. And if there are no products, no good roads, and no safe bridges, there can be no strong economy.

While there is disagreement around why people don’t want to work in manufacturing and construction, I will propose three for your consideration.

Firstly, the manufacturing and construction sectors have an image problem. People don’t see these jobs as high-tech, high-status jobs where the working environment is clean and safe. In short, people don’t see these jobs as jobs they can be proud and they don’t think others will think highly of them if they say they work in manufacturing or construction. And because of the history of layoffs, people don’t see these jobs as secure and predictable and don’t see them as reliable sources of income. This may not be the case for all people, but I think it applies to a lot of people.

Secondly, the manufacturing and construction sectors don’t pay enough. People don’t see these jobs as viable mechanisms to provide a solid standard of living for themselves and their families. This is a generalization, but I think it holds true.

Thirdly, the manufacturing and construction sectors require specialized knowledge, skills, and abilities skills that are not taught in traditional high schools or colleges. And without these qualifications, people are reluctant to apply. And if they do apply and a company hires them even though they don’t have the knowledge, skills, and abilities, companies must invest in training which creates a significant cost hurdle.

So, what are we to do?

To improve their image, the manufacturing and construction trade organizations and professional societies can come together and create a coordinated education program to change what people think about their industries. And states can help by educating their citizens on the importance of manufacturing and construction to the health of the states’ economies. This will be a long road, but I think it’s time to start.

To attract new talent, the manufacturing and construction sectors must pay a higher wage. In the short term, profits may be reduced, but imagine how much profits will be reduced if there are no people to build the products or fix the bridges. And over the long term, with improved business processes and working methods, profits will grow.

To train people to work in manufacturing and construction, we can reinstitute the Training Within Industry program of the 1940s. The Manufacturing Extension Partnership programs within the states can be a center of mass for this work along with the Construction Industry Institute and other construction trade organizations.

It’s time to join forces to make this happen.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Our Fear of China is Overblown

Our Fear of China is Overblown

GUEST POST from Greg Satell

The rise of China over the last 40 years has been one of history’s great economic miracles. According to the World Bank, since it began opening up its economy in 1979, China’s GDP has grown from a paltry $178 billion to a massive $13.6 trillion. At the same time, research by McKinsey shows that its middle class is expanding rapidly.

What’s more, it seems like the Asian giant is just getting started. China has become increasingly dominant in scientific research and has embarked on two major initiatives: Made in China 2025, which aims to make it the leading power in 10 emerging industries, and a massive Belt and Road infrastructure initiative that seeks to shore up its power throughout Asia.

Many predict that China will dominate the 21st century in much the same way that America dominated the 20th. Yet I’m not so sure. First, American dominance was due to an unusual confluence of forces unlikely to be repeated. Second, China has weaknesses—and we have strengths—that aren’t immediately obvious. We need to be clear headed about China’s rise.

The Making of an American Century

America wasn’t always a technological superpower. In fact, at the turn of the 20th century, much like China at the beginning of this century, the United States was largely a backwater. Still mostly an agrarian nation, the US lacked the industrial base and intellectual heft of Europe. Bright young students would often need to go overseas for advanced degrees. With no central bank, financial panics were common.

Yet all that changed quickly. Industrialists like Thomas Edison and Henry Ford put the United States at the forefront of the two most important technologies of the time, electricity and internal combustion. Great fortunes produced by a rising economy endowed great educational institutions. In 1913 the Federal Reserve Act was passed, finally bringing financial stability to a growing nation. By the 1920s, much like China today, America had emerged as a major world power.

Immigration also played a role. Throughout the early 1900s immigrants coming to America provided enormous entrepreneurial energy as well as cheap labor. With the rise of fascism in the 1930s, our openness to new people and new ideas attracted many of the world’s greatest scientists to our shores and created a massive brain drain in Europe.

At the end of World War II, the United States was the only major power left with its industrial base still intact. We seized the moment wisely, using the Marshall Plan to rebuild our allies and creating scientific institutions, such as the National Science Foundation (NSF) and the National Institutes of Health (NIH) that fueled our technological and economic dominance for the rest of the century.

There are many parallels between the 1920s and the historical moment of today, but there are also many important differences. It was a number of forces, including our geography, two massive world wars, our openness as a culture and a number of wise policy choices that led to America’s dominance. Some of these factors can be replicated, but others cannot.

MITI and the Rise of Japan

Long before China loomed as a supposed threat to American prosperity and dominance, Japan was considered to be a chief economic rival. Throughout the 1970s and 80s, Japanese firms came to lead in many key industries, such as automobiles, electronics and semiconductors. The United States, by comparison, seemed feckless and unable to compete.

Key to Japan’s rise was a long-term industrial policy. The Ministry of International Trade and Industry (MITI) directed investment and funded research that fueled an economic miracle. Compared to America’s haphazard policies, Japan’s deliberate and thoughtful strategy seemed like a decidedly more rational and wiser model.

Yet before long things began to unravel. While Japan continued to perform well in many of the industries and technologies that the MITI focused on, it completely missed out on new technologies, such as minicomputers and workstations in the 1980s and personal computers in the 1990s. As MITI continued to support failing industries, growth slowed and debt piled up, leading to a lost decade of economic malaise.

At the same time, innovative government policy in the US also helped turn the tide. For example, in 1987 a non-profit consortium made up of government labs, research universities and private sector companies, called SEMATECH, was created to regain competitiveness in the semiconductor industry. America soon retook the lead, which continues even today.

China 2025 and the Belt and Road Initiative

While the parallels with America in the 1920s underline China’s potential, Japan’s experience in the 1970s and 80s highlight its peril. Much like Japan, it is centralizing decision-making around a relatively small number of bureaucrats and focusing on a relatively small number of industries and technologies.

Much like Japan back then, China seems wise and rational. Certainly, the technologies it is targeting, such as artificial intelligence, electric cars and robotics would be on anybody’s list of critical technologies for the future. The problem is that the future always surprises us. What seems clear and obvious today may look ridiculous and naive a decade from now.

To understand the problem, consider quantum computing, which China is investing heavily in. However, the technology is far from monolithic. In fact, there are a wide variety of approaches being championed by different firms, such as IBM, Microsoft, Google, Intel and others. Clearly, some of these firms are going to be right and some will be wrong.

The American firms that get it wrong will fail, but others will surely succeed. In China, however, the ones that get it wrong will likely be government bureaucrats who will have the power to prop up state supported firms indefinitely. Debt will pile up and competitiveness will decrease, much like it did in Japan in the 1990s.

This is, of course, speculation. However, there are indications that it is already happening. A recent bike sharing bubble has ignited concerns that similar over-investment is happening in artificial intelligence. Many investors have also become concerned that China’s slowing economy will be unable to support its massive debt load.

The Path Forward

The rise of China presents a generational challenge. Clearly, we cannot ignore a rising power, yet we shouldn’t overreact either. While many have tried to cast China as a bad actor, engaging in intellectual theft, currency manipulation and other unfair trade policies, others point out that it is wisely investing for the long-term while the US manages by the quarter.

Interestingly, as Fareed Zakaria recently pointed out, the same accusations made about China’s unfair trade policies today were leveled at Japan 40 years ago. In retrospect, however, our fears about Japan seem almost quaint. Not only were we not crushed by Japan’s rise, we are clearly better for it, incorporating Japanese ideas like lean manufacturing and combining them with our own innovations.

I suspect, or at least I hope, that we will benefit from China’s rise much as we did from Japan’s. We will learn from its innovations and be inspired to develop more of our own. If a Chinese scientist invents a cure for cancer, American lives will be saved. If an American scientist invents a better solar panel, fewer Chinese will be choking on smog.

Perhaps most of all, we need to remember that what made the 20th Century the American Century was our ability to rise to the challenges that history presented. Whether it was rebuilding Europe in the 40s and 50s, or Sputnik in the 50s and 60s or Japan in the 70s and 80s, competition always brought out the best in us. Then, as now, our destiny was our own to determine.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Four Ways Governments Can Accelerate the Digital Transformation of Their Economies

Four Ways Governments Can Accelerate the Digital Transformation of Their Economies

GUEST POST from Art Inteligencia

In today’s digital world, governments have a critical role to play in accelerating digital transformation. As technology continues to evolve, governments must find ways to embrace and apply new technologies, while also ensuring that their citizens have access to the most advanced digital services.

To ensure success, there are several key steps that the government should take.

1. Governments Should Invest in Digital Infrastructure

By investing in the infrastructure necessary to support digital transformation, the government can create a platform for innovation and adoption of new technologies. This includes things like high-speed broadband, 5G networks, and cloud computing capabilities.

2. Governments Should Provide Incentives to Spur Digital Adoption

This could come in the form of tax breaks, grants, and other incentives to organizations that are investing in digital transformation. This will help create a climate of investment and innovation, which will in turn help accelerate the transformation process.

3. Governments Should Create a Supportive Regulatory Environment

This means creating laws and regulations that are conducive to digital transformation, such as data privacy and security laws. This will help ensure that organizations can safely and securely adopt new technologies and services.

4. Governments Should Invest in Digital Literacy and Education

By investing in digital literacy and education, the government can ensure that citizens have the tools and knowledge necessary to take advantage of the digital transformation. This can include programs such as coding boot camps and digital literacy courses for adults.


By taking these steps, the government can create an environment that is conducive to digital transformation and help accelerate the process. In doing so, the government can ensure that its citizens have access to the most advanced digital services and technologies, and that organizations can take advantage of the opportunities that come with digital transformation.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

3 Things Politicians Can Do to Create Innovation

3 Things Politicians Can Do to Create Innovation

GUEST POST from Greg Satell

In the 1960s, the federal government accounted for more than 60% of all research funding, yet by 2016 that had fallen to just over 20%. During the same time, businesses’ share of R&D investment more than doubled from about 30% to almost 70%. Government’s role in US innovation, it seems, has greatly diminished.

Yet new research suggests that the opposite is actually true. Analyzing all patents since 1926, researchers found that the number of patents that relied on government support has risen from 12% in the 1980s to almost 30% today. Interestingly, the same research found that startups benefitted the most from government research.

As we struggle to improve productivity from historical lows, we need the public sector to play a part. The truth is that the government has a unique role to play in driving innovation and research is only part of it. In addition to funding labs and scientists, it can help bring new ideas to market, act as a convening force and offer crucial expertise to private businesses.

1. Treat Knowledge As A Public Good

By 1941, it had become clear that the war raging in Europe would soon envelop the US. With this in mind, Vannevar Bush went to President Roosevelt with a visionary idea — to mobilize the nation’s growing scientific prowess for the war effort. Roosevelt agreed and signed an executive order that would create the Office of Scientific Research and Development (OSRD).

With little time to build labs, the OSRD focused on awarding grants to private organizations such as universities. It was, by all accounts, an enormous success and lead to important breakthroughs such as the atomic bomb, proximity fuze and radar. As the war was winding down, Roosevelt asked Bush to write a report to continue OSRD’s success peacetime.

That report, titled Science, The Endless Frontier, was delivered to President Truman and would set the stage for America’s lasting technological dominance. It set forth a new vision in which scientific advancement would be treated as a public good, financed by the government, but made available for private industry. As Bush explained:

Basic research leads to new knowledge. It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn. New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science.

The influence of Bush’s idea cannot be overstated. It led to the creation of new government agencies, such as the National Science Foundation (NSF), the National Institutes of Health (NIH) and, later, the Defense Advanced Research Projects Agency (DARPA). These helped to create a scientific infrastructure that has no equal anywhere in the world.

2. Help to Overcome the Valley of Death

Government has a unique role to play in basic research. Because fundamental discoveries are, almost by definition, widely applicable, they are much more valuable if they are published openly. At the same time, because private firms have relatively narrow interests, they are less able to fully leverage basic discoveries.

However, many assume that because basic research is a primary role for public investment that it is its only relevant function. Clearly, that’s not the case. Another important role government has to play is helping to overcome the gap between the discovery of a new technology and its commercialization, which is so fraught with peril that it’s often called the “Valley of Death.”

The oldest and best known of initiative is SBIR/STTR program, which is designed to help startups commercialize cutting-edge research. Grants are given in two phases. In the first, a proof-of-concept phase, grants are capped at $150,000. If that’s successful, up to $1 million more can be awarded. Some SBIR/STTR companies, such as Qualcomm, iRobot and Symantec, have become industry leaders.

Other more focused programs have also been established. ARPA-e focuses exclusively on advanced energy technologies. Lab Embedded Entrepreneurship Programs (LEEP) give entrepreneurs access to the facilities and expertise of the National Labs in addition to a small grant. The Manufacturing Extension Program (MEP) helps smaller companies build the skills they need to be globally competitive.

3. Act As a Convening Force

A third role government can play is that of a convening force. For example, in 1987 a non-profit consortium made up of government labs, research universities and private sector companies, called SEMATECH, was created to regain competitiveness in the semiconductor industry. America soon regained its lead, which continues even today.

The reason that SEMATECH was so successful was that it combined the scientific expertise of the country’s top labs with the private sector’s experience in solving real world problems. It also sent a strong signal that the federal government saw the technology as important, which encouraged private companies to step up their investment as well.

Today, a number of new initiatives have been launched that follow a similar model. The most wide-ranging is the Manufacturing USA Institutes, which are helping drive advancement in everything from robotics and photonics to biofabrication and composite materials. Others, such as JCESR and the Critical Materials Institute, are more narrowly focused.

Much like its role in supporting basic science and helping new technologies get through the “Valley of Death,” acting as a convening force is something that, for the most part, only the federal government can do.

Make No Mistake: This Is Our New Sputnik Moment

In the 20th century three key technologies, electricity, internal combustion and computing drove economic advancement and the United States led each one. That is why it is often called the “American Century.” No country, perhaps since the Roman Empire, has ever so thoroughly dominated the known world.

Yet the 21st century will be different. The most important technologies will be things like synthetic biology, materials science and artificial intelligence. These are largely nascent and it’s still not clear who, if anybody, will emerge as a clear leader. It is very possible that we will compete economically and technologically with China, much like we used to compete politically and militarily with the Soviet Union.

Yet back in the Cold War, it was obvious that the public sector had an important role to play. When Kennedy vowed to go to the moon, nobody argued that the effort should be privatized. It was clear that such an enormous undertaking needed government leadership at the highest levels. We pulled together and we won.

Today, by all indications, we are at a new Sputnik moment in which our global scientific and technological leadership is being seriously challenged. We can respond with imagination, creating novel ways to, as Bush put it, “turn the wheels of private and public enterprise,” or we can let the moment pass us by and let the next generation face the consequences.

One thing is clear. We will be remembered for what we chose to do.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

We Must Prepare for Future Crises Like We Prepare for War

We Must Prepare for Future Crises Like We Prepare for War

GUEST POST from Greg Satell

In a 2015 TED talk, Bill Gates warned that “if anything kills ten million people in the next few decades, it’s most likely to be a highly infectious virus rather than a war. Not missiles, but microbes.” He went on to point out that we have invested enormous amounts of money in nuclear deterrents, but relatively little to battle epidemics.

It’s an apt point. In the US, we enthusiastically spend nearly $700 billion on our military, but cut corners on nearly everything else. Major breakthroughs, such as GPS satellites, the Internet and transistors, are merely offshoots of budgets intended to help us fight wars more effectively. At the same time, politicians gleefully propose budget cuts to the NIH.

A crisis, in one sense, is like anything else. It eventually ends and, when it does, we hope to be wiser for it. No one knows how long this epidemic will last or what the impact will be, but one thing is for sure — it will not be our last crisis. We should treat this as a new Sputnik moment and prepare for the next crisis with the same vigor with which we prepare for war.

Getting Artificial Intelligence Under Control

In the Terminator series, an automated defense system called Skynet becomes “self aware” and launches a nuclear attack to end humanity. Machines called “cyborgs” are created to hunt down the survivors that remain. Clearly it is an apocalyptic vision. Not completely out of the realm of possibility, but very unlikely.

The dangers of artificial intelligence, however, are very real, although not nearly so dramatic. Four years ago, in 2016, I published an article in Harvard Business Review outlining the ethical issues we need to address, ranging from long standing thought experiments like the trolley problem to issues surrounding accountability for automated decisions.

Unlike the Terminator scenario, these issues are clear and present. Consider the problem of data bias. Increasingly, algorithms determine what college we attend, if we get hired for a job and even who goes to prison and for how long. Unlike human decisions, these mathematical models are rarely questioned, but affect materially people’s lives.

The truth is that we need our algorithms to be explainable, auditable and transparent. Just because the possibility of our machines turning on us is fairly remote, doesn’t mean we don’t need too address more subtle, but all to real, dangers. We should build our systems to serve humanity, not the other way around.

The Slow-Moving Climate Crisis

Climate change is an issue that seems distant and political. To most people, basic needs like driving to work, heating their homes and doing basic household chores are much more top of mind than the abstract dangers of a warming planet. Yet the perils of climate change are, in fact, very clear and present.

Consider that the National Oceanic and Atmospheric Administration has found that, since 1980, there have been at least 258 weather and climate disasters where overall damages reached or exceeded $1 billion and that the total cost of these events has been more than $1.7 trillion. That’s an enormous amount of money.

Yet it pales in comparison to what we can expect in the future. A 2018 climate assessment published by the US government warned that we can expect climate change to “increasingly affect our trade and economy, including import and export prices and U.S. businesses with overseas operations and supply chains,” and had similar concerns with regard to our health, safety and quality of life.

There have been, of course, some efforts to slow the increase of carbon in our atmosphere that causes climate change such as the Paris Climate Agreement. However, these efforts are merely down payments to stem the crisis and, in any case, few countries are actually meeting their Paris targets. The US pulled out of the accord entirely.

The Debt Time Bomb

The US national debt today stands at about 23.5 trillion dollars or roughly 110% of GDP. That’s a very large, but not catastrophic number. The deficit in 2020 was expected to be roughly $1 trillion, or about four percent of GDP, but with the impact of the Coronavirus, we can expect it to be at least two to three times that now.

Considering that the economy of the United States grows at about two percent a year on average, any deficit above that level is unsustainable. Clearly, we are far beyond that now and, with baby boomers beginning to retire in massive numbers, Medicare spending is set to explode. At some point, these bills will have to be paid.

Yet focusing solely on financial debt misses a big part of the picture. Not only have we been overspending and under-taxing, we’ve also been massively under investing. Consider that the American Society of Civil Engineers has estimated that we need to spend $4.5 trillion to repair our broken infrastructure. Add that infrastructure debt to our financial and environmental debt it likely adds up to $30-$40 trillion, or roughly 150%-200% of GDP.

Much like the dangers of artificial intelligence and the climate crisis, not to mention the other inevitable crises like the new pandemics that are sure to come, we will eventually have to pay our debts. The only question is how long we want to allow the interest to pile up.

The Visceral Abstract

Some years ago, I wrote about a concept I called the visceral abstract. We often fail to realize how obscure concepts affect our daily lives. The strange theories of quantum mechanics, for example, make modern electronics possible. Einstein’s relativity helps calibrate our GPS satellites. Darwin’s natural selection helps us understand diseases like the Coronavirus.

In much the same way, we find it easy to ignore dangers that don’t seem clear and present. Terminator machines hunting us down in the streets is terrifying, but the very real dangers of data bias in our artificial intelligence systems is easy to dismiss. We worry how to pay the mortgage next month, but the other debts mounting fade into the background.

The news isn’t all bad, of course. Clearly, the Internet has made it far easier to cope with social distancing. Technologies such as gene sequencing and supercomputing simulations make it more likely that we will find a cure or a vaccine. We have the capacity for both petty foolishness and extreme brilliance.

The future is not inevitable. It is what we make it. We can choose, as we have in the past, to invest in our ability to withstand crises and mitigate their effects, or we can choose to sit idly by and give ourselves up to the whims of fate. We pay the price either way. How we pay it is up to us.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

How COVID-19 Has Exposed Us

How COVID-19 Has Exposed Us

GUEST POST from Greg Satell

The moon landing in 1969 was, in many ways, the high point of the American century. Since then, we’ve been beset by scandals like Watergate, Iran-Contra and two presidential impeachments, mired in never-ending wars that we don’t win, while increasingly encumbered by rising debts and income inequality amid falling productivity growth. Incomes have stagnated while education and healthcare costs have soared.

Yet in an essay written back in February, just before the Covid-19 crisis, Ross Douthat wrote that these apparent woes are actually signs of success. In effect, he argued that we lack major technological breakthroughs because we become so technologically advanced, and we lack economic progress because we’ve become so prosperous.

Even then, it was a strange and somewhat maddening position to take. Why would Douthat, an intelligent and insightful man, write such things? Because he so wanted to believe them that he went in search for facts to support them. Many of us have been doing the same. Yet the Covid-19 crisis has unmasked us and it’s time to start facing up to the truth.

A Failed Market Revolution

In 1954, the eminent economist Paul Samuelson, came across an obscure dissertation written by a French graduate student named Louis Bachelier around the turn of the century. The paper, which anticipated Einstein’s later breakthrough on Brownian motion, declared somewhat innocently that “the mathematical expectation of the speculator is zero.”

Samuelson’s discovery launched a revolution in mathematical finance models based on on Bachelier’s assumption, including the Efficient Market Hypothesis, portfolio theory, the Capital Asset Pricing Model (CAPM) and the Black-Scholes model. The underlying assumption was that markets were rational, and risk could be quantified and managed effectively.

The flaws in these models should have been obvious even at the time and some, including the mathematician Benoit Mandelbrot, pointed out that markets were far more volatile than the financial engineering models predicted. Nevertheless, policymakers chose to ignore the warnings and put their faith in the “magic of the market.”

Probably the biggest failure of market fundamentalism is that, as economist Thomas Philippon points out in his book The Great Reversal, over the past 40 years markets in the United States have become significantly weaker. In a similar vein, a study published in Harvard Business Review that examined 893 industries found that two thirds had become more concentrated.

The truth is that we’ve chosen weaker markets and less competition, which has led to less dynamism and innovation. That’s no accident.

Digital Disruption

In Regional Advantage, AnnaLee Saxenian describes how Silicon Valley replaced Boston’s “Technology Highway” as the center of the digital universe. While Boston was corporate and hierarchical, Silicon Valley was freewheeling and networked. The Silicon Valley ethos was very much the counterculture.

So, it was no accident that when Steve Jobs flew to New York to recruit John Sculley, who was at the time President of Pepsi, to lead Apple he asked him,”Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” The implication being that selling computers was a higher calling than selling soft drinks.

That was nearly 40 years ago and while the Covid-19 crisis has certainly highlighted some benefits of digital technology, such as cheap and effective teleconferencing, it’s also become clear that the digital revolution has largely been a disappointment. Productivity growth, except for a relatively brief period in the late nineties and early aughts, has been depressed since the 1970s.

Compare the iPhone to the breakthroughs of the mid-twentieth century, such as Bell Lab’s transistor, Boeing’s 707 and IBM’s 360 and it becomes clear that while digital technology has done much to disrupt industries, it’s done relatively little to create significant new value, at least in comparison to earlier technologies.

The Uncertain Promise of Globalization

The aftermath of the fall of the Berlin Wall was a time of great optimism. With the Cold War over, books like Francis Fukayama’s The End of History predicted a capitalist, democratic utopia in which free markets would conquer the world making everyone more prosperous. Those that refused to reform would be unable to compete.

While there were genuine achievements, especially in lifting up the world’s poorest, it’s hard to see how globalization has made us significantly better off. In fact, rather than the triumph of freedom, we’ve seen a global rise in populist authoritarian movements, the polar opposite of what intellectuals like Fukayama predicted.

In the United States, the situation has become especially dire. Social mobility and life expectancy in the white working class are declining, while anxiety and depression are rising to epidemic levels. While wages have stagnated, the cost of healthcare and education has soared, squeezing the middle class. Income inequality is at its highest level in 50 years.

So, while it’s true that there have been real benefits from globalization, such as curbing inflation, we’ve done little to mitigate the costs to the average citizen. That didn’t just happen but was the result of choices that we made.

We Need to Choose Resilience and Grand Challenges Over Output and Disruption

The Covid-19 crisis has unmasked us. We thought that markets, technology and globalization would save us, that we could just set up some sensible rules of the road and everything would run on autopilot. That’s clearly untrue. We took short-term profits while ignoring long-term costs, loaded up on debt and hoped for the best.

The current crisis has followed the same pattern. We simply failed to prepare for known risks because it seemed expedient not to. George Bush warned about the possibility of a pandemic as did his Health and Human Services Secretary. Jay Leno mocked them. The Obama administration set up a step-by-step playbook and it was ignored. The long list of failures goes on.

Yet we don’t have to be victims of our failed choices. We can learn to make better ones. After the 1918 Spanish Flu pandemic, we embarked on a 70-year productivity boom. Out of the ashes of World War II, we built a new era of peace and prosperity that was unprecedented in world history. We can do so again. We have that power.

New technologies, under development as we speak, will likely give us the power to cure cancer, create clean energy, save the environment and colonize space. We can rebuild the middle class, usher in a new era of peace and prosperity, increase life expectancy while improving quality of life. These are all things we may be able to achieve in the next decade or two.

Yet those possibilities are merely potential that we can succeed or fail to actualize. We can, as we did after World War II, choose to invest in the future and tackle grand challenges. We can build new infrastructure, spawn new industries and create an educated workforce. Or we can, as we did after the end of the Cold War, choose disruption over construction.

What’s clear is that nothing is inevitable. The digital revolution didn’t have to be a dud. The Great Recession didn’t have to happen. The Covid-19 Pandemic could have been, at the very least, greatly mitigated. We are responsible for the choices we make. Now is the time to shoot for the moon (and Mars), not to grade ourselves on a curve.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

Why Revolutions Fail

Why Revolutions Fail

GUEST POST from Greg Satell

I still remember the feeling of triumph I felt in the winter of 2005, in the aftermath of the Orange Revolution in Ukraine. During the fall, we readied ourselves for what proved to be a falsified election. In November, when the fraudulent results were announced, we took to the streets and the demonstrations lasted until new elections were called in January.

We had won, or so we thought. Our preferred candidate was elected and it seemed like a new era had dawned. Yet soon it became clear that things were not going well. Planned reforms stalled in a morass of corruption and incompetence. In 2010, Victor Yanukovych, the same man we marched against, rose to the presidency.

The pattern repeats with almost metronomic regularity. Egyptian dictator Hosni Mubarak was ousted in the Arab Spring, only to be replaced by the equally authoritarian Abdel Fattah el-Sisi. George W. Bush gave way to Barack Obama, who set the stage for Donald Trump. Revolutions sow the seeds for their own demise. We need to learn to break the cycle.

The Physics Of Change And The Power Of Shared Values

In Rules for Radicals, the legendary activist Saul Alinsky observed that every revolution inspires a counterrevolution. That is the physics of change. Every action provokes a reaction because, if an idea is important, it threatens the status quo, which never yields its power gracefully. If you seek to make change in the world, you can be sure that some people aren’t going to like it and will fight against it.

For example, President Bush’s support for a “Defense of Marriage Act” inspired then San Francisco Mayor Gavin Newsom to unilaterally begin performing weddings for gay and lesbian couples at City Hall, in what was termed the Winter of Love. 4,027 couples were married before their nuptials were annulled by the California Supreme Court a month later.

The backlash was fierce. Conservative groups swung into action to defend the “sanctity of marriage” and in 2008 were successful in placing Proposition 8, an amendment to the California Constitution that prohibited gay marriage, on the ballot. It was passed with a narrow majority of 52% of the electorate which, only further galvanized LGBTQ activists and led, eventually, to legalized gay marriage.

In our work helping organizations drive transformation, we find similar dynamics at play. Corporate revolutionaries tend to assume that once they get their budget approved or receive executive sponsorship, everything will go smoothly. The reality is that’s the point when things often get bogged down, because those who oppose change see that it has actually become possible and redouble their efforts to undermine it.

The Differentiation Trap

Many revolutionaries, corporate and otherwise, are frustrated marketers. They want to differentiate themselves in the marketplace of ideas through catchy slogans that “cut through.” It is by emphasizing difference that they seek to gin up enthusiasm among their most loyal supporters.

That was certainly true of LGBTQ activists, who marched through city streets shouting slogans like “We’re here, we’re queer and we’d like to say hello.” They led a different lifestyle and wanted to demand that their dignity be recognized. More recently, Black Lives Matter activists made calls to “defund the police,” which many found to be shocking and anarchistic.

Corporate change agents tend to fall into a similar trap. They rant on about “radical” innovation and “disruption,” ignoring the fact that few like to be radicalized or disrupted. Proponents of agile development methods often tout their manifesto, ignoring the fact many outside the agile community find the whole thing a bit weird and unsettling.

While emphasizing difference may excite people who are already on board, it is through shared values that you bring people in. So it shouldn’t be a surprise that the fight for LGBTQ rights began to gain traction when activists started focusing on family values. Innovation doesn’t succeed because it’s “radical,” but when it solves a meaningful problem. The value of Agile methods isn’t a manifesto, but the fact that they can improve performance.

Learning To Love Your Haters

Once you understand that shared values are key to driving change forward, it becomes clear that those who oppose the change you seek can help break the cycle of revolution and counter-revolution and beginning to drive change forward. That’s why you need to learn to love your haters.

By listening to people who hate your idea you can identify early flaws and fix them before it’s too late. Yet even more importantly they can help you identify shared values because they are trying to persuade many of the same people you are. Often, if not always, you can use their own arguments against them.

That’s exactly what happened in the fight for LGBTQ rights. The central argument against the movement was that the gay lifestyle was a threat to family values. So it was no accident that it prevailed on the basis of living in committed relationships and raising happy families. In a similar way, Black Lives Matter activists would do much better focusing on the shared value of safe neighborhoods that in a crusade against police officers.

To be clear, listening to your opposition doesn’t mean engaging directly with them. That’s a mistake Barack Obama made far too often. He would appear on Bill O’Reilly’s show on Fox News, only to be ridiculed as soon as he was off camera. He would have been much better off watching at home and using the bombastic TV host’s remarks for his own purposes.

Achieving Schwerpunkt

In the final analysis, the reason that most would-be revolutionaries fail is that they assume that the righteousness of their cause will save them. It will not. Injustice, inequity and ineffectiveness can thrive for decades and even centuries, far longer than a human lifespan. If you think that your idea will prevail simply because you believe in it you will be sorely disappointed.

Tough, important battles can only be won with good tactics, which is why successful change agents learn how to adopt the principle of Schwerpunkt. The idea is that instead of trying to defeat your enemy with overwhelming force generally, you want to deliver overwhelming force and win a decisive victory at a particular point of attack.

Thurgood Marshall did not seek to integrate all schools, at least not at first. He started with graduate schools, where the “separate but equal” argument was most vulnerable. More recently, Stop Hate For Profit attacked Facebook not by asking users to boycott, but focused on advertisers, who themselves were vulnerable to activist action.

Yet Schwerpunkt is a dynamic, not a static concept. You have to constantly innovate your approach as your opposition adapts to whatever success you may achieve. For example, the civil rights movement had its first successes with boycotts, but eventually moved on to sit-ins, “Freedom Rides,” community actions and eventually, mass marches.

The key to success wasn’t any particular tactic, leader or slogan but strategic flexibility. Unfortunately, that’s exactly what most movements lack. All too often they get caught up in a strategy and double down, because it feels good to believe in something, even if it’s a failure. They would rather make a point than make a real difference.

Successful revolutionaries, on the other hand, understand that power will not fall simply because you oppose it, but it will crumble if you bring those who support it over to your side. That’s why lasting change is always built on the common ground of shared values.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.