GUEST POST from Greg Satell
In a 2015 TED talk, Bill Gates warned that “if anything kills ten million people in the next few decades, it’s most likely to be a highly infectious virus rather than a war. Not missiles, but microbes.” He went on to point out that we have invested enormous amounts of money in nuclear deterrents, but relatively little to battle epidemics.
It’s an apt point. In the US, we enthusiastically spend nearly $700 billion on our military, but cut corners on nearly everything else. Major breakthroughs, such as GPS satellites, the Internet and transistors, are merely offshoots of budgets intended to help us fight wars more effectively. At the same time, politicians gleefully propose budget cuts to the NIH.
A crisis, in one sense, is like anything else. It eventually ends and, when it does, we hope to be wiser for it. No one knows how long this epidemic will last or what the impact will be, but one thing is for sure — it will not be our last crisis. We should treat this as a new Sputnik moment and prepare for the next crisis with the same vigor with which we prepare for war.
Getting Artificial Intelligence Under Control
In the Terminator series, an automated defense system called Skynet becomes “self aware” and launches a nuclear attack to end humanity. Machines called “cyborgs” are created to hunt down the survivors that remain. Clearly it is an apocalyptic vision. Not completely out of the realm of possibility, but very unlikely.
The dangers of artificial intelligence, however, are very real, although not nearly so dramatic. Four years ago, in 2016, I published an article in Harvard Business Review outlining the ethical issues we need to address, ranging from long standing thought experiments like the trolley problem to issues surrounding accountability for automated decisions.
Unlike the Terminator scenario, these issues are clear and present. Consider the problem of data bias. Increasingly, algorithms determine what college we attend, if we get hired for a job and even who goes to prison and for how long. Unlike human decisions, these mathematical models are rarely questioned, but affect materially people’s lives.
The truth is that we need our algorithms to be explainable, auditable and transparent. Just because the possibility of our machines turning on us is fairly remote, doesn’t mean we don’t need too address more subtle, but all to real, dangers. We should build our systems to serve humanity, not the other way around.
The Slow-Moving Climate Crisis
Climate change is an issue that seems distant and political. To most people, basic needs like driving to work, heating their homes and doing basic household chores are much more top of mind than the abstract dangers of a warming planet. Yet the perils of climate change are, in fact, very clear and present.
Consider that the National Oceanic and Atmospheric Administration has found that, since 1980, there have been at least 258 weather and climate disasters where overall damages reached or exceeded $1 billion and that the total cost of these events has been more than $1.7 trillion. That’s an enormous amount of money.
Yet it pales in comparison to what we can expect in the future. A 2018 climate assessment published by the US government warned that we can expect climate change to “increasingly affect our trade and economy, including import and export prices and U.S. businesses with overseas operations and supply chains,” and had similar concerns with regard to our health, safety and quality of life.
There have been, of course, some efforts to slow the increase of carbon in our atmosphere that causes climate change such as the Paris Climate Agreement. However, these efforts are merely down payments to stem the crisis and, in any case, few countries are actually meeting their Paris targets. The US pulled out of the accord entirely.
The Debt Time Bomb
The US national debt today stands at about 23.5 trillion dollars or roughly 110% of GDP. That’s a very large, but not catastrophic number. The deficit in 2020 was expected to be roughly $1 trillion, or about four percent of GDP, but with the impact of the Coronavirus, we can expect it to be at least two to three times that now.
Considering that the economy of the United States grows at about two percent a year on average, any deficit above that level is unsustainable. Clearly, we are far beyond that now and, with baby boomers beginning to retire in massive numbers, Medicare spending is set to explode. At some point, these bills will have to be paid.
Yet focusing solely on financial debt misses a big part of the picture. Not only have we been overspending and under-taxing, we’ve also been massively under investing. Consider that the American Society of Civil Engineers has estimated that we need to spend $4.5 trillion to repair our broken infrastructure. Add that infrastructure debt to our financial and environmental debt it likely adds up to $30-$40 trillion, or roughly 150%-200% of GDP.
Much like the dangers of artificial intelligence and the climate crisis, not to mention the other inevitable crises like the new pandemics that are sure to come, we will eventually have to pay our debts. The only question is how long we want to allow the interest to pile up.
The Visceral Abstract
Some years ago, I wrote about a concept I called the visceral abstract. We often fail to realize how obscure concepts affect our daily lives. The strange theories of quantum mechanics, for example, make modern electronics possible. Einstein’s relativity helps calibrate our GPS satellites. Darwin’s natural selection helps us understand diseases like the Coronavirus.
In much the same way, we find it easy to ignore dangers that don’t seem clear and present. Terminator machines hunting us down in the streets is terrifying, but the very real dangers of data bias in our artificial intelligence systems is easy to dismiss. We worry how to pay the mortgage next month, but the other debts mounting fade into the background.
The news isn’t all bad, of course. Clearly, the Internet has made it far easier to cope with social distancing. Technologies such as gene sequencing and supercomputing simulations make it more likely that we will find a cure or a vaccine. We have the capacity for both petty foolishness and extreme brilliance.
The future is not inevitable. It is what we make it. We can choose, as we have in the past, to invest in our ability to withstand crises and mitigate their effects, or we can choose to sit idly by and give ourselves up to the whims of fate. We pay the price either way. How we pay it is up to us.
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.