In 1973, in the wake of the Arab defeat in the Yom Kippur war with Israel, OPEC instituted an oil embargo on America and its allies. The immediate effects of the crisis was a surge in gas prices and a recession in the west. The ripple effects, however, were far more complex and played out over decades.
The rise in oil prices brought much needed hard currency to the Soviet Union, prolonging its existence and setting the stage for its later demise. The American auto industry, with its passion for big, gas guzzling cars, lost ground to the emergent. The new consciousness of conservation led to the establishment of the Department of Energy.
Today the Covid-19 crisis has given a shock to the system and we’re at a similar inflection point. The most immediate effects have been economic recession and the rapid adoption of digital tools, such as video conferencing. Over the next decade or so, however, the short-term impacts will combine with other more longstanding trends to reshape technology and society.
Pervasive Transformation
We tend to think about innovation as if it were a single event, but the truth is that it’s a process of a process of discovery, engineering and transformation, which takes decades to run its course. For example, Alan Turing discovered the principles of a universal computer in 1936, but it wasn’t until the 1950s and 60s that digital computers became commercially available.
Even then, digital technology, didn’t really begin to become truly transformational until the mid-90s. By this time, it was well understood enough to make the leap from highly integrated systems to modular ecosystems, making the technology cheaper, more functional and more reliable. The number of applications exploded and the market grew quickly.
Still, as the Covid-19 crisis has made clear, we’ve really just been scratching the surface. Although digital technology certainly accelerated the pace of work, it did fairly little to fundamentally change the nature of it. People still commuted to work in an office, where they would attend meetings in person, losing hours of productive time each and every day.
Over the next decade, we will see pervasive transformation. As Mark Zuckerberg has pointed out, once people can work remotely, they can work from anywhere, which will change the nature of cities. Instead of “offsite” meetings, we may very well have “onsite” meetings where people from their home cities over travel to headquarters to do more active collaboration.
These trends will combine with nascent technologies like artificial intelligence and blockchain to revolutionize business processes and supply chains. Organizations that cannot adopt key technologies will very likely find themselves unable to compete.
The Rise Of Heterogeneous Computing
The digital age did not begin with personal computers in the 70s and 80s, but started back in the 1950s with the shift from electromechanical calculating machines to transistor based mainframes. However, because so few people used computers back then—they were largely relegated to obscure back office tasks and complex scientific calculations—the transformation took place largely out of public view.
A similar process is taking place today with new architectures such as quantum and neuromorphic computing. While these technologies are not yet commercially viable, they are advancing quickly and will eventually become thousands, if not millions, of times more effective than digital systems.
However, what’s most important to understand is that they are fundamentally different from digital computers and from each other. Quantum computers will create incredibly large computing spaces that will handle unimaginable complexity. Neuromorphoic systems, based on the human brain, will be massively powerful, vastly more efficient and more responsive.
So over the next decade we’ll be shifting to a heterogeneous computing environment, where we use different architectures for different tasks. Most likely, we’ll still use digital technology as an interface to access systems, but increasingly performance will be driven by more advanced architectures.
A Shift From Bits to Atoms
The digital revolution created a virtual world. My generation was the first to grow up with video games and our parents worried that we were becoming detached from reality. Then computers entered offices and Dan Bricklin created Visicalc, the first spreadsheet program. Eventually smartphones and social media appeared and we began spending almost as much time in the virtual world as we did in the physical one.
Essentially, what we created was a simulation economy. We could experiment with business models in our computers, find flaws and fix them before they became real. Computer-aided design (CAD) software allowed us to quickly and cheaply design products in bits before we got down to the hard , slow work of shaping atoms. Because it’s much cheaper to fail in the virtual world than the physical one, this made our economy more efficient.
Today we’re doing similar things at the molecular level. For example, digital technology was combined with synthetic biology to quickly sequence the Covid-19 virus. These same technologies then allowed scientists to design vaccines in days and to bring them to market in less than a year.
A parallel revolution is taking in materials science, while at the same time digital technology is beginning to revolutionize traditional industries such as manufacturing and agriculture. The expanded capabilities of heterogeneous computing will accelerate these trends over the next few decades.
What’s important to understand is that we spend vastly more money on atoms than bits. Even at this advanced stage, information technologies only make up about 6% of GDP in advanced economies. Clearly, there is a lot more opportunity in the other 94%, so the potential of the post-digital world is likely to far outstrip anything we’ve seen in our lifetimes.
Collaboration Is The New Competitive Advantage
Whenever I think back to when we got that first computer back in the 1980s, I marvel at how different the world was then. We didn’t have email or mobile phones, so unless someone was at home or in the office, they were largely unreachable. Without GPS, we had to either remember where things were or ask for directions.
These technologies have clearly changed our lives dramatically, but they were also fairly simple. Email, mobile and GPS were largely standalone technologies. There were, of course, technical challenges, but these were relatively narrow. The “killer apps” of the post-digital era will require a much higher degree of collaboration over a much more diverse set of skills.
To understand how different this new era of innovation will be, consider how IBM developed the PC. Essentially, they sent some talented engineers to Boca Raton for a year and, in that time, developed a marketable product. For quantum computing, however, it is building a vast network, including national labs, research universities, startups and industrial partners.
The same will be true of the post-Covid world. It’s no accident that Zoom has become the killer app of the pandemic. The truth is that the challenges we will face over the next decade will be far too complex for any one organization to tackle it alone. That’s why collaboration is becoming the new competitive advantage. Power will reside not at the top of hierarchies, but at the center of networks and ecosystems.