For the past 50 years, innovation has largely been driven by our ability to cram more transistors onto a silicon wafer. That’s what’s allowed us to double the power of our technology every two years or so and led to the continuous flow of new products and services streaming out of innovative organizations.
Perhaps not surprisingly, over the past few decades agility has become a defining competitive attribute. Because the fundamentals of digital technology have been so well understood, much of the value has shifted to applications and things like design and user experience. Yet that will change in the years ahead.
Over the next few decades we will struggle to adapt to a post-digital age and we will need to rethink old notions about agility. To win in this new era of innovation we will have to do far more than just move fast and break things. Rather, we will have to manage four profound shifts in the basis of competition that will challenge some of our most deeply held notions.
Shift 1: From Transistor-Based Computers To New Computing Architectures
In 1965, Intel’s Gordon Moore published a paper that established predicted Moore’s Law, the continuous doubling of transistors that can fit on an integrated circuit. With a constant stream of chips that were not only more powerful, but cheaper, successful firms would rapidly prototype and iterate to speed new applications to market.
Yet now Moore’s Law is ending. Despite the amazing ingenuity of engineers, the simple reality is that every technology eventually hits theoretical limits. The undeniable fact is that atoms are only so small and the speed of light is only so fast and that limits what we can do with transistors. To advance further, we will simply have to find a different way to compute things.
The two most promising candidates are quantum computing and neuromorphic chips, both of which are vastly different from digital computing, utilizing different logic and require different computer languages and algorithmic approaches than classical computers. The transition to these architectures won’t be seamless.
We will also use these architectures in much different ways. Quantum computers will be able to handle almost incomprehensible complexity, generating computing spaces larger than the number of atoms in the known universe. Neuromorphic chips are potentially millions of times more efficient than conventional chips and are much more effective with continuous streams of data, so may be well suited for edge computing and tasks like machine vision.
Shift 2: From Bits To Atoms
The 20th century saw two major waves of innovation. The first, dominated by electricity and internal combustion, revolutionized how we could manipulate the physical world. The second, driven by quantum physics, microbial science and computing, transformed how we could work with the microscopic and the virtual.
The past few decades have been dominated by the digital revolution and it seems like things have been moving very fast, but looks can be deceiving. If you walked into an average 1950s era household, you would see much that you would recognize, including home appliances, a TV and an automobile. On the other hand, if you had to live in a 1900’s era home, with no running water or electricity, you would struggle to survive.
The next era will combine aspects of both waves, essentially using bits to drive atoms. We’re building vast databases of genes and materials, cataloging highly specific aspects of the physical world. We are also using powerful machine learning algorithms to analyze these vast droves of data and derive insights. The revolution underway is so profound that it’s reshaping the scientific method.
In the years to come, new computing architectures are likely to accelerate this process. Simulating chemistry is one of the first applications being explored for quantum computers, which will help us build larger and more detailed databases. Neuromorphic technology will allow us to shift from the cloud to the edge, enabling factories to get much smarter.
The way we interface with the physical world is changing as well. New techniques such as CRISPR helps us edit genes at will. There is also an emerging revolution in materials science that will transform areas like energy and manufacturing. These trends are still somewhat nascent, but have truly transformative potential.
Shift 3: From Rapid Iteration To Exploration
Over the past 30 years, we’ve had the luxury of working with technologies we understand extremely well. Every generation of microchips opened vast new possibilities, but worked exactly the same way as the last generation, creating minimal switching costs. The main challenge was to design applications.
So it shouldn’t be surprising that rapid iteration emerged as a key strategy. When you understand the fundamental technology that underlies a product or service, you can move quickly, trying out nearly endless permutations until you arrive at an optimized solution. That’s often far more effective than a planned, deliberate approach.
Over the next decade or two, however, the challenge will be to advance technology that we don’t understand well at all. As noted above, quantum and neuromorphic computing are still in their nascent stages. Improvements in genomics and materials science are redefining the boundaries of those fields. There are also ethical issues involved with artificial intelligence and genomics that will require us to tread carefully.
So in the future, we will need to put greater emphasis on exploration to understand these new technologies and how they relate to our businesses. Instead of looking to disrupt markets, we will need to pursue grand challenges to solve fundamental problems. Most of all, it’s imperative to start early. By the time many of these technologies hit their stride, it will be too late to catch up.
Shift 4. From Hypercompetition to Mass Collaboration
The competitive environment we’ve become used to has been relatively simple. For each particular industry, there have been distinct ecosystems based on established fields of expertise. Competing firms raced to transform fairly undifferentiated inputs into highly differentiated products and services. You needed to move fast to get an edge.
This new era, on the other hand, will be one of mass collaboration in which government partners with academia and industry to explore new technologies in the pre competitive phase. For example, the Joint Center for Energy Storage Research combines the work of five national labs, a dozen or so academic institutions and hundreds of companies to develop advance batteries. Covid has redefined how scientists collaborate across institutional barriers.
Or consider the Manufacturing Institutes set up under the Obama administration. Focusing on everything from advanced fabrics to biopharmaceuticals, these allow companies to collaborate with government labs and top academics to develop the next generation of technologies. They also operate dozens of testing facilities to help bring new products to market faster.
I’ve visited some of these facilities and have had the opportunity to talk with executives from participating companies. What struck me was how palpable the excitement about the possibilities of this new era was. Agility for them didn’t mean learning to run faster down a chosen course, but to widen and deepen connections throughout a technological ecosystem.
Over the past few decades, we have largely been moving faster and faster down a predetermined path. Over the next few decades, however, we’ll increasingly need to explore multiple domains at once and combine them into something that produces value. We’ll need to learn how to go slower to deliver much larger impacts.