These 4 Major Paradigm Shifts Will Transform The Future Of Technology


Greg Satell

May 18, 2016


image credit: Pixabay

For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.

Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.

Over the next decade, Moore’s Law will end. Instead of replacing manual labor, technology will automate routine cognitive work. As information technology fades into the background, second order technologies, such as genomics, nanotechnology and robotics will take center stage. Here are the four major paradigm shifts that we need to watch and prepare for.

From The Chip to The System

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years. He also predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.

That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations.

Yet Moore’s law is now nearing its end. The problem is twofold. First, there are only so many transistors you can squeeze onto a chip before quantum effects cause them to malfunction. Second, is the problem known as the von Neumann bottleneck. Simply put, it doesn’t matter how fast chips can process if they need to wait too long communicate with each other.

So we have to shift our approach from the chip to the system. One approach, called 3D stacking, would simply combine integrated circuits into a single three dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it could increase speeds significantly and allow progress to continue.

From Applications To Architectures

Since the 1960’s, when Moore wrote his article, the ever expanding power of computers made new applications possible. For example, after relational databases were developed in 1970, it became possible to store and retrieve massive amounts of information quickly and easily. That, in turn, dramatically changed how organizations could be managed.

Later innovations, like graphic displays, word processors and spreadsheets, set the stage for personal computers to be widely deployed. The Internet led to email, e-commerce and, eventually, mobile computing. In essence, the modern world is little more than the applications that make it possible.

Till now, all of these applications have taken place on von Neumann machines—devices with a central processing unit paired with data and applications stored in a separate place. So far, that’s worked well enough, but for the things that we’ve begun asking computers to do, like power self-driving cars, the von Neumann bottleneck is proving to be too large a constraint.

So the emphasis is moving from developing new applications to developing new architectures that can handle them better. Neuromorphic chips, based on the brain itself, will be thousands of times more efficient than conventional chips. Quantum computers, which IBM has recently made available in the cloud, work far better for security applications. New FPGA chips can be optimized for other applications.

Soon, when we choose to use a specific application, our devices will automatically be switched to the architecture—often, but not always, made available through the cloud—that can run it best.

From Products To Platforms

It used to be that firms looked to launch hit products. If you look at the great firms of the last century, they often rode to prominence on the back of a single great product, like IBM’s System/360, the Apple II or Sony’s Walkman. Those first successes could then lead to secondary ones—like the PC and the Macintosh—and lead to further dominance.

Yet look at successful companies today and they make their money off of platforms. Amazon makes the bulk of its profits from third party sellers, Amazon Prime and cloud computing, all of which are platforms. And what would Apple’s iPhone be without the App Store, where so much of its functionality comes from?

Platforms are important because they allow us to access ecosystems. Amazon’s platform connects ecosystems of retailers to ecosystems of consumers. The App Store connects ecosystems of developers to ecosystems of end users. IBM has learned to embrace open technology platforms, because they give it access to capabilities far beyond it own engineers.

The rise of platforms makes it imperative that managers learn to think differently about their businesses. While in the 20th century, firms could achieve competitive advantage by optimizing their value chains, the future belongs to those who can widen and deepen connections.

From Bits To Atoms

In The Rise and Fall of American Growth, economist Robert Gordon argues that the rapid productivity growth the US experienced from 1920-1970 is largely a thing of the past. While there may be short spurts of growth, like there was in the late 90’s, we’re not likely to see a sustained period of progress anytime soon.

Among the reasons he gives is that, while earlier innovations such as electricity and the internal combustion engine had broad implications, the impact of digital technology has been fairly narrow. The evidence bears this out. We see, to paraphrase Robert Solow, digital technology just about everywhere except in the productivity statistics.

Still, there are indications that the future may look a good deal different than the past. Digital technology is beginning to power new areas, such as genomics, nanotechnology and robotics, that are already having a deep impact on such fields as renewable technology, medical research and logistics, that have extremely large potential impacts.

It is all too easy to get caught up in old paradigms. When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.

In an age of disruption, the only viable strategy is to adapt.

This article was written by Greg Satell from Forbes and was legally licensed through the NewsCred publisher network.

There are 2 comments

  • Charles Rosenbury - 06/03/2016 14:46
    An interesting article which presents some perspectives which may not have been obvious to many readers. I question whether these are paradigm shifts. We may look back on them and agree, or we may find that what we are seeing now is simply the next hype. The one comment that I find which seems to be out of place is that technology is not affecting productivity. I find that to be a shallow analysis, as technology is definitely affecting productivity. The problem is that we are already too productive for the consumption rate, so the excess productivity does not show up as productivity, it shows up as unemployment, which simply fuels a decrease in the need for productivity. The paradigm shift which is likely going to emerge is that we need to find a way to deal with the unemployment caused by technology.

  • Phil Richardson CD, CSM-Senior Vice President - 06/03/2016 10:21
    Like many such articles which begin with both a challenging title and fascinating analysis, this one too leaves most readers unsatisfied. The author appears to be knowledgeable, thoughtful and well-intentioned which underscores my comment because he is seeking to share his perception but not really helping the reader to react to it. By this I mean he doesn't offer solutions other than prepare to "adapt". He needs to offer suggestions as to how to do so against the backdrop of disruption in order to complete the article in a way that truly helps the reader. Perhaps he's already planning to do so in an upcoming article. If not, I suggest he does, because otherwise he only leaves us with confusion and fear. I therefore further recommend that to increase the usefulness of his original article, he brings the opinion and wisdom of several specialists such as himself to imagine and propose ways of "adapting" that they believe will work, thereby helping us all to do so. Thanks Phil Richardson

Great ! Thanks for your subscription !

You will soon receive the first Content Loop Newsletter