NEWS

Moore's Law at 50: How much longer can it last?

Scott Tilley

In technology it is axiomatic that the Apple computer you want will always cost $2,500. This has been true for decades. But while the cost may remain the same, the computer itself changes dramatically: it's always faster than the one you currently own.

The increased computational power has been driven by a rule known as Moore's Law, which states that the number of transistors on a chip doubles approximately every two years. This prediction of exponential growth has held true since it was first proposed in 1965.

In the early 1970s, Intel's processors had about 2,500 transistors. Today, Intel makes processors with over 5 billion transistors – all crammed into a space about the size of your fingernail.

Every year it gets harder to maintain this incredible pace of miniaturization. Engineers at Intel and other companies are running up against the laws of physics. As transistors get smaller, the wires connecting them get smaller too. The electrons that flow through the chips start to "leak" across boundaries that are now just a few atoms wide. We now have commercial chips manufactured at 14nm – meaning the distance between transistors is about 7,000 times smaller than the width of a human hair.

Increased chip density and fast clock speeds cause these tiny chips to run very hot. When you hear the fan blowing inside your laptop computer it's because the chip inside is getting too warm. This is the reason you rarely hear about GHz in advertising anymore: the chips are not getting much faster. Instead, they are relying on parallelism to increase their processing speeds. My MacBook Air is quad-core, which means the CPU actually has 4 separate processors on the same physical chip. More cores means more power.

Every time pundits predict the end of Moore's Law, industry finds a way to innovate around the problems. Sophisticated manufacturing processes and increasingly esoteric chip materials are used to shrink the die size. But eventually we'll hit the wall and no more miniaturization or parallelization will be possible. So what will we do then?

The solution may be to reexamine the basic design of today's microprocessors. They may be faster, smaller, and more powerful than the chips of the past, but they are still fundamentally the same device. They rely on binary digits and a traditional model of computation that is implemented using the Von Neumann architecture. A more powerful model of computation is one that is dramatically different than this classical model: quantum computing. That's where the technological future may lead.

Scott Tilley is a professor at the Florida Institute of Technology in Melbourne. Contact him at TechnologyToday@srtilley.com.