Moore's law


Moore's law[tunnel field-effect transistor|] is the observation that the number of transistors in a dense integrated circuit doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.
The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and CEO and co-founder of Intel, who in 1965 posited a doubling every year in the number of components per integrated circuit, and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years, a compound annual growth rate of 40%. While Moore did not use empirical evidence in forecasting that the historical trend would continue, his prediction held since 1975 and has since become known as a "law."
Moore's prediction has been used in the semiconductor industry to guide long-term planning and to set targets for research and development, thus functioning a bit like a self-fulfilling prophecy. Advancements in digital electronics, such as the reduction in quality-adjusted microprocessor prices, the increase in memory capacity, the improvement of sensors, and even the number and size of pixels in digital cameras, are strongly linked to Moore's law. These step changes in digital electronics have been a driving force of technological and social change, productivity, and economic growth.
Industry experts have not reached a consensus on exactly when Moore's law will cease to apply. Microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, below the pace predicted by Moore's law. However, as of 2018, leading semiconductor manufacturers have developed IC fabrication processes in mass production which are claimed to keep pace with Moore's law.

History of the concept

In 1959, Douglas Engelbart discussed the projected downscaling of integrated circuit size in the article "Microelectronics, and the Art of Similitude". Engelbart presented his ideas at the 1960 International Solid-State Circuits Conference, where Moore was present in the audience.
That same year, Mohamed Atalla and Dawon Kahng invented the MOSFET, also known as the MOS transistor, at Bell Labs. The MOSFET was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses, with its high scalability and low power consumption resulting in a higher transistor density and making it possible to build high-density IC chips. In the early 1960s, Gordon E. Moore recognized that the ideal electrical and scaling characteristics of MOSFET devices would lead to rapidly increasing integration levels and unparalleled growth in electronic applications.
In 1965, Gordon Moore, who at the time was working as the director of research and development at Fairchild Semiconductor, was asked to contribute to the thirty-fifth anniversary issue of Electronics magazine with a prediction on the future of the semiconductor components industry over the next ten years. His response was a brief article entitled "Cramming more components onto integrated circuits". Within his editorial, he speculated that by 1975 it would be possible to contain as many as 65,000 components on a single quarter-square-inch semiconductor.
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years.

Moore posited a log-linear relationship between device complexity and time. In a 2015 interview, Moore noted of the 1965 article: "...I just did a wild extrapolation saying it’s going to continue to double every year for the next 10 years."
In 1974, Robert H. Dennard at IBM recognized the rapid MOSFET scaling technology and formulated what became known as Dennard scaling, which describes that as MOS transistors get smaller, their power density stays constant such that the power use remains in proportion with area. MOSFET scaling and miniaturization have been the key driving forces behind Moore's law. Evidence from the semiconductor industry shows that this inverse relationship between power density and areal density broke down in the mid-2000s.
At the 1975 IEEE International Electron Devices Meeting, Moore revised his forecast rate, predicting semiconductor complexity would continue to double annually until about 1980, after which it would decrease to a rate of doubling approximately every two years. He outlined several contributing factors for this exponential behavior:
Shortly after 1975, Caltech professor Carver Mead popularized the term "Moore's law". Moore's law eventually came to be widely accepted as a goal for the semiconductor industry, and it was cited by competitive semiconductor manufacturers as they strove to increase processing power. Moore viewed his eponymous law as surprising and optimistic: "Moore's law is a violation of Murphy's law. Everything gets better and better." The observation was even seen as a self-fulfilling prophecy.
The doubling period is often misquoted as 18 months because of a prediction by Moore's colleague, Intel executive David House. In 1975, House noted that Moore's revised law of doubling transistor count every 2 years in turn implied that computer chip performance would roughly double every 18 months. Moore's law is closely related to MOSFET scaling, as the rapid scaling and miniaturization of MOSFETs is the key driving force behind Moore's law. Mathematically, Moore's Law predicted that transistor count would double every 2 years due to shrinking transistor dimensions and other improvements. As a consequence of shrinking dimensions, Dennard scaling predicted that power consumption per unit area would remain constant. Combining these effects, David House deduced that computer chip performance would roughly double every 18 months. Also due to Dennard scaling, this increased performance would not be accompanied by increased power i.e. the energy-efficiency of silicon-based computer chips roughly doubles every 18 months. Dennard scaling ended in the 2000s. Koomey later showed that a similar rate of efficiency improvement predated silicon chips and Moore's Law, for technologies such as vacuum tubes.
portable computer, from 1982, with a Zilog Z80 4 MHz CPU, and a 2007 Apple iPhone with a 412 MHz ARM11 CPU; the Executive weighs 100 times as much, is nearly 500 times the volume, costs approximately 10 times as much, and has 1/103rd the clock frequency of the smartphone.|alt=Large early portable computer next to a modern smartphone
Microprocessor architects report that since around 2010, semiconductor advancement has slowed industry-wide below the pace predicted by Moore's law. Brian Krzanich, the former CEO of Intel, cited Moore's 1975 revision as a precedent for the current deceleration, which results from technical challenges and is "a natural part of the history of Moore's law". The rate of improvement in physical dimensions known as Dennard scaling also ended in the mid-2000s. As a result, much of the semiconductor industry has shifted its focus to the needs of major computing applications rather than semiconductor scaling. Nevertheless, leading semiconductor manufacturers TSMC and Samsung Electronics have claimed to keep pace with Moore's law with 10 nm and 7 nm nodes in mass production and 5 nm nodes in risk production.

Moore's second law

As the cost of computer power to the consumer falls, the cost for producers to fulfill Moore's law follows an opposite trend: R&D, manufacturing, and test costs have increased steadily with each new generation of chips. Rising manufacturing costs are an important consideration for the sustaining of Moore's law. This had led to the formulation of Moore's second law, also called Rock's law, which is that the capital cost of a semiconductor fab also increases exponentially over time.

Major enabling factors

Numerous innovations by scientists and engineers have sustained Moore's law since the beginning of the IC era. Some of the key innovations are listed below, as examples of breakthroughs that have advanced integrated circuit and semiconductor device fabrication technology, allowing transistor counts to grow by more than seven orders of magnitude in less than five decades.
Computer industry technology road maps predicted in 2001 that Moore's law would continue for several generations of semiconductor chips.

Recent trends

One of the key challenges of engineering future nanoscale transistors is the design of gates. As device dimension shrinks, controlling the current flow in the thin channel becomes more difficult. Compared to FinFETs, which have gate dielectric on three sides of the channel, gate-all-around MOSFET structure has even better gate control.
Microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, below the pace predicted by Moore's law. Brian Krzanich, the former CEO of Intel, announced, "Our cadence today is closer to two and a half years than two." Intel stated in 2015 that improvements in MOSFET devices have slowed, starting at the 22 nm feature width around 2012, and continuing at 14 nm.
The physical limits to transistor scaling have been reached due to source-to-drain leakage, limited gate metals and limited options for channel material. Other approaches are being investigated, which do not rely on physical scaling. These include the spin state of electron spintronics, tunnel junctions, and advanced confinement of channel materials via nano-wire geometry. Spin-based logic and memory options are being developed actively in labs.

Alternative materials research

The vast majority of current transistors on ICs are composed principally of doped silicon and its alloys. As silicon is fabricated into single nanometer transistors, short-channel effects adversely change desired material properties of silicon as a functional transistor. Below are several non-silicon substitutes in the fabrication of small nanometer transistors.
One proposed material is indium gallium arsenide, or InGaAs. Compared to their silicon and germanium counterparts, InGaAs transistors are more promising for future high-speed, low-power logic applications. Because of intrinsic characteristics of III-V compound semiconductors, quantum well and
tunnel effect transistors based on InGaAs have been proposed as alternatives to more traditional MOSFET designs.
Biological computing research shows that biological material has superior information density and energy efficiency compared to silicon-based computing.
image of graphene in its hexagonal lattice structure |alt=refer to caption
Various forms of graphene are being studied for graphene electronics, e.g. Graphene nanoribbon transistors have shown great promise since its appearance in publications in 2008. More research will need to be performed, however, on sub 50 nm graphene layers, as its resistivity value increases and thus electron mobility decreases.

Forecasts and roadmaps

In April 2005, Gordon Moore stated in an interview that the projection cannot be sustained indefinitely: "It can't continue forever. The nature of exponentials is that you push them out and eventually disaster happens." He also noted that transistors eventually would reach the limits of miniaturization at atomic levels:
In 2016 the International Technology Roadmap for Semiconductors, after using Moore's Law to drive the industry since 1998, produced its final roadmap. It no longer centered its research and development plan on Moore's law. Instead, it outlined what might be called the More than Moore strategy in which the needs of applications drive chip development, rather than a focus on semiconductor scaling. Application drivers range from smartphones to AI to data centers.
IEEE began a road-mapping initiative in 2016, , named the International Roadmap for Devices and Systems.
Most forecasters, including Gordon Moore, expect Moore's law will end by around 2025.

Consequences

Digital electronics have contributed to world economic growth in the late twentieth and early twenty-first centuries. The primary driving force of economic growth is the growth of productivity, and Moore's law factors into productivity. Moore expected that "the rate of technological progress is going to be controlled from financial realities". The reverse could and did occur around the late-1990s, however, with economists reporting that "Productivity growth is the key economic indicator of innovation." Moore's law describes a driving force of technological and social change, productivity, and economic growth.
An acceleration in the rate of semiconductor progress contributed to a surge in U.S. productivity growth, which reached 3.4% per year in 1997–2004, outpacing the 1.6% per year during both 1972–1996 and 2005–2013. As economist Richard G. Anderson notes, "Numerous studies have traced the cause of the productivity acceleration to technological innovations in the production of semiconductors that sharply reduced the prices of such components and of the products that contain them."
An alternative source of improved performance is in microarchitecture techniques exploiting the growth of available transistor count. Out-of-order execution and on-chip caching and prefetching reduce the memory latency bottleneck at the expense of using more transistors and increasing the processor complexity. These increases are described empirically by Pollack's Rule, which states that performance increases due to microarchitecture techniques approximate the square root of the complexity of a processor.
For years, processor makers delivered increases in clock rates and instruction-level parallelism, so that single-threaded code executed faster on newer processors with no modification. Now, to manage CPU power dissipation, processor makers favor multi-core chip designs, and software has to be written in a multi-threaded manner to take full advantage of the hardware. Many multi-threaded development paradigms introduce overhead, and will not see a linear increase in speed vs number of processors. This is particularly true while accessing shared or dependent resources, due to lock contention. This effect becomes more noticeable as the number of processors increases. There are cases where a roughly 45% increase in processor transistors has translated to roughly 10–20% increase in processing power.
On the other hand, manufacturers are adding specialized processing units to deal with features such as graphics, video, and cryptography. For one example, Intel's Parallel JavaScript extension not only adds support for multiple cores, but also for the other non-general processing features of their chips, as part of the migration in client side scripting toward HTML5.
A negative implication of Moore's law is obsolescence, that is, as technologies continue to rapidly "improve", these improvements may be significant enough to render predecessor technologies obsolete rapidly. In situations in which security and survivability of hardware or data are paramount, or in which resources are limited, rapid obsolescence may pose obstacles to smooth or continued operations.
Because of the toxic materials used in the production of modern computers, obsolescence, if not properly managed, may lead to harmful environmental impacts. On the other hand, obsolescence may sometimes be desirable to a company which can profit immensely from the regular purchase of what is often expensive new equipment instead of retaining one device for a longer period of time. Those in the industry are well aware of this, and may utilize planned obsolescence as a method of increasing profits.
Moore's law has affected the performance of other technologies significantly: Michael S. Malone wrote of a Moore's War following the apparent success of shock and awe in the early days of the Iraq War. Progress in the development of guided weapons depends on electronic technology. Improvements in circuit density and low-power operation associated with Moore's law also have contributed to the development of technologies including mobile telephones and 3-D printing.

Other formulations and similar observations

Several measures of digital technology are improving at exponential rates related to Moore's law, including the size, cost, density, and speed of components. Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor", at minimum cost.
Transistors per integrated circuit – The most popular formulation is of the doubling of the number of transistors on ICs every two years. At the end of the 1970s, Moore's law became known as the limit for the number of transistors on the most complex chips. The graph at the top shows this trend holds true today. As of 2017, the commercially available processor possessing the highest number of transistors is the 48 core Centriq with over 18 billion transistors.
Density at minimum cost per transistor – This is the formulation given in Moore's 1965 paper. It is not just about the density of transistors that can be achieved, but about the density of transistors at which the cost per transistor is the lowest.
As more transistors are put on a chip, the cost to make each transistor decreases, but the chance that the chip will not work due to a defect increases. In 1965, Moore examined the density of transistors at which cost is minimized, and observed that, as transistors were made smaller through advances in photolithography, this number would increase at "a rate of roughly a factor of two per year".
Dennard scaling – This posits that power usage would decrease in proportion to area of transistors. Combined with Moore's law, performance per watt would grow at roughly the same rate as transistor density, doubling every 1–2 years. According to Dennard scaling transistor dimensions would be scaled by 30% every technology generation, thus reducing their area by 50%. This would reduce the delay by 30% and therefore increase operating frequency by about 40%. Finally, to keep electric field constant, voltage would be reduced by 30%, reducing energy by 65% and power by 50%. Therefore, in every technology generation transistor density would double, circuit becomes 40% faster, while power consumption stays the same. Dennnard scaling came to end in 2005–2010, due to leakage currents.
The exponential processor transistor growth predicted by Moore does not always translate into exponentially greater practical CPU performance. Since around 2005–2007, Dennard scaling has ended, so even though Moore's law continued for several years after that, it has not yielded dividends in improved performance. The primary reason cited for the breakdown is that at small sizes, current leakage poses greater challenges, and also causes the chip to heat up, which creates a threat of thermal runaway and therefore, further increases energy costs.
The breakdown of Dennard scaling prompted a greater focus on multicore processors, but the gains offered by switching to more cores are lower than the gains that would be achieved had Dennard scaling continued. In another departure from Dennard scaling, Intel microprocessors adopted a non-planar tri-gate FinFET at 22 nm in 2012 that is faster and consumes less power than a conventional planar transistor. The rate of performance improvement for single-core microprocessors has slowed significantly. Single-core performance was improving by 52% per year in 1986–2003 and 23% per year in 2003–2011, but slowed to just seven percent per year in 2011–2018."
Quality adjusted price of IT equipment – The price of information technology, computers and peripheral equipment, adjusted for quality and inflation, declined 16% per year on average over the five decades from 1959 to 2009.
The pace accelerated, however, to 23% per year in 1995–1999 triggered by faster IT innovation, and later, slowed to 2% per year in 2010–2013.
While quality-adjusted microprocessor price improvement continues, the rate of improvement likewise varies, and is not linear on a log scale. Microprocessor price improvement accelerated during the late 1990s, reaching 60% per year versus the typical 30% improvement rate during the years earlier and later. Laptop microprocessors in particular improved 25–35% per year in 2004–2010, and slowed to 15–25% per year in 2010–2013.
The number of transistors per chip cannot explain quality-adjusted microprocessor prices fully. Moore's 1995 paper does not limit Moore's law to strict linearity or to transistor count, "The definition of 'Moore's Law' has come to refer to almost anything related to the semiconductor industry that on a semi-log plot approximates a straight line. I hesitate to review its origins and by doing so restrict its definition."
Hard disk drive areal density – A similar prediction was made in 2005 for hard disk drive areal density. The prediction was later viewed as over-optimistic. Several decades of rapid progress in areal density slowed around 2010, from 30–100% per year to 10–15% per year, because of noise related to smaller grain size of the disk media, thermal stability, and writability using available magnetic fields.
Fiber-optic capacity – The number of bits per second that can be sent down an optical fiber increases exponentially, faster than Moore's law. Keck's law, in honor of Donald Keck.
Network capacity – According to Gerry/Gerald Butters, the former head of Lucent's Optical Networking Group at Bell Labs, there is another version, called Butters' Law of Photonics, a formulation that deliberately parallels Moore's law. Butters' law says that the amount of data coming out of an optical fiber is doubling every nine months. Thus, the cost of transmitting a bit over an optical network decreases by half every nine months. The availability of wavelength-division multiplexing increased the capacity that could be placed on a single fiber by as much as a factor of 100. Optical networking and dense wavelength-division multiplexing is rapidly bringing down the cost of networking, and further progress seems assured. As a result, the wholesale price of data traffic collapsed in the dot-com bubble. Nielsen's Law says that the bandwidth available to users increases by 50% annually.
Pixels per dollar – Similarly, Barry Hendy of Kodak Australia has plotted pixels per dollar as a basic measure of value for a digital camera, demonstrating the historical linearity of this market and the opportunity to predict the future trend of digital camera price, LCD and LED screens, and resolution.
The great Moore's law compensator , also known as Wirth's law – generally is referred to as software bloat and is the principle that successive generations of computer software increase in size and complexity, thereby offsetting the performance gains predicted by Moore's law. In a 2008 article in InfoWorld, Randall C. Kennedy, formerly of Intel, introduces this term using successive versions of Microsoft Office between the year 2000 and 2007 as his premise. Despite the gains in computational performance during this time period according to Moore's law, Office 2007 performed the same task at half the speed on a prototypical year 2007 computer as compared to Office 2000 on a year 2000 computer.
Library expansion – was calculated in 1945 by Fremont Rider to double in capacity every 16 years, if sufficient space were made available. He advocated replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons or other institutions. He did not foresee the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission media. Automated, potentially lossless digital technologies allowed vast increases in the rapidity of information growth in an era that now sometimes is called the Information Age.
Carlson curve – is a term coined by The Economist to describe the biotechnological equivalent of Moore's law, and is named after author Rob Carlson. Carlson accurately predicted that the doubling time of DNA sequencing technologies would be at least as fast as Moore's law. Carlson Curves illustrate the rapid decreases in cost, and increases in performance, of a variety of technologies, including DNA sequencing, DNA synthesis, and a range of physical and computational tools used in protein expression and in determining protein structures.
Eroom's law – is a pharmaceutical drug development observation which was deliberately written as Moore's Law spelled backwards in order to contrast it with the exponential advancements of other forms of technology over time. It states that the cost of developing a new drug roughly doubles every nine years.
Experience curve effects says that each doubling of the cumulative production of virtually any product or service is accompanied by an approximate constant percentage reduction in the unit cost. The acknowledged first documented qualitative description of this dates from 1885. A power curve was used to describe this phenomenon in a 1936 discussion of the cost of airplanes.
Edholm's law – Phil Edholm observed that the bandwidth of telecommunication networks is doubling every 18 months. The bandwidths of online communication networks has risen from bits per second to terabits per second. The rapid rise in online bandwidth is largely due to the same MOSFET scaling that enables Moore's law, as telecommunications networks are built from MOSFETs.
Haitz's law predicts that the brightness of LEDs increases as their manufacturing cost goes down.