History of electronic engineering


This article details the history of electronic engineering. Chambers Twentieth Century Dictionary defines electronics as "The science and technology of the conduction of electricity in a vacuum, a gas, or a semiconductor, and devices based thereon".
Electronic engineering as a profession sprang from technological improvements in the telegraph industry during the late 19th century and in the radio and telephone industries during the early 20th century. People gravitated to radio, attracted by the technical fascination it inspired, first in receiving and then in transmitting. Many who went into broadcasting in the 1920s had become "amateurs" in the period before World War I. The modern discipline of electronic engineering was to a large extent born out of telephone-, radio-, and television-equipment development and the large amount of electronic-systems development during World War II of radar, sonar, communication systems, and advanced munitions and weapon systems. In the interwar years, the subject was known as radio engineering. The word electronics began to be used in the 1940s In the late 1950s the term electronic engineering started to emerge.
Electronic laboratories created and subsidized by large corporations in the industries of radio, television, and telephone equipment, began churning out a series of electronic advances. The electronics industry was revolutionized by the inventions of the first transistor in 1948, the integrated circuit chip in 1959, and the silicon MOSFET in 1959. In the UK, the subject of electronic engineering became distinct from electrical engineering as a university-degree subject around 1960.
Electronic engineering facilitated the development of many technologies including wireless telegraphy, radio, television, radar, computers and microprocessors.

Wireless telegraphy and radio

Some of the devices which would enable wireless telegraphy were invented before 1900. These include the spark-gap transmitter and the coherer with early demonstrations and published findings by David Edward Hughes and Heinrich Rudolf Hertz and further additions to the field by Édouard Branly, Nikola Tesla, Oliver Lodge, Jagadish Chandra Bose, and Ferdinand Braun. In 1896, Guglielmo Marconi went on to develop the first practical and widely used radio wave based communication system.
Millimetre wave communication was first investigated by Jagadish Chandra Bose during 18941896, when he reached an extremely high frequency of up to 60GHz in his experiments. He also introduced the use of semiconductor junctions to detect radio waves, when he patented the radio crystal detector in 1901.
In 1904, John Ambrose Fleming, the first professor of electrical Engineering at University College London, invented the first radio tube, the diode. Then, in 1906, Robert von Lieben and Lee De Forest independently developed the amplifier tube, called the triode. Electronics is often considered to have begun with the invention of the triode. Within 10 years, the device was used in radio transmitters and receivers as well as systems for long distance telephone calls.
The invention of the triode amplifier, generator, and detector made audio communication by radio practical. In 1912, Edwin H. Armstrong invented the regenerative feedback amplifier and oscillator; he also invented the superheterodyne radio receiver and could be considered the father of modern radio.
The first known radio news program was broadcast 31 August 1920 by station 8MK, the unlicensed predecessor of WWJ in Detroit, Michigan. Regular wireless broadcasts for entertainment commenced in 1922 from the Marconi Research Centre at Writtle near Chelmsford, England. The station was known as 2MT and was followed by 2LO, broadcasting from Strand, London.
While some early radios used some type of amplification through electric current or battery, through the mid-1920s the most common type of receiver was the crystal set. In the 1920s, amplifying vacuum tubes revolutionized both radio receivers and transmitters.
Vacuum tubes remained the preferred amplifying device for 40 years, until researchers working for William Shockley at Bell Labs invented the transistor in 1947. In the following years, transistors made small portable radios, or transistor radios, possible as well as allowing more powerful mainframe computers to be built. Transistors were smaller and required lower voltages than vacuum tubes to work.
Before the invention of the integrated circuit in 1959, electronic circuits were constructed from discrete components that could be manipulated by hand. These non-integrated circuits consumed much space and power, were prone to failure and were limited in speed although they are still common in simple applications. By contrast, integrated circuits packed a large number — often millions — of tiny electrical components, mainly transistors, into a small chip around the size of a coin.

Television

In 1927 Philo Farnsworth made the first public demonstration of a purely electronic television. During the 1930s several countries began broadcasting, and after World War II it spread to millions of receivers, eventually worldwide. Ever since then, electronics have been fully present in television devices.
Modern televisions and video displays have evolved from bulky electron tube technology to use more compact devices, such as plasma and Liquid-crystal displays. The trend is for even lower power devices such as the organic light-emitting diode displays, and it is most likely to replace the LCD and plasma technologies.

Radar and radio location

During World War II many efforts were expended in the electronic location of enemy targets and aircraft. These included radio beam guidance of bombers, electronic counter measures, early radar systems etc. During this time very little if any effort was expended on consumer electronics developments.

Transistors and integrated circuits

The first working transistor was a point-contact transistor invented by John Bardeen and Walter Houser Brattain at the Bell Telephone Laboratories in 1947. William Shockley then invented the bipolar junction transistor at BTL in 1948. While early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, they opened the door for more compact devices.
developed the silicon surface passivation process and invented the MOSFET transistor
invented the monolithic integrated circuit chip
The surface passivation process, which electrically stabilized silicon surfaces via thermal oxidation, was developed by Mohamed M. Atalla at BTL in 1957. This led to the development of the monolithic integrated circuit chip. The first integrated circuits were the hybrid integrated circuit invented by Jack Kilby at Texas Instruments in 1958 and the monolithic integrated circuit chip invented by Robert Noyce at Fairchild Semiconductor in 1959.
co-invented the MOSFET transistor
The MOSFET was invented by Mohamed Atalla and Dawon Kahng at BTL in 1959. It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses. It revolutionized the electronics industry, becoming the most widely used electronic device in the world. The MOSFET is the basic element in most modern electronic equipment, and has been central to the electronics revolution, the microelectronics revolution, and the Digital Revolution. The MOSFET has thus been credited as the birth of modern electronics, and possibly the most important invention in electronics.
The MOSFET made it possible to build high-density integrated circuit chips. Atalla first proposed the concept of the MOS integrated circuit chip in 1960, followed by Kahng in 1961. The earliest experimental MOS IC chip to be fabricated was built by Fred Heiman and Steven Hofstein at RCA Laboratories in 1962. MOS technology enabled Moore's law, the doubling of transistors on an IC chip every two years, predicted by Gordon Moore in 1965. Silicon-gate MOS technology was developed by Federico Faggin at Fairchild in 1968. Since then, the mass-production of silicon MOSFETs and MOS integrated circuit chips, along with continuous MOSFET scaling miniaturization at an exponential pace, has led to revolutionary changes in technology, economy, culture and thinking.

Computers

A computer is a programmable machine that receives input, stores and manipulates data, and provides output in a useful format.
Although mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century. These were the size of a large room, consuming as much power as several hundred modern personal computers. Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into small pocket devices, and can be powered by a small battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous.
The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers ranging from a netbook to a supercomputer are all able to perform the same computational tasks, given enough time and storage capacity.

Microprocessors

The origins of the microprocessor can be traced back to the invention of the MOSFET, also known as the MOS transistor. It was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959, and first demonstrated in 1960. The same year, Atalla proposed the concept of the MOS integrated circuit, which was an integrated circuit chip fabricated from MOSFETs. By 1964, MOS chips had reached higher transistor density and lower manufacturing costs than bipolar chips. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip.
The first multi-chip microprocessors, the Four-Phase Systems AL1 in 1969 and the Garrett AiResearch MP944 in 1970, were developed with multiple MOS LSI chips. The first single-chip microprocessor was the Intel 4004, released on a single MOS LSI chip in 1971. A single-chip microprocessor was conceived in 1969 by Marcian Hoff. His concept was part of an order by Japanese company Busicom for a desktop programmable electronic calculator, which Hoff wanted to build as cheaply as possible. The first realization of the single-chip microprocessor was the Intel 4004, a 4-bit processor released on a single MOS LSI chip in 1971. It was developed by Federico Faggin, using his silicon-gate MOS technology, along with Intel engineers Hoff and Stan Mazor, and Busicom engineer Masatoshi Shima. This ignited the development of the personal computer. In 1973, the Intel 8080, an 8-bit processor, made possible the building of the first personal computer, the MITS Altair 8800. The first PC was announced to the general public on the cover of the January 1975 issue of Popular Electronics.
Many electronics engineers today specialize in the development of programs for microprocessor-based electronic systems, known as embedded systems. Hybrid specializations such as Computer Engineering have emerged due to the detailed knowledge of the hardware that is required for working on such systems. Software engineers typically do not study microprocessors at the same level as computer and electronics engineers. Engineers who exclusively carry out the role of programming embedded systems or microprocessors are referred to as "embedded systems engineers", or "firmware engineers".