Free to follow every thread. No paywall, no dead ends.
Integrated circuit: the story on HearLore | HearLore
Integrated circuit
In September 1958, Jack Kilby held a small, rectangular piece of germanium in his hand that contained an entire electronic circuit, yet it was not a single, unified object in the way modern engineers understand it. This device, the first working integrated circuit, relied on external gold wires to connect its components, making it fragile and impractical for mass production. Kilby, a newly employed engineer at Texas Instruments, had recorded his initial ideas in July of that year, but the demonstration on the 12th of September 1958 marked the true birth of the concept. The US Air Force became the first customer for this invention, which would eventually earn Kilby the Nobel Prize in Physics in 2000. However, the device was not a true monolithic integrated circuit chip because it lacked the internal connections that would allow it to be manufactured on a large scale. The external wiring was a necessary compromise that prevented the technology from becoming the foundation of the modern world immediately. It was a proof of concept, a spark that needed a different kind of fuel to ignite the revolution.
The Planar Revolution
Six months after Kilby's demonstration, Robert Noyce at Fairchild Semiconductor developed the first practical monolithic integrated circuit chip, solving the critical problem of external connections. Noyce's version was fabricated from silicon using the planar process, a method developed by his colleague Jean Hoerni that allowed reliable on-chip aluminum interconnections. Unlike Kilby's germanium-based design, Noyce's silicon chip used the planar process to create a structure where all components were inseparably associated and electrically interconnected on a single piece of material. This innovation was enabled by the inventions of the planar process by Jean Hoerni and of p, n junction isolation by Kurt Lehovec. Hoerni's invention was built on Carl Frosch and Lincoln Derick's work on surface protection and passivation by silicon dioxide masking. The result was a chip that could be mass-produced, leading to the modern integrated circuit design that dominates the industry today. The Apollo Program became the largest single consumer of integrated circuits between 1961 and 1965, proving the technology's viability for critical aerospace applications. The shift from germanium to silicon and from external wires to internal connections was the turning point that allowed the technology to scale.
The Logic of Scale
The evolution of integrated circuits followed a predictable yet explosive trajectory known as Moore's Law, which stated that the number of transistors on a chip would double every two years. This trend, originally stated by Gordon Moore to double every year before he changed the claim to every two years in 1975, drove the development of technologies from small-scale integration to ultra-large-scale integration. In 1964, Frank Wanlass demonstrated a single-chip 16-bit shift register with 120 MOS transistors, a feat that seemed impossible at the time. By 1986, one-megabit random-access memory chips were introduced, containing more than one million transistors. The complexity grew so rapidly that by 2005, microprocessor chips passed the billion-transistor mark, and by 2016, a typical chip area could hold up to 25 million transistors per square millimeter. The cost per transistor and the switching power consumption per transistor went down, while the memory capacity and speed went up, through the relationships defined by Dennard scaling. This relentless scaling allowed for the creation of modern devices that fit billions of transistors into an area the size of a human fingernail.
Common questions
When did Jack Kilby demonstrate the first working integrated circuit?
Jack Kilby demonstrated the first working integrated circuit on the 12th of September 1958. This device relied on external gold wires to connect its components and was made from a small rectangular piece of germanium. The US Air Force became the first customer for this invention which later earned Kilby the Nobel Prize in Physics in 2000.
How did Robert Noyce improve the integrated circuit design compared to Jack Kilby?
Robert Noyce developed the first practical monolithic integrated circuit chip six months after Kilby's demonstration. Noyce's version used silicon and the planar process to create internal aluminum interconnections that allowed for mass production unlike Kilby's external wiring. This innovation enabled the Apollo Program to become the largest single consumer of integrated circuits between 1961 and 1965.
What is Moore's Law and how has it affected integrated circuit development?
Moore's Law states that the number of transistors on a chip doubles every two years. Gordon Moore originally stated this would happen every year before changing the claim to every two years in 1975. This trend drove the development from small-scale integration to ultra-large-scale integration and allowed microprocessor chips to pass the billion-transistor mark by 2005.
How much does it cost to construct a modern semiconductor fabrication facility?
A modern semiconductor fabrication facility can cost over US$12 billion to construct. The cost rises over time due to the increased complexity of new products known as Rock's law. These facilities use wafers up to 300 mm in diameter and transistors as small as 5 nm.
What are the different types of integrated circuit packaging used over time?
The earliest integrated circuits were packaged in ceramic flat packs which continued to be used by the military for many years. Commercial packaging shifted to the dual in-line package and later adopted pin grid array and leadless chip carrier packages in the 1980s. Surface-mount technology emerged in the early 1980s and the flip-chip BGA developed in the 1990s enables much higher pin counts.
What is chip art and why do designers embed hidden images in integrated circuits?
Chip art refers to surreptitious non-functional images or words embedded into the silicon surface area of integrated circuits. Designers use this practice to leave their mark on devices and turn the rigid constraints of the fabrication process into a canvas for expression. These hidden messages can be seen under a microscope and have become a cultural phenomenon from the early days of integrated circuits to the present.
The manufacturing of integrated circuits requires a level of precision that defies human intuition, utilizing processes that involve layers thinner than the wavelength of visible light. A modern fabrication facility, commonly known as a semiconductor fab, can cost over US$12 billion to construct, and the cost rises over time because of increased complexity of new products, a phenomenon known as Rock's law. The process involves photolithography, where ultraviolet photons of shorter wavelength are employed to expose each layer, because visible light wavelengths are too large to pattern the tiny features. The wafers used in these facilities can be up to 300 mm in diameter, wider than a common dinner plate, and the transistors themselves can be as small as 5 nm. The fabrication process includes three key steps: photolithography, deposition, and etching, supplemented by doping and cleaning. Each device is tested before packaging using automated test equipment, and the wafer is then cut into rectangular blocks, each known as a die. The cost of designing and developing a complex integrated circuit is quite high, normally in the multiple tens of millions of dollars, which means it only makes economic sense to produce integrated circuit products with high production volume.
The Packaging Paradox
While the internal complexity of integrated circuits has grown exponentially, the physical packaging has evolved to protect these delicate components while connecting them to the outside world. The earliest integrated circuits were packaged in ceramic flat packs, which continued to be used by the military for many years due to their reliability and compact size. Commercial packaging rapidly shifted to the dual in-line package, first in ceramic and later in plastic, typically a cresol, formaldehyde, novolac resin. In the 1980s, the pin count of VLSI circuits exceeded the practical limit of DIP packaging, leading to the adoption of pin grid array and leadless chip carrier packages. Surface-mount technology emerged in the early 1980s and gained popularity by the late 1980s, offering finer lead pitch and using leads formed as either gull-wing or J-lead. By the late 1990s, plastic quad flat pack and thin small-outline package designs became the most common for high pin-count devices. The flip-chip BGA, developed in the 1990s, enables much higher pin counts than most other package types, allowing an array of input/output connections to be distributed across the entire die instead of being limited to its edges. This evolution in packaging has been essential to maintaining the performance and reliability of integrated circuits as they have become smaller and more complex.
The Art of the Chip
Beneath the utilitarian surface of integrated circuits lies a hidden world of creativity and rebellion, where designers have used the silicon surface area for surreptitious, non-functional images or words. These artistic additions, often created with great attention to detail, showcase the designers' creativity and add a touch of personality to otherwise utilitarian components. These are sometimes referred to as chip art, silicon art, silicon graffiti, or silicon doodling. The practice began as a way for engineers to leave their mark on the devices they created, turning the rigid constraints of the fabrication process into a canvas for expression. Some chips feature hidden images, while others contain messages that can only be seen under a microscope. This tradition has continued from the early days of integrated circuits to the present, with designers finding ways to embed their identity into the very fabric of the technology. The Secret Art of Chip Graffiti, an article by H. Goldstein, highlights the importance of these hidden messages, which serve as a reminder that the people behind the technology are human and creative. The practice has become a cultural phenomenon, with some chips featuring entire scenes or characters hidden within the circuitry.
The Future of Integration
As the physical limits of transistor scaling approach, the industry is turning to new technologies to continue the trend of increasing complexity and performance. Three-dimensional integrated circuits, or 3D-IC, have two or more layers of active electronic components that are integrated both vertically and horizontally into a single circuit. Communication between layers uses on-die signaling, so power consumption is much lower than in equivalent separate circuits. The use of through-silicon vias and monolithic 3D approaches allows for the stacking of multiple layers of transistors, creating a more compact and efficient device. The industry is also exploring the use of materials other than silicon, such as graphene transistors, molybdenite transistors, and carbon nanotube field-effect transistors. These materials offer the potential for higher performance and lower power consumption, but they also present significant challenges in terms of manufacturing and integration. The development of advanced packaging techniques, such as 2.5D and 3D packaging, is also helping to increase performance and reduce size without having to reduce the size of the transistors. The future of integrated circuits lies in the ability to combine these new technologies with the existing silicon-based infrastructure, creating a new generation of devices that are more powerful and efficient than ever before.