How are Computer Chips Made?
Computer chips are made in facilities known as fabrication plants or fabs, and they start as silicon, a material commonly derived from sand. Silicon is special because it’s a semiconductor, meaning its ability to conduct electricity is between that of a conductor (like copper) and an insulator (like glass).
Here’s a breakdown of how chips are manufactured:
Silicon extraction and shaping:
The process begins by melting and refining sand to extract single-crystal silicon ingots, which are nearly 100% pure. These ingots are then sliced into thin wafers, which are meticulously cleaned, polished, and coated with a layer of silicon dioxide. To make them photosensitive, a further layer called photoresist is applied. It’s crucial to keep these wafers free from any dust or contaminants during this stage. Once prepared, electronic circuits are etched onto these basic silicon chips.
Circuit etching:
The silicon wafer is then placed under a circuit-patterned plate, known as a mask, and exposed to ultraviolet light. This light hardens the photoresist in the pattern of the circuit. Hot gases are then used to melt away the unhardened material, revealing the silicon dioxide layer beneath and leaving behind a 3D replica of the mask’s circuit design.
What is a Computer Chip?
A computer chip, also known as a microchip, is the heart of the modern electronic device. This tiny component is a marvel of engineering, packed with millions, sometimes billions, of microscopic components that execute the computing tasks essential to digital life.
A computer chip is essentially a small piece of semiconducting material, typically silicon, which supports numerous transistors. These transistors act as the on/off switches controlling the flow of electrical signals through the chip’s circuits. The evolution of these chips has revolutionized the computing world, making it possible to house powerful computers in the palms of our hands.