First Published: Fri, May 23 2014. 08 17 AM IST
Home»  Consumer

Christie’s to auction the chip that Jack Kilby built

The auction is to demonstrate Kilby’s invention of IC on a single chip, a move that helped spawn modern computing era
E-mailPrint
Christie’s to auction the chip that Jack Kilby built
Estimated at $1-2 million, Kilby’s prototype integrated circuit was built between 18 July and 12 September 1958. Photo: AFP
Mumbai: Christie’s, the world’s leading art business, said on Thursday it will auction the Nobel Prize-winning prototype integrated circuit (IC) developed by Jack Kilby at Texas Instruments in 1958 to demonstrate his invention of IC on a single chip, a move that helped spawn the modern computing era.
The IC, also called a chip or microchip, turned 50 on 12 September 2008 and is today a key part of any electronic device, including smartphones, tablets, desktops, television sets and credit cards. It is a semiconductor—primarily silicon—wafer on which billions of tiny resistors, capacitors, and transistors are fabricated.
Estimated at $1-2 million, Kilby’s prototype IC was built between 18 July and 12 September 1958 of a doubly diffused germanium wafer with flying gold wire and four leads by Tom Yeargan (1920-2001), a member of the team that executed Kilby’s theories on how to bring miniaturization to the giant computers of the first half of the 20th century.
The chip is mounted on glass and enclosed in a plastic case belonging to Yeargan, with a label signed by Kilby, and is accompanied by another prototype—a silicon circuit with five gold and platinum leads—and a three-page statement by Yeargan on the chronology and building of the invention of the IC, dated 6 March 1964.
In May 1959, the discovery was announced to the public. Mark Shepherd, the head of semiconductor research, called it “the most significant development by Texas Instruments since we divulged the commercial availability of the silicon transistor”, according to Christie’s.
Kilby, who died in 2005, never claimed sole credit for his breakthrough but credited “the contributions of thousands of engineers and scientists in laboratories and production facilities all over the world”.
In 2000, Kilby was awarded the Nobel Prize in physics for his part in the invention of the integrated circuit. In his acceptance speech, he credited technicians Pat Harbrecht and Yeargan for their roles in the construction of the first integrated circuits and quoted the physicist Charles Townes: “No. I didn’t build it myself. But it’s based on an idea of mine!”
Dropping from a unit cost of $450 in 1961 to a fraction of a cent today, chips are at the heart of all modern electronic devices. However, based on the principles devised by Alan Turing, the early chips were dependent on vacuum tubes, and basic telephone circuit relays. Vacuum tubes were bulky, expensive and would heat up, gobbling loads of electricity.
Early chips, called small scale integration (SSI), contained a few tens of transistors. By the 1980s, large scale integration (LSI) circuits carried thousands of transistors. Today’s very large scale integration, or VLSI, circuits carry billions of them.
The processing power of the silicon chip has grown in line with a prediction made by Intel co-founder Gordon Moore in 1965, who said the number of transistors that can be placed on a chip for the same cost will double roughly every two years.
The trick has been achieved by shrinking transistors and how small they can get is a question everyone’s been asking for a while.
Making a chip typically involves etching a circuit onto a silicon wafer and putting the components on it layer by layer. This has prompted some scientist to suggest that the only way to put more components on a chip is to change the way chips are made. Intel did announce in May 2011 that it had discovered a way to use 3D design to crowd more transistors onto a single chip.
International Business Machine Corp. (IBM) scientists, meanwhile, have developed a way to manufacture a new breed of computer chips that use carbon nanotubes in place of silicon.
According to a 20 May article in Wired, Google teamed up with Nasa last August to to work on a box called D-WAVE, which “is the world’s first practical quantum computer, a device that uses radical new physics to crunch numbers faster than any comparable machine on earth”. While ordinary computers use “bits” that flip between 1 and 0, representing a single number in a calculation, quantum computers use quantum bits and qubits, which can exist as 1s and 0s simultaneously, allowing them to process a great deal more of information.
Meanwhile, DNA chips are finding applications throughout the field of molecular biology. They were initially developed to enhance genomic sequencing projects, especially the Human Genome Project.
And then, there is talk about neuristors that can capture the ability of a brain’s neuron to generate a spike or impulse of activity when a threshold is exceeded, eventually resulting in a brain-like chip.
blog comments powered by Disqus
  • Wed, Oct 29 2014. 04 15 PM
  • Wed, Oct 22 2014. 09 49 PM
Subscribe |  Contact Us  |  mint Code  |  Privacy policy  |  Terms of Use  |  Advertising  |  Mint Apps  |  About HT Media  |  Jobs
Contact Us
Copyright © 2014 HT Media All Rights Reserved