From Volume 81, Number 5 (July 2008)
DOWNLOAD PDF
In 1945, two engineers at the University of Pennsylvania invented the first general-purpose electronic computing device—the Electronic Numerical Integrator and Computer (“ENIAC”). The ENIAC was capable of 5000 simple calculations a second, yet it took up the space of an entire room, “weighed 30 tons, and contained over 18,000 vacuum tubes, 70,000 resistors, and almost 5 million hand-soldered joints.” This machine cost over $1 million dollars, equivalent to roughly $9 million today. Over the next thirty years integrated circuits shrunk, yielding microprocessors able to perform millions and billions of calculations per second with new storage media able to hold megabits and gigabits of data. As a result, computers became smaller, more advanced, and dramatically less expensive. Still, prior to the late-1980s, these and other computers were “solely the tool[s] of a few highly trained technocrats.” In the mid-1980s, only 8.2 percent of American households contained computers. American public businesses, universities, and research organizations used only 56,000 large “general purpose” computers and 213,000 smaller “business computers”; private businesses used another 570,000 “mini-computers” and 2.4 million desktop computers; and the federal government employed between 250,000 and 500,000 computers.
81_959