The History of Computers in America
One of the most popular and rapidly advancing inventions of the 20th century is the computer. Computers are being integrated into every aspect of life from business to science to entertainment. From the abacus to Charles Babbage's Difference Engine to today's super-fast microprocessors, the computer has been the single most important tool in the "Information Age" and has made life's tasks easier.
The earliest ancestor of the computer was the abacus, a calculator used by the Chinese as early as 500 BC. It consists of a frame and eight columns of beads. The beads are shifted up or down to do addition or subtraction. The abacus is still used today in China and Japan (History of Abacus).
The basic principles of the modern computer were developed by the English mathematician Charles Babbage (1792-1871). Babbage has been named "the father of modern computing" for his ideas on adding machines (Stine, 21). Babbage's plan, which he called an "analytical engine," would be capable of doing any mathematical operation. It could add, subtract, multiply, and divide numbers of up to 50 digits (most modern calculators can only compute 8 digit numbers) and could store up to 1000 of these numbers on cards with small holes in them, called punch cards. Babbage's idea also had a "store" to save variables and a "mill" in which the operations were done. The modern day equivalent of the store is memory and that of the mill is the processor (Stine, 26). Perhaps the most incredible part of Babbage's plan was that it was completely mechanical and ran on steam power. "But in 1833, Babbage didn't have electronics. He didn't even have electricity. Nobody did. Electricity was fifty years in the future, and electronics a century away. So, like all engineers, technologists, and inventors, Babbage was constrained by the technology of the time: he had steam power, mechanical actions, levers, gears, cams, and other mechanical motions. Thus, the speed of his analytical engine was excruciatingly slow by today's standards, although it had the capacity to do much more." (Stine, 23)
For nearly 50 years after Babbage's idea, computing machines were almost completely forgotten about. The technology of the time just was not good enough and there were not enough general applications that they could be used for other than solving equations. But in 1890, a reason for building a working adding machine came in the form of the 1890 United States census. The U.S. Government had to find an efficient way to categorize the nearly 60 million people living in America by sex, birthplace, occupation, and other many other attributes. After holding a competition for a solution, they decided on a young mining engineer named Herman Hollerith (Stine, 40). He combined punched cards with devices that could read the cards electronically. Hollerith's "tabulator" did the job 3 to 4 times faster than the government estimated their team of 100 clerks could have done by hand (Snyder, "Computer"). The machine punched holes into cards that had 192 different spaces. The information for each person was punched into the card, and each space on the card represented an answer to a certain question based on whether there was a hole or not. Not only did it do the job, it was the first data-processing machine to use electricity. "This was the start of the computer revolution. Even in 1890, it was revolutionary." Hollerith went on to form the Election Tabulating Company and sold his invention to governments as far away as Russia (Stine, 42).
Hollerith's tabulator started a boom in business machines. The government agencies were not the only ones that needed to add huge lists of numbers. Banks, insurance agencies, and other big corporations had collections of books filled with their numbers that needed to be added. Several different companies began appearing in the market offering better and better adding machines. A few of these were International Business Machines (IBM), Burroughs Corporation, Remington, and many others. The Burroughs Arithmometer was "Ã¢â‚¬Â¦as ubiquitous in 1905 businesses as the desktop computer is today." (Stine, 54)
The last great mechanical computer was Howard Aiken's Mark 1, developed by IBM. It used electromagnetic components to replace mechanical parts (Snyder, "Computer"). Although it was the most advanced mechanical computer in history, "Ã¢â‚¬Â¦it epitomized the computer because it was obsolete at the moment the IBM engineers first switched it onÃ¢â‚¬Â¦ in 1943. The vacuum tube was taking over, making computers totally electronic." (Stine, 95)
The vacuum tube was the first completely electronic switch. Put simply, this meant speed. Mechanical computers could only work as fast as the parts they were made of could move. One of the first computers to use vacuum tubes was the Electronic Variable Computer (EDVAC), developed by Hungarian-American mathematician John von Neumann. It was one of the first computers used in mathematics, meteorology, economics, and hydrodynamics, and it was the first electronic computer to use a program stored entirely in its own memory (Snyder, "Computer"). Von Neumann had been hired by the War Department to build an electronic computer that could accurately simulate the course of a shell fired from a cannon at different angles and velocities. The EDVAC was used to produce these "ballistics charts". It was also used in the preparation of aircraft and atomic weapons design, fire control, and logistics (Aspray, 25).
Around the same time von Neumann was working on the EDVAC, an American physicist named John Mauchly was working on the Electronic Numerical Integrator and Computer (ENIAC), which is regarded as the first successful general digital computer. It was completed in 1945 at the Moore School of Engineering in Pennsylvania. ENIAC was built by Mauchly and an engineer named J. Presper Eckert, weighed over 60,000 lb., and contained more than 18,000 vacuum tubes. ENIAC, like EDVAC, was used by the military for calculating ballistics tables and designing atomic weapons. The biggest difference between the two computers, however, was that the ENIAC did not store programs in its memory the way EDVAC did so it had to be reprogrammed for every new task it was used for (Snyder, "Computer").
Eckert and Mauchly eventually started a company and developed the first computer to use both numbers and letters. Called the Universal Automatic Computer (UNIVAC), they followed in the footsteps of Herman Hollerith and his tabulator and sent the machine to the Census Bureau where it was used for the 1950 census. UNIVAC was completely electronic and was much smaller than the ENIAC. On top of that, it was fast, able to do 1,000 operations a second. UNIVAC is most famous for predicting the winner of the 1952 election in which Eisenhower ran against Stevenson. UNIVAC's prediction of Eisenhower's 438 votes to Stevenson's 93 was very close to the actual 442 to 89. "The electronic digital computer was suddenly perceived as a super, omniscient electrical brain." (Stine, 110)
By 1948, scientists began to realize what a problem vacuum tubes were becoming. They were very big and required huge amounts of power. One SAGE model computer used as much energy in a day as a town of 15,000 people (Stine, 111). Finally, at Bell Laboratories the solution was discovered. The transistor was developed by a team of American physicists. It acts like an electric switch, but is much more reliable, smaller, and cost-effective than the vacuum tubes. Even today, transistors are a fundamental part in almost all modern electronic devices (Snyder, "Computer").
With the invention of transistors came the integrated circuit, invented by Robert Noyce in 1959. It consisted of a group of very small transistors and other electrical components placed on a single chip of silicon. By 1970, engineers were able to fit thousands of transistors on a single chip. This led to the development of modern microprocessors, which contain over 10 million transistors (Snyder, "Computer").
As integrated circuits became smaller, so did computers, both in size and price. Many companies envisioned "personal computers" (PCs), computers small enough to be operated by a single user at work or home. The first PC was sold by Instrumentation Telemetry Systems in 1975 and was called the Altair 8800.
While this PC was incredible basic (it did not have a monitor or a keyboard), it paved the way for today's modern PCs. It also created a fierce battleground in the computer industry. IBM, which had been the standard in the computer industry, was now feeling pressure from a new upstart company founded by two men, Steve Jobs and Steve Wozniak. This company was Apple Computer and their Apple II model was proving to be quite advanced, more so than the IBM Model 60. This was the beginning of what would become an explosion of competition in the PC market (Stine, 214).
The main goal of computer manufacturers was to make computers as affordable as possible while also making them faster, more reliable, and able to store more information. Almost every computer manufacturer accomplished these goals, so naturally computers began to appear everywhere. In businesses keeping track of inventories and prices, in colleges helping students do research, and in labs doing calculations for scientists and physicists.
The computer has evolved an incredible amount over the years, and by doing this has made so many aspects of life better and easier. The technology is growing so rapidly that a computer is considered obsolete a month after it is shipped to a store. At the current rate of change, it is simply amazing to think of the computers that will be developed in the next decade alone. The computer has helped revolutionize complete industries and is truly one of the greatest inventions in history.
"History of Abacus." 10 Oct. 1998.
Online posting. Virtual Abacus.
Aspray, William. John von Neumann and The Origins of Modern Computing. Cambridge: MIT Press. 1990.
Carberry, M. S., et al. Foundations of Computer Science. Potomac: Computer Sciences Press, Inc. 1979.
Sides, Charles H. How to Write Papers and Reports About Computer Technology. Philadelphia: iSi Press. 1984.
Snyder, Timothy Law. "Computer." Microsoft Encarta 98. CD-ROM. Microsoft Corporation. 1998.
Stine, G. Harry. The Untold Story of the Computer Revolution. New York: Arbor House. 1985.