The development of the computer has been an incremental process. Humans used various methods to represent numbers. The first was probably stones. The word "calculate" comes from the Latin word "calculus." Calculus means small stone. Eventually these stones were placed in rows and columns. However, as numbers increased, the number of stones required became prohibitive. This led to the creation of counting boards.
The first surviving written record of an abacus was by the ancient Greek historian Herodotus. Herodotus mentioned the use of the abacus by Greeks and Egyptians.
The Chinese counting board used rods and beads in a wooden frame. A famous Chinese text from the the first century B.C. explains algebraic equations using the device.
In Russia, the abacus was modified to have colored beads in the middle indicating the decimal place. The abacus is still widely used at markets in the former Soviet Union.
Leonardo de Vinci made a design for a calculating machine in 1490-1491. However, due to the technology of the time, it does not appear that the machine was actually built. Schickard created the design for a calculating clock around 1623. Although the machine appeared to have been built, it did not survive to modern times.
Blaise Pascal produced a machine that could do arithmetic functions. Pascal produced approximately 50 variations in his lifetime, of which eight or nine survive today. As with several subsequent designs, the devices were simplified by limiting operations to addition.
In 1834 Charles Babbage created a device called the Analytical Engine. This device used many of the innovations incorporated in electronic computers.
Data could be imputted via metal punch cards. Data could be outputted either as punch cards or print. The device had a mechanical memory. The central processing unit (CPU) used registers and drums to translate user instructions into control of the hardware.
Many of the innovations from 1939 to 1950 were related to government projects.
Bell Telephone Labs demonstrated the Complex Number Calculator in 1940. The demonstration included doing calculations via a Teletype and is considered the first time that remote access computing was demonstrated.
The British built the Colossus computer in 1944 to break German codes. The ENIAC was unveiled in 1946. The machine could do 5000 operations per second. IBM used the Selective Sequence Electronic Calculator to create moon position tables by 1952 that would be used in the 1969 Apollo flight. Short programs called subroutines were used for programming at Cambridge in 1949 for a U.S. military project.
Computers for Institutions
The size and cost of computers limited the technology primarily to the government, businesses and universities in the period from 1950 to 1975. In 1950, the first computer was employed that used the more reliable diodes in place of vacuum tubes. The UNIVAC I was used to do 1,905 operations per second in 1951. IBM mass-produced 450 computers in 1954.
The CDC 6600 computer processed 3 million instructions per second in 1964. The PDP-8 was the first commercially successful microcomputer for small businesses and labs at a price of $18,000.
The University of Illinois constructed a computer that ran 300 million operations per second in 1966. A computerized guidance system was debuted on Apollo 7 in 1968.
The term “personal computer” was coined when the Altair 8800 computer kit was marketed in 1975.
ATARI marketed computers with video game capabilities in 1979. The Osborne I was the first portable computer, weighing 24 pounds and introduced in 1981. IBM unveiled its first PC in 1981 also. The Commodore C64 was introduced in 1982 and went on to become the greatest-selling computer model in history. Compaq marketed a clone that could run IBM software.
Building Upon Technology
The World Wide Web became possible with a new computer language called HTML in 1990. The Pentium processor allowed faster operation in 1993. Improvements in storage, speed, size and networking continued, based upon many of the early technologies.