Computer technology has advanced very quickly over the years. The term computer originally referred to people. It was a job title for those who did repetitive work with math problems. The first programmable digital computers were invented in the 1940s. They were as big as living rooms and were about as powerful as modern day calculators. The PC, or personal computer, became prominent in the 1980s. Today computers are used for every thing imaginable, and the future of computers is guaranteed to bring many changes to modern society.
As far back as 300 B.C. humans had tools to help in computations. The first gear driven calculating machine was most likely the "Calculating Clock." It was created in 1623 by Wilhelm Schickard. The "calculating clock" and other similar inventions lacked accuracy because it was impossible to create gears of the required precision at the time.
Electronic Digital Computers
In order to better solve mathematical equations for aiming artillery shells, the U.S. military invested heavily in computer technology. With the help of military money, in 1944 IBM and Harvard collaborated to make the Mark I. This was the first programmable electronic digital computer created. However, it was not purely electrical. It relied on switches, relays, rotating shafts and clutches. It weighed 5 tons and used 500 miles of wire. It employed binary code, a system of 1s and 0s. This is the same basis for digital technology today.
The First Computer Language
Programming early computers quickly became somewhat of a hassle. In 1953 Grace Hopper invented the first high level computer language. It helped humans to simplify the binary code used by the computer so they could more simply dictate the computer's actions. Hopper's invention was called Flow-matic, and was applied to the Mark I. This language became COBOL, an extremely popular language developed in 1959 and modified over the years to keep pace with changing technology.
In 1971 Intel created the first microprocessor. This allowed computers to be much smaller than previous incarnations. The first microprocessor was known as the Intel 4004; later Intel developed the 8080, which became the first commercially viable microprocessor. The Intel 8080 sold for $360, considerably less than the millions of dollars charged for the IBM 360 mainframe. In 1975 the 8080 was used in the MITS Altair computer, which was the first PC.
In 1981 IBM put out a more accessible line of computers using the Intel microprocessor. These came equipped with the Microsoft Disk Operating System (MS-DOS). This programming allowed users to more easily interface with their computers, and was the predecessor for operating systems like Windows Vista and Mac OS X.
- Photo Credit "Green, Digital DNA, City of Palo Alto, Art in Public Places, 9.01.05, California, USA9341" is Copyrighted by Flickr user: Wonderlane under the Creative Commons Attribution license.
The History of Technology in the Classroom
Since the early 1980s, advances in computer technology have transformed how we communicate, entertain ourselves, work and even learn. The rapid pace...
About Computer Science Engineering
Computer science engineering, or simply "computer science," is a formal, academic study of information, data, computing, and mechanical automation. In 1997, the...
What Is Computer Information Technology?
Most areas of education and business are changing today thanks to developments in computers and information technology. This broad field of engineering...
History of Technology in Education
In the 21st century, technology has rapidly emerged as an important component of teaching, learning and reform in America. Although a valuable...
The History of Computer Networking
The history of computer networking and the Internet spans not just the decades of computer development, but the century of communication technology...
History of Computer Hardware Engineering
Computer engineering is inextricably linked to the computer itself. In order for there to be a profession involving people who build something,...