What Is the History of the Computer?

White laptop computer on bed.jpg

Computers are a part of everyday life and are almost ubiquitous in the sciences, business and in schools. Still, this commonplace item has very long roots stretching back to the middle of the twentieth century. The history and evolution of computers has influenced, and been influenced by, almost all scientific advancements from World War II onward.

1 History

Attempts to create a self-motivating and calculating machine that would emulate many of the thinking patters of human beings has been a centuries-long quest for many inventors. The eighteenth century Turk was alleged to be a chess-playing automaton which defeated the likes of Napoleon Bonaparte and Benjamin Franklin in chess matches; it was later revealed that the Turk was a hoax controlled by a man inside the machine. Still, the idea for creating a human-like thinking machine remained alive and well into the twentieth century. Konrad Zuse, a German engineer, developed the first truly programmable computer in 1941. It used a stream of paper tape with holes punched into it to perform calculations based on a complex algorithm. The first commercial computer was the famous UNIVAC of 1951, invented by John Presper Eckert and John W. Mauchly. It used a series of vacuum tubes to perform complex calculations and filled several rooms with its bulk; the computing power of UNIVAC is equivalent to today's pocket calculators.

2 Time Frame

In the 1950s most computers relied on vacuum tubes to perform electronic calculation functions, which kept the size of computers to the equivalent of a large room. If vacuum tubes were not allowed sufficient space to allow air to cool them, they would blow out and shut the computer down completely until the burned-out or defective tube was replaced; a difficult prospect with 20,000 or more tubes to choose from. By the close of the 1950s, the integrated circuit chip was invented by Jack Kilby and Robert Noyce, which reduced the size (and cost) of computers. Earlier in the decade, IBM had gotten into the computer market, but mainly sold to businesses (such as banks) that had room to spare in their buildings. The invention and adoption of the chip would allow IBM to become the premier producer of computing machines throughout the 1960s and 1970s.

3 Identification

IBM offered the first computer aimed at the individual consumer by the middle of the 1970s. The 5100 series computers were large, sensitive to changes in balance and temperature, extremely expensive and used magnetic media in the form of audio-style tapes to record and save information. IBM was soon joined by a new start-up company called Apple, which offered wooden boxes filled with hand-soldered components bought at Radio Shack. By 1977, Apple had introduced two models of its computer, dropped the wood in favor of plastic and metal, and offered cheaper alternatives to IBM. Another company called Commodore offered its Pet computer in the same year. The entrance of Apple and Commodore into the market forced all competitors to adopt new technologies such as floppy disks for removable program and information storage.

4 Significance

The proliferation of computer platforms in the 1980s allowed another small start-up company to flourish. Specializing in what it called an "operating system," Microsoft found its footing in 1981 selling software to IBM. The ease with which the operating system allowed a user to interface with the computer, as well as the familiarity of many people with IBM's products from their work environment, allowed IBM to retain its market share and make important inroads on Apple and Commodore. Meanwhile, Microsoft was slowly amassing a fortune in software fees and royalties. IBM's machines were so popular that several clones appeared which used processes similar to IBM's and allowed them to run the increasingly popular Microsoft Disk Operating System (MS-DOS). Radio Shack carried Tandy computers, for instance, which closely mimicked the operations of more expensive IBM machines, but with better graphics, sound, memory and more features. These cheap imitators allowed the IBM platform to proliferate and prompted Commodore and Apple to counter with better, cheaper computers of their own. Apple introduces the Macintosh in 1984 and was the first home computer with a graphic user interface (GUI). Commodore's Vic20 and Commodore 64 line was augmented when the company bought the Amiga corporation and offered its own line of GUI-enhanced computers in 1985. These developments prompted Microsoft to offer its Windows platform later in 1985.

5 Size

As IBM machines and their clones expanded, they began to adapt to their rivals, Commodore and Apple, by offering better graphics, better sound, more user-friendly applications and more programs to choose from. By the 1990s, both Apple and Commodore faced stiff competition, though their computers were lauded by critics worldwide. The praise of critics was not enough for Commodore, which filed for bankruptcy in 1994, but still maintained its Amiga computer line. By the end of the decade, Commodore completely shuttered its doors, effectively killing the Amiga line. Apple was in better shape, but only slightly. Having lost market share to the increasingly-cheap Windows-bundled IBM-compatibles, Apple tried to expand its appeal through production of PDAs (personal data assistants), but continued to hemorrhage money. By 1997, several CEOs had tried and failed to revive the Apple brand which allowed Steve Jobs to take over as CEO. Jobs immediately effected a deal with Microsoft to allow the latter's Office products to be offered on Apple Macintoshes, which boosted the brand's appeal. By 1997, Apple had copied the models of IBM-clone producers Dell and Gateway to offer custom-built Macintoshes through their Apple Stores.

6 Considerations

By 2001, IBM-clones (now called simply PCs) were the dominant platform for computers yet, ironically, IBM itself was reduced to a minor player in the personal computing market over the course of several decades. Microsoft's adaptation of Apple-like GUI had made its corporation one of the most powerful in the world, leading many critics to charge that Microsoft was a monopoly. While Microsoft was embroiled in legal battles in the United States and Europe, Apple began expanding its profile through acquisition of film editing software companies and programs, but really revived itself with the advent of the iPod personal music player. The sales of iPods allowed Apple the funds necessary to revamp its Macintosh brand and begin to compete on a serious level with its rival, Microsoft-based PCs. At about this time, other so-called "open source" operating systems such as Red Hat and Linux allowed PC users to move away from Microsoft's stranglehold on the personal computing market. The proliferation of the internet in the late 1990s exploded throughout the 2000s despite the bursting of the "dot com bubble," and made seemingly inconsequential companies specializing in Internet search engines and browsing software household names. Many of the functions once reserved for Microsoft-based software were now available on the internet for less (or even for free) which reduced demand for many of Microsoft's products such as Word, Outlook and Excel. At the beginning of the decade, Apple computers were in about 2 percent of all households, schools and other institutions. By 2007, Apple controlled almost 8 percent of the personal computing market, allowing it to become a serious competitor to PC computers.

Michael Hinckley received a Bachelor of Arts degree in US history from the University of Cincinnati, a Master of Arts degree in Middle East history from the University of California at Santa Barbara. Hinckley is conversant in Arabic, and is a part-time lecturer at two Midwestern universities.

×