Today’s personal computers are drastically different from the massive, hulking machines that emerged out of World War II—and the difference isn’t only in their size. By the 1970s, technology had evolved to the point that individuals—mostly hobbyists and electronics buffs—could purchase unassembled PCs or “microcomputers” and program them for fun, but these early PCs could not perform many of the useful tasks that today’s computers can. Users could do mathematical calculations and play simple games, but most of the machines’ appeal lay in their novelty. Today, hundreds of companies sell personal computers, accessories and sophisticated software and games, and PCs are used for a wide range of functions from basic word processing to editing photos to managing budgets. At home and at work, we use our PCs to do almost everything. It is nearly impossible to imagine modern life without them.
Invention of the PC: The Computer Age
The earliest electronic computers were not “personal” in any way: They were enormous and hugely expensive, and they required a team of engineers and other specialists to keep them running. One of the first and most famous of these, the Electronic Numerical Integrator Analyzer and Computer (ENIAC), was built at the University of Pennsylvania to do ballistics calculations for the U.S. military during World War II. ENIAC cost $500,000, weighed 30 tons and took up nearly 2,000 square feet of floor space. On the outside, ENIAC was covered in a tangle of cables, hundreds of blinking lights and nearly 6,000 mechanical switches that its operators used to tell it what to do. On the inside, almost 18,000 vacuum tubes carried electrical signals from one part of the machine to another.
Invention of the PC: Postwar Innovations
ENIAC and other early computers proved to many universities and corporations that the machines were worth the tremendous investment of money, space and manpower they demanded. (For example, ENIAC could solve in 30 seconds a missile-trajectory problem that could take a team of human “computers” 12 hours to complete.) At the same time, new technologies were making it possible to build computers that were smaller and more streamlined. In 1948, Bell Labs introduced the transistor, an electronic device that carried and amplified electrical current but was much smaller than the cumbersome vacuum tube. Ten years later, scientists at Texas Instruments and Fairchild Semiconductor came up with the integrated circuit, an invention that incorporated all of the computer’s electrical parts–transistors, capacitors, resistors and diodes–into a single silicon chip.
But one of the most significant inventions that paved the way for the PC revolution was the microprocessor. Before microprocessors were invented, computers needed a separate integrated-circuit chip for each one of their functions. (This was one reason the machines were still so large.) Microprocessors were the size of a thumbnail, and they could do things the integrated-circuit chips could not: They could run the computer’s programs, remember information and manage data all by themselves.
The first microprocessor on the market was developed in 1971 by an engineer at Intel named Ted Hoff. (Intel was located in California’s Santa Clara Valley, a place nicknamed “Silicon Valley” because of all the high-tech companies clustered around the Stanford Industrial Park there.) Intel’s first microprocessor, a 1/16-by-1/8-inch chip called the 4004, had the same computing power as the massive ENIAC.
The Invention of the PC
These innovations made it cheaper and easier to manufacture computers than ever before. As a result, the small, relatively inexpensive “microcomputer”–soon known as the “personal computer”–was born. In 1974, for instance, a company called Micro Instrumentation and Telemetry Systems (MITS) introduced a mail-order build-it-yourself computer kit called the Altair. Compared to earlier microcomputers, the Altair was a huge success: Thousands of people bought the $400 kit. However, it really did not do much. It had no keyboard and no screen, and its output was just a bank of flashing lights. Users input data by flipping toggle switches.
In 1975, MITS hired a pair of Harvard students named Paul G. Allen and Bill Gates to adapt the BASIC programming language for the Altair. The software made the computer easier to use, and it was a hit. In April 1975 the two young programmers took the money they made from “Altair BASIC” and formed a company of their own—Microsoft—that soon became an empire.
The year after Gates and Allen started Microsoft, two engineers in the Homebrew Computer Club in Silicon Valley named Steve Jobs and Stephen Wozniak built a homemade computer that would likewise change the world. This computer, called the Apple I, was more sophisticated than the Altair: It had more memory, a cheaper microprocessor and a monitor with a screen. In April 1977, Jobs and Wozniak introduced the Apple II, which had a keyboard and a color screen. Also, users could store their data on an external cassette tape. (Apple soon swapped those tapes for floppy disks.) To make the Apple II as useful as possible, the company encouraged programmers to create “applications” for it. For example, a spreadsheet program called VisiCalc made Apple a practical tool for all kinds of people (and businesses)–not just hobbyists.
The PC Revolution
The PC revolution had begun. Soon companies like Xerox, Tandy, Commodore and IBM entered the market, and computers became ubiquitous in offices and eventually homes. Innovations like the “Graphical User Interface,” which allows users to select icons on the computer screen instead of writing complicated commands, and the computer mouse made PCs even more convenient and user-friendly. Today, laptops, smartphones and tablet computers allow us to have a PC with us wherever we go.