With the digital computer now more than 60 years old, there has been growing interest in its history and development. Although it would take a library of books to do the subject justice, providing a summary of the main themes and trends of each decade of computing will give readers of this book some helpful context for understanding the other entries.
Early History
In a sense, the idea of mechanical computation emerged in prehistory when early humans discovered that they could use physical objects such as piles of stones, notches, or marks as a counting aid. The ability to perform computa-tion beyond simple counting extends back to the ancient world: For example, the abacus developed in ancient China could still beat the best mechanical calculators as late as the 1940s (see calculator). The mechanical calculator began in the West in the 17th century, most notably with the machines created by philosopher-scientist Blaise Pas-cal. Other devices such as “Napier’s bones” (ancestor of the slide rule) depended on proportional logarithmic relation-ships (see analog computer).
While the distinction between a calculator and true com-puter is subtle, Charles Babbage’s work in the 1830s delin-eated the key concepts. His “analytical engine,” conceived but never built, would have incorporated punched cards for data input (an idea taken over from the weaving industry), a cen-tral calculating mechanism (the “mill”), a memory (“store”), and an output device (printer). The ability to input both pro-gram instructions and data would enable such a device to solve a wide variety of problems (see Babbage, Charles).
Babbage’s thought represented the logical extension of the worldview of the industrial revolution to the problem of calculation. The computer was a “loom” that wove math-ematical patterns. While Babbage’s advanced ideas became largely dormant after his death, the importance of statistics and information management would continue to grow with the development of the modern industrial state in Europe and the United States throughout the 19th century. The punch card as data store and the creation of automatic tabu-lation systems would reemerge near the end of the century (see Hollerith, Herman).
During the early 20th century, mechanical calculators and card tabulation and sorting machines made up the data pro-cessing systems for business, while researchers built special-purpose analog computers for exploring problems in physics, electronics, and engineering. By the late 1930s, the idea of a programmable digital computer emerged in the work of theo-reticians (see Turing, Alan and von Neumann, John).
1940s
The highly industrialized warfare of World War II required the rapid production of a large volume of accurate calculations for such applications as aircraft design, gunnery con-trol, and cryptography. Fortunately, the field was now ripe for the development of programmable digital computers. Many reliable components were available to the computer designer including switches and relays from the telephone industry and card readers and punches (manufactured by Hollerith’s descendant, IBM), and vacuum tubes used in radio and other electronics.
Early computing machines included the Mark I (see Aiken, Howard), a huge calculator driven by electri-cal relays and controlled by punched paper tape. Another machine, the prewar Atanasoff-Berry Computer (see Atanasoff, John) was never completed, but demonstrated the use of electronic (vacuum tube) components, which were much faster than electromechanical relays. Meanwhile, a German inventor built a programmable binary computer that combined a mechanical number storage mechanism with telephone relays (see Zuse, Conrad). Zuse also pro-posed building an electronic (vacuum tube) computer, but the German government decided not to support the project.
During the war, British and American code breakers built a specialized electronic computer called Colossus, which read encoded transmissions from tape and broke the code of the supposedly impregnable German Enigma machines.
The most viable general-purpose computers were devel-oped by J. Presper Eckert and John Mauchly starting in 1943 (see Eckert, J. Presper and Mauchly, John). The first, ENIAC, was completed in 1946 and had been intended to perform ballistic calculations. While its programming facilities were primitive (programs had to be set up via a plugboard), ENIAC could perform 5,000 arithmetic opera-tions per second, about a thousand times faster than the electromechanical Mark I. ENIAC had about 19,000 vacuum tubes and consumed as much power as perhaps a thousand modern desktop PCs.
1950s
The 1950s saw the establishment of a small but viable com-mercial computer industry in the United States and parts of Europe. Eckert and Mauchly formed a company to design and market the UNIVAC, based partly on work on the exper-imental EDVAC. This new generation of computers would incorporate the key concept of the stored program: Rather than the program being set up by wiring or simply read sequentially from tape or cards, the program instructions would be stored in memory just like any other data. Besides allowing a computer to fetch instructions at electronic rather than mechanical speeds, storing programs in memory meant that one part of a program could refer to another part dur-ing operation, allowing for such mechanisms as branching, looping, the running of subroutines, and even the ability of a program to modify its own instructions.
The UNIVAC became a hit with the public when it was used to correctly predict the outcome of the 1952 presi-dential election. Government offices and large corporations began to look toward the computer as a way to solve their increasingly complex data processing needs. Forty UNI-VACs were eventually built and sold to such customers as the U.S. Census Bureau, the U.S. Army and Air Force, and insurance companies. Sperry (having bought the Mauchly-Eckert company), Bendix, and other companies had some success in selling computers (often for specialized applica-tions), but it was IBM that eventually captured the broad business market for mainframe computers.
The IBM 701 (marketed to the government and defense industry) and 702 (for the business market) incorporated sev-eral emerging technologies including a fast electronic (tube) memory that could store 4,096 36-bit data words, a rotating magnetic drum that could store data that is not immediately needed, and magnetic tape for backup. The IBM 650, mar-keted starting in 1954, became the (relatively) inexpensive workhorse computer for businesses (see mainframe). The IBM 704, introduced in 1955, incorporated magnetic core memory and also featured floating-point calculations.
1960s
The 1960s saw the advent of a “solid state” computer design featuring transistors in place of vacuum tubes and the use of ferrite magnetic core memory (introduced commercially in 1955). These innovations made computers both more compact (although they were still large by modern stan-dards), more reliable, and less expensive to operate (due to lower power consumption.) The IBM 1401 was a typical example of this new technology: It was compact, relatively simple to operate, and came with a fast printer that made it easier to generate data.
There was a natural tendency to increase the capacity of computers by adding more transistors, but the hand-wiring of thousands of individual transistors was difficult and expensive. As the decade progressed, however, the concept of the integrated circuit began to be implemented in computing. The first step in that direction was to attach a number of transistors and other components to a ceramic substrate, creating modules that could be handled and wired more easily during the assembly process.
IBM applied this technology to create what would become one of the most versatile and successful lines in the history of computing, the IBM System/360 computer. This was actually a series of 14 models that offered suc-cessively greater memory capacity and processing speed while maintaining compatibility so that programs devel-oped on a smaller, cheaper model would also run on the more expensive machines. Compatibility was ensured by devising a single 360 instruction set that was implemented at the machine level by microcode stored in ROM (read-only memory) and optimized for each model. By 1970 IBM had sold more than 18,000 360 systems worldwide.
By the mid-1960s, however, a new market segment had come into being: the minicomputer. Pioneered by Digital Equipment Corporation (DEC) with its PDP line, the mini-computer was made possible by rugged, compact solid-state (and increasingly integrated) circuits. Architecturally, the mini usually had a shorter data word length than the main-frame, and used indirect addressing (see addressing) for flexibility in accessing memory. Minis were practical for uses in offices and research labs that could not afford (or house) a mainframe (see minicomputer). They were also a boon to the emerging use of computers in automating manufac-turing, data collection, and other activities, because a mini could fit into a rack with other equipment (see also embed-ded systems). In addition to DEC, Control Data Corporation (CDC) produced both minis and large high-performance machines (the Cyber series), the first truly commercially viable supercomputers (see supercomputer).
In programming, the main innovation of the 1960s was the promulgation of the first widely-used, high-level pro-gramming languages, COBOL (for business) and FORTRAN (for scientific and engineering calculations), the result of research in the late 1950s. While some progress had been made earlier in the decade in using symbolic names for quantities and memory locations (see assembler), the new higher-level languages made it easier for professionals out-side the computer field to learn to program and made the programs themselves more readable, and thus easier to maintain. The invention of the compiler (a program that could read other programs and translate them into low-level machine instructions) was yet another fruit of the stored program concept.
1970s
The 1970s saw minis becoming more powerful and versatile. The DEC VAX (“Virtual Address Extension”) series allowed larger amounts of memory to be addressed and increased flexibility. Meanwhile, at the high end, Seymour Cray left CDC to form Cray Research, a company that would pro-duce the world’s fastest supercomputer, the compact, freon-cooled Cray-1. In the mainframe mainstream, IBM’s 370 series maintained that company’s dominant market share in business computing.
The most striking innovation of the decade, however, was the microcomputer. The microcomputer (now often called the “computer chip”) combined three basic ideas: an integrated circuit so compact that it could be laid on a single silicon chip, the design of that circuit to perform the essen-tial addressing and arithmetic functions required for a com-puter, and the use of microcode to embody the fundamental instructions. Intel’s 4004 introduced in late 1971 was origi-nally designed to sell to a calculator company. When that deal fell through, Intel started distributing the microproces-sors in developer’s kits to encourage innovators to design computers around them. Soon Intel’s upgraded 8008 and 8080 microprocessors were available, along with offerings by Rockwell, Texas Instruments, and other companies.
Word of the microprocessor spread through the elec-tronic hobbyist community, being given a boost by the January 1975 issue of Popular Electronics that featured the Altair computer kit, available from an Albuquerque com-pany called MITS for about $400. Designed around the Intel 8080, the Altair featured an expansion bus (an idea bor-rowed from minis).
The Altair was hard to build and had very limited mem-ory, but it was soon joined by companies that designed and marketed ready-to-use microcomputer systems, which soon became known as personal computers (PCs). By 1980, entries in the field included Apple (Apple II), Commodore (Pet), and Radio Shack (TRS-80). These computers shared certain common features: a microprocessor, memory in the form of plug-in chips, read-only memory chips containing a rudimentary operating system and a version of the BASIC language, and an expansion bus to which users could con-nect peripherals such as disk drives or printers.
The spread of microcomputing was considerably aided by the emergence of a technical culture where hobbyists and early adopters wrote and shared software, snatched up a variety of specialized magazines, talked computers in user groups, and evangelized for the cause of widespread personal computing.
Meanwhile, programming and the art of software development did not stand still. Innovations of the 1970s included the philosophy of structured programming (fea-turing well-defined control structures and methods for passing data to and from subroutines and procedures). New languages such as Pascal and C, building on the earlier Algol, supported structured programming design to varying degrees (see structured programming). Programmers on college campuses also had access to UNIX, a powerful oper-ating system containing a relatively simple kernel, a shell for interaction with users, and a growing variety of utility programs that could be connected together to solve data processing problems (see unix). It was in this environment that the government-funded ARPANET developed proto-cols for communicating between computers and allowing remote operation of programs. Along with this came e-mail, the sharing of information in newsgroups (Usenet), and a growing web of links between networks that would eventu-ally become the Internet (see internet).
1980s
In the 1980s, the personal computer came of age. IBM broke from its methodical corporate culture and allowed a design team to come up with a PC that featured an open, expand-able architecture. Other companies such as Compaq legally created compatible systems (called “clones”), and “PC-com-patible” machines became the industry standard. Under the leadership of Bill Gates, Microsoft gained control of the operating system market and also became the dominant competitor in applications software (particularly office soft-ware suites).
Although unable to gain market share comparable to the PC and its clones, Apple’s innovative Macintosh, introduced in 1984, adapted research from the Xerox PARC laboratory in user interface design. At a time when PC compatibles were still using Microsoft’s text-based MS-DOS, the Mac sported a graphical user interface featuring icons, menus, and buttons, controlled by a mouse (see user interface). Microsoft responded by developing the broadly similar Windows operating environment, which started out slowly but had become competitive with Apple’s by the end of the decade.
The 1980s also saw great growth in networking. Uni-versity computers running UNIX were increasingly linked through what was becoming the Internet, while office com-puters increasingly used local area networks (LANs) such as those based on Novell’s Netware system. Meanwhile, PCs were also being equipped with modems, enabling users to dial up a growing number of on-line services ranging from giants such as CompuServe to a diversity of individually run bulletin board systems (see bulletin board systems).
In the programming field a new paradigm, object-ori-ented programming (OOP) was offered by languages such as smalltalk and C++, a variant of the popular C language. The new style of programming focused on programs as embodying relationships between objects that are respon-sible for both private data and a public interface represented by methods, or capabilities offered to users of the object. Both structured and object-oriented methods attempted to keep up with the growing complexity of large software sys-tems that might incorporate millions of lines of code. The federal government adopted the Ada language with its abil-ity to precisely manage program structure and data opera-tions. (See object-oriented programming and ada.)
1990s
By the 1990s, the PC was a mature technology dominated by Microsoft’s Windows operating system. UNIX, too, had matured and become the system of choice for university com-puting and the worldwide Internet. Although the potential of the Internet for education and commerce was beginning to be explored, at the beginning of the decade the network was far from friendly for the average consumer user.
This changed when Tim Berners-Lee, a researcher at Geneva’s CERN physics lab, adapted hypertext (a way to link documents together) with the Internet protocol to implement the World Wide Web. By 1994, Web browsing software that could display graphics and play sounds was available for Windows-based and other computers (see World Wide Web and Web browser). The remainder of the decade became a frenzied rush to identify and exploit business plans based on e-commerce, the buying and sell-ing of goods and services on-line (see e-commerce). Mean-while, educators demanded Internet access for schools.
In the office, the Intranet (a LAN based on the Inter-net TCP/IP protocol) began to supplant earlier network-ing schemes. Belatedly recognizing the threat and potential posed by the Internet, Bill Gates plunged Microsoft into the Web server market, included the free Internet Explorer browser with Windows, and vowed that all Microsoft pro-grams would work seamlessly with the Internet.
Moore’s Law, the dictum that computer power roughly doubles every 18 months, continued to hold true as PCs went from clock rates of a few tens of MHz to more than 1 GHz. RAM and hard disk capacity kept pace, while low-cost color printers, scanners, digital cameras, and video systems made it easier than ever to bring rich media content into the PC and the on-line world.
Beyond 2000
The new decade began with great hopes, particularly for the Web and multimedia “dot-coms,” but their stocks, inflated by unsustainable expectations, took a significant dip in 2000–2001. By the middle of the decade the computing industry had largely recovered and in many ways was stron-ger than ever. On the Web, new software approaches (see
Ajax, application service provider, and service-ori-ented architecture) are changing the way services and even applications are delivered. The integration of search engines, mapping, local content, and user participation (see blogging, user-created content, and social network-ing) is changing the relationship between companies and their customers.
In hardware, Moore’s law is now expressed not through faster single processors, but using processors with two, four, or more processing “cores,” challenging software design-ers (see multiprocessing). Mobile computing is one of the strongest areas of growth (see pda and smartphone), with devices combining voice phone, text messaging, e-mail, and Web browsing.)
The industry continues to face formidable challenges ranging from mitigating environmental impact (see green pc) to the shifting of manufacturing and even software development to rapidly growing countries such as India and China (see globalism and the computer industry.) Thus far, each decade has brought new technologies and methods to the fore, and few observers doubt that this will be true in the future.
Note: for a more detailed chronology of significant events in computing, see Appendix 1: “Chronology of Com-puting.” For more on emerging technologies, see trends and emerging technologies.
No comments:
Post a Comment