Today, playing games is one of the most popular computing activities. In the early days of computing, games offered a way to test AI techniques (see artificial intelligence). Games have also encouraged the development of more sophisticated graphics (see computer graphics) and ways of interacting with the machine (see user interface).
Games and AI
Although modern computer games may draw upon several genres, several recognizably distinct types of games have been developed over the past half century or so. The first were computer versions of existing board games. “Deter-ministic” games (where there is no element of chance) such as tick-tack-toe and, more important, checkers and chess offered a challenge to the first computer scientists who were seeking to learn how to make machines perform tasks that are usually attributed to human intelligence. For example, Alan Turing and Claude Shannon both developed chess-playing programs, although Turing’s came at a time when computers were still too primitive to handle the volume of calculations required, and was thus carried out by hand. By the time a computer program (Deep Blue) had defeated the world champion in 1997, the AI field had long since left the game behind (see chess and computers).
Simulation Games
Military planners had devised war games since the 19th century, but the complexity of modern warfare (including logistics as well as tactics) cried out for the help of the com-puter. By 1955 the U.S. military was running large-scale global cold war simulations pitting NATO against the USSR and the Eastern bloc. Unlike deterministic games such as chess, war games generally use complex rules to capture the many interacting factors such as the morale, experi-ence, and firepower of a military unit or the performance of an air defense system against different types of targets. The results will be more or less realistic depending on how many factors are properly accounted for—often only later combat experience will tell.
The use of game theory (the mathematics of competitive situations) and economics also proved to be fruitful areas for the use of computer simulations. In 1959 Carnegie Tech (later Carnegie-Melon University) introduced a simulation called “The Management Game.” Until the 1980s, however, lack of inexpensive computing power kept sophisticated simulations limited to large institutions such as the mili-tary, government, universities, and major corporations.
Today simulation games are popular in both the educa-tional and consumer markets. They include flight simula-tors, a variety of sports including baseball, football, soccer, and golf, and games in which the player strives to build a 19th-century railroad empire or run a modern city. Indeed some games, such as the popular kingdom-building simula-tor Civilization or the complex Sim City, while marketed pri-marily as entertainment, can easily fit into a social studies curriculum.
Arcade and Graphic Games
Starting in the 1960s, CRT (television-like) displays gave the new minicomputers the means to display simple graph-ics. In 1962, an intrepid band of game hackers at MIT cre-ated Spacewar, the first interactive graphic game and the forerunner of the arcade boom of the 1970s. When the first home computers from Apple, Commodore, Atari, and IBM hit the market in the late 1970s and early 1980s, they included rudimentary (but often colorful) graphics capa-bilities. Many amateur programmers used the comput-ers’ built-in BASIC language to create games such as lunar lander simulators and Star Trek–style space battles. Around the same time, the home game cartridge machine was intro-duced by Atari and other companies, while the arcade game Pac-Man became a phenomenal success in 1980 (see game consoles).
Role-playing, Real-time, and Social Worlds
Around the time of the first home computers, a noncom-puter game called Dungeons and Dragons became extremely popular. “D&D” and other role-playing games allowed play-ers to create and portray characters, with elaborate rules being used to resolve events such as battles. Role-play-ing games soon began to appear on PCs—early examples include the Wizardry and Ultima series. Meanwhile, text-based adventure games were becoming popular on early computer networks, particularly at universities. These evolved into MUDs (Multi-User Dungeons) where players’ characters could interact with each other. Eventually many of these programs went beyond their adventuring roots to create a variety of social worlds in a sort of text-based vir-tual reality.
By the 1990s, the typical PC had a special circuit (see graphics card) capable of displaying millions of colors, together with video memory (now 256 MB or more) that could hold the complex images needed for high-resolution animation. Computer game graphics have become increas-ingly complex (see computer graphics), including real-istic textures, shading and light, smooth animation, and special effects rivaling Hollywood. (Compare, for example, early wireframe graphics in games such as the Wizardry of 1980 with games such as Diablo II and Warcraft with ani-mated characters moving in a richly textured world.)
The way players interact with the game world has also significantly changed. The first computer games tended to be divided into turn-based strategy and role-playing games and real-time arcade-style “shoot ’em ups.” Today, however, most games, regardless of genre, run as RTS (real-time sim-ulations) in which players must interact continuously with the game situation.
By the late 1990s gaming was no longer a solitary pur-suit. The Internet made it possible to offer game worlds in which thousands of players could participate simultane-ously (see online games). Games such as Everquest and Asheron’s Call have thousands of devoted players who spend many hours developing their characters’ skills, while open-ended worlds such as Second Life seem to no longer be games at all, but a virtual, parallel universe with a full range of social interaction. However, the increased real-ism of modern games has also heightened the controversy about in-game violence and other antisocial behavior, as in the Grand Theft Auto series. (Although there is a rating system for games similar to that for movies, its effectiveness in keeping adult-themed games out of the hands of young children seems to be limited.)
Game Development
The emphasis on state-of-the-art animation and graphics and multiplayer design has changed the way game develop-ment is done. The earliest home computer games were typi-cally the product of a single designer’s vision, such as Chris
Crawford’s Balance of Power and Richard Garriott (“Lord British”) in the Ultima series. Today, however, commercially competitive games are the product of teams that include graphics, animation, and sound specialists, actors and voice talent, and other specialists in addition to the game design-ers. While earlier games might be compared to books with single authors, modern game developers often compare their industry to the movie industry with its dominant stu-dios. And, as with the movie industry, critics have argued that the high cost of development and of access to the mar-ket has led to much imitation of successful titles and less innovation.
On the other hand, a variety of modern programming environments (such as Visual Basic or even Macromedia Flash) make it easy for young programmers to get a taste of game programming, and for amateur programmers to create games that can be distributed via the Internet (see shareware and freeware). Although computer science programs have been slow to recognize the attraction and value of game programming, a variety of academic pro-grams are now emerging. These range from computer arts, graphics, and animation programs to a full-fledged four-year degree program in game design at the University of California, Santa Cruz. This program includes not only courses in game design and programming, but also courses on the game business and even ethics.
No comments:
Post a Comment