|Quotes • Headscratchers • Playing With • Useful Notes • Analysis • Image Links • Haiku • Laconic|
Making Video Games is a tricky business. While they're most often compared to movies, they have both their own culture and very different technologies driving them. A game is not only a complex computer program, it can have a full-blown GUI and render 3D graphics in real time (as opposed to the nearest non-interactive equivalent, CGI movies, which have the luxury of using existing software to "film" and can spend hours or even days on a single frame, rather than 1/60 of a second). The technologies involved also shift much faster than in movies. TV and movie writers, however, have very little in the way of first-hand experience with their sister industry (even game writing, which arguably overlaps the most with "normal" scriptwriting, requires the writer to make the gameplay and narrative complement each other). The result is this trope; other media tend to misunderstand the complex process of making a game. A few simple points that address common misconceptions are below:
The days of commercial games being made by a guy in his basement on his own are gone. A commercially viable game made from scratch requires a large team to put together thanks to the many skill sets needed: graphic programmers, physics programmers, AI designers, level designers, character modelers...
- Various "libraries" (collections of pre-written code), some open-source and some paid, have been developed to handle part of this task. Games with smaller teams often use these libraries (for example, a standard API to handle the graphics) or license an existing Game Engine.
- More recently, this idea has made a comeback, since Digital Distribution services like Steam and the iPhone app store allow smaller developers to produce and sell simpler games. The prohibitive cost of "dev stations" (modified consoles which allow prototype versions of a game to be played on them) keeps this from working for big console releases or multiplayer Party Games, and so-called "indie" games tend to be relatively short, with a simple (often retro) aesthetic.
There are a number of distinct groups involved in the technical side of game production. These are not formal or standardized, so the lines between them can be blurred (for example, programmers might know something about design, and designers probably know the basics of programming).
- Game Designers develop the rules, mechanics, and systems of the game. In an RTS, they decide what stats units have, in FPSes they decide what guns do, and so forth. Generally they are led by a "Lead Designer" or similar, who is roughly analogous to a film director. Though they are arguably the most essential element in production, they tend not to show up much in media, since (unlike programmers, artists, or executives) most people don't have a clear handle on what they actually do.
- Programmers write the skeleton and muscles of the game (as it were); they create the actual program which makes everything move, allows enemies to think, and so on. When their job isn't done you end up with Vaporware (i.e. nothing), or at best, an obvious Game Maker work.
- Artists (including visual artists, writers, animators, musicians, etc.) put the flesh and skin on this skeleton; without their sounds and visuals Pac-Man Fever would still be a Truth in Television. One of the major debates in game design is whether good games tend to start with designers or these guys -- obviously, designers are essential for good gameplay, but starting with a solid story or art tends to make a more "cinematic" experience.
- Quality Assurance, as their name suggests, extensively test the game for bugs, balance issues, and hardware/software compatibility. While sometimes dismissed as glorified testers, a good QA developer is invaluable for making sure the game's design and programming are all working like they should. When their job isn't done right, you end up with an Obvious Beta. Note that being a QA tester is not a slacker's dream job where you get to play awesome video games all day and get paid fat bucks for it. This Penny Arcade comic (and its accompanying news item) provide some insight.
- Producers foot the bill for the project as a whole and have various oversight roles, making sure that everybody else is doing what they're supposed to be, and is on time and budget. They have a reputation for being curmudgeonly bastards with no respect for Art, but they have an important role keeping the project grounded in reality. Often the producers are a separate company from the developers, which certain people -- fans, the media, Jack Thompson -- are apt to forget.
Game developers have a variety of working environments (as one might imagine given the number of different jobs).
- Programmers have Stereotype A, the screen full of unreadably-tiny code; most programmers are given small chunks of the game to work on, so only the head honchos usually have actual working copies of everything. Later in development, once the game has reached the alpha/beta stage, programmers will be called on to bug-hunt or otherwise make small modifications in the code, but the ability to see those changes in real time remains rare. (Most games take a non-trivial amount of time to compile, especially if changes have been made to art.)
- Some artists and designers have Stereotype B, the on-screen 3D model. This one shows up especially often in media, but isn't all that common in actual game design. Some games use a setup like this is design their levels, maps, and so forth, but unless it is part of an existing engine, creating utilities for level creation is the responsibility of the programmers. Such utilities tend to be very game-specific and not suited for use with other projects; dress them up a little and they become "map editors" and the like for making user-created content.
- For the most part, artists such as animators and musicians use their own sets of industry-standard tools, not too different from what they might use in film or their own work. Many of these specialists are freelancers, contracted by the development studio on a project-by-project basis.
- Designers have unusual workspaces, which tend to vary depending on the type of game and personal preference. Whiteboards and other means of mapping out information are common, since the designers often deal with problems such as interactions between rules, "cycles" of gameplay actions, and other processes that lend themselves well to visual representation.
Making a game follows several stages, which determine who is working on the game and what they're doing at any particular point. Specific steps vary from company to company, but typically include:
- Pre-alpha: This covers the beginning of the game's development, starting from the basic idea. In early pre-alpha, the game generally doesn't exist outside of prototypes and concepts, which means it can be easy to make sweeping changes ("what if it was co-op?" "what if the player was a rabbit?" "what if we added aliens?"). Eventually, a core concept and feature set emerge and are agreed upon, and programmers begin to hash out the essential features -- the basic functions the game will need to do what they want it to. At this point, the in-house artist(s), if any, will work on concept art that defines the visual and audio styles.
- Alpha: The alpha stage begins once the game is "feature-complete" and has its essential framework in place. Writers, artists, and sound designers are called in to begin fleshing out the game. Levels, maps, etc. are designed and implemented as well. On the programming side, bugs are the rule rather than the exception at this point, and much effort is put toward weeding them out.
- Beta: At this point, the product is sufficiently developed to allow play, if not necessarily smooth play. Generally, a beta will include not just gameplay but also a first pass on story, art, and sound. Testing is the focus at this point, to the extent that "beta" often refers specifically to the testing process. Beta tests can be "in-house", which includes developers and full-time testers, or "open", which recruit much more broadly (often from players of the company's other games). In either case, beta-testers are often required to sign agreements to the effect that they won't reveal any details about the game.
- Release: After a protracted beta period, spent pounding bugs flat and polishing the assets, the game is finally ready to sell. However, this isn't necessarily the end of the job. Often the team is kept on to work on Downloadable Content or sequels. In addition, modern games are expected to be "supported" for a period after launch, meaning that programmers will be kept on to fix bugs and other problems that show up after release.
- One episode of Law and Order: Criminal Intent featured a game designer as the murderer of the week. It portrayed him as being only one of two people working on a game, asking whether the lighting on a level he was designing was OK (a designer places lamps in a level, and the engine uses them to light the level) and a review apparently not only mentioned him by name but criticized his programming as being "sterile". As anyone who's read a review will know, an individual programmer won't be known by name, nor would a programmer be criticized for the graphics (that would be the job of the designer). But it turned out that the level had an Easter Egg of the designers name, and the designer was one of the owners of the company.
- The 2001 horror movie How To Make A Monster was based around the premise of a monster from a game killing off its creators. As a remake of a movie which originally used a movie monster, it makes some mistakes. According to the typically vitriolic Something Awful review, in it, a team of three people (responsible for AI, sounds and weapons, respectively) is given a month to make a computer game. No wonder they manage such an Epic Failure that the game actually starts killing them.
- CSI: Miami had an episode where a game executive provided some teenagers with TEC-9s and encouraged them to act out events from his GTA clone to build up hype.
- And to add insult to injury, made him much more of a Jerkass than even the most hated of real game industry suits.
- The chick-lit novel Lucy Crocker 2.0 by Caroline Preston. The heroine is a housewife and one-time artist who helps her programmer husband make wildly successful computer games. Unfortunately, Preston can't even accurately describe a woman checking her e-mail, much less what goes into designing a game. The process seems to consist of Lucy Crocker painting something with watercolors, and her husband scanning the image into his computer.
- There was an episode of Veronica Mars where two geeks make a world-class video game in their dorm room, all by themselves.
- Grandma's Boy is essentially this trope layered over a Stoner Flick.
- The live action version of One Hundred and One Dalmatians had Roger's profession updated from music composer to games designer. The process of making a game apparently involved him taking a game (which he presumably made on his own) to a group of suits who let an obnoxious child review it (apparently having played it for a few minutes) and give him feedback.
- Commercials for Westwood College and Collins College follow this trope, as seen here and here. It's a good thing that guy was there to notice the graphics need tightening up.
- In an episode of King of the Hill, two nerds at the community college make a full-length Grand Theft Auto clone, with Hank as the protagonist, just to mock him. After giving Hank the only copy (which naturally works perfectly), they apparently just get bored of it and never release the game at all.
- It could be a mod of an existing game, which would be much more believable for two people to produce. And Hank is shown playing online with at least some other people.
- The eponymous game of Stay Alive was apparently made by one guy drawing creepy pictures in a notebook. Over the course of the movie we see almost his entire house and he doesn't even have a computer.
- Averted in John Sandford's "Prey series" where Lucas Davenport only comes up with the storylines and rules (he started out doing wargame scenarios) and leaves the coding to first, one expert, and later an entire building of them. He started out trying to do all the coding himself, but quickly realized that it was beyond him.
- NCIS: Los Angeles had Calen go undercover as a game tester. It quickly becomes apparent that he is horrible at it so the team cuts the power to the building before the real testers in the company can discover that he is faking.
- In addition to this, it ignores the fact that a mixed pool of testers (and thus less skilled players) would be quite desirable for QA purposes.
- NCIS episode "Kill Screen" has the lead programmer for an online game insert a sophisticated piece of code into the game with the intent to create a botnet supercomputer able to hack into the Pentagon. The company he works for is portrayed as a fairly large and successful organization and it is quite implausible that he would be able to sneak something like that into the program without it being detected by other programmers and testers (particularly since it ignores the fact that the software would be tested as above). Perfectly possible in reality, especially for a lead programmer. Programmers are usually responsible for a distinct part of the app and are too busy to check what others are doing. Of course, the ideas that a) the Pentagon would be vulnerable to such a botnet and b) an MMO would be the best way to create one are... dubious at best.
- The David Cronenberg film eXistenZ depicts Allegra Geller as the world's premiere game designer of the eponymous game. Aside from egregious playing straight of No Plans, No Prototype, No Backup, none of her associates even seem to have the slightest idea what eXistenZ is actually about. Justified, as none of the people playing transCendenZ are professional game designers.