Let me preface this blog by thanking MobyGames for all of their high quality screenshots of classic games. Without their hard work and their kind permission to repost the screenshots in question, this post would not have been possible. Thanks a million!
Regulars here at Avault will have recently seen my review of Crysis and the very specific warnings I related about the game’s stiff system requirements. While there’s no denying that trend in PC gaming is always toward programs that utilize more and more processing power, memory and storage space, Crysis bore special mention because its requirements are beyond even those of its contemporaries, in some cases significantly higher. For many hardcore gamers, whose systems easily exceed the minimum requirements for Crysis, I’m betting my warning was merely seen as a justification for their extremely high end systems and then ignored. In fact, I’d be disappointed if anyone with a top of the line Alienware system felt intimidated by my statements.
However, we few, we happy few, who are PC gamers are not just gamers — we are also consumers of a variety of technological products that go beyond the games themselves, including operating systems, hard drives, RAM, CPUs, video cards, sound cards, network cards, modems, Internet connections, mice, keyboards, speakers (or a connection to a real stereo if you’re an audiophile like me), surge protectors, monitors, CD-RWs, DVD, DVD-RW, thumb drives and probably a few other things I’ve left out. Each of these items has some impact on the quality of your PC gaming experience. Some, such as DVD and CD drives, are not a big deal anymore since games tend to store almost everything on the hard drive. Others, such as the Internet connection, may be extremely important for gamers who think multiplayer is the end all and be all of gaming, while unimportant to other gamers.
However, as consumers, when new games come along that require substantial upgrades to even play, we should ask ourselves how much better are the features versus how much more computing resources it takes to achieve this new level of gaming. After having seen the best graphics games have to offer ever since the ’80s, I’m beginning to wonder if graphics have reached a plateau in terms of quality versus performance.
Let’s start by tracing the development of graphics in PC gaming over the past 30 years by using the easiest barometer available: screenshots. And where better to start than with one of the most popular games of all time, a game that started an entire franchise that continues to this day. I’m talking about the original Ultima, known variously as Ultima I: The First Age of Darkness, Ultima I, and even just Ultima. Originally released in 1980 for the venerable Apple II, it was so popular it was ported, recoded and re-released several times, the latest of which was in a collection published in 1998 for DOS. Ultima‘s legacy set the standard for every CRPG to follow, having more impact than even venerable franchises such as Final Fantasy. But what did computer graphics look like in 1980? See for yourself:
That giant rat looks frightening, doesn’t it? Well, maybe not to jaded tastes that have seen Oblivion, but it’s not too bad, considering the computing power of an Apple II. According to Wikipedia, the Apple II+ had a 1 MHz processor, 48 KB RAM, a cassette drive (remember cassettes as storage media?) and an old school 5 1/4 inch disk drive. The cost? $1,200 in 1979-1980, which is roughly $3,400 after adjusting for inflation. Now, before my fellow hardware nuts start pointing out how processor cycles, RAM and storage media capacity are not meaningful when compared to the same stats today, let me say I know. I’m merely simplifying things to keep the argument simple and to point out very broad trends. (Believe me, I know what total, ahem, bull, it was to compare MHz when looking at the performance of RISC and SISC processors back in the day.)
Such difficulties in comparison don’t obviate my claim; compared with computer capabilities today, you could store the Apple II OS, the entire library of its games and run several copies at the same time… with your video card. And this isn’t to say that the Apple II was a piece of junk; I’m just pointing out how far we’ve come in terms of hardware and in terms of graphics. Alright, so that’s what top of the line gaming looked like in 1980.
By 1986, graphics had improved somewhat, but more important, the games had gotten better. That fateful year in gaming saw the release of The Legend of Zelda and Metroid for consoles in Japan (yet two more franchises that are still making money), but for the PC, the best of the best was Starflight. This open-ended science fiction game included starship battles, roleplaying, exploration and a non-linear plot, all of which are the sorts of gaming elements we still talk about. It also looked pretty good in DOS:
If you played this game in DOS, you might have done it on an IBM PC AT with an Intel 80286 6 MHz processor, 512 KB RAM, perhaps a 10 or 20 MB harddrive and an EGA graphics mode. So, for a 6x increase in processor power and a 10x increase in RAM, you got those graphics. The cost? As much as $6,000 in 1984, which is about $11,500 when adjusting for inflation.
The years rolled by until a little company called id was founded by John Carmack, John Romero, Tom Hall and Adrian Carmack. In 1992, they released Wolfenstein 3D and made history. First person shooters were now a mainstream genre and the first choice for those craving cutting edge graphics. And in 1992, this is what cutting edge looked like. Note the pseudo-3D perspective:
By this time, PCs and their ilk were running around 25 MHz, but by the time Doom came out, you could find machines running at 50 or even 60 MHz. As for RAM, you might have two megs. Cost? Well, by this point, prices were dropping dramatically. An Amiga 1200 sold for $599 in 1992, which is $900 when adjusted for inflation. Competition in the market was good for consumers, obviously.
By 1996, Blizzard Entertainment was making money hand over fist, and it was because of Diablo. While the reasons for Diablo‘s success are legion, it’s graphics represented a gold standard for what we expected from all of our games. They look dated now, but the isometric 3D perspective would serve Blizzard well for years to come, including use in games such as Starcraft and Diablo II:
Diablo required only modest improvements in system requirements: 60MHz processor, 8 MBs RAM (16 for multiplayer) an SVGA video card, a 2X CD-ROM and Windows 95.
But within three years, both games and computers began ramping up their resources. 1999 saw the release of Aliens versus Predator for the PC. In the old days of Avault, this game garnered the Seal of Approval. I know, because I didn’t purchase it until the reviewers gave it the thumbs up. The graphics for it were, and are, simply wonderful. Not only do they have decent polygon counts and good color depth, but you really pay for three different video display modes because each race views the game world differently. Furthermore, Predators and Marines have extra vision modes, further multiplying the graphics wonders of this title. And the screenshots prove it. Take a look:
Thanks to the combination of excellent graphics and some brilliant sound, the game was very scary when you were on the defensive. I never did, and still will not, play the game as the Marine with the lights off. You also felt the distinct rush of slowly, stealthily stalking your prey while playing as the Predator and Alien. Great graphics also helped this game earn its Mature ESRB game rating (head-bites, dismembering, blood splattering everywhere). All of these good graphics required hefty system requirements. Three years after Diablo, AvP required 200 MHz, 32 MB RAM, 64 MB harddrive space and Windows 98.
For the next few years, clones of Doom were everywhere. But in 2004, id proved it still had the midas touch when it released Doom 3. Doom 3 garnered nearly unanimous praise for a variety of things, including a return to Doom‘s simpler gameplay in the face of ever more complex shooters. The worst thing anyone could say about Doom 3 was that duct tape should have been standard issue for any Marine who might have to fight in the dark. The graphics were great and required some hefty system requirements: 1.5 GHz CPU, 384 MB RAM, 64 MB video card and 2.2 gigabytes of harddrive space. These are the graphics that resulted:
Notice the attention to detail. Also notice the nice polygon counts. You can’t say you didn’t get your money’s worth; however, you’ll notice that the progress of graphical improvement has slowed since AvP. The system requirements are nearly seven times as high, but are the graphics seven times better?By 2006, another landmark title had been released: The Elder Scrolls IV: Oblivion. I could spend a few hours citing various reviewers who loved this game, but who has time to do that? Instead, let’s just summarize it by saying Bethesda Game Studios earned every single penny of profit they have received for this game. And no wonder: between open-ended gameplay, a robust character generation and advancement system, and excellent voice acting from such notable actors as Patrick Stewart and Sean Bean, the game did everything right. The lack of multiplayer did not seem to hurt its sales in the least. Furthermore, the graphics are gorgeous. I just played it again last night, and they’re still engaging. Yet the system requirements have gotten even harsher. Here’s what a 2 GHz processor, 512 MB RAM, 128 MB video card and a 4.6 GB harddrive would get you:
Are these graphics better than Doom 3? I think so, and probably some of you would agree with me. But the system requirements are twice that of Doom 3. Are the graphics twice as good? It’s a hard call.
All of this discussion of system requirements and screenshots has been leading up to a comparison with Crysis. This game needs no introduction; if, for some reason, it does, go read my review. Even if you think I need my head examined, Metacritic indicates that Crysis has a combined score of 91/100. In academic circles, we’d call that “critical consensus,” and it’s very favorable. Like it or not, this title is going to become a benchmark for both shooters in general and graphics in particular. There’s even some talk that it has the best graphics of any game to date. We may not know for a few months, if only because even top of the line systems have a hard time going beyond “medium” for graphics. Take a look at the screenshots and make your own decision:
And what kind of system do you need in order to take advantage of these graphics? A 2.8 GHz processor, 1 GB RAM, 256 MB video card and 12 GB harddrive space. However, if you’re not running Windows Vista and DirectX10, you miss out on some DirectX10-only features of the game engine (I’ll rant about DirectX10 and non-Vista users later). Those requirements are about twice that of Oblivion. Yes, they’re impressive. But does anyone reading this think they’re actually twice as good?I don’t think so; however, the point of this article is not to begin bashing EA for releasing a game with high system requirements. In fact, if anything, their decision to release the game in its current form seems to be a sound business decision. Crysis will continue to generate sales for years, and EA has nothing to fear from another game coming out and rendeirng their top title obsolete.
Furthermore, as my previous statements about the price of computers makes clear, computer prices have tended to drop over the years as you adjust for inflation. Even Alienware’s most overpriced models have comparable prices to the Apple II and IBM PC AT when they were first released.
Instead, I’m trying to make a very different point. Specifically, that while Moore’s Law may predict that the number of transistors that can be inexpensively placed on an integrated circuit increases exponentially, the improvement of graphics quality over time does not. In fact, it seems that the rate of improvement in graphics has slowed down ever since games gained 3D graphics while their system requirements have kept up with Moore’s Law. Part of this is because other aspects of games have gotten more bloated as time has worn on. Voice acting eats up the harddrive space, while detailed inventories and item creation routines in CRPGs utilize more RAM.
And, of course, Windows has gotten more bloated over time as well, meaning that more processor cycles and RAM is used just to turn on the computer and get to your own desktop.
But on the other hand, we seem to have reached a point where PC games, despite the demand for titles that look stunning, are not going to improve their appearance by leaps and bounds every generation. The hardware doesn’t seem to be able to support it. Yes, I concede that CGI in Hollywood movies will continue to get better, but coding a CGI fight scene in a movie is not the same as coding a 3D first person shooter wherein the physics, camera views, object shapes, decals and lighting are all variable and subject to change at a moment’s notice and at the whim of a player.
If the hardware isn’t going support leaps and bounds in graphics, then what have we to hope for? That depends. If Rock’s Law is correct, then the cost of a semi-conductor plant will double every four years. While this adage has been challenged for its veracity, the general trend does seem to be that it is getting more expensive to improve the production of integrated circuits and research new techniques. Furthermore, Moore’s Law may become obsolete as researchers finally get a handle on nanotechnology and build integrated circuits that utilize molecular-size components. Keeping these things in mind, on the one hand, we have the possibility for soaring production costs to slow down the hardware race in PCs, which will further plateau improvements in graphics. On the other, we have technological limitations that will slow down improvements in processors. These two factors conspire to arrest the development of graphics in PCs.
The question seems to be whether or not there will be some sort of breakthrough in either computer design or programming that will grant us the next massive improvement in graphics within the next few years. There will have to be; we cannot rely on the old trend of faster CPUs and more efficient memory to enable programmers and developers to satisfy our insatiable desire for graphically intense titles.
Review these screenshots and tell me if you think I’m right: