Call of Juarez: Bound in Blood Screenshot

As a dyed-in-the-wool PC gamer and hardware enthusiast, I regularly visit [H]ardOCP, a website dedicated to all things PC gaming hardware. They recently did a performance evaluation of Call of Juarez: Bound in Blood using a number of modern graphics cards across various price ranges, and found, as they have with many games in the last year or two, that it performed very well even on lower-end graphics cards. They conclude:

As the prevalence of cross-platform game development increases, it seems that the PC's role as a gaming technology showcase is diminishing. Gone are the days when games were developed for the PC first, the consoles later. It is simply more profitable to focus on consoles, and then throw the PC gamers a bone (or not). There are various and numerous reasons for this, but the reason most cited by game developers and publishers is the persistence and relative ease of game piracy on the PC. It just makes sense that these businesses shift their focus in the face of unswerving opposition.

But whatever the cause, there is no doubt that PC game are getting lighter, and their hardware requirements are becoming less and less stringent. Once upon a time, PC gaming was about tweaking and modifying games to run and look better on the staggering variety of hardware in the wild. But now, almost every game we have seen in the past year has run beautifully out of the box on even the least expensive video cards.

Call of Juarez: Bound in Blood Screenshot

This is by no stretch of the imagination the first time I've heard such things. When I was active on the now-defunct Tweakguides.com forums, I debated the apparent decline in cutting-edge PC technology with PC gamers many times over. It is inarguable that in many respects, it has never been easier on the wallet to be a PC gamer. Many games do indeed perform exceptionally well across a large variety of cards; the high-end configurations seem more suited to those who want to run very high levels of anti-aliasing and/or ultra-high resolutions. My own video card configuration, a pair of nVidia GTX 260s—a reasonably high-end setup—allows me to run even the most demanding games with extremely high image quality on my 22" monitor. While ATI and nVidia are preparing to release their next-generation DirectX 11 cards this fall, I truly see no need for an upgrade, particularly since it will likely be at least a couple of years before DirectX 11 is widely used. 

But I think the reasons for this lessened pressure to buy expensive upgrades are more complex than the proliferation of multiplatform development. And I think that, despite the historical performance-per-dollar ratio we see in the video card market, games are continuing to push technological boundaries. Let's consider some of the factors.

Crysis Warhead Screenshot

1. A competitive GPU market

There is no denying that the video card market has matured greatly since its dawn in the mid-90s; it's hard to believe that ten years ago, the Voodoo3 was considered a high-performance card. nVidia and ATI hadn't even entered the picture at that point, much less become a mature, competitive market.

Over the last eight years or so, nVidia and ATI have continued to attempt to out-do one another, and each generation brings more performance across a greater variety of price ranges.  We've seen monster dual-GPU cards that cost over $600, and lean machines that offer surprising performance for a low price. The last year has been particularly good for ATI who, despite still retaining a relatively small share of the GPU marketplace compared to nVidia, has leveraged their efficient, powerful GPUs to force nVidia to drop their prices and offer more cards targeted at more price points.

If Intel's forthcoming Larrabee makes an impact in the GPU marketplace, we may see even more performance-per-dollar.

Crysis Warhead Screenshot

2. Reusable, highly optimized game engines

Every so often, a game comes along that really pushes graphical boundaries. Doom 3, Far Cry and Half-Life 2 did it in 2004; The Elder Scrolls IV: Oblivion did it in 2006, and Crysis did it in 2007. But these types of games are few and far between. It makes more sense for developers to reuse existing engines with minor tweaks and optimizations rather than attempt to build a cutting-edge engine from the ground up for every game. Over time, these engines can offer very impressive performance as they become increasingly optimized. Valve's Source engine, which was introduced in 2004 with Half-Life 2, is still widely used nearly five years later. The famous Unreal Engine 3 has likewise been in use for a few years now, and provides very impressive visuals while keeping hardware demands remarkably reasonable. Currently, the most advanced engine is Crytek's CryEngine2, which was introduced in 2007's Crysis. At the time, it was so advanced that even the highest-end dual-GPU configurations could not run it at maximum settings. A year later, the expansion Crysis Warhead introduced numerous performance optimizations which, along with the increasingly competitive GPU market, allowed it to be played at high or even maximum settings without an ultra-expensive PC. And while we will undoubtedly see advanced engines trickle in occasionally, given the immense risk in game development the use of a well-established high-performance engine will remain preferable to ground-up engine development for many developers.

Alan Wake Screenshot

3. Multiplatform development, piracy or not

There is little doubt that more developers are focusing on multiplatform development. Genres that were once solely the domain of hardcore PC gamers have branched into consoles as well—first-person shooters were the first to go, and even MMORPGs and real-time strategy games, while still well-established on the PC, have begun to trickle into the console space as well.

The exact impact of piracy on sales is unknown, nor is its real extent. But piracy aside, the ease of development across multiple platforms (the Xbox and its successor are DirectX-based platforms—hence the name, which was derived from "DirectX Box" during the original's development) and the wide audience granted by multiple platforms make a singular focus on any platform seem somewhat short-sighted. And while Microsoft and Sony are free to throw incentives at developers to keep games platform-exclusive, the PC remains an open platform for developers.

Call of Duty: World at War Screenshot

We should also keep in mind that multiplatform development is by no means some sort of new trend. It took hold with the first Xbox, and nothing has been the same since. As far back as The Elder Scrolls 3: Morrowind, big-name PC developers have been developing PC and console versions of their games simultaneously, and this has not stopped the PC versions of such games from being cutting-edge. Flexible game engines allow for scaling across a wide variety of hardware, and as we saw with Oblivion, engines can be optimized to take advantage of both consoles and a wide array of PC configurations. 

I can't help but think that there really isn't much to complain about. I'm occasionally frustrated when a developer delays or nixes a PC version of a game (as was recently the case with Alan Wake), but in most cases the wait doesn't faze me. There are still many excellent PC exclusives and multiplatform games that take fine advantage of modern PC technology. The competitive GPU marketplace, along with the rise of digital distribution platforms like Steam and the gaming-friendly features of Windows 7, are tearing down some of the entry barriers to PC gaming, which may be vital for the long-term viability of the platform. And ultimately, whether a game was developed for this platform or that often has very little bearing on its quality—and for those of us who are willing to shell out for a more customizable, higher-fidelity experience, the performance is still there. A game like Call of Duty: World at War may not be as demanding as Crysis, but when you see it in native high-resolution with anti-aliasing and every whiz-bang visual effect cranked up and play it with a 1600dpi laser mouse and customized key maps, the console versions just seem flaccid by comparison.  The PC gaming landscape is changing, but it's still going strong and, with inexpensive hardware, digital distribution and a vast catalog of games and mods, there has perhaps never been a better time to jump on board.

Leave a Reply

3 Comments on "Why isn’t PC gaming pushing technological boundaries?"

Notify of
avatar

Sort by:   newest | oldest
ZippyDSMlee
Guest
7 years 8 days ago
When PCs lead the industry in power and software development things were catered to the PC gamers but over the last 5 if not 10 years with since the days of the Xbox MS and the game industry itself focused more and more on consoles and when they matched the broad spectrum of mid rang rigs there was just no turning back since consoles just sale more titles. I picked the PC for 2 reasons control over the control mice and keyboard variety and control mapping variety(unless its some console hand me down that dose not have the IQ to… Read more »
redrain85
Guest
redrain85
7 years 22 days ago
There are definitely some pros to the current situation, with the focus on multiplatform development. It’s forced the hardware manufacturers to lower their prices in order to compete more effectively with consoles. Both ATI and nVidia have dramatically dropped the prices of their cards. You can now actually buy a really well performing video card for under $200. This was unthinkable a few years ago. But when developers don’t develop at least the occasional PC-exclusive title that pushes the boundaries, then obviously there’s little incentive to upgrade. Why the hardware manufacturers don’t give the game developers a kick in the… Read more »
Jones
Guest
Jones
7 years 30 days ago

Thank you for taking the time to write this informative article.

wpDiscuz