Kevin (erf_) wrote,

spawn more overlords!

Game industry, I love you, but you're bringing me down.

Do you remember when the Xbox 360 launched at $400? And the PS3 at $500? It never crossed my mind to buy either of those consoles at launch. Not because the launch titles were disappointing (promises of then-distant Lost Planet, Halo 2, and Gears of War for Xbox and Metal Gear Solid 4, LittleBigPlanet, and Mirror's Edge for PS3 would have been more than enough to justify a purchase), but because those price tags were so out of my range that, lifelong passion aside, it was inconceivable that I could ever drop that kind of money on something that wouldn't get me through school or kept me fed and sheltered.

The price of a launch PS3, two controllers, a memory card, and one game would have come out to about the same as one month's rent in the house I shared with three other gamers on South Pleasant St. (Granted, my desktop computer cost even more, but that was a machine that I needed for coding, writing papers, and yes, making games--and it was just barely powerful enough to play even games that had come out years ago.) We talked about pooling together our funds and buying a PS3 at some point; instead I ended up using the money my parents had sent me over the holidays to buy a slim PS2, as even that console had been unaffordable to me until the launch of its successor sent prices tumbling down.

As for the Wii...we could have afforded one. But that console very much falls into a niche of its own, one that, at the time, I had no desire to inhabit.

In 1977, the Atari 2600 launched at $200. In 1985, the NES launched at $200. In 1990, the Super Nintendo and Sega Genesis launched at $200. In 1995, the original PlayStation launched at $200; in 2000 the PlayStation 2 launched at $200.

In 2005, the Xbox 360 launched at $400. In 2006, the PS3 launched at $500.

"Always $200" is neither a fair comparison nor a sustainable pricing scheme, I know--inflation, the 1980s video game crash, the dot-com boom and a variety of other factors have taken the value of the dollar on a plunge since then, and the $200 that could have been used to buy an Atari 2600 in 1977 would have been just enough to buy a PS3 in 2006. Perhaps the higher pricing was a step in the right direction. Competition between Sony and Microsoft meant there was intense pressure for each company to engineer the higher-end, more powerful system; at GDC 2007, for example, a dev amusedly remarked to me that the sound hardware on the PS3 had more DSPs than anyone knew what to do with. (Do gamers really care that they can hear 64 simultaneous sound channels at once?) The Wii launched at $250 in 2006 dollars and is vastly inferior, hardware-wise, to its competitors--its dated graphics and mediocre processing power are a pretty good picture of where games would be right now if the price point hadn't been kicked up a notch. But the implication is that a gaming console is now no longer a mere appliance. The decision to buy a seventh-gen system is no longer that of buying an NES or a VCR, of whether or not to eat ramen and skip movie tickets for a month to save up, of whether to hold off on buying that new hard drive for your computer. It's approaching the territory of a new laptop, or half a used automobile, or a custom electric guitar, or making your share of rent in a New York apartment. For someone who has to make every dollar count, spending that amount of money on an entertainment device is simply not justifiable.

To put things in perspective: In late 2008, I was working on Wall Street, making good money in a cushy upper-middle-class software engineering job. After spending virtually nothing on anything I didn't need since graduation I had finally paid off most of my student loans--and I still couldn't afford a PS3. Sure, it was an entry-level job. But it was an entry-level job on Wall Street.

I am certainly not the first--maybe I am the last--to notice. Penny Arcade and Kotaku were talking about the price jump in distanced, abstract terms before seventh gen even came out--they were worrying about the Poor Gamers who wouldn't be able to continue their hobby anymore. I guess it's taken me the better part of half a decade to realize that the Poor Gamers they're talking about aren't the teenaged kids of single moms, or my buddies who are working minimum wage (were working minimum wage, before the recession eliminated those jobs). The Poor Gamer category now includes people like me--college students, recent grads, and pretty much anyone for whom you'd be unlikely to see an HDTV and a fancy 5.1 surround audio system in their living room. (Both of which, not surprisingly, the PS3 and Xbox 360 support.)

This isn't an oh, look at me, I'm so poor sob story. This is a hey, I'm a 15 to 35 year old middle class single worker, I'm your target demographic and I cannot afford your product story.

To their credit, the game industry has listened. They don't sell consoles for 2006 prices anymore because the recession means that no one can afford consoles at 2006 prices. Especially when the PS3 and Xbox 360 break down as often as they do. (Just look at all the class action lawsuits over Red Ring of Deaths and scratched discs Microsoft has had to deal with since 2005. And just a couple days ago early-model 40GB PS3s all over the world stopped functioning due to a leap year bug...) Both Sony and Microsoft have announced significant price reductions over the last year. Last year Sony, in a nod to the strategy that allowed me to finally afford a PS2, released a downmarket "slim" version of the PS3 that costs significantly less, at the expense of some features (like PS2 backward compatibility). Microsoft has also demonstrated exceptional business savvy in making their product more accessible to consumers, refurbishing thousands of Red Ring of Death'd early-model Xbox 360s replaced under warranty and reselling them through online retailers at a drastically reduced price.

To find the true market price of these systems, without the overhead applied by GameStop or, I turned to eBay--the most reliable indicator I've found so far for how much stuff is actually worth. The market price for a used slim 120GB PS3? $250-350. The market price for a refurbished 20GB Xbox 360 (that has already exploded once)? $175-200. The prices of these consoles are finally within sanity. I can almost afford them now.

But the damage has been done.

I am going to PAX East at the end of this month. As a lifelong gamer and an aspiring engineering dev, I am familiar with the games most other attendees will be talking about. I've studied things like the rapid prototyping process used for Army of Two, the groundbreaking dynamic narrative mechanics of Heavy Rain, the clever system innovations of Bioshock 2 and Modern Warfare 2 and their predecessors. I've read the abandoned design docs for Fallout 3 (back when Black Isle was on the project instead of Bethesda) and followed the revision process for Team Fortress 2, I've studied balance issues and unit design and bits of the development pipeline of upcoming StarCraft 2. I've read up on improvements to the A-star pathfinding algorithm used in Supreme Commander and Voronoi fragmentation models considered for use in Final Fantasy XIII. I've followed the marketing and business strategy of Nintendo (post-Game Over), I've had drinks with dozens of developers, from one-man studios to the avatars of behemoths, I've volunteered at GDC and two SIGGRAPHs, I trade industry gossip with extremely minor players in the field. I am by no means an industry expert (a games journalist or industry analyst would not be impressed) but I am enough of a wallflower that I can impress the living shit out of your average 17-year-old Halo 3: ODST junkie. And I have not played one, not one, of the games I have been talking about. Not a single one.

I am reminded of chemistry and physics students at underfunded small colleges, who, not having the resources to perform actual experiments, must simply read other people's results and take everything for granted. It's like that. These games have become theoretical concepts to me, exercises in engineering and design--they are not something that a sixteen-year-old sits down to after a day of trigonometry and vague sexual awkwardness. B.A. in Computer Science and two years of professional programming experience aside, this makes me less qualified for the industry than your typical 18-year-old, Electronic Gaming Monthly-reading DigiPen aspirant.

Software engineer apps at major studios, ostensibly seeking to weed out the true believers from engineers in other fields who have never played a game in their lives, often ask, "Name a triple-A title you have played recently, and tell us why you enjoyed it." If it weren't for the $199 Nintendo DS trade-in promotion last year, I would be unable to answer this question at all. (I've played hundreds of budget titles--in my lifetime I've played more games than most people know exist--but oh, that "triple-A" word always has to be in there, doesn't it.) And somehow I doubt that my answers, as is, are ever satisfactory. I seek to convince employers that I can craft experiences that I have not had the pleasure of enjoying subjectively. This is, as they are well aware, bullshit. Would you hire a writer who doesn't read, or a chef who doesn't eat?

How can I work in games, if I can't even afford to play them?
If I work in games, will I be able to afford the games I make?

I know the industry's hands are tied about all this. Gamer paychecks have dwindled but gamer expectations have not; at GDC '07 there was already considerable anxiety over the rising cost of development and how it could simply not continue at that rate. A multitude of reorganizations, layoffs (hi EA!), and various market contraction related activities have helped reduce the damage, as well as the influx in free time an increasingly unemployed (yet somehow either richer or more irresponsible with their money than I am) gamer population can spend on video games. Sales have even been good in the waning days of the recession, after a rocky start. But the reality of the situation is that it hasn't been enough. On one occasion I've even been told (by a small developer), "We'd much more seriously consider hiring you if we knew we wouldn't have to lay you off in a couple months." I don't blame them. I wouldn't risk hiring me, if I were them.

So they can't afford to hire entry-level programmers (my two years of LAMP / Java development experience, while valuable for web development, is worth jack for the kind of stuff they do), and they can't afford to make consoles cheaper. Ultimately, this means I get screwed over both ways.

This is the sole reason why I am moving from the mainstream games job market into the casual games job market. It's why my resumes these days are going to advergame, iPhone, and Facebook app developers instead of the likes of Atari, EA, and Bioware. It's why I'm making Nintendo DS homebrew games instead of trying to develop for Xbox Live Arcade. It's why I'm putting aside 6 years of C/C++ and the two years of linear algebra, 3D graphics algorithms, and OpenGL I did in school to learn Flash.

My love for mainstream games has not diminished. But I have, in a very literal sense, been classed out of the biz.
Tags: essays, games, the root of all evil, work

  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded