I didn’t grow up rich, but I didn’t exactly grow up poor. I grew up in the Northwest suburbs of Chicago, around absurdly rich kids. We did alright, but our area’s high cost of living presented plenty of challenges. My friends growing up didn’t appreciate the value of money like I did. They’d never really had to worry about it.
One Christmas, my friends, feeling bad that I was the one person in the group whose house didn’t feature an Xbox and a copy of Halo for all of us to play, pitched in and bought me a used Xbox. They also threw in a copy of Halo 2, which had just come out. It was a gift that truly moved me, not just because my friends got me an expensive gift for Christmas…it mattered because in doing so, they acknowledged that I didn’t have all the nice things they took for granted, and they wanted me to have them.
Unfortunately, this gap between richer and poorer gamers hasn’t gone away. If anything, it’s widened. On one end of the spectrum, you have the least wealthy kids, huddled around their big brother’s Nintendo 64, playing games they’ve inherited because it’s their only option. On the other extreme, you have the proverbial rich kids with decked-out gaming rigs, complete with VR attachments and all the latest titles loaded up and ready to go on Steam; in other words, the “PC Master Race.”
The phrase “PC Master Race” was first coined by notable video game critic and author Yahtzee Croshaw as a throwaway joke in his 2008 video review of The Witcher (the first one). Ironically, he was making a joke about the UI of many PC-exclusive games, accusing them of being deliberately convoluted so as to weed out all but the most committed to the pretention of PC-gaming superiority. Some PC games of bygone eras were known for their difficult gameplay. It used to be a point of pride to have beaten certain games purely due to their extreme difficulty curve and incomprehensible gameplay. This was also when PC gaming was more of a niche, and console gaming was the more popular, “less nerdy” option.
VR equipment and computers capable of the high performance necessary to play current-gen mainstream titles like “Assassin’s Creed” cost hundreds, if not thousands of dollars. Consoles are an affordable alternative, and enable players to enjoy most mainstream titles, but there is some stigma against them in certain circles; in addition to being capable of showcasing the best graphics and smoothest gameplay, top-end gaming rigs enjoy a wider library of titles to choose from, thanks to developers like Steam and Blizzard. They also afford the player greater hardware flexibility – you can plug an Xbox controller into a PC for example, or use a Steam controller, or all kinds of alternatives. Xbox, Wii and Playstation consoles require a controller with a single layout.
With the economy such as it is, those who can play all the most recent titles and enjoy the best hardware tend to be either younger players supported by their more financially-secure parents, or adult players with high incomes…but there’s also a huge population of players in the middle, struggling to pay bills (including substantial student loans), who can’t afford the latest and greatest hardware and software.
There exists a lot of tension between not just PC and console gamers, but gamers who are able to afford the latest and greatest hardware, and those who are not so fortunate. These gaps are problematic mostly because they reflect a lack of accessibility to one’s choice of gaming content and hardware, rather than the choices or tastes themselves. You can choose which consoles and games you want to play – you can’t necessarily choose if you’re wealthy enough to buy all the ones you want. Or any, in some cases.
Many whose jobs are to play video games for a living, such as professional Let’s Players, sometimes lose sight of how expensive, even financially unrealistic, it is for some to even afford a console to play the latest games. This creates cultural pressure on gamers to stay up to date, as such talents have significant impact on the gaming subculture. Which leaves a lot of wallets empty and a lot of less fortunate gamers feeling ostracized.
Hyperboles aside, I know people still getting hours of happiness from their old "Ocarina of Time" cartridges from back in the late 90’s. I’m one of them. Hell, I recently downloaded "Lords of Magic", a turn-based strategy game by Sierra from back in the day, because I found out it was on Steam. I’ve played more of it so far than I have of "Undertale".
This may make me sound like an old man, but these days the industry is much larger. Thus, there are so many fun, functional 20-60 hour games coming out every month, it seems this binging of full-length titles also devalues what may be a 60-hour experience or more. More importantly, it creates the expectation that “real” gamers will have played at least a handful of the latest games per financial quarter.
Perhaps the rise of indie games, including games with lower graphical requirements, have made such a resurgence because of this problem – not only are they cheaper (many Indie games on Steam tend to be around $15-30), but they’re easier on the ol’ hard drive. "Chivalry: Medieval Warfare", for example, is a medieval multiplayer game centered around melee combat, where players can block each other’s swings – even dodge, if you’re super Shredder status. Yet it utilizes the Unreal engine, so it’s optimized for a wide range of operating systems and hardware loadouts. Even meager ones, like mine.
Comments
Post a Comment