Sometime around mid-December, parts of my family abandoned their game consoles and went very, very PC.
It’s not that we don’t love our game consoles: I’m finally getting around to building out island homes in Animal Crossing: New Horizons for Nintendo Switch with my daughters. My brother Pablo scored a PlayStation 5 against all odds before the holidays were in full swing. Mario Kart Live was on some of our wishlists.
But several nights a week since then, my brother, his fiancée, and some of our friends and family have been getting online and playing PC games together, such as the old zombie-shooting classic Left 4 Dead 2 or a more recent scary squad game GTFO. After the holidays, Red Dead Online was on sale for $5 and we all jumped in on it. Now, we’ve spent dozens of hours completing bounty hunting missions, purchasing ugly ponchos, and getting mauled by bears in our pursuit of big-game meats to cook. As we’ve all been spending more time on PC games, I began to wonder if newcomers or upgraders really need a super-expensive, high-end machine to keep up. Are they required?
My friend Jeana, who wanted to join in on some zombie-gaming after dipping her toe into multiplayer games such as Jackbox and Among Us, invested in a well-equipped gaming rig and a huge, curved monitor. Just before that, my dad finally got a decent gaming setup, handed down from my brother with a few new parts. My brother handed down the machine to indulge my dad’s decades-long love of Microsoft’s Flight Simulator franchise. Dad has since added a VR headset, a Christmas gift from my brother, to play Flight Sim and other sims such as Star Wars: Squadrons in even more immersive virtual reality.
We get glitches sometimes, which comes with the territory amid the ever-changing drivers for high-end graphics cards and Windows 10, which has its own set of quirks. But for the most part, we are all running pretty powerful Intel-based systems with Nvidia graphics cards that start at the Geforce GTX 1070 level and top out at around RTX 3070 after some post-holiday upgrades by my brother and his fiancée, Linh.
I’m absolutely certain that those new graphics cards can absolutely crush my 1070 on any speed or performance metric, but the remarkable thing is that except for some eye candy and better frame rates (frames-per-second are one way to measure graphics performance in games), I’m having a hard time seeing a huge difference in all of our gameplay experiences.
The hand-me-down PC my dad runs, which we cobbled together for under $200 in new parts, including some RAM and a sizable solid-state drive, runs the very bloated, very demanding Flight Simulator and VR apps just as effectively as as the $1,700 system I purchased three and a half years ago. Jeana’s brand-new system, which was in the $1,300 range, monitor not included, may not show every sliver of ray-traced sunlight in Red Dead Online, but it hasn’t hampered her enthusiasm for the game, and to my eyes, it looks fantastic at its native 1440p resolution on her monitor from a RTX 2060 graphics card.
PC graphics made an evolutionary leap when Nvidia introduced its line of 1000x graphics cards, which helped kick-start a resurgence in PC gaming. The new graphics cards from Nvidia, and new generations of cards from its main competitor, AMD, were more efficient and added more graphics capabilities for game developers to deploy. I happened to be in Austin at the event that launched Nvidia’s 1000x generation of graphics cards and as jaded as I had become to corporate-speak and big-budget event production, it was hard not to feel the electricity in the air signaling that this wasn’t just another product launch. When I played Overwatch for the first time on one of the PCs equipped with these new Nvidia cards, I knew something had changed and that we were in for all new kinds of gaming experiences that game consoles, at least at the time, couldn’t match.
Except for some eye candy and better frame rates, I’m having a hard time seeing a huge difference in all of our gameplay experiences.
But as we have evolved from the 1000x line to Nvidia’s inevitable supercharged TI versions of its cards to the 2000x generation and the current 3000x generation, it feels like the company is making an annual show of eking out performance gains and increasing efficiencies on what is essentially the same kind of hardware.
The price tags for these upgrades suggests otherwise. For a top-of-the-line $1,600 graphics card, you’d expect exponentially better gameplay experiences, the kind of jump that would be worth the investment, not “extreme pricing for very modest gains.” And your satisfaction really depends on how you define “gains” going from, say, a GTX 1080 to a new RTX 3090.
We’re in a plateau, which is not a bad place to be for gamers with aging hardware or who are on a budget for upgrades. It means whatever graphics card or gaming PC you choose to buy new right now, you’re probably going to have a good experience with the latest game. (Minus the bugs you might encounter in games such as the troubled Cyberpunk 2077.)
Are you really going to notice the difference between Very High and Ultra graphics in most games? Will a card that runs three degrees cooler in your PC chassis increase your gaming enjoyment? Do most gamers care about overclocking their graphics cards to eke out a 10% increase in frames per second? I’m willing to bet most gamers don’t even have monitors that can keep up with the kinds of frame rates these newer cards put out, and the fast pace of graphics card development is bottlenecked by other PC hardware. We’re in a good place for gamers unless you’re outraged by the costs of some of these hardware upgrades (thanks, bitcoin).
That’s great news for gamers who have a variety of different PC systems from different price points. We just want to hop online and play together with our cowboy posse and know that we’re all having comparable gameplay experiences. The best kind of gaming is when we can forget about the hardware and just have a good, rootin’-tootin’, horse-ridin’ time together.