The next generation of graphics cards is on the horizon, with both Nvidia and AMD working on new chips rumored to launch before the end of this year. Couple that with the release of the PS5 and Xbox Series X – which will contain graphics tech from AMD’s new chips – and it’s a big year for gaming hardware.

Gamers love to debate the merits of console hardware vs its counterparts on PC, especially at times like this, when both platforms take big leaps forward. But with features like refresh rate, HDR, ray tracing, and FreeSync, it’s getting harder to compare the two, as each provides a unique experience that’s tough to match on the other. Let’s take a look back at the last few console launches to see how PC hardware fared – and why things are different in 2020.

[widget path=”global/article/imagegallery” parameters=”albumSlug=best-4k-gaming-tv-for-ps5-and-xbox-series-x&captions=true”]

Compelling Consoles in an Age of Pricey PCs: Xbox 360 and PlayStation 3

It feels like a million years ago that Microsoft launched the Xbox 360, but back in 2005, it was an exciting step forward for consoles. (You could play games in crystal-clear 720p!) Comparing it to PCs, though, was difficult – the Xbox 360 was a computer at heart, but it used such different hardware that there weren’t really off-the-shelf equivalents. Most people looked to then-midrange cards like the Nvidia 7800 GT and 7900 GT for similar performance at around $250-300. Others argued the 360 GPU’s architecture was more similar to the higher-end ATI X1900 XT, which was priced closer to $400-$450 – and that’s not even considering the multi-core CPU inside the 360.

[ignvideo url=”https://www.ign.com/videos/2013/02/22/crysis-3-graphics-comparison-ps3-vs-pc-vs-360″]

Most of that generation’s games rendered at 720p, usually with a framerate of 30 frames per second. Benchmarks for those GPUs suggest they could hit similar performance in F.E.A.R. on PC, but given that they cost as much as the entire Xbox 360 – a $300-$400 console, depending on the package you bought – the 360 was rather tempting. It’s likely that Windows and other software took up enough resources that running games at similar speeds required pretty beefy PC hardware at the time, especially compared to self-contained game consoles like the 360 and PS3.

Here’s the catch, though: these consoles stuck around for seven or eight years before their successors launched, which means PCs had plenty of time to catch up. If you were buying an Xbox 360 in 2006, you were getting a good gaming machine at a compelling price. If you looked at that same console in 2010, the gap between it and a comparably priced PC would have been much smaller. (And, of course, PCs became capable of so much more, if you had the money – just look at our comparison of Crysis 3 on PC, PS3, and Xbox 360.)

A New Era: PlayStation 4 and Xbox One

When the PlayStation 4 and Xbox One finally launched in late 2013, the scene had changed dramatically. These consoles’ internal hardware was much more comparable to PCs, and while it still wasn’t a perfect one-to-one match, it was much easier to draw parallels between newly-launched hardware. Consensus seems to be that the PS4’s GPU was most comparable to AMD’s Radeon HD 7850 or 7870, midrange cards you could buy for as low as $140 and $170 at the time of the PS4’s launch. That meant you could build a PC that compared to the new consoles more affordably than at the Xbox 360’s launch.

[ignvideo url=”https://www.ign.com/videos/2016/10/22/battlefield-1-graphics-comparison-xbox-one-vs-ps4-vs-pc”]

PC benchmarks from the time seem to back that up, too. In Battlefield 4 (a launch game for the PS4 and Xbox One), Guru3D pegged the Radeon HD 7850 at around 40 frames per second at 1080p with Ultra settings. Considering the PS4 rendered Battlefield 4 at 900p to hit 60 frames per second, likely with lower-than-Ultra settings, I’d call that comparable with a few settings tweaks. The same goes for The Witcher 3: Wild Hunt two years later, where a similar-tier card could get you graphical fidelity and performance that came close to matching the PS4. Of course, if you had a better card, you had a lot more room for improvement, as The Witcher 3 really pushed boundaries on PC – but even if you didn’t, the value proposition of a PC was better than ever.

Closing the Gap: PlayStation 4 Pro and Xbox One X

If we hadn’t gotten the PS4 Pro and Xbox One X, PCs would have been an even better relative value in the second half of this console generation’s lifespan, much like in the Xbox 360 and PS3’s later days. But instead, we got a mid-cycle refresh that shook things up a bit thanks to a big push for 4K in the console space.

While the Jaguar CPUs in these consoles were pretty long in the tooth by 2016, the GPU got a decent upgrade, drawing comparisons to AMD’s RX 470 and 480. Those cards were supposed to cost $180-$200 in 2016, but came closer to $200-$270 at the time thanks to limited supply (thanks, Bitcoin miners). Playing 4K games at 30 frames per second is possible with an RX 470, especially if you’re willing to turn settings down – after all, consoles are rarely running at what PC gamers would consider Ultra quality – but consoles had the advantage of some unique optimizations that keep things playable.

[ignvideo url=”https://www.ign.com/videos/2018/03/06/final-fantasy-xv-graphics-comparison-ps4-pro-vs-xbox-one-x-vs-pc”]

In Digital Foundry’s analysis of Red Dead Redemption 2, they found that the Xbox One X had graphical quality equivalent to the PC port with many settings at medium or low. In fact, certain tweaks on the Xbox version dialed shadows, reflections, and volumetric lighting down even lower than was possible on PC. Couple that with checkerboarding and other tricks to display games like RDR2 at “4K,” and the two become harder to compare – many PC games don’t offer these “Faux-K” features, which makes it hard to put the two platforms apples-to-apples. Don’t even get me started on the mess that is HDR in Windows, either.

So from a strict cost perspective, these half-step consoles kept the value gap between consoles and PCs from getting too wide, particularly if you were looking to game at 4K in HDR. But that wasn’t necessarily what PC gamers were gunning for, either. Which makes things complicated.

PCs and Consoles Are Aiming for Different Things Than They Used To

Comparing console hardware to PCs has never been truly fair. Sure, raw compute power is nice, but you also have to consider the advantages of mouse and keyboard, the power-on-and-go nature of consoles, and – if multiplayer is your thing – what device your friends use.

But even if we just look at the internal hardware, these platforms feel more divided than ever, as both markets aim for vastly different experiences. Current consoles play games at 4K with 30 to 60 frames per second, because that’s what modern TVs are built for. But PC monitors haven’t pushed 4K as hard as TVs have, instead opting to raise refresh rates for smoother motion. Many consider 1080p or 1440p at 144Hz to be more enjoyable than 4K at 60Hz – myself included – because smoother motion makes for a more noticeable improvement than extra pixels, especially with anti-aliasing in the mix. And even today’s high-end GPUs don’t offer enough power to get the best of both worlds, forcing you to choose between resolution and refresh rate. Oh, and I haven’t even mentioned new features like ray tracing, variable refresh rate, and HDR.

[widget path=”global/article/imagegallery” parameters=”albumSlug=xbox-series-x-user-interface&captions=true”]

The two platforms may re-converge, at least to a certain degree, with this next generation of gaming hardware. 4K monitors are coming down in price, and while 4K/144Hz still has some serious quirks, HDMI 2.1 and DisplayPort 2.0 should make this high bar more achievable. In fact, we already know the PlayStation 5 and Xbox Series X will be capable of outputting a 4K image at 120Hz with variable refresh rate (VRR) – though we don’t yet know what resolution and refresh rate games will actually aim for in practice. That makes it tough to say how next year’s budget and midrange GPUs will stack up. Will Nvidia’s new cards push 4K-capable models down to more reasonable prices? Or will consoles step up their game and push higher refresh rates to close the gap? Oh, and don’t forget about the super fast SSDs console makers have been touting, either – that’s a whole separate conversation.

So if you’re itching to get on board with the latest and greatest gaming hardware coming out this year, your decision may make a bigger difference than ever before. I don’t usually tell people to wait for the next big thing – if you’re always holding out for what’s coming, you’ll never upgrade at all – but with such big leaps in both PC and console hardware, it may behoove you to wait until we’ve seen what both platforms can do. You don’t want to be stuck on the wrong side of the fence before you know what each is aiming for. If consoles stick to 4K/60 while PCs continue to stress lower resolutions at higher refresh rates, raw power may not be a deciding factor after all.

Source: IGN.com Consoles vs PCs in 2020: Why Raw Power Isn't Everything