The Division struggles on the world’s most popular GPUs

4 min read
0

The Division and the struggles on PC

Nvidia really hit it out of the park with the GTX 970 when it launched, and it became a sales phenomenon despite some issues. It’s such a popular card that it’s Steam’s most used GPU right now, claiming a massive 5% of their entire user base. That’s nothing to scoff at, so it’s interesting to see The Division struggle to maintain performance on essentially their biggest target.

Analysing details from the recently closed beta, Digital Foundry pitted the GTX 970 and AMDs R9 390 against each other to see who came out on top. The R9 390 is already better on paper (being AMD’s answer to the 970 almost a year later), so it’s no surprise that it comes out ahead. But a whole 20 frames ahead on some instances? That’s a bit of a shocker. Here’s how the game fares on both Ultra and High settings.

As you can see, both cards fail to achieve a locked 60FPS on either of the presets (which is a little disheartening), with an overclock to the GTX 970 required to keep things stable just at High. What is interesting is the disparity between the 970 and R9 390 at stock, which shows a huge gap in performance especially in more closed off areas. The R9 390 is certainly the faster card, but it theoretically shouldn’t be that much faster. Seems there’s still some heavy optimisation needed on Nvidia’s side.

Ramping up the game to ultra settings, and moving on to an i7 4790K paired with GTX 970 and R9 390, two observations are clear: firstly, we’re looking at minimum frame-rates on both cards in 35fps territory, but the AMD hardware shows a clear advantage over Nvidia in engine-driven cut-scenes and lower detail locations. This can rise to anything up to 15-20fps. It’s a remarkable advantage, and one that is sustained for much of the run of play in the beta, since most of the action occurs indoors. The advantage is even more noteworthy in that Nvidia released a game-ready driver for the beta, whereas AMD didn’t – we used the most recent 16.1 Radeon Crimson hotfix driver for our testing.

And truth be told, the beta only really gives an early indication of performance overall. These last few weeks are spent deep in optimisation and bug squashing, so it’s likely that the performance exhibited here on both AMD and Nvidia cards could vastly improve by the time launch day rolls around. That’ll also include new drivers from both teams which will further improve the situation (well, more often than not).

Read  Xur is going to be a lot harder to find when year two of Destiny 2 kicks off

So yes, some of the most popular cards on the market might start showing their age in terms of maxing out games at 1080p, but I think we’re still a while away from calling them obsolete.

Last Updated: February 5, 2016

Alessandro Barbosa

You can all call me Sandy until I figure out how to edit this thing, which is probably never. Sandy not good enough? Call me xXx_J0k3R_360degreeN0Sc0pe_xXx. Also, Geoff's a bastard.

Check Also

Destiny 2: Where is Xur (and whats he got for sale?) – 21 September

Maybe you’re stuck at work and can only log in later tonight to spend some of your legenda…