NVidia’s GTX 970 is the current price-to-performance darling, offering incredible visual for incredible value. It seems, however, that it’s harbouring a dark secret. It’s a 4GB card, but it looks like a significant chunk of that VRAM doesn’t work.
According to a number of rather angry people on Reddit, Overlock and the Guru3D forums who’ve noticed their shiny, powerful GTX 970’s come to a screeching halt when maxing out their cards. Clever people, using VRAM benchmarking software, have discovered that when the last 500`700MB of VRAM gets accessed, memory performance drops significantly. Some users have even found that their cards go belly up when hitting 3GB.
The issue doesn’t seem to affect the more expensive, more powerful 980 – so it’s not a Maxwell fault, but rather one specific to the 970. DRAM bandwidth drops from a peak of over 150GB/s, right down to 19.88 GB/s.
“Going beyond 3.5GB vram usage in games like Hitman Absolution, Call of Duty Advanced Warfare severely degrades the performance, as if the last 512mb is actually being swapped from the RAM,” says Reddit user nanogenesis in a thread on the subject.
The problems seems to be widespread – and Nvidia’s admitted that problem affects every single 970 (to varying degrees). They’re looking in to the issue – but unless it’s something that can be resolved via driver or firmware update, a recall may be on the cards.
“We are still looking into this and will have an update as soon as possible,” said NVidia community liaison ManuelG on the GeForce forums.
We asked our resident cabbage, Alessandro – who owns a GTX 970 – to test, and he was hit with the same results. You can download the benchmark to try it yourself, here. If you require the extra library to run the benchmark, you can get it here.
While the card is still a performance champ, it makes it hard to recommend right now, as it’ll never reach its full potential. If you have money burning a hole in your pocket and you’re aching to buy a new card, you may just prefer to opt for the 980, which isn’t affected by this. Or you could, loathe as I am to say it, wait to see what AMD has up its sleeve.