Home Gaming Nvidia’s Maxwell cards may not be fully DirectX 12 compliant

Nvidia’s Maxwell cards may not be fully DirectX 12 compliant

3 min read
39

Thewayitsmeanttobeplayed

It all started with Ashes of The Singularity, the first game that can be functionally used as a DirectX 12 Benchmark. It demonstrated that with DirectX 12 enabled, AMD’s GPU’s stand to gain far more than Nvidia’s counterparts. In fact, enabling the newer API often lowers performance on Nvidia’s cards.

From there it’s ended up becoming a he-said she said mud-slinging match between the two corporations, but the developers of the engine powering Ashes of the Singularity, Oxide Games, has revealed a little more – and it doesn’t look good for Nvidia. Apparently Nvidia’s newest cards, including their top of the line flagships, aren’t capable of Asynchronous Compute, one of the more important features enabled by the new low-level API.

Speaking in a response on overclockers.net, Oxide Games has said that Nvidia’s cards don’t support the feature at all.

“Maxwell doesn’t support Async Compute, at least not natively, “ says the Oxide representative, posting as Kollock. “We disabled it at the request of Nvidia, as it was much slower to try to use it then to not.”

He said that Nvidia’s driver reported that the feature was enabled, but using it proved otherwise.

“Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn’t really have Async Compute so I don’t know why their driver was trying to expose that.”

As you know, AMD’s cards benefit from the use of asynchronous compute in the Ashes benchmark – and it’s something that will likely be used more in the future.

“I suspect that one thing that is helping AMD on GPU performance is D3D12 exposes Async Compute, which D3D11 did not. Ashes uses a modest amount of it, which gave us a noticeable perf improvement. It was mostly opportunistic where we just took a few compute tasks we were already doing and made them asynchronous, Ashes really isn’t a poster-child for advanced GCN features.

Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven’t made their way to the PC yet, but I’ve heard of developers getting 30% GPU performance by using Async Compute.”

AMD’s thrown in a word or two as well. Speaking on Reddit, AMD’s Robert Hallock has said that Nvidia has essentially lied about Maxwell’s Direct X12 capabilities.

“NVIDIA claims “full support” for DX12, but conveniently ignores that Maxwell is utterly incapable of performing asynchronous compute without heavy reliance on slow context switching.

GCN has supported async shading since its inception, and it did so because we hoped and expected that gaming would lean into these workloads heavily. Mantle, Vulkan and DX12 all do. The consoles do (with gusto). PC games are chock full of compute-driven effects.”

In effect, NVidia’s newest cards aren’t as DX12 compliant as they’d like people to believe. AMD admits that even its Fury X isn’t “fully” Dx12 compliant – because no such card exists right now.

“I think gamers are learning an important lesson: there’s no such thing as “full support” for DX12 on the market today.

There have been many attempts to distract people from this truth through campaigns that deliberately conflate feature levels, individual untiered features and the definition of “support.” This has been confusing, and caused so much unnecessary heartache and rumor-mongering.

Here is the unvarnished truth: Every graphics architecture has unique features, and no one architecture has them all. Some of those unique features are more powerful than others.

Yes, we’re extremely pleased that people are finally beginning to see the game of chess we’ve been playing with the interrelationship of GCN, Mantle, DX12, Vulkan and LiquidVR.”

Nvidia’s staying quiet on this issue at the moment, but it’s starting to look like AMD’s cards may be better suited for future proofing than Nvidia’s right now. That is, of course, dependent on just how much developers will transition to DirectX 12 in the short term, and how much they’ll utilise Asynch compute.

Last Updated: September 1, 2015

39 Comments

  1. Admiral Chief Returns

    September 1, 2015 at 12:11

    AWWWWJISSSSS ROOI SPAN!

    Reply

    • Hammersteyn

      September 1, 2015 at 12:20

      MAKE WAY

      Reply

  2. Hammersteyn

    September 1, 2015 at 12:13

    They lied again?

    Reply

    • Admiral Chief Returns

      September 1, 2015 at 12:19

      Where is that crazy Mexican man with his laugh?

      Reply

      • Greylingad[CNFRMD]

        September 1, 2015 at 12:47

        Compiling a video about the MGSV micro transactions that slipped in?

        Reply

    • Alien Emperor Trevor

      September 1, 2015 at 12:24

      No they didn’t. All the truth is there, they just disabled some of it.

      Reply

    • Greylingad[CNFRMD]

      September 1, 2015 at 12:34

      To be quite frank, I’m really disappointed in nVidia coming from the last year or so, first there was the “It has 4GB” scandal, then this whole slouching about how their “true intentions”, the wheel is turning…slowly, but it’s still turning….

      Reply

  3. Greylingad[CNFRMD]

    September 1, 2015 at 12:30

    A short lived gain for nVidia with the 9 series, a future proof, secured climb in AMD sales is now to be expected, even as an nVidia user, I hope this changes things up a little…

    Reply

    • RustedFaith

      September 1, 2015 at 12:52

      Yeh and I feel like a idiot getting a 980 ti now …

      Reply

      • Greylingad[CNFRMD]

        September 1, 2015 at 12:54

        Ouch…Well I’m an early adopter of the 970, pre 4GB fallout…I’ve been quite pissed off since…

        Reply

  4. Captain JJ Fantasticus

    September 1, 2015 at 12:54

    but HAIRWORKS!

    Reply

    • Deceased

      September 1, 2015 at 13:23

      “So thanks to a wonderful user over at Guru3D, it’s been discovered that hairworks can in fact run relatively smoothly on AMD cards. I’m talking 60fps smooth guys, and the trade-off is barely noticeable.

      All you need to do is create a profile for witcher3.exe in your Catalyst Control Center, then set Tessellation Mode to “Override application settings”, and finally set the Maximum Tessellation Level to either 2x, 4x, 8x or 16x. Picture guide

      Important: Depending on what tessellation level you set, the hair quality AND performance will vary. For best performance while maintaining realistic hair I recommend 4x or 8x if your card can handle it. I would recommend against using 2x as it severely reduces quality and looks terrible. Here’s a comparison photo that lets you see the differences between 2x, 4x, 8x and 16x tessellation levels. (Thanks to nzweers from G3D for the comparison photo!)

      For reference I’m using a r9 290 and I have little to no performance impact if I use 8x, if I bump it up to 16x then I regularly drop to 50 in intense areas (e.g. wolf packs). So there you have it, let me know if this also works for you guys – and also if you happen to find any issues that arise from doing this tweak.

      Update: Hairworks performance can be further improved by reducing or removing the Anti-Aliasing of the hairs, read this thread for instructions on how to do so. Also, several users and myself included have came to the conclusion that the “AMD-Optimized” setting may actually work even better while providing the same or even better visual quality, I gained a minor fps increase by going from 8x to AMD-Optimized. I recommend you experiment with the settings to find your own sweet spot as every system is different and every user will have their own preferences.”

      Source : https://www.reddit.com/r/witcher/comments/36jpe9/how_to_run_hairworks_on_amd_cards_without/

      Reply

      • Captain JJ Fantasticus

        September 1, 2015 at 13:25

        Hairworks looks ridiculous. I’d prefer to just leave it off. I like to play Witcher 3 without it looking like everything is underwater.

        Reply

        • Deceased

          September 1, 2015 at 13:27

          Also not enabling it at all… The tessellation does, in fact, still cause a bit of a knock ( not nearly as hard as nVidia Hairworks option in-game ), and the game is perfect without it…

          ( how to bold the part where I say The Wither 3 is perfect? )

          Reply

          • Captain JJ Fantasticus

            September 1, 2015 at 14:21

            Agreed.
            Ask Admiral about the bolding, I have no idea.

  5. Kromas untamed

    September 1, 2015 at 13:21

    HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAH

    /takes deep breath

    HASHAHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHA

    Reply

  6. Frost

    September 1, 2015 at 13:33

    I want to buy a new PC. So what do I do now? I was sure I wanted the GTX970, not so much anymore.

    Also, do I get an i5 4690? Or do I look at the Skylake series?

    Edit: Or do I decide screw it all and get a PS4. This seriously seems like a much easier option right now.

    Reply

    • Captain JJ Fantasticus

      September 1, 2015 at 14:22

      The i5 is still a bulletproof piece of tech, but if you’re looking at longevity I’d suggest the Skylake. Easier to keep with the changes coming up then.
      About the GTX970 I can’t really say, I’ve lost a bit of touch with the Nvidia cards, so not sure what’s good and what’s not.

      Reply

      • Frost

        September 1, 2015 at 14:27

        Thanks. If you were to recommend an ATI card in that same price bracket?

        Reply

        • FoxOneZA

          September 1, 2015 at 15:59

          R9 280 or R9 380 3GB would be sufficient. The R9 380 being more efficient and native DX12 hardware compatible.

          Reply

          • Frost

            September 1, 2015 at 17:42

            Quick search and I have to say the R9 380 4GB does seem attractive.

          • Captain JJ Fantasticus

            September 2, 2015 at 07:00

            The R9280 and R9 380 are both incredibly decent cards. I’ve got the R9 280 and for that price it’s unmatched imo.

        • love !

          September 3, 2015 at 02:59

          * Ci5 6600K
          * Sapphire Nitro R9 380 4gb gddr5
          * 8gb ddr4 (1 stick)
          * Z170 basic mobo
          * 256gb ssd
          * 600watts psu

          Happy gaming at 1080p60

          * This rig will last for 3 years considering respective res and fps

          Reply

    • Dawid Eduard Roestorf

      September 1, 2015 at 15:24

      Go for the PS4, no more hassles.

      Reply

    • CrazyCompPretendingToBeHuman

      September 7, 2015 at 16:45

      hey Frost. The gtx 970 has been replaced as the king of price to performance card. You should go for a r9 390 which outperforms the 970 in almost all games and costs about the same if not a teeeeeny bit less.

      Reply

  7. Dawid Eduard Roestorf

    September 1, 2015 at 15:26

    Asynch compute has been used on the consoles for a while now. Reason why the console perform even with hardware nowhere close to that of a PC.

    In the end, this will catch on, very soon (next year I suspect half the games already) and then you need to choose.

    Nvidia should take about 6 months to change their hardware (I think), till then, AMD rocks

    Reply

    • love !

      September 3, 2015 at 03:04

      Nvidia really has dropped the ball…
      2014 – gtx 970 controversy
      2015 – no native dx12 gpus

      Its high time they should work on those TFLOPs…Pascal has already been tapped out…so 2016 seems unlikely…may be 2017? God knows…

      Good days for AMD ahead…they nailed it with the consoles and now their PC gpus are rocking

      Owner of a green card :/

      * Im glad that i have not upgarded to 900 series yet…though i was planning to lol

      Reply

  8. FoxOneZA

    September 1, 2015 at 16:01

    This is from the same dude’s that believe 3.5gb of RAM is 4gb and brought us that turd Gameworks :/

    Reply

  9. Pansyfaust

    September 1, 2015 at 21:09

    No card out right now will be fully DX 12 compliant. It seems that in this case the one thing Nvidia’s Maxwell card cant do well(or at all) are asynchronous compute, which is one of many DX 12 features.

    Reply

  10. OSiRiS

    September 2, 2015 at 09:25

    No need to worry, its been disproven last night. Just google nvidia dx 12 and you should see a few links to tech sites that ran benchmarks and whatnot to disprove these guys. I wouldve pastes the link here but couldnt get on this site last night for some reason

    Reply

  11. Warren Meinking

    September 2, 2015 at 15:46

    As an nVidia lover, I am quite disappointed in them to be honest.

    Reply

  12. UltimateNinjaPandaDudeGuy

    September 2, 2015 at 16:39

    lol! I am pretty excited to see what happens with Pascal cards and also what Nvidia is going to do about this situation. Next year round about this time I am upgrading and reviews will be the deciding factor!

    Reply

  13. hellowalkman

    September 4, 2015 at 12:04

    The Maxwell series of cards should be renamed to “GIMPWELL” since most of the Cards or their features are gimped one way or another… 😛

    Reply

  14. Josh Morland

    September 4, 2015 at 14:01

    nVidia Gimpwell™

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Turns out Microsoft will require a TPM chip for you to install Windows 11

Turns out the much hyped low-specs for Microsoft's new operating system might be more rest…