Home Gaming SLI and CrossFire setups getting memory stacking soon

SLI and CrossFire setups getting memory stacking soon

2 min read
86

253089-4

Having run a dual card for the past three years, I can honestly say that SLI/Crossfire configurations should be reserved for PC gamers with way too much money to spend on their rigs. There are obvious performance gains, but the price of an additional card comes with many handicaps. The inability to stack memory is the most glaring one, made even more disappointing given requirements of modern games. But it’s going to change very soon.

Right now, two cards configured in SLI/Crossfire don’t stack memory, meaning two 4GB cards effectively still produce 4GB of memory instead of 8GB. It’s something that most seasoned PC gamers have come to accept, resting on the fact that the performance gain is enough of a boost to justify the cost. But AMD isn’t satisfied with that answer, and rightly so considering their massive push into dual-card territory recently. The issue surrounding the limitation, however, has already been addressed, with Mantle offering options to let the stacking begin.

Robert Hallock, head of global technical marketing at AMD, has explained via Twitter that their low level API can already bridge the gap to allow memory stacking. With “to-the-metal” access to GPUs, developers can allow for multiple cards to add onto the memory stack, with the cards not needing their own copy of the game running on them individually. It’s all down to optimisation, which will become easier with future APIs like DirectX 12.

Mantle is the first graphics API to transcend this behaviour and allow that much-needed explicit control. For example, you could do split-frame rendering with each GPU ad its respective framebuffer handling 1/2 of the screen. In this way, the GPUs have extremely minimal information, allowing both GPUs to effectively behave as a single large/faster GPU with a correspondingly large pool of memory.

That doesn’t mean it’s in effect yet, especially since this is a feature developers themselves need to add. However, the idea of multiple cards being hindered by memory could be a distant memory in the near future, which opens up PC gaming in a significant way.

Ultimately the point is that gamers believe that two 4GB cards can’t possibly give you the 8GB of useful memory. That may have been true for the last 25 years of PC gaming, but that’s not true with Mantle and its not true with the low overhead APIs that follow in Mantle’s footsteps.

With DirectX 12 creeping its way into titles soon, it could be just a matter of months before memory stacking for multiple graphics cards becomes a reality. That makes offerings like the GTX 970 and R2 290X even more enticing, especially since cards exceeding 4GB of memory are quite expensive still. But it would be a game changer in the mid-range market. Imagine being able to stack the memory you need for full texture settings without breaking the bank? That’s the future right there.

Last Updated: February 3, 2015

86 Comments

  1. So 4GB+4GB will equal 7GB?

    Llew must be so excited!

    Reply

    • HairyEwok

      February 3, 2015 at 12:15

      Those are 980’s not 970’s

      Reply

      • Neil Mathieson

        May 16, 2015 at 23:22

        How do they get any air to cool down? This is not good

        Reply

    • Raptor Rants A Lot:Original #7

      February 3, 2015 at 12:16

      ……..

      Reply

      • Admiral Chief

        February 3, 2015 at 12:20

        lol 7 dots

        Reply

        • Raptor Rants A Lot:Original #7

          February 3, 2015 at 12:21

          Dude seriously… Stop counting the stuff! Don’t you have work?

          Reply

          • Admiral Chief

            February 3, 2015 at 12:23

            Pattern recognition, also, yes, but you know me, mr multitask

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:24

            lol

        • Tosh MA

          February 3, 2015 at 12:30

          It’s 8 dots. But it’s still funny that he doesn’t even check any more.

          Reply

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:30

            It is???? *sigh* I should have counted 🙁

          • Admiral Chief

            February 3, 2015 at 12:31

            SHHHHSSSHHHHHHHH

            😛

    • Raptor Rants A Lot:Original #7

      February 3, 2015 at 12:28

      lol it’s funny coz 3.5 + 3.5 is actually 7 hehehehe

      Reply

      • Blahblahblahblah

        April 7, 2015 at 09:22

        Hey the other 1gb is stored for caching and will be available for use when directx 12 comes out in August

        Reply

    • Tosh MA

      February 3, 2015 at 12:28

      Well Nvidia seem to think so. 😛

      Reply

    • FoxOneZA

      February 3, 2015 at 15:37

      All arrows point towards stacked memory and dual GPU cards from the RED Team in the near future.

      Reply

  2. Matthew Holliday

    February 3, 2015 at 12:10

    fantastic.

    ive been wondering about my crossfire build since i installed my second card. the FPS gain has been incredible, but the inconsistancy and jittering has been pretty damn annoying in certain games.

    still confused about mantle though, do i need to switch ingame from directX to mantle manually? ive noticed that certain games have that option, or should i just leave it on directX?

    Reply

    • Raptor Rants A Lot:Original #7

      February 3, 2015 at 12:16

      If you have a mantle option then use it

      Reply

  3. Admiral Chief

    February 3, 2015 at 12:10

    Provided your mobo can handle it…

    Reply

    • Tosh MA

      February 3, 2015 at 12:11

      And if you buy AMD, your power supply…

      Reply

      • Admiral Chief

        February 3, 2015 at 12:12

        And if you buy Intel/NVeeedia, your soul

        Reply

        • Tosh MA

          February 3, 2015 at 12:12

          I’ll trade my soul for a superior experience any day. 😛

          Reply

          • Admiral Chief

            February 3, 2015 at 12:16

            Then you will lose twice

          • Tosh MA

            February 3, 2015 at 12:17

            Joke’s on you man, I’m a ginger. 😉

          • Admiral Chief

            February 3, 2015 at 12:18

            Man, you are pretty screwed then, and I was lying about the pretty part 😛

        • Hammersteyn

          February 3, 2015 at 12:14

          So Rince is out of luck

          Reply

          • Admiral Chief

            February 3, 2015 at 12:41

            He’d have NO idea what to do with a GPU

          • Blood Emperor Trevor

            February 3, 2015 at 12:46

            Of course he would. He knows they make gfx better, so he’d tape it to his monitor.

          • Admiral Chief

            February 3, 2015 at 12:55

            XD

      • Matthew Holliday

        February 3, 2015 at 12:13

        • Tosh MA

          February 3, 2015 at 12:14

          • Matthew Holliday

            February 3, 2015 at 12:15

            the r9 270 was released within the 700 series of cards though.
            comparing it to latest tech isnt fair.

            it would be better to compare the 900 series when the new AMD line comes out.

          • Tosh MA

            February 3, 2015 at 12:16

            http://www.hwcompare.com/18035/geforce-gtx-970-vs-radeon-r9-290x/

            How about that then. Similar real world performance across games, but just over half the TDP. One R9 290X uses as much power as 2 GTX970 in SLI.

          • Matthew Holliday

            February 3, 2015 at 12:24

            thats still within the R9 series though. which uses the same tech as the 270.
            the 970 is a brand new card.

            a better argument would be to compare the power consumption cuts between the nvidia 500-900 series then the AMD HD 5000 – R9 series.

            point was, the mid-high range cards like the 760 and r9 270 power consumption is comparable. there are obvious examples of nvidia doing better, but the flagship models are close.

          • Tosh MA

            February 3, 2015 at 12:28

            See, while your argument works on paper, it means nothing to the person who’s looking to buy a new card now. Right now, the 2 competing cards are the 970 and R9 290, within that budget range. My point stands.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:30

            Didn’t AMD say their answer to the 900 series is coming in the next 2 months or so? If you were looking to buy right now, I’d caution the person to wait for the AMD solution. Just in case

          • Tosh MA

            February 3, 2015 at 12:32

            Weird “person” phrasing there, but I get it. I will have to wait anyway, because budget doesn’t allow for a quick purchase, Save save save.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:35

            Sorry I switched tenses halfway through…. That is a bit… odd…. Where’s the grammar police when you need them. Eish.

            Also, please don’t be one of those that buy a piece now and then a piece later. Just save and buy all in one go. Otherwise you end up with a 6 month old CPU for example that could have been better if one had just waited.

          • Tosh MA

            February 3, 2015 at 12:37

            oh all I need is the graphics update. I spent a lot of money 2 years ago on the MB/CPU and RAM to make sure all I needed was a GPU upgrade to extend the life of my PC.

            Oddly, my MB doesn’t support SLI, but does support Xfire. So it’s funny that I’m a bit of an Nvidia fan at the moment XD

          • Admiral Chief

            February 3, 2015 at 12:40

            Bwhahahaha

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:53

            lol. That’s that “Oops” purchase moment where you realise afterwards that you didn’t check if it supported both lol.

            I bought a whole new PC just over 3 years back to ensure I have a good long PC life again. Cost me R10k (R10.7 with the extra GPU 2 1/2 years later) Can still play everything on max 🙂 Woooo

          • Tosh MA

            February 3, 2015 at 12:58

            Yeah but tbh I still am not overly keen on 2 cards on 1 board. So while I knew that was the case, I didn’t mind because I didn’t plan on putting 2 cards in.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 13:02

            Ah ok well that’s ok then. I dig the option of 2 cards simply because it can help increase a PC’s lifespan without too much of an upgrade. Especially if the memory stacking comes to pass

          • Lothy

            February 3, 2015 at 13:25

            look having 2 cards is awesome. Its sometimes a cheaper alternative to get that boost u want. I use to have 2 5870s and man it was great.

          • Lothy

            February 3, 2015 at 13:24

            *Bruce almighty voice* REDEMPTION IS AT HAND!

          • Spaffy

            February 4, 2015 at 07:58

            haha ditto, my MB does Xfire, but I run Nvidia XD

          • Spaffy

            February 4, 2015 at 07:57

            Which is how I payed over 9k for my 780 a few months before the 980 released for 7k.
            🙁

          • Raptor Rants A Lot:Original #7

            February 4, 2015 at 12:25

            Oh crap that sux

          • zeemonkeyman

            February 4, 2015 at 13:19

            Yeah I reckon the successor to the 290 could be a gem. It’s under looked if you ask me, it is a fair bit cheaper than the 970 where I come from and is quite comparable in most games but yeah it does use a lot more power. Maybe it’s successor will fix that

          • Lothy

            February 3, 2015 at 13:11

            I agree, but the nvidian fanatic follows are super quick to compare their deity god to the old range of AMD cards. And the blasphemy that ensues if u dare mention this error in their beliefs. Suddenly they start running around shouting “Viridis akbhar” while setting off green bombs of mist strapped to their chests. All very dramatic.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 13:19

            can you buy better at the current price? No. When AMD releases a better card it will also be compared to nvidias current top tier cards. It’s how it works.

            You compare current best with current best. It shouldn’t be about team green or red. It should be “What currently gets you the best performance for best price” currently that’s the 970.

          • Lothy

            February 3, 2015 at 13:22

            I agree, but the over zealous support that NVidia gets here is really funny. Heck I mean Tosh is a perfect example. Matthew makes a legit statement and suddenly he is seen as a heathen (again over dramatizing for fun) and another commandment is thrown his way.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 13:26

            The fact is Matthew compared the 700 series with the R9xxx series. Now while that is comparing gen vs same gen, in pc gaming one should never do that because it should always be about what is best now at the current price brackets. PC gaming unfortunately will always be price bracket vs price bracket. Not gen vs gen because let’s face it. PC hardware moves too fast to keep it gen vs gen.

            Not that Matthew’s statement was wrong. AMD did far better in the power department in the R9xxx series than the nVidia 700 series did. But unfortunately nVidia has taken the lead on both performance and power consumption efficiency

            But I am sure AMD will bring their next solution to beat nVidia soon. Vote with the wallet, not with your brand choice

          • Matthew Holliday

            February 3, 2015 at 16:01

            i wouldnt debate if this was a performance debate.
            the debate was about power consumption, which is… less relevant, in terms of sales.

            nvidia usually do win when it comes to power consumption, i was just pointing out that its not always the case anymore.

            performance wise, i agree, its price vs price.
            power consumption wise, i disagree, that is gen appropriate.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 17:39

            Have to agree with you there. 100%

  4. HairyEwok

    February 3, 2015 at 12:14

    For what reason would a person want 16 gigs of VRAM besides for 3D rendering? Please humor me on what game will ever use that much VRAM.

    Reply

    • Hammersteyn

      February 3, 2015 at 12:14

      Crysis 6

      Reply

      • Admiral Chief

        February 3, 2015 at 12:17

        BUT CAN IT BE RUN?

        Reply

        • Kensei Seraph

          February 3, 2015 at 12:19

          With 16 gigs of VRAM… probably not.

          Reply

    • Sylvex Dragonskin

      June 3, 2015 at 11:53

      It’s cheap and 8 isn’t gonna last long enough.

      Reply

  5. Darren Peach

    February 3, 2015 at 12:14

    I started reading the article until it became gibberish. Not the writing or the content, I think it was the distinct lack of relevance which rendered the topic pointless. Even if you are gaming on a pc, SLI and Crossfire are overkill. Like trying to eradicate a ant with a hydrogen bomb. I suppose you could argue that people who drive super-cars and have private jets also need to have the fastest computers on the planet.

    Reply

    • Raptor Rants A Lot:Original #7

      February 3, 2015 at 12:21

      I disagree. SLI can be overkill when you are stacking multiple top tier cards yes, but in the mid-low range bracket it can be the difference between forking out R3000 for a new card or getting a second hand one of the mid range you already have to extend you pc’s life.

      Mine for example is running SLI 560Ti’s. Now before the SLI’s I was running games on high but I could see some games already starting to tax it a little (BF4 and such) where some settings had to be bumped down to medium to prevent severe frame drops in hectic situations.

      By spending R700 on a second hand 560Ti I am able to play games on ultra again and I will now be set for another 2 -3 years easy before games are so demanding I have to play them on low. SLI is a great way to extend your gaming PC’s life

      Reply

      • Tosh MA

        February 3, 2015 at 12:21

        R700 hey?

        Reply

        • Raptor Rants A Lot:Original #7

          February 3, 2015 at 12:24

          Yes, I paid R700 for my 2nd 560Ti! I should have lied and said R800…. *grumble*

          Reply

      • Darren Peach

        February 3, 2015 at 12:24

        Good point. Another use would be the physics aspect. When I was still a high end card junkie, A dedicated card for physics was all the rage. Weird how the sands of time make all that important stuff seem so irrelevant. Whats happening with the physics think these days ?

        Reply

        • Raptor Rants A Lot:Original #7

          February 3, 2015 at 12:27

          PhysX is still a thing but I always caution people against using a card dedicated to physX. The gains are really not worth it as you effectively cut your SLI bus from 16x to 8x just so you can have a dedicated physX processor.

          The cards nowadays are so good that they can handle PhysX and rendering just fine even on a single card without too much performance hits.

          My setup runs better in PhysX games in SLI mode than it does in dedicated PhysX mode (Batman Arkham City is the best example of this)

          Reply

          • Darren Peach

            February 3, 2015 at 12:31

            Also, Multicore processors are a thing now.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:33

            Yeah but I reckon multicore is here to stay. There’s no way to go back to single cores simply because of the huge benefits of multicore rendering /calculations allowing a CPU to do multitasking. Single cores can’t do that so well

          • Darren Peach

            February 3, 2015 at 12:37

            Unless there is a paradigm shift in tech, Like organic processors.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:50

            Which would in theory be super multicore as the pathways won’t be static and would allow multiple bits of traffic to be transmitted at the same time. Which is the whole idea behind multicore – to simulate “organic like” multitasking processing

          • Darren Peach

            February 3, 2015 at 12:54

            How bout quantum processing, calculate everything at once.

          • Raptor Rants A Lot:Original #7

            February 3, 2015 at 12:55

            BOOYEAH!!!! Now we are talking! How many FPS do you want? Oh? Infinite? Sure, let me just calculate ALL THE FRAMES POSSIBLE!!!

          • Tosh MA

            February 3, 2015 at 12:33

            They’ve been around for years. Games are finally starting to use them now, that’s all.

            Thank consoles for finally catching up.

          • Darren Peach

            February 3, 2015 at 12:38

            Yep.

    • Tosh MA

      February 3, 2015 at 12:21

      It’s a syndrome my gf likes to call the “whipping out your penis” syndrome. It affects a lot of gamers.

      Reply

      • Darren Peach

        February 3, 2015 at 12:42

        Yeah, I know the feeling or syndrome. I remember playing a Quake match against my buddies boss. While we were playing, He was smack-talking me. I was very shocked at that experience, Here we are, sitting in a tiny little office about the size of a bedroom…. I am playing this guys and it almost sounds like he is talking dirty to his girlfriend in the bedroom. Funny as hell.

        Reply

    • FSR

      February 3, 2015 at 13:09

      Also, not everyone only games in 1080p at 60fps. When you start pushing 2560 x 1440 @ 140fps, you need more VRAM bandwidth. That’s where multi-card setups have value, at the higher than normal res’s.

      Reply

    • Lothy

      February 3, 2015 at 13:28

      Well think of it like this. A new card would cost you R3500 (hypothetically for something decent), but a 2nd card like the 1 u own might cost you R600 (generally under R1000) and having these cards in SLI/Xfire will give you almost the same performance as that new card. Thus allowing you to get a very cheap upgrade for really good performance that will last u at least another year or 2.

      Reply

      • Alessandro Barbosa

        February 3, 2015 at 14:02

        Sadly though, SLI only works with identical cards 🙁 CrossFire is far superior in this regard

        Reply

  6. Raptor Rants A Lot:Original #7

    February 3, 2015 at 12:16

    I’m calling it. AMD won’t allow it with anything less than the R2xxx series and nVidia won’t allow it with anything less than a 660

    Reply

    • Tosh MA

      February 3, 2015 at 12:19

      Looking at that header, I wonder what it’ll be like running 7 cards at once…

      Reply

      • Raptor Rants A Lot:Original #7

        February 3, 2015 at 12:21

        *sigh*

        Reply

  7. Patrik Ugrin

    February 5, 2015 at 01:28

    Doest that mean i can have 2x 780 = 6gb!
    Screw you Titan! 😀

    Reply

  8. Lord Xantosh

    April 26, 2015 at 13:58

    so wait for HBM on R9 3XXX and DX12 HBM 4gb+4gb with the I/O that HBM has, ZOMG 9001 FPS on Minesweeper!

    Reply

  9. BC Shelby

    July 22, 2015 at 22:33

    …so if this will indeed be true, it would be a huge boost to those who work with 3D rendering outside of games.

    The concept of having 24 GB of video memory from say, 4 980TIs to throw at a render process would mean a scene “heavy” with polys and texture maps will easily be handled in video memory rather than crash or drop to CPU mode sounds too good to be true.

    Reply

  10. Tom

    July 30, 2015 at 20:47

    Just installed windows 10 with latest nvidia drivers and my sli setup is stacked now.

    Reply

  11. ezeriv

    June 28, 2016 at 06:51

    and??? we are still waiting…

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Turns out Microsoft will require a TPM chip for you to install Windows 11

Turns out the much hyped low-specs for Microsoft's new operating system might be more rest…