Home Gaming Will AMD reclaim the performance crown with the R9 380X?

Will AMD reclaim the performance crown with the R9 380X?

2 min read
89

Mugabe! Hnggggh! Say it again! MUGABE!

AMD’s had a troubling year – at least as far as its desktop cards have gone. They’ve essentially had nothing to compete with Nvidia’s enthusiast powerhouse – the Maxwell-based GTX 900 series – so that’s where all the PC gamers with cash to burn have gone. It makes sense. For once, Nvidia’s cards aren’t the most expensive, and actually offer better price to performance. As the fastest cards in their segment, it’s naturally that people would opt for Nvidia.

It’s become so bad, that, according to Digitimes, AIB vendors have lowered their purchases of AMD GPU’s to prevent a stock build-up. Those same, vendors, they say “are also concerned that the market could gradually lean towards Nvidia, causing them to lose bargaining chips with GPU makers.”

AMD needs something big to shake up the market. And they could have that in their impending R9 380X. Alleged – and thus, to be taken with great big grains of salt – benchmarks of the card as it stands now show that the card is behemoth, knocking the GTX 980 from its lofty perch. If they’re to be believed, AMD could soon claim the performance crown – and with its penchant for a better price-to-performance ratio over Team Green, it could provide the impetus AMD needs to get back in the game.

Chinese language site Chiphell posted the following benchmarks.

232311n8zwpqmw7gxmidbq

232325x8aph0rxxn7r085z

232346swis5ig7gmimiepm

232401x2rg0z3u52tit2uj

The R9 380X is said to sport 4096 stream processors and 4GB of next-generation, stacked high-bandwidth memory (HBM) giving the card up to 640GB/s of bandwidth. According to the benchmarks, the card is up to 58 percent faster than AMD’s current top-end, the R290X and Nvidia’s 970 – which if true, would make this one heck of rendering champion.

It does, however, seem that the card is as power-hungry and hot as it is powerful; continuing AMD’s strain on power grids across the globe, this beast could end up consuming a lot of power – around 295W – which may add a little credence to previous rumours of the card using a hybridised water cooling solution.

Unfortunately, we have no idea when the card is set for public availability – with suggestions of a Q2 2015 release. The wait may be too long for AMD fans with itchy trigger fingers.

Last Updated: February 5, 2015

89 Comments

  1. Pieter Kruger

    February 5, 2015 at 13:36

    Sticking with Nvidia, AMD just not so reliable!

    Reply

    • Hammersteyn

      February 5, 2015 at 13:54

      What if I told you that there’s an AMD in your Eggsbox and one in my Ponystation?

      Reply

      • Admiral Chief

        February 5, 2015 at 14:44

        XD

        Reply

      • Tosh MA

        February 5, 2015 at 14:45

        Kind of proves his point really.

        Choose, 1080p or 60fps.

        Reply

        • Hammersteyn

          February 5, 2015 at 14:58

          Still, it’s reliable

          Reply

          • Tosh MA

            February 5, 2015 at 15:04

            Damn it. Those loopholes in my argument. 🙁

          • Hammersteyn

            February 5, 2015 at 15:12

            XD

        • Kriss Prolls Crispy

          February 10, 2015 at 17:40

          Don’t blame AMD man, blame intel and nvidia for sponsoring game devs that optimize games only for their hardware. Well intel and nvidia, it’s also the game devs fault for accepting this kind of shit. But it’s just about money 🙂 That’s also what made them blind and what will make ATI take a jump ahead of nvidia i think for a whille.

          You can say thank you rockstar for accepting these deals from nvidia and now on my painintheassstation 4 i can only play games at 30 fps, buhuuu.

          I shit on consoles, especially the new ones, haha pay every month to play?! are you fucking retarded? Anyways enjoy.

          Reply

          • TLDR

            March 6, 2015 at 17:16

            What does it matter why it runs better if it’s due to superior hardware, “paid” optimizations, or the fact that they send magical elves to your house while you play to give you a blowjob.

            If it runs better as a consumers thats the only thing you should care about.

            And FYI considering that both consoles run AMD CPU’s and GPU’s and that console games are a much bigger market it’s a bit too tinfoil to think that Intel and Nvidia could ever mast the resources needed to make the games run better on their hardware. When they are not only competing with Sony and MSFT, but also with a consumer market 10 times bigger than the PC gaming one 🙂

      • Pieter Kruger

        February 5, 2015 at 15:10

        Very aware of that, just saying if I had a choice I’d go for Nvidia every time!

        Reply

        • Hammersteyn

          February 5, 2015 at 15:13

          Lies! Red TEAM

          Reply

          • Pieter Kruger

            February 5, 2015 at 15:18

            Thought you were blue team? Anyway it’s more of a pinkish hue really, only thing red about AMD is their cards running red hot! ????

          • Hammersteyn

            February 5, 2015 at 15:34

            Well blue and red makes purple….
            PURPLE team.

        • rockstarfish

          February 5, 2015 at 17:28

          Well that’s dumb… you will get a worse card because it says Nvidia on it?

          Reply

          • Pieter Kruger

            February 5, 2015 at 18:28

            I believe that reliability count just as much as performance so taking that into account I wouldn’t say a slight drop in performance makes it the worse option.

          • amilayajr

            February 5, 2015 at 20:03

            GTX970 is Reliable as long as that 3.5GB memory is kept at the bay. If it goes beyond that, it’s gimped lol. I’d still buy AMD Cards, I’d pick 290X over 970 anytime.
            Only fool thinks they’re saving a lot of money in electricity bill. People are

            stupid. If you’re going to play great games, you want it in max
            settings. If that’s the case, start watching how 970 starts consuming a
            lot of electricity. I’d rather take the savings I can get now with 290X
            since it’s cheaper than 970. It’s sad but people don’t realize, it
            will take them few years before they recoup the money they save from
            electricity bill; by that time, you’re looking for upgrades or new
            card. LOL. At least I know 290X will be able to handle 4K gaming and
            it’s future proof better than gimped Nvidia GTX970. Research first on
            reliable testings. Nvidia knew the 3.5GB limit from the get go, that’s
            why they requested reviewers to ensure only use selected games that
            won’t utilize beyond 3.5GB memory. Have you notice any reviews that
            used any game that uses more than 3.5GB? None, because Nvidia requested
            them as part of review to show how strong their cards but hide the
            weakness. Lol, they should start doing bench mark again with 4K gaming
            that utilizes more than 3.5GB memory and will see how GTX970 starts
            dumping bricks.

          • Jack

            February 6, 2015 at 08:52

            cheaper?? what ??
            100 rand cheaper WOW
            The only one who is stupid here is you my friend 🙂

          • amilayajr

            February 6, 2015 at 15:40

            Lol, yeap it’s cheaper bud. Because I know I can play in higher resolution and not having to worry about 3.5GB gimped memory. If that’s ok with you, have fun with your GTX970. and yes I still believe AMD offering is decent and having mantle is pretty cool as well as Free synch. In addition, 290x is more future proof for 4K Gaming compare to GTX970x. I hate it when gamers complain about noise and heat about their gaming rig as well as efficiency. Go BIG or go home, silent cooling is way to go! Besides when you’re gaming do you really worry about electricity bill when you want everything in Max settings as high as possible?. that’s pure stupidity if you worry about electricity, might as well not game at all.
            Regardless of the price….. either way, Nvidia GTX970 is still gimped, that is a fact and nothing you can do about it. AMD might be hot and noisy but you can do something about it easily……Now, who’s stup!d here…….I’ll leave it up to you.
            Still an owner of NVIDA Card here btw…… and upgrading soon…..to AMD.

          • Mark Lum

            February 6, 2015 at 07:03

          • Jack

            February 6, 2015 at 08:53

            Nice try troll
            He’s talking about now aka the present
            Stop living in denial 🙂

    • Captain JJ the crafter

      February 5, 2015 at 14:01

      The very reason I switched from Nvidia to AMD.

      Reply

    • 86james randy

      February 5, 2015 at 20:14

      well my R9 290X 4GB is 4GB… that seem to be reliable… wonder what else Gpu company lie about if they lie about vram…

      Reply

    • George Costanza

      February 6, 2015 at 09:18

      at least they don’t outright lie

      Reply

    • Kriss Prolls Crispy

      February 10, 2015 at 17:33

      haha that’s sad that you believe that… Or are you just a nvidia fanboy trying to troll? Anyways amd might not have been the best for some years ago, but at the same time if intel and nvidia didn’t spend most of their time trying to ruin it for amd and ati well they might have been able to come up wit something better. But this time ATI is going to be ahead of nvidia And it’s not the first time 🙂 Oh and not to mention all the codes nvidia throw into game so that ati users get bad perfs. They probably still do it in benchmarks like they did before #intel Hey ATI has mantle and every one is free to use it 🙂 Nvidia uses their locked thing and only they can use it. A bunch of selfish egocentric cunts, that’s what they are.

      One day i’m sure truth will be told and seen, amd will be as good as intel if not better and for lower prices of course as always. I’ve had intel and amd and so far the most reliable pc i had was a full amd/ati setup sooooo…. yeahhhh.

      Reply

    • Kriss Prolls Crispy

      February 10, 2015 at 17:41

      Oh amd not reliable. but at least when they make a video card with 4gb of ram well it really has 4gb of WORKING ram, amazing right?

      Reply

    • canopus72

      February 19, 2015 at 15:53

      Total rubbish. Nvidiot are liars and criminals, as they conned customers into believing the 970GTX has 4GB memory when in reality it has 3.5GB of memory (which cannot perform well at 4K resolution). I used to be an Nvidiot fanboy for the last 10 years but since January 2014, I got fed up with their extortionate prices, marginal performance gain and poor resale value. In July 2014 I bought a pair of Sapphire 290 TRIX and I was bloody amazed by their performance. I now have a 295X2 purchased at sale price on black Friday (£480) and I love this gfx card. Im quietly confident the 380X and 390X will destroy anything Nvidiot has to offer due to the new HBM that has been paired to the Hawaii gpu (380X) and brand new arch FIJI gpu (390/X). Since the 970GTX farce, AMD should now focus on aggressive marketing and publicly confirm their 380X and 390X has genuine 4GB memory and new tech HBM. That is their strongest selling point.

      Reply

    • Javier Turok

      May 12, 2015 at 20:04

      Nvidia reliable? So gtx 970 3.5gb hahaha.

      Reply

  2. Raptor Rants A Lot:Original #7

    February 5, 2015 at 13:44

    Well, not that hot considering it’s cooler than almost all the other cards.

    But I don’t know. Those benches… Such a massive leap? It seems a bit hard to believe. If it’s true though… It may be a good shake up of the market…. IF they can match the price range nVidia is hitting lately. Also, they would need a mid range version of the card too

    Reply

    • Blood Emperor Trevor

      February 5, 2015 at 14:12

      If it’s that cool while drawing so much power then maybe there’s some truth to the rumour of water-cooling.

      Reply

      • Tosh MA

        February 5, 2015 at 14:26

        Good point actually.

        Reply

    • Lord Chaos

      February 5, 2015 at 14:13

      I’ve been saying it for a while now, the new CEO is gonna make some big changes

      Reply

      • Raptor Rants A Lot:Original #7

        February 5, 2015 at 15:20

        Doubt it will be massive changes but who knows. AMD does need to pull up its socks a bit though

        Reply

      • 86james randy

        February 5, 2015 at 20:16

        well she is very well educated… i can trust on that to make change.

        Reply

    • RustedFaith

      February 5, 2015 at 22:55

      I think its due to the new HBM memory we will have to wait and see.

      Reply

  3. Sgt. M

    February 5, 2015 at 13:45

    I’d be very worried about Mugabe clouds overhead

    Reply

    • Captain JJ the crafter

      February 5, 2015 at 13:59

      It brings a sh!tstorm

      Reply

      • Sgt. M

        February 5, 2015 at 15:22

        *Jacob Zuma Laugh*

        Reply

    • hikingmike

      February 5, 2015 at 18:04

      I was wondering wtf that was

      Reply

  4. Ryanza

    February 5, 2015 at 13:45

    Nvidia always releases their cards first and the Nvidia fanboy’s scream how much power it has. Then AMD releases their cards and then all the Nvidia fanboy’s can say is power consumption, power consumption, power consumption, drivers.

    The last batch of AMD cards were mainly re-branded except for the R9 290x and R9 295 x2. The R9 295 x2 was the fastest card.

    I have been waiting for years for AMD to properly update their cards. These new cards are going to be a killer. Power supple killer but still they going to run really fast and have a decent price tag. I wouldn’t be surprised if AMD has the fastest card again.

    Don’t Support Gameworks.

    Reply

    • Mossel

      February 5, 2015 at 13:59

      haha nice twist there at the end!

      Reply

    • geel slang

      February 5, 2015 at 14:18

      The fat lady have not sung untill team green drops their ti card.

      Reply

      • Stewie

        February 5, 2015 at 15:28

        Remeber this is the 380X not the 390x,there holding that back for the gm200 or the ti

        Reply

    • amilayajr

      February 6, 2015 at 15:43

      We don’t really know if this card needs that much power, hopefully not too much that you need a 1000W PSU. We just love competition, it’s good for us consumers… means lower prices :). I wonder if 295X2 will drop prices as well, that’s not a bad pick at all as well considering it’s still the fastest offering out there. Max every settings baby!

      Reply

  5. Blood Emperor Trevor

    February 5, 2015 at 13:46

    My trigger finger is just fine, honed by years of not buying games day one. It’s between a 300 series card & a 970, so I’m waiting to see actual price/performance.

    Reply

    • Tosh MA

      February 5, 2015 at 14:25

      Just look at your PSU as well, that power draw on the 300 AMD series will mean I need to factor the price of a new PSU in to the purchase price of that card. I can put a 970 or 980 in right now. So there’s that.

      Reply

      • Blood Emperor Trevor

        February 5, 2015 at 14:41

        I’m building a complete new system & that’s exactly why I haven’t bought a PSU yet. 650W should be sufficient anyway if I go the i5 route.

        Reply

        • Tosh MA

          February 5, 2015 at 14:43

          You’ll probably have to go 800W. I’m on a 650W and I just won’t have enough to power an AMD card. :/

          Reply

          • Frik van der Hewerskink

            February 5, 2015 at 14:49

            I own a R9 290 which is powered by a 650w PSU :/

          • Tosh MA

            February 5, 2015 at 14:55

            Ok. I have 2 hard drives and 7 case fans, along with a GTX 560, 4x4gb RAM and an i7 3770k. It’s not just the card that draws power sadly.

            So sure, with 2 case fans, 1 hard drive and 2x8gb chips of ram then it’s feasable.

          • Spaffy

            February 5, 2015 at 15:19

            560 pulls 150w, max.
            I have a 780 on a 600w. Wattage is not that important, Amps are. Especially on your 12v rail.

          • Frik van der Hewerskink

            February 6, 2015 at 07:20

            Probably, but i don’t have 2 case fans, 1 hard drive and 2x8gb chips of ram either.

          • 200380051

            February 5, 2015 at 19:50

            http://www.tomshardware.com/answers/id-2196225/290-psu-requirement.html

            650w seems enough. For overclocking though i would get 750w or more.

  6. Raptor Rants A Lot:Original #7

    February 5, 2015 at 13:50

    Mugabe….

    Oooooh….. Do it again…

    Mugabe….

    Oooooh

    Mugabe Mugabe Mugabe!!!

    http://a.dilcdn.com/bl/wp-content/uploads/sites/2/2014/02/Banzai-Shenzi-The-Lion-King-Mufasa.gif

    Reply

    • Hammersteyn

      February 5, 2015 at 13:54

      XD

      Reply

  7. konfab aka derp

    February 5, 2015 at 13:51

    Heading of the week right there.

    Reply

  8. Hammersteyn

    February 5, 2015 at 13:55

    We know James Earl Jones played Mufasa, so Mugabe was Mufasas father then?

    Reply

    • Captain JJ the crafter

      February 5, 2015 at 14:00

      What does that make Darth Vader then?

      Reply

      • Blood Emperor Trevor

        February 5, 2015 at 14:01

        The dress-up doll, Manikin Skywalker.

        Reply

        • Captain JJ the crafter

          February 5, 2015 at 14:01

          Haha. Excellent.

          Reply

      • Hammersteyn

        February 5, 2015 at 14:02

        Not my father. Though I wish he was

        Reply

        • Captain JJ the crafter

          February 5, 2015 at 14:03

          I’m not sure hey. I don’t think a metal bikini would suit you. ;P

          Reply

          • Blood Emperor Trevor

            February 5, 2015 at 14:03

            That’s sexist.

          • Captain JJ the crafter

            February 5, 2015 at 14:04

            o_O

          • Hammersteyn

            February 5, 2015 at 14:05

            Agreed, why cant I wear a bikini if I wanted too?

          • Captain JJ the crafter

            February 5, 2015 at 14:07

            You can wear it, I suppose. If you really want to. I won’t judge.
            I will comment on it though.

          • Hammersteyn

            February 5, 2015 at 14:09

            Bet I’d even look better than Darryn in one.

          • Blood Emperor Trevor

            February 5, 2015 at 14:10

            Oh, oh… CAT FIGHT!

          • Captain JJ the crafter

            February 5, 2015 at 14:11

            LOL

          • Tosh MA

            February 5, 2015 at 16:01

            A Cthon would look better than Darryn in one. *RUNS

            http://tor.zamimg.com/uploads/images/4174.jpg

          • Tosh MA

            February 5, 2015 at 15:58

            *to

  9. BurnZ

    February 5, 2015 at 13:56

    Wow, it uses as much power and gets as hot as my stove at home! wicked so i can sell the stove to help pay for the card, and cook supper on it at the same time! hell ya, two for one baby!

    Reply

  10. CrasH

    February 5, 2015 at 14:16

    Soo… your saying the one the released their card last… after knowing the exact performance of the competitors card… is faster….

    Reply

    • Archzion

      February 5, 2015 at 14:40

      Your English hurts my brain.

      Reply

  11. Ghost In The Rift

    February 5, 2015 at 14:17

    Its gonna be a very interesting Year for nVidia and AMD, just wish they can get a move on with it.

    Reply

  12. Tosh MA

    February 5, 2015 at 14:24

    With that kind of power draw I will need to buy a new PSU just for this card. I could put a 980 in now without touching my PSU.

    Sigh, numbers are great but damn.

    Reply

  13. Kromas Votes LAG WCMovie Event

    February 5, 2015 at 14:40

    New AMD slogan will up sales quite a bit. “We don’t lie to our customers.”

    Reply

  14. Ardz

    February 5, 2015 at 14:43

    I hope AMD will go with the hybrid cooling system. Or a Non reference R9 290/290X cooler. They were god awful. Hitting 90 degrees on a card whilst hearing a jet engine was not an ideal gaming experience. Especially when you game without a headset.
    I live on the Gold Coast, Australia. It gets too hot in the summer to wear a headset.

    Reply

    • Spaffy

      February 5, 2015 at 15:20

      What’s it like playing survivor everyday? 😛

      Reply

  15. Pieter Kruger

    February 5, 2015 at 15:25

    All water-cooled AMD cards now come with a free kettle whistle…… ????

    Reply

    • Blood Emperor Trevor

      February 5, 2015 at 19:56

      See, they CAN make coffee too 😀

      Reply

  16. rockstarfish

    February 5, 2015 at 17:24

    7 watts more than 290x and more than 50% faster. Yes please! This makes maxwell “efficentcy” look like a joke.

    Reply

  17. Ironbunny IonBunny

    February 5, 2015 at 17:58

    I bet they’re going to demolish 980 with 370 or 370x that uses 200w ish and costs 100 bucks less.

    Reply

  18. HairyEwok

    February 5, 2015 at 18:14

    And then pascal comes out again and Nvidia rains supreme for a couple of months….. It’s the same bloody cycle each year…
    Remember these 300x series have High Bandwidth Memory (HBM) which makes a huge difference in performance, and Nvidia cards have none of that now. You will only be able to compare apples against apples when pascal comes out with HBM as well.

    Reply

  19. Zander Boshoff

    February 5, 2015 at 19:40

    So get your brand new R9 380x, load crysis to benchmark, Boom stage 3 load shedding

    Reply

  20. Wraith

    February 6, 2015 at 09:08

    Those temps don’t look too bad actually. +/- 70 degrees is actually pretty good, especially if this is the AMD reference cooler. Third-party cooling would mostly bring that down even lower.

    Reply

  21. sath

    February 7, 2015 at 16:35

    This questions seems very redundant the 980 has been out for a while and its performance is well known, AMD would have to be brain dead to release a flagship weaker than the 980.

    Reply

  22. SaucyJack42

    February 10, 2015 at 02:34

    Well, we all know who signs your paychecks now. AMD has had the crown since April 2014 with the 295×2 (hilariously included in the article’s benchmarks) and not a single offering from Nvidia can match it (and its reasonably priced on top of that). Stock has been building up for AMD GPUs because of the hype over HBM (whether true or not). People don’t want to upgrade with a major new technology entering the field. Myself included.

    Reply

  23. canopus72

    February 19, 2015 at 16:00

    The author of this article is wrong about specs for 380X. It will NOT have 4096 stream processors. the 380X will feature a rebranded and overclocked Hawaii gpu (as found in 290/X) which has 2816 stream processors. The rebranded Hawaii gpu will be paired to the new tech 2.5D HBM. This new gen of memory is up to 9x faster than standard GDDR5 RAM and it is due to the new HBM, that the 380X is giving amazing performance that is destroying the 980GTX.

    The 390/X will feature a brand new gpu called FIJI-XT (390X) and FIJI-PRO (390). The 390 variant will be the same as the 390X but with lesser stream processors. It is the 390X that will have 4096 stream processors and I think the 390 will have about 3800 stream processors.

    Reply

  24. canopus72

    February 19, 2015 at 16:09

    As for power, well the TDP for 380X is 300W and the TDP for 290X is 290W. Not that big a difference and I reckon the 380X will come with standard air coolers and perhaps an AIO liquid cooled variant. The 390X however, will come with an AIO liquid cooled solution. I have a feeling the 390X may even beat the 295X2 in benchmarks by a decent margin. I would like to know what the TDP and RRP will be for the 390X, but if the leaked shipping reports from the Mumbai port is correct, then it will cost INR 760,000 (£760). A premium gfx card at a premium price.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Turns out Microsoft will require a TPM chip for you to install Windows 11

Turns out the much hyped low-specs for Microsoft's new operating system might be more rest…