1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI Radeon R9 280X Gaming 6 GB

Discussion in 'Reviews' started by W1zzard, May 12, 2014.

  1. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    15,043 (3.88/day)
    Thanks Received:
    11,951
    Last edited by a moderator: May 26, 2014
    Ikaruga, MrGrammaX and Mathragh say thanks.
  2. Hilux SSRG

    Hilux SSRG

    Joined:
    May 1, 2012
    Messages:
    1,023 (1.06/day)
    Thanks Received:
    170
    Location:
    New Jersey, USA
    Nice review.

    $120 extra for a 50mhz base clock bump and 3gb more memory is a huge waste of money. The price can't be right.
     
    Aquinus and techy1 say thanks.
  3. Suka

    Suka

    Joined:
    May 6, 2013
    Messages:
    54 (0.09/day)
    Thanks Received:
    7
    Location:
    Steam Servers
    spot on.
     
  4. Sony Xperia S

    Joined:
    Mar 28, 2014
    Messages:
    231 (0.85/day)
    Thanks Received:
    22
    It is a the almost 3-year old Tahiti, not Hawaii. :)

    "AMD's Hawaii graphics processor uses the GCN shader architecture. It is produced on a 28 nm process at TSMC, Taiwan, with 6.2 billion transistors on a 438 mm² die."

    http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming_6_GB/4.html

    :(

    Man, the progress with GPUs is terrible. Moore's "law" is long long gone, dead and buried. :)
     
  5. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    730 (0.66/day)
    Thanks Received:
    211
    Again people insist on overstating VRAM requirements frankly, sure more VRAM isn't a bad thing, but I would say even 6GB on 780 Ti would be overkill.
     
  6. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,537 (1.27/day)
    Thanks Received:
    547
    4K yo

    Nothing says future proofing like a three-year-old GPU using a soon-to-be-replaced-by-DisplayPort 1.3/HDMI 2.0 output that it couldn't use anyway (assuming firmware was offered) without the guarantee that its pixel clock could run reliably at 600MHz (for 60Hz operation).
     
  7. Tsukiyomi91

    Tsukiyomi91

    Joined:
    Feb 18, 2013
    Messages:
    86 (0.13/day)
    Thanks Received:
    19
    Location:
    KL, Malaysia
    I doubt that a soon-to-be 3 year old chip that's being "reused" is going to be a good investment for "future proofing". Benchmark already stated that having more video RAM does not yield high frame rate & I agree that spending an extra $120 for a minor core speed bump (measly 50MHz) & 3GB of extra video RAM is a waste of money (and time). Rather spend my dough on a GTX780Ti 3GD5 since it has 1.) a full-blown GK110 chip 2.) 7GHz of effective memory speed 3.) lowered optimal temperature range for GPU Boost to kick in & lastly 4.) Kicks the AMD R9 290X's arse in many ways despite being $100 more.
     
  8. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,537 (1.27/day)
    Thanks Received:
    547
    It's sarcasm. The fact that I referenced Tahiti's nominal 400MHz pixel clock not being able to reach 600MHz (as is the case with Hawaii) for 4K60MHz operation should have been a big tell.
     
  9. techy1

    Joined:
    Jan 20, 2014
    Messages:
    100 (0.29/day)
    Thanks Received:
    12
    nice - a 280x (that is barely better than any other 280x) for a price of R9 290 - well... thx MSI, but no thx
     
  10. walterg74 New Member

    Joined:
    May 24, 2014
    Messages:
    2 (0.01/day)
    Thanks Received:
    0
    I lke the reviews here, but seriously:

    Pro: 6Gb of RAM
    Con: Extra RAM doesn't improve performance...

    So wth? That's a little (actually a lot) contradictory...

    Trying to decide best value/perf between these: 270X/760/280X/770, to replace my old HD4870.

    Don't wanna go higher than that, since later on I will just get a new built with either an i5 4670K, or an i7 4770K, etc. (currently running an older Phenom II X4 3.0Ghz BE, on a Gb 790X series motherboard, and still DDR2 RAM...)
     
  11. MrGrammaX New Member

    Joined:
    May 24, 2014
    Messages:
    1 (0.00/day)
    Thanks Received:
    0
    meh.. R9 290 4gb(400$) is better choice :)
     
  12. Sony Xperia S

    Joined:
    Mar 28, 2014
    Messages:
    231 (0.85/day)
    Thanks Received:
    22
    R9 270X or even better R7 260X (at 1920*1080). :)

    http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming_6_GB/26.html
     
  13. pky New Member

    Joined:
    May 24, 2014
    Messages:
    3 (0.01/day)
    Thanks Received:
    0
    What's more interesting than the 6GB of memory is the much more silent and effective cooler than the one on the 3GB MSI Gaming. Would be best if they make a 3GB model with the updated cooler with the same price tag as the previous one.
     
  14. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    15,043 (3.88/day)
    Thanks Received:
    11,951
    yup, that would be nice. or upgrade the existing 3 gb model with the design of the 6 gb card.

    people have different requirements. so if you play nothing but skyrim with texture mods you might actually need 6 gb (I'd rather suggest play a different game :D )
     
  15. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    1,486 (6.17/day)
    Thanks Received:
    604
    Location:
    Texas
    You need the extra 3gb of Vram only in the case of 4k in reality. So if you want to invest in 3-4 of these cards for 4k you will have enough ram at least to make up for it but unless the game your playing has the scaling to make up for a 3-4 card setup it becomes a waste. Plus as seen, might as well grab the R9 290, it will be enough for 4k and the same price in the end with more performance to boot

    lol, so many ways with that game to wreck your computer, I think Skyrim became an experiment to see how far they can make a computer cry before the game became unplayable.
     
  16. alwayssts

    alwayssts

    Joined:
    May 13, 2008
    Messages:
    390 (0.16/day)
    Thanks Received:
    87
    QFT. That said, cards like this signal that 4Gbit ram is getting ready for primetime, which is a big deal (and I think largely the take-away from this product.)

    Now, I don't know if Tonga (iirc my codesnames) for instance will have either of those connectivity options (probably not) but I would be willing to bet it's pixel clock is fixed and there are 8-chip 4GB models from the get-go, regardless of the big push in the rumor mill being 2GB reference (inferring amd wants them to make a big splash for typical 1080p users).

    Couple of those cards, at the right price, could be kinda-sorta interesting versus Hawaii, just as 2xPitcairn was to Tahiti, although in the later example 4GB was a very rare sight.

    On a side note, glad to see (at least with this sample) 4Gb elpida chips are not as crappy as some of their 2Gb offerings. Perhaps their binning standards changed with new products because of the abilities to differentiate chips for new standards (ie low-power), perhaps the chips themselves are just so new (implied by the price) that higher-leakage parts might flow into lower-end products (as is usually true in the beginning) before they are eventually binned more thoroughly, or maybe just perhaps they were just tired of being crapped on by geeks versus Samsung/Hynix. I'm always glad W1z calls them out in his reviews. Who knows in this case, but if repeated enough times things like that can have an effect on products. :lovetpu:
     
    Last edited: May 26, 2014
  17. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,537 (1.27/day)
    Thanks Received:
    547
    The 400MHz pixel clock is a hold over from the old RAMDAC standard. Since VGA has gone the way of the dodo, and the advent of faster-than-60Hz monitors, I wouldn't think keeping the old 400MHz limit had any relevance for todays cards
    Probably safe to say that even if the 2GB model is reference standard, you should see 4GB cards at the same time unless AMD see the 4GB variant as the salve for AIB's forced to tow the line with a reference cooler for a few months. One of the truly bizarre business strategies of AMD seems to be the "reference only" board at launch. You might have thought they'd have learned by now that a site will only do a single review for a reference board (maybe two if they do a separate CrossfireX review), yet Nvidia reap the full benefit of PR on Day One thanks to a slew of DCII, Windforce, Jetstream, TwinFrozr, and SuperClockedFTWiChillAMP'ed versions
    Well, with GDDR5 turning into the new DDR3 with the advent of HBM / WideIO / HMC and DDR4/GDDR6, maybe the incentive to lower prices to keep the old production lines running might make 4GB GDDR5 cost effective. 5 Gbit chips are already (seemingly) dirt cheap, so no reason that 7Gbit can't follow suit over the next couple of years.
     
  18. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    1,486 (6.17/day)
    Thanks Received:
    604
    Location:
    Texas
    Its mostly because its whats necessary, more ram has become a huge requirement since we are just skipping up to higher resolutions faster than hardware has a chance to evolve. We never really even got a full introduction to the world of 1440/1600p before 4k was the next big thing.

    It will probably become a new standard for at least the middle ground offerings to have at least 4gb, but its going to become one of those standards just to handle the new gen games and 1080p will soon be considered a low baller in the gaming world.

    In the end though, the 6gb seems like a necessity while being a waste at the same time at least with this price in mind. You might as well just buy the 4gb R9 290 and gain a nice performance advantage while still being able to handle the current gen lineup of resolutions (If you invest into CFX).

    Oh im with you on that, I hate those sneaky random assortment of ram that you can end up with. Its nice to be able to see what a card has on it so you can at least assume its going to contain that grade of ram especially when people like to overclock their ram to the limits. im surprised in the end though because Elpida has taken such a hit with how bad the chips have been to the overclocking community, guess I got lucky with my cards I didn't run into any.
     
  19. techy1

    Joined:
    Jan 20, 2014
    Messages:
    100 (0.29/day)
    Thanks Received:
    12
    I like these discussions that extra vram could give real advantage in "some scenarios" and most deffinetly in "4k"... now - lets look at real life (cuz Wizz made real life test for us) - 1) is there any advantage of extra memory? 2) is there any benefits in highest settings and/or higher resolutions? 3) is there any in gains in 4k (there is 4k in this test!!!)? - answer for all these questions is NO, NO and NO.... so from where someoene gets the idea that extra vram could give advantage (if the tests show othervise)?
     
  20. pky New Member

    Joined:
    May 24, 2014
    Messages:
    3 (0.01/day)
    Thanks Received:
    0
    @techy1 Dude, it's not all about frames per second. Resolution and settings affect the image you see on the screen. Example image. If you just care about the FPS, then you can play on lower res/settings, but people want to play the game as it's meant to be played, not to see blurriness everywhere.
     
  21. SmokingCrop

    Joined:
    Nov 16, 2013
    Messages:
    25 (0.06/day)
    Thanks Received:
    8
    Go play on the lowest resolution with the integrated gpu of your cpu then.
    k, tnx bai. <3
     
  22. techy1

    Joined:
    Jan 20, 2014
    Messages:
    100 (0.29/day)
    Thanks Received:
    12
    "Go play on the lowest resolution with the integrated gpu of your cpu then.
    k, tnx bai. <3" - only if my iGPU would have doulbe amout of vram I should be fine, right? :D

    "@@techy1 Dude, it's not all about frames per second. Resolution and settings affect the image you see on the screen. Example image. If you just care about the FPS, then you can play on lower res/settings, but people want to play the game as it's meant to be played, not to see blurriness everywhere."
    I am talking about reallife tests (that are made on high settings - all equal for all tested GPUs - you can read description) and there double vRam did not show any benefit whatsoever... it is not that wizz tests "double vRam" GPU's on high settings and normal GPU's un normal settings
     
  23. pky New Member

    Joined:
    May 24, 2014
    Messages:
    3 (0.01/day)
    Thanks Received:
    0
    Oh, I got it... I guess I misunderstood your previous post. :)
     
  24. thebluebumblebee

    thebluebumblebee

    Joined:
    Jul 2, 2008
    Messages:
    3,319 (1.40/day)
    Thanks Received:
    1,867
    o_O
    Listed price: $400
    Price today at Newegg $330. After MIR: $299
     
    Crunching for Team TPU More than 25k PPD
  25. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    1,486 (6.17/day)
    Thanks Received:
    604
    Location:
    Texas
    Well at that price that's actually a pretty good deal if you want to run a multi-card setup at a high resolution!
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page