Friday, November 9th 2018

XFX Radeon RX 590 Fatboy Smiles For The Camera

The flood of leaked AMD Radeon RX 590 graphics cards continues with the latest one being from XFX. Sporting a new naming scheme, the XFX Radeon RX 590 Fatboy is very similar to the RX 580 GTS series. It features the same dual fan cooler used on the RX 580 GTS, which takes up roughly 2.5 slots. Even the backplate remains the same with no changes to speak of, meaning side by side you wouldn't be able to tell the difference unless you look at power delivery which is where the designs diverge. The RX 590 Fatboy utilizing an 8+6 pin design compared to the RX 580 GTS series and its single 8-pin design. In regards to display outputs that remains the same between the two with three DisplayPorts, one HDMI port, and one DVI-I port being standard.

When it comes to clock speeds the XFX RX 590 Fatboy OC+ at least according to Videocardz will come with a 1600 MHz boost clock. That is an increase of roughly 200 MHz over XFX's highest clocked RX 580. With such a high boost clock the additional 6-pin power connector is likely included for improved power delivery and depending on luck may allow for more overclocking headroom. Considering no vendor releases just one version of a graphics card it is likely that a few more variants will be available at launch. Sadly no pricing information is available as of yet.
Source: Videocardz
Add your own comment

60 Comments on XFX Radeon RX 590 Fatboy Smiles For The Camera

#2
dj-electric
That is by far the best name for a GPU i've ever heard.

I love you, XFX
Posted on Reply
#3
bug
Cool, this already matches the RTX 2080. In power connectors :D
Posted on Reply
#5
eidairaman1
The Exiled Airman
Call it trislot, just like a VaporX/Nitro or Strix card
Posted on Reply
#6
xkm1948
AMD is gonna ask at least $300 for this
Posted on Reply
#7
RealNeil
It's fat,............like my Red Devil Vega-64.

Posted on Reply
#8
bug
RealNeil, post: 3938833, member: 150651"
It's fat,............like my Red Devil Vega-64.


But this is weird. PCIe power spec hasn't changed, so these can't draw more power than cards did five years ago. Why the oversized cooling? Is it just to shave off a few dB?
Posted on Reply
#9
INSTG8R
My Custom Title
I like the name but hate the look of the cooler.
Posted on Reply
#10
theoneandonlymrk
quite dissapointed there is only one hdmi, i wont be buying one though im intrigued by that clockspeed, i think its a good job nvidia are nearly out of 1080s.
Posted on Reply
#11
micropage7
in real world fat boy can't run fast :D:kookoo::D:D:nutkick:
Posted on Reply
#12
EntropyZ
So this is going to draw more power than a non-founders GTX 1070 OC'ed while having the same amount of PCI-E connectors and more current draw, and still end up being slower than that card in most games. Hmmmmmmmmmmmmm.




Why do I get the creeping feeling Navi isn't going to be an impressive improvement over Vega. While undoubtedly both are more future proof, I have to question why can't they finally move to off the GCN arch which needs more power to get the same performance as Nvidia. I think RTG needs to take the risk like the CPU division did with Ryzen and put all the cards at the table. Trouble is... they are losing talent and money over there.
Posted on Reply
#13
RealNeil
bug, post: 3938839, member: 157434"
But this is weird. PCIe power spec hasn't changed, so these can't draw more power than cards did five years ago. Why the oversized cooling? Is it just to shave off a few dB?
PCI-E power spec relates to what power the card draws from the PCI-E slot your card is plugged into. (it is a standard amount of power on all boards) The PCI-E power cables from your PSU supplement that depending on what your GPU'S real requirements are.
Combination of both may require more robust cooling.

I have a GTX-980Ti that uses two 8-pin and one 6-pin power cord as well as what the PCI-E bus provides.

My two GTX-1080s have just one 8-pin.
Posted on Reply
#14
Lionheart
EntropyZ, post: 3938878, member: 162270"
So this is going to draw more power than a non-founders GTX 1070 OC'ed while having the same amount of PCI-E connectors and more current draw, and still end up being slower than that card in most games. Hmmmmmmmmmmmmm.




Why do I get the creeping feeling Navi isn't going to be an impressive improvement over Vega. While undoubtedly both are more future proof, I have to question why can't they finally move to off the GCN arch which needs more power to get the same performance as Nvidia. I think RTG needs to take the risk like the CPU division did with Ryzen and put all the cards at the table. Trouble is... they are losing talent and money over there.
Lol why comment if you're just going to be whiny & negative while knowing the answers. AMD/RTG has low R&D funding & even if AMD come out with a high end line of video cards that beat out Nvidia, the sheep will still flock to Nvidia due to mindshare.
Posted on Reply
#15
RealNeil
Lionheart, post: 3938889, member: 52641"
even if AMD come out with a high end line of video cards that beat out Nvidia, the sheep will still flock to Nvidia due to mindshare.
I have both brands here. The way I see it, if either company wants loyalty, they should buy a dog.
Posted on Reply
#16
R0H1T
RealNeil, post: 3938904, member: 150651"
I have both brands here. The way I see it, if either company wants loyalty, they should buy a dog.
All brands want loyalty, that's part of the reason why the likes of Apple/Intel/Nvidia churn out record profits year after year. I'd add Tesla in there as well but it's unsure whether their profit streak will last more than a single quarter. Now it's (always) up to the buyers to ensure that they aren't beholden to one brand, just because it's their favorite.
Posted on Reply
#17
lexluthermiester
R0H1T, post: 3938912, member: 131092"
All brands want loyalty
This is true. All brands want brand-loyalty to ensure some level of return business.

As for this card, I think the cooling solution implies that the card will OC/Boost itself into the ranges of perfomance that will require a need for the oversized HS/Fan combo used.
Posted on Reply
#18
Vya Domus
EntropyZ, post: 3938878, member: 162270"
I have to question why can't they finally move to off the GCN arch which needs more power to get the same performance as Nvidia.
You don't have to ask that because that's not the case, there is nothing about GCN that makes it use more power inherently. AMD has been stuck on an inferior node, Vega 20 is proof of that. That's all there is to it.

EntropyZ, post: 3938878, member: 162270"
Trouble is... they are losing talent and money over there.
Here's something I don't understand about you RTG haters: how come AMD is in such an appalling state with their GPU division according to you all, yet they actually have "talent and money" that they are losing ? Are they failing hard or do they have great talent there, which is it ? You can't have both.

RealNeil, post: 3938904, member: 150651"
The way I see it, if either company wants loyalty, they should buy a dog.
They get awfully close to that, actually.
Posted on Reply
#19
EntropyZ
Vya Domus, post: 3938927, member: 169281"
You don't have to ask that because that's not the case, there is nothing about GCN that makes it use more power inherently. AMD has been stuck on an inferior node, Vega 20 is proof of that. That's all there is to it.
14nm existed for long time, and the industry has been using it to a better degree. I don't think some process optimization or die shrink is going to give them a decent advantage. They're testing the waters with 14nm++ or 12nm and either succeeding, or just wasting their time. They should be working on Navi ever since Polaris came out, if RTG didn't do their homework, well everyone knows what happens.

Vya Domus, post: 3938927, member: 169281"
Here's something I don't understand about you RTG haters: how come AMD is in such an appalling state with their GPU division according to you all, yet they actually have "talent and money" that they are losing ? Are they failing hard or do they have great talent there, which is it ? You can't have both.
I'm not a hater, those are the people that just type "amd sucks". I'm being critical of their decisions. They're doing slightly better than before but nowhere near where they should be if they want to compete. I don't like them being a whole GPU arch behind all the time, why are they waiting to change that for so long? Maybe RTG is going for the war of attrition by staying behind Nvidia for awhile since Moore's Law is dead. Since you mentioned Vega 20, their next releases are a stopgap before Navi, which is supposed to impress everyone. But in the mean time, most people will still keep it in their heads that RTG doesn't have something better than competition, or that they were too late to be there when they needed an upgrade. (Let's not forget mining kind of gave everyone a good slap)

Anyway, I never said they have someone outstanding there, but clearly some companies would want their engineers working on something better. So they jumped ship. And money-wise they are playing safe as far we can tell right? But when they look at their less than impressive revenue, I imagine they take it as a loss. RTG isn't failing, they're just being disappointment in the eyes of an average consumer. First impressions matter, only the naive will buy something on a company promise that their product will become better later on, it's almost like pre-ordering (and I am strict on pre-orders), they were too late. Vega and Polaris are pretty strong now, nobody should deny that, but people who wanted those GPUs a long time ago most likely are happily gaming on their Pascals, including me. It's a big shame.

As mentioned before mindshare is playing a big part on what is happening right now. Which, let's not forget, also affects game optimizations in a lot of cases.

I only know what is already fact and hearsay plus my own prediction of the future. You can point out where I am terribly wrong. Real talk, what I want to say is, they just haven't been doing enough since 2012, or maybe they just overhyped everything which made anything less competitive seem like it was worse than it really was. We need them to come out on top, for consumers. I want to wake up one morning and see something that is good on launch day and priced at what previous generation was at and have my mind blown. Reasonable expectation, I hope?

Instead we get 10-25% improvements while paying double. That's where we are heading at this rate. People are expecting the RX 590 to be twice as powerful as an RX 570 as the best possible scenario, otherwise no sane person (maybe someone with a 200-400 budget) will upgrade if the improvement is low from previous generations. The people that bought the high-end stuff can skip a gen' easy today unless they want that sweet, sweet 4K on 60FPS. In the end the majority vote with their wallets.

While this doesn't affect my life in any significant way, I have to see the bigger picture, this is setting the precedent for the industry. Well... at least re-brands aren't happening... for now, but they'll be back.
Posted on Reply
#20
Turmania
I would have preferred at least new card would have been done in 7nm process, to squeeze more speed and at least have around power consumption between 1060-1070. but I don`t think many would complain, as most are waiting for next years Navi version. I`m waiting for Nitro version of Sapphire.
Posted on Reply
#21
oxidized
ZoneDymo, post: 3938723, member: 66089"
dat boi so fat
dat boi so ugly
Posted on Reply
#22
Kamgusta
bug, post: 3938839, member: 157434"
But this is weird. PCIe power spec hasn't changed, so these can't draw more power than cards did five years ago. Why the oversized cooling? Is it just to shave off a few dB?
Latest AMD cards violate PCI-E specifications, drawing much more power than allowed. This also results in cases where PCI-E voltage drops significantly, from 12V to 11,5V.
Posted on Reply
#23
Imsochobo
theoneandonlymrk, post: 3938859, member: 82332"
quite dissapointed there is only one hdmi, i wont be buying one though im intrigued by that clockspeed, i think its a good job nvidia are nearly out of 1080s.
Use displayport, you can convert DP -> HDMI if your device doesn't support DP but if it has DP use it!
Posted on Reply
#25
theoneandonlymrk
EntropyZ, post: 3938878, member: 162270"
So this is going to draw more power than a non-founders GTX 1070 OC'ed while having the same amount of PCI-E connectors and more current draw, and still end up being slower than that card in most games. Hmmmmmmmmmmmmm.




Why do I get the creeping feeling Navi isn't going to be an impressive improvement over Vega. While undoubtedly both are more future proof, I have to question why can't they finally move to off the GCN arch which needs more power to get the same performance as Nvidia. I think RTG needs to take the risk like the CPU division did with Ryzen and put all the cards at the table. Trouble is... they are losing talent and money over there.
Lack of understanding much.

Different arch, different tactics , Amd one arch many market's. No game specific arch.

Nvidia multi arch tiered for gaming alone and seperate design for data.

Gcn does it all, Nvidia makes special bits to do special shit ie 64/ Or 32 bit cuda cores ,tensor.

And for power the Rtx cards suck plenty ,get off that high horse , Nvidia fanboi days riding power comments is Over.
Posted on Reply
Add your own comment