Wednesday, May 27th 2015

AMD Fiji XT Pictured Some More

In the latest picture leaked of AMD's upcoming flagship graphics card, codenamed "Fiji-XT," we get a final confirmation of the reference-design card's length, particularly its short PCB. Since this card uses a factory-fitted AIO liquid cooling solution, and since the Fiji XT package is effectively smaller than that of Hawaii, with the surrounding memory chips gone (moved to the GPU package as HBM stacks), the PCB is extremely compact, with just the GPU package, and its VRM. Speaking of which, the card draws power from a pair of 8-pin PCIe power connectors. The coolant tubes stick out from the rear of the card, making their way to a 120 x 120 mm radiator, with a single included 120 mm PWM fan. With this card, AMD is doing away with DVI altogether. Connectors will be a mixture of DisplayPort 1.2a and HDMI 2.0.
AMD teased this video on its Twitch channel.

Source: OCN Forums
Add your own comment

57 Comments on AMD Fiji XT Pictured Some More

#26
Fluffmeister
One thing is for sure, AMD will always be the PR slide king... that's a safe job right there.
Posted on Reply
#27
Assimilator
UbersonicThe only reason cards still had DVI ports was because they were necessary for connecting VGA adaptors (you can't go direct from HDMI/DP to VGA with a passive adapter like you can with DVI-I, it requires an active converter), but since AMD ditched DVI-I ports on the R290 in favour of digital only DVI-D they became superfluous as you can go direct from HDMI/DP to DVI-D.
If you intend to connect a monitor that only supports VGA to a high-end graphics card, you need to be shot.
FluffmeisterOne thing is for sure, AMD will always be the PR slide king... that's a safe job right there.
If only their products were half as good as their marketing.
Posted on Reply
#28
Ubersonic
AssimilatorIf you intend to connect a monitor that only supports VGA to a high-end graphics card, you need to be shot.
You would be surprised how many people use old/bad monitors as secondary screens where resolution/quality/etc are irrelevant. I know at least two guys who game on decent gaming monitors and use VGA screens by the side of them for non gaming stuffs (looking up something on a web browser or watching the football during a game, IM windows, etc), one uses a 1280x1024 VGA panel the other uses a 1600x900 VGA panel. I know dozens more who use DVI screens that are no better than VGA.
Posted on Reply
#29
looniam
UbersonicYou would be surprised how many people use old/bad monitors as secondary screens where resolution/quality/etc are irrelevant. I know at least two guys who game on decent gaming monitors and use VGA screens by the side of them for non gaming stuffs (looking up something on a web browser or watching the football during a game, IM windows, etc), one uses a 1280x1024 VGA panel the other uses a 1600x900 VGA panel. I know dozens more who use DVI screens that are no better than VGA.
to be fair, a lot of socket 115x motherboards have either d-sub or dvi if not both for a second screen. that's how my "media viewing screen" is connected to. however this better have a working DP/DVI adapter in the box because i can't see spending additional $$$ until i have also upgraded my monitor for this. (still 1080/60Hz)
Posted on Reply
#30
chinmi
No dvi = no vga :(
This is a no buy for me, i prefer my gtx970 atm, its fully compatible with my svga crt monitor :)
Posted on Reply
#31
ZoneDymo
UbersonicYou would be surprised how many people use old/bad monitors as secondary screens where resolution/quality/etc are irrelevant. I know at least two guys who game on decent gaming monitors and use VGA screens by the side of them for non gaming stuffs (looking up something on a web browser or watching the football during a game, IM windows, etc), one uses a 1280x1024 VGA panel the other uses a 1600x900 VGA panel. I know dozens more who use DVI screens that are no better than VGA.
yeah they use old stuff that still happens to work just because why get anything newer if it works.
That is a mentality some have but certainly not one a highend graphics card maker seeking the edge of current tech should keep in mind.

There are adapters if you want to keep using the old gear that still works, there always are, for everything.
And otherwise just buy a second hand simple LCD screen for 30 bucks or something, we are not talking huge investments here to just get something slightly newer that will work and lets be honest, if you are going to spend 600 or so dollar on a graphics card...im sure an extra 30 wont be a big deal.
Posted on Reply
#32
ZoneDymo
chinmiNo dvi = no vga :(
This is a no buy for me, i prefer my gtx970 atm, its fully compatible with my svga crt monitor :)
Seems odd to me you would even consider upgrading so quickly after getting a card from the current gen (aka recently).
Also there are HDMI - VGA converters.
HDMI and DVI is pretty much the same thing.
Posted on Reply
#34
Brusfantomet
ZoneDymoSeems odd to me you would even consider upgrading so quickly after getting a card from the current gen (aka recently).
Also there are HDMI - VGA converters.
HDMI and DVI is pretty much the same thing.
A HDMI to VGA adapter needs to be active tho. The DVI to VGA adapters require DVI-A to work. HDMI only copied the digital part of DVI.

Had it not been for the VRM on the card a universal water block would have worked wonders for the Fiji card.
Posted on Reply
#35
Ubersonic
BrusfantometHad it not been for the VRM on the card a universal water block would have worked wonders for the Fiji card.
I actually expect to see a couple of people use universal blocks on this card combined with a spare motherboard VRM block. That would have been stupid previously as it would leave the VRAM bare, but now that's no longer an issue...
Posted on Reply
#36
KarymidoN
My only concern is the power consumption of these cards... Performance should not disappoint, but do not expect much to overcome in Nvidia, at least we will have competition.
Posted on Reply
#37
Captain_Tom
xkm1948I am seriously wishing it not gonna go over 800 dollars.
All rumors are saying $850 right now.
Posted on Reply
#38
GhostRyder
People, to those complaining about the lack of a DVI port on the card you do realize we have a display port adaptor to just about any type of port you could possible want including DVI and VGA. I am personally glad to see it go and make room for just DP and HDMI all on one row as that makes for a much cleaner card and allows for easier single slot conversion for those that would want to.

Personally I think the card looks great, dual 8 pins is a bit foreshadowing on its power consumption unless they just chose to up it to another 8 pin for the heck of it (Which I doubt) but as long as the performance holds then I am fine with it. We will just have to wait and see if the rumors and hype are all true...

Last, just give us some official announcement with specs and such already!
Posted on Reply
#39
64K
imo the two 8 pins don't necessarily mean a lot. I know this gives the possibility of up to 375 watts but I'm guessing the card will use somewhere around 300 watts peak out of the box and we don't know the overclocking potential with the liquid cooler yet. I think they are just giving some room for overclocking. It's going to be interesting when a review goes up here hopefully soon so people can decide.
Posted on Reply
#40
GhostRyder
64Kimo the two 8 pins don't necessarily mean a lot. I know this gives the possibility of up to 375 watts but I'm guessing the card will use somewhere around 300 watts peak out of the box and we don't know the overclocking potential with the liquid cooler yet. I think they are just giving some room for overclocking. It's going to be interesting when a review goes up here hopefully soon so people can decide.
I had a similar thought, it would make sense based on where this card is being aimed at. This also means that the type of PSU needed for this is a bit higher up which could also be a point they are trying to make (Like they did with the 295X2) to keep the "Low quality" PSU's from being able to power the card as well (Although this is a complete guess on my part and nothing more than that).
Posted on Reply
#41
moproblems99
Anybody else been to the AMD driver support page recently? They now have a 12 page questionnaire. Topics include what games you play, are you happy with their driver support, CCC, etc. Since the last editorial about AMD driver support was such a hot topic, maybe people should go fill it out so AMD can get direct, constructive feedback about these topics.
Posted on Reply
#42
EarthDog
64Kimo the two 8 pins don't necessarily mean a lot. I know this gives the possibility of up to 375 watts but I'm guessing the card will use somewhere around 300 watts peak out of the box and we don't know the overclocking potential with the liquid cooler yet. I think they are just giving some room for overclocking. It's going to be interesting when a review goes up here hopefully soon so people can decide.
300W and a 120mm AIO... yikes... At least it is not as bad as the 295x2 and its 120mm AIO that got hot to the touch. But that is not enough rad to cool 300W much better than air. Push/Pull is likely needed for really pushing things.

I dont understand, Ghost, what you mean by needing a higher up PSU like what they did with the 295x2 by simply using 8 pins. Any PSU that has two PCIe power leads has 2 6+2 pins... quality of the PSU and the number of PCIe connectors are not related.
Posted on Reply
#43
Ubersonic
Hang on, I didn't realise the orientation before, these images are "fan side" up, so the is no active VRM fan?!

Is it possible the AIO block on the GPU also flows through a VRM block? O.O

If so then TAKE MY MONEY!
Posted on Reply
#44
GhostRyder
EarthDog300W and a 120mm AIO... yikes... At least it is not as bad as the 295x2 and its 120mm AIO that got hot to the touch. But that is not enough rad to cool 300W much better than air. Push/Pull is likely needed for really pushing things.

I dont understand, Ghost, what you mean by needing a higher up PSU like what they did with the 295x2 by simply using 8 pins. Any PSU that has two PCIe power leads has 2 6+2 pins... quality of the PSU and the number of PCIe connectors are not related.
Well my wording was pretty bad but what I meant was keeping the PSU requirements a bit high to avoid off brands and such that could cause problems on such a "High End" card. I mostly meant because ive met plenty of odd off brand PSU's that are 500-700watt that actually only include 1 6+2 pin and maybe another 6 pin so this could be to again make these more aimed at the enthusiast market similar to the 295X2 which had special requirements on the PSU otherwise you ran into problems (But that was mostly because of its ridiculously high power draw over 2 8pin connectors). It was just a random guess so take it as nothing more than a inference.

My way of thinking was just to avoid the people who say "Since I have the connectors my PSU must be enough to run it" type of thinking. But again it was just another way of looking at it not backed by any actual proof and was just another guess.
Posted on Reply
#45
Brusfantomet
UbersonicI actually expect to see a couple of people use universal blocks on this card combined with a spare motherboard VRM block. That would have been stupid previously as it would leave the VRAM bare, but now that's no longer an issue...
Well, GDDR at 1250 MHz does not need active cooling, it might be needed if you up the ram voltage but at stock volts and 1250 (effectively 5000) MHz GDDR is about as warm as skin. was running two HD6950s with a universal block for 2 yeas, the cards still works to this day.
Posted on Reply
#46
Casecutter
AssimilatorIf you intend to connect a monitor that only supports VGA to a high-end graphics card, you need to be shot.
Exactly! Not "taken out and shot"... shot on the spot, as you imply. :toast:
UbersonicHang on, I didn't realise the orientation before, these images are "fan side" up, so the is no active VRM fan?!

Is it possible the AIO block on the GPU also flows through a VRM block? O.O
As we've yet to see the radiator explicitly, I might be considering the pump not being combined with the waterblock. As there's no (card) fan they’d have to cool the chip, perhaps the HBM stack and VRM's. I don't see all that packaged and the pump under that shroud, though not saying it couldn’t be. This time I conceivably could imagine something other than a traditional AIO cooler, and perhaps a waterblock, while the pump is built into the radiator tank.
Posted on Reply
#47
GhostRyder
CasecutterExactly! Not "taken out and shot"... shot on the spot, as you imply. :toast:



As we've yet to see the radiator explicitly, I might be considering the pump not being combined with the waterblock. As there's no (card) fan they’d have to cool the chip, perhaps the HBM stack and VRM's. I don't see all that packaged and the pump under that shroud, though not saying it couldn’t be. This time I conceivably could imagine something other than a traditional AIO cooler, and perhaps a waterblock, while the pump is built into the radiator tank.
If it is a pump on the radiator (Which I don't see it in the pictures yet so its unknown) then that would be (In my opinion) the best case scenario. Just imagine if you could then just pull the tubes off and slap it into your own custom water loop!!!

If they did that, I might actually be willing to break my rules (Again) and buy some cards this round.
Posted on Reply
#48
Solaris17
Super Dainty Moderator
I still use DVI on my monitors they are 22" IPS displays. Both of which are still over $100 a piece. I don't use VGA anymore but I also see DP as the 4k of gaming. rather gimmicky. I see both sides of the fence but I would have to say DVI stands somewhere in between and should at least still be included. even if its with 1 header.
Posted on Reply
#49
Captain_Tom
Solaris17I still use DVI on my monitors they are 22" IPS displays. Both of which are still over $100 a piece. I don't use VGA anymore but I also see DP as the 4k of gaming. rather gimmicky. I see both sides of the fence but I would have to say DVI stands somewhere in between and should at least still be included. even if its with 1 header.
Worse comparison made of all time.
Posted on Reply
#50
Solaris17
Super Dainty Moderator
Captain_TomWorse comparison made of all time.
Why exactly? Im curious because at the moment only flag ship cards costing several hundred dollars a piece in multi GPU configurations can sustain the FPS necessary to play most popular games at high detail levels. This makes it impractical for the norm not to mention it simply hasn't been pushed hard enough to be much more than a novelty item for the market. Likewise while current GPUS including flag ships have DP outputs most of these connections coincide with the high resolution high Hz monitors they would "look best on". However Monitors in even high price segments are still made with even VGA which as outdated as it is is still used in the majority of affordable monitors and TVs. DVI didn't even manage to take its crown.

I would like your input on why exactly 4k gaming and DP only GPUs are not only somehow superior for all intensive purposes in today's saturated 1080p market but also why you might think the market is full enough of these products to warrent such a change from DVI?

I will ask @EarthDog as well. Reviewing aside of course. I did that as well and consumerism and elitism are very different things.
Posted on Reply
Add your own comment
May 9th, 2024 22:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts