• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Fiji XT Pictured Some More

I like DVI too, only because I don't need a higher resolution and I like the fact the cables are secured by screws, so cables cant short a port out by moving.

DisplayPort is also Secured.
 
One thing is for sure, AMD will always be the PR slide king... that's a safe job right there.
 
The only reason cards still had DVI ports was because they were necessary for connecting VGA adaptors (you can't go direct from HDMI/DP to VGA with a passive adapter like you can with DVI-I, it requires an active converter), but since AMD ditched DVI-I ports on the R290 in favour of digital only DVI-D they became superfluous as you can go direct from HDMI/DP to DVI-D.

If you intend to connect a monitor that only supports VGA to a high-end graphics card, you need to be shot.

One thing is for sure, AMD will always be the PR slide king... that's a safe job right there.

If only their products were half as good as their marketing.
 
If you intend to connect a monitor that only supports VGA to a high-end graphics card, you need to be shot.

You would be surprised how many people use old/bad monitors as secondary screens where resolution/quality/etc are irrelevant. I know at least two guys who game on decent gaming monitors and use VGA screens by the side of them for non gaming stuffs (looking up something on a web browser or watching the football during a game, IM windows, etc), one uses a 1280x1024 VGA panel the other uses a 1600x900 VGA panel. I know dozens more who use DVI screens that are no better than VGA.
 
You would be surprised how many people use old/bad monitors as secondary screens where resolution/quality/etc are irrelevant. I know at least two guys who game on decent gaming monitors and use VGA screens by the side of them for non gaming stuffs (looking up something on a web browser or watching the football during a game, IM windows, etc), one uses a 1280x1024 VGA panel the other uses a 1600x900 VGA panel. I know dozens more who use DVI screens that are no better than VGA.

to be fair, a lot of socket 115x motherboards have either d-sub or dvi if not both for a second screen. that's how my "media viewing screen" is connected to. however this better have a working DP/DVI adapter in the box because i can't see spending additional $$$ until i have also upgraded my monitor for this. (still 1080/60Hz)
 
No dvi = no vga :(
This is a no buy for me, i prefer my gtx970 atm, its fully compatible with my svga crt monitor :)
 
You would be surprised how many people use old/bad monitors as secondary screens where resolution/quality/etc are irrelevant. I know at least two guys who game on decent gaming monitors and use VGA screens by the side of them for non gaming stuffs (looking up something on a web browser or watching the football during a game, IM windows, etc), one uses a 1280x1024 VGA panel the other uses a 1600x900 VGA panel. I know dozens more who use DVI screens that are no better than VGA.

yeah they use old stuff that still happens to work just because why get anything newer if it works.
That is a mentality some have but certainly not one a highend graphics card maker seeking the edge of current tech should keep in mind.

There are adapters if you want to keep using the old gear that still works, there always are, for everything.
And otherwise just buy a second hand simple LCD screen for 30 bucks or something, we are not talking huge investments here to just get something slightly newer that will work and lets be honest, if you are going to spend 600 or so dollar on a graphics card...im sure an extra 30 wont be a big deal.
 
No dvi = no vga :(
This is a no buy for me, i prefer my gtx970 atm, its fully compatible with my svga crt monitor :)

Seems odd to me you would even consider upgrading so quickly after getting a card from the current gen (aka recently).
Also there are HDMI - VGA converters.
HDMI and DVI is pretty much the same thing.
 
More importantly, where is 15.5 beta????????
 
Seems odd to me you would even consider upgrading so quickly after getting a card from the current gen (aka recently).
Also there are HDMI - VGA converters.
HDMI and DVI is pretty much the same thing.

A HDMI to VGA adapter needs to be active tho. The DVI to VGA adapters require DVI-A to work. HDMI only copied the digital part of DVI.

Had it not been for the VRM on the card a universal water block would have worked wonders for the Fiji card.
 
Had it not been for the VRM on the card a universal water block would have worked wonders for the Fiji card.

I actually expect to see a couple of people use universal blocks on this card combined with a spare motherboard VRM block. That would have been stupid previously as it would leave the VRAM bare, but now that's no longer an issue...
 
My only concern is the power consumption of these cards... Performance should not disappoint, but do not expect much to overcome in Nvidia, at least we will have competition.
 
People, to those complaining about the lack of a DVI port on the card you do realize we have a display port adaptor to just about any type of port you could possible want including DVI and VGA. I am personally glad to see it go and make room for just DP and HDMI all on one row as that makes for a much cleaner card and allows for easier single slot conversion for those that would want to.

Personally I think the card looks great, dual 8 pins is a bit foreshadowing on its power consumption unless they just chose to up it to another 8 pin for the heck of it (Which I doubt) but as long as the performance holds then I am fine with it. We will just have to wait and see if the rumors and hype are all true...

Last, just give us some official announcement with specs and such already!
 
Last edited:
imo the two 8 pins don't necessarily mean a lot. I know this gives the possibility of up to 375 watts but I'm guessing the card will use somewhere around 300 watts peak out of the box and we don't know the overclocking potential with the liquid cooler yet. I think they are just giving some room for overclocking. It's going to be interesting when a review goes up here hopefully soon so people can decide.
 
imo the two 8 pins don't necessarily mean a lot. I know this gives the possibility of up to 375 watts but I'm guessing the card will use somewhere around 300 watts peak out of the box and we don't know the overclocking potential with the liquid cooler yet. I think they are just giving some room for overclocking. It's going to be interesting when a review goes up here hopefully soon so people can decide.
I had a similar thought, it would make sense based on where this card is being aimed at. This also means that the type of PSU needed for this is a bit higher up which could also be a point they are trying to make (Like they did with the 295X2) to keep the "Low quality" PSU's from being able to power the card as well (Although this is a complete guess on my part and nothing more than that).
 
Anybody else been to the AMD driver support page recently? They now have a 12 page questionnaire. Topics include what games you play, are you happy with their driver support, CCC, etc. Since the last editorial about AMD driver support was such a hot topic, maybe people should go fill it out so AMD can get direct, constructive feedback about these topics.
 
imo the two 8 pins don't necessarily mean a lot. I know this gives the possibility of up to 375 watts but I'm guessing the card will use somewhere around 300 watts peak out of the box and we don't know the overclocking potential with the liquid cooler yet. I think they are just giving some room for overclocking. It's going to be interesting when a review goes up here hopefully soon so people can decide.
300W and a 120mm AIO... yikes... At least it is not as bad as the 295x2 and its 120mm AIO that got hot to the touch. But that is not enough rad to cool 300W much better than air. Push/Pull is likely needed for really pushing things.

I dont understand, Ghost, what you mean by needing a higher up PSU like what they did with the 295x2 by simply using 8 pins. Any PSU that has two PCIe power leads has 2 6+2 pins... quality of the PSU and the number of PCIe connectors are not related.
 
Hang on, I didn't realise the orientation before, these images are "fan side" up, so the is no active VRM fan?!

Is it possible the AIO block on the GPU also flows through a VRM block? O.O

If so then TAKE MY MONEY!
 
300W and a 120mm AIO... yikes... At least it is not as bad as the 295x2 and its 120mm AIO that got hot to the touch. But that is not enough rad to cool 300W much better than air. Push/Pull is likely needed for really pushing things.

I dont understand, Ghost, what you mean by needing a higher up PSU like what they did with the 295x2 by simply using 8 pins. Any PSU that has two PCIe power leads has 2 6+2 pins... quality of the PSU and the number of PCIe connectors are not related.
Well my wording was pretty bad but what I meant was keeping the PSU requirements a bit high to avoid off brands and such that could cause problems on such a "High End" card. I mostly meant because ive met plenty of odd off brand PSU's that are 500-700watt that actually only include 1 6+2 pin and maybe another 6 pin so this could be to again make these more aimed at the enthusiast market similar to the 295X2 which had special requirements on the PSU otherwise you ran into problems (But that was mostly because of its ridiculously high power draw over 2 8pin connectors). It was just a random guess so take it as nothing more than a inference.

My way of thinking was just to avoid the people who say "Since I have the connectors my PSU must be enough to run it" type of thinking. But again it was just another way of looking at it not backed by any actual proof and was just another guess.
 
I actually expect to see a couple of people use universal blocks on this card combined with a spare motherboard VRM block. That would have been stupid previously as it would leave the VRAM bare, but now that's no longer an issue...
Well, GDDR at 1250 MHz does not need active cooling, it might be needed if you up the ram voltage but at stock volts and 1250 (effectively 5000) MHz GDDR is about as warm as skin. was running two HD6950s with a universal block for 2 yeas, the cards still works to this day.
 
If you intend to connect a monitor that only supports VGA to a high-end graphics card, you need to be shot.
Exactly! Not "taken out and shot"... shot on the spot, as you imply. :toast:


Hang on, I didn't realise the orientation before, these images are "fan side" up, so the is no active VRM fan?!

Is it possible the AIO block on the GPU also flows through a VRM block? O.O
As we've yet to see the radiator explicitly, I might be considering the pump not being combined with the waterblock. As there's no (card) fan they’d have to cool the chip, perhaps the HBM stack and VRM's. I don't see all that packaged and the pump under that shroud, though not saying it couldn’t be. This time I conceivably could imagine something other than a traditional AIO cooler, and perhaps a waterblock, while the pump is built into the radiator tank.
 
Last edited:
Exactly! Not "taken out and shot"... shot on the spot, as you imply. :toast:



As we've yet to see the radiator explicitly, I might be considering the pump not being combined with the waterblock. As there's no (card) fan they’d have to cool the chip, perhaps the HBM stack and VRM's. I don't see all that packaged and the pump under that shroud, though not saying it couldn’t be. This time I conceivably could imagine something other than a traditional AIO cooler, and perhaps a waterblock, while the pump is built into the radiator tank.
If it is a pump on the radiator (Which I don't see it in the pictures yet so its unknown) then that would be (In my opinion) the best case scenario. Just imagine if you could then just pull the tubes off and slap it into your own custom water loop!!!

If they did that, I might actually be willing to break my rules (Again) and buy some cards this round.
 
I still use DVI on my monitors they are 22" IPS displays. Both of which are still over $100 a piece. I don't use VGA anymore but I also see DP as the 4k of gaming. rather gimmicky. I see both sides of the fence but I would have to say DVI stands somewhere in between and should at least still be included. even if its with 1 header.
 
  • Like
Reactions: xvi
I still use DVI on my monitors they are 22" IPS displays. Both of which are still over $100 a piece. I don't use VGA anymore but I also see DP as the 4k of gaming. rather gimmicky. I see both sides of the fence but I would have to say DVI stands somewhere in between and should at least still be included. even if its with 1 header.

Worse comparison made of all time.
 
Back
Top