• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next AMD Flagship Single-GPU Card to Feature HBM

ATI was a Canadian company... heater during cold winter... actually a two in one :D

And actually they must sell their R9 290 no matter what, unsold silicon is a more loss for them than sold for a bargain. I bet they calculated everything as good they can.

i am Swiss i live in mountain ... i fit the customer profile for AMD/ATI cards (even if 44° is the max for my little princess now ... thanks Kryografics) now thanks to that i will be able to get a 2nd 290 + a psu for cheaper than if i wanted to jump on maxwell even ordering another block+backplate + a auxiliary 140X60/65mm rad would set the total a bit bellow a 980 :D and no way a 970 would be tempting over a "already owned" 290 (i was tempted in the beginning but on a second look it proved to be not a good idea, ie: Side-grade)

If i will recieve a card with low noise, high performance and a good price - i'm in.
well ... as i paid my 290 on 2nd hand (not used too long fortunately ) and if i take in account the price of the loop/block/backplate to add to the total price it's still under the price of a 970 for me ... so judging by the temps and silence of operation (on a 240X60mm rad ) my card fits that description :D but if you mean "direct out of the box" then yes a 970 is fine. (not blaming nvidia ... blaming the greedy retailer/etailer around me :roll: )
 

Busy putting up Lollipop on my M8.... And I I also live up more more north in here is enough cold...

But I mostly play Skyrim lately... and... you know... 2.5 years... and the CFX still sucks there... 2.5 years.... I have also lived on a SLI setup I do not suggest multiple card setup as such anymore to anyone except they use a triple monitor setup ie they have no choice as it needs horsepower.

I would sell the old one and get the 980 thou...
 
Busy putting up Lollipop on my M8.... And I I also live up more more north in here is enough cold...

But I mostly play Skyrim lately... and... you know... 2.5 years... and the CFX still sucks there... 2.5 years.... I have also lived on a SLI setup I do not suggest multiple card setup as such anymore to anyone except they use a triple monitor setup ie they have no choice as it needs horsepower.

I would sell the old one and get the 980 thou...
well i said i could get ... but a single 290 is way more than enough for 2015, i will jump on 2nd hand 390X or 980Ti 1080 whatever it will be called :D

i also had a 580 SLI and tested a 7870ghz CFX ... well dual card, too, is not for me indeed. i am on a single 1080p monitor so no need for more (planning eventually to go 21/9 29" for fun or a 27" 1440p but ... for now i am fine :) )
 
I am just trying to understand why the quote appeared on linkedin
LinkedIn is a networking and job marketplace. Given that AMD's workforce has a certain fluidity to it, maybe every scintilla of CV worthy background puts her one step closer to her next position.
OTOH, maybe she like most normally level-headed people, suddenly turns into a gabbling fool as soon as they are let loose near a social networking site.
nobody says she didn't work on such project, but nobody told what kind of tech node it had, it could be a catch and the blooper around these news. The seconds they are using GloFo now, we have no hard info on them and their silicon leakage at this stage. There may be many variables.
Undoubtedly. Of the obvious candidates (assuming UMC's 28nm HLP/HPM isn't a consideration):
TSMC 28nm HPL/HPC - both mature with good yields
TSMC 20nm SOC - Very unlikely for a large chip with a large power budget
GloFo 28nm SHP - Already in use for GCN ( Kaveri APU), and reported process for AMD GPUs in 2015
GloFo 20nm LPM - Very unlikely for a large chip with a large power budget, and reportedly being sidelined by GloFo as it concentrates on licensed 14nm-XM.
And the speculation about the 380X, it is funny that it has not the R9 class in front of it, ain't it?
Maybe just a timesaving abbreviation. I know I do it myself, as do many others, so I would expect engineers who have to deal with the nomenclature constantly would do likewise.
 
maybe she like most normally level-headed people, suddenly turns into a gabbling fool as soon as they are let loose near a social networking site.

That may be the sad story indeed.

Maybe just a timesaving abbreviation. I know I do it myself, as do many others, so I would expect engineers who have to deal with the nomenclature constantly would do likewise.

I am an engineer myself, seldom fire with shortened part number in a area I work and I am specializing on, because it causes so much misunderstandings and we tend to correct each other on these cases and even note on specific revisions on each part numbers(as the number is not enough), I may have used a codename, but yes, it may be an option too...
 
"Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.
 
AMD has a number of options for production of these cards which most people will be very happy with. There is a lot more than just stacked RAM and a smaller node to be had.
 
"Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.


So you haven't read the sources, and instead joined to attack the wording of the report?

Good job.
 
AMD has a number of options for production of these cards which most people will be very happy with. There is a lot more than just stacked RAM and a smaller node to be had.
Thanks for vague unquantifiable promises. I foresee a bright future for you at AMD's PR department.
FW66jfN.jpg
 
"Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.

Lolz wtf? o_O
 
"Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.
I am pretty sure the thread OP has no bias and only based this off the source of this so I would think twice if I was you before adding an opinion like that on him.
 
"Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.

There are fanboys and then there guys who post responses like yours....
 
This is becoming a trend. AMD goes all out, nvidia reacts with something they've been holding back. Seems like something has to give here. Hopefully getting stuck at 28nm screws with nvidia a little more than AMD and they can gain some ground.
 
I was thinking of just ordering a GTX 970, but now I'm hesitating again. Argh.
 
This news came out a while ago. Design stage of these cards is past, must be in manufacture right now and samples floating around in weeks......
 
Yes, but can it play at 4K ?
 
Yes, but can it play at 4K ?

Most likely yes. With such memory it will have tons of bandwidth to support it. It's just up to the GPU design to utilize it now...
 
  • Like
Reactions: xvi
Not sure if anyone here mentioned this, but 300W board power is no more than a baseline, if anything, because GPU design generally doesn't go past 300W due to PCI and power supply limitations.

To be honest I am not expecting miracles, if Tonga was anything to go by.
 
GPU design generally doesn't go past 300W due to PCI and power supply limitations

295x2 obliterated all PCI-E power specifications and recommendations, so if they really wanted to, they would go past 300W.
 
  • Like
Reactions: xvi
295x2 obliterated all PCI-E power specifications and recommendations, so if they really wanted to, they would go past 300W.

Yes, but that's a dual-GPU card...

The issue is that if you put a single GPU past the 300w mark, many people will run into issues with power supplies for example. It hurts sales, many systems will be incompatible.
 
Yes, but that's a dual-GPU card...

The issue is that if you put a single GPU past the 300w mark, many people will run into issues with power supplies for example. It hurts sales, many systems will be incompatible.

The stock 290X hits 282W peak, and surpasses 300W in furmark (unrealistic). That didn't hurt sales too much, and it was only 18W below 300 on a single GPU card. They can get away with it, just not much higher.
 
I am pretty sure the thread OP has no bias and only based this off the source of this so I would think twice if I was you before adding an opinion like that on him.

Well, you KNOW there is always an extreme fanboy who joins just to troll and dump on a thread, completely unaware that bta is not biased. Happens on both sides of the fence, sadly, depending whether the news is about the green side or red side.
 
Last edited:
The stock 290X hits 282W peak, and surpasses 300W in furmark (unrealistic). That didn't hurt sales too much
till the day nvidia released gtx 970 and 980
 
till the day nvidia released gtx 970 and 980

Either AMD address performance, or they address power consumption. I don't imagine seeing them addressing both in full.
 
till the day nvidia released gtx 970 and 980

You can't base the findings on the fact that GPU's are like 1 year apart...
 
Back
Top