• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Fiji XT R9 390X Specs Leak

They care as much as Nvidia does.
The roles currently seem the reverse of what they were when that video came out however. Back then you had the GTX480 which for its time used lots of power while AMD had the 5870 which was comparable in performance and was 1,5 times as powerefficient on the same process.
AMD back then made use of the situation by producing that video. This time around Nvidia seems to refrain from stuff like that however, while also lowering the prices of their card way more than they'd have to based on product positioning and demand.

Absolutely, and as then like now, even stated in this very thread, if the performance is there, people will buy a card regardless of the power consumption and for that matter price.

If people don't like the video they should take it up with AMD.
 
Can't we discuss AMD, Nvidia, power, etc without insults and name calling?

Boy I regret my initial comment in this thread :oops:

Eh, someone had to say it :P I'm sticking with my power gobbling 290x's for a few years. Until then, AMD does have their work cut out for them ;)
 
If it is really 20nm, I am sold, at last a real upgrade... old school, brute and guarantees better specs without much thinking.

Otherwise I don't care especially power wise really, just as most of us. It just has became a popular archetypal argument to use in tech forums. It does more power, yes, do we actually can feel it, NO, open the darn window open if you actually are anemic and live in a matchbox sized flat. You feel 50-100W more difference in the room? Quad SLI and CFX users order some ice cubes then to sit on. The seconds is the driver bashing, despite the fact, both have problems, always the same critters whine and whine about obscure problems, and I bet 90% are hardware faults. RMA your PC.

Philosophers sigh...
 
Good thing the industry does. Lower power processing requirements are currently driving most new developments - everything is about power reduction as portability becomes more and more relevant. But this is irrelevant, this topic is rumour - as much as the Titan II one is on the news section of TPU. There's nothing in this OP that suggests anything about power draw - we don't know what architecture it is. This could be more efficient than Hawaii - we don't know.

Let's hope the 390x is a Maxwell killer - otherwise we're facing more hideous prices from Nvidia's top range.
Industry and consumer are two different things. Unless it's something ridiculous, I don't care about power consumption either.
 
Fudzilla says 390X will also be on a 28nm node.

It's all rumors, of course, but if it's true, it kinda sucks since there will be no 20nm GPUs in 2015, and we'll have to wait until 2016 before we get something on a 14/16nm process. :(
 
so they thinking 7990 in single gpu form? interesting.
 
Fudzilla says 390X will also be on a 28nm node.

It's all rumors, of course, but if it's true, it kinda sucks since there will be no 20nm GPUs in 2015, and we'll have to wait until 2016 before we get something on a 14/16nm process. :(
Not sure if I believe it mostly because it would make the idea of AMD holding out on releasing a new GPU all the more fruitless because it would also be a 28nm chip. The whole point I had read was to wait for the 20nm to be completely ready for them so they could undercut things with it. Problem is that if this is true then that would all have been in vain and waiting like this to be completely pointless (unless of course the Fiji chips really are that far behind).

But I am speaking my opinion on that and nothing more.
 
It depends on where in the USA you live and how many cards you're running but to answer your question in general. If your AMD card was using 150 watts more than the Nvidia equivalent card. The average cost in the US for a kWh is 12 cents. If you game for an average of 20 hours a week then your monthly electric bill is going to go up ~$1.56

So, not much.

Where we live in Europe, we pay 34 cents pr. KW and that actually cheap since we get a 25% deduction due to yearly use, i just checked our prices a few secs ago. To reduce power is in all peoples interest, but for a computer running 24/7/365 it dosent really matter taken over a whole year.
Like written in the qoute price difference is 1.56 a week, then do that for 52 weeks to get a year, that gives you roughly 80 dollars. I dunno about you, but I dont have to time to care about 80 dollars for a whole year, it simply dosent matter. Where we live the difference would be roughly 3 times higher, and again 240 dollars in a year, comon, it still dosent matter.
 
Where we live in Europe, we pay 34 cents pr. KW and that actually cheap since we get a 25% deduction due to yearly use, i just checked our prices a few secs ago. To reduce power is in all peoples interest, but for a computer running 24/7/365 it dosent really matter taken over a whole year.
Like written in the qoute price difference is 1.56 a week, then do that for 52 weeks to get a year, that gives you roughly 80 dollars. I dunno about you, but I dont have to time to care about 80 dollars for a whole year, it simply dosent matter. Where we live the difference would be roughly 3 times higher, and again 240 dollars in a year, comon, it still dosent matter.

I'm not sure that I'm understanding you Ebo. If you are paying 34 cents per kWh and use an AMD card that might draw 150 watts more than an Nvidia equivalent and game ~20 hours a week on average then it would increase your electric bill by $4.42 a month or $53 per year.
 
I'm not sure that I'm understanding you Ebo. If you are paying 34 cents per kWh and use an AMD card that might draw 150 watts more than an Nvidia equivalent and game ~20 hours a week on average then it would increase your electric bill by $4.42 a month or $53 per year.
he used the monthly rate increase as the weekly rate increase. Still though 50$ is a new game. So there's that. Imo the power consumption goes out the window when you compare it to A/C, electric heaters, electric dryers, pool pumps, household lighting/fans, etc.

13 kWh is nothing when you're normally bulled for over 1200. So while power is a consideration it won't make or break my choice. I will grant though that I'd have likely gotten a 970 rather than an R9 290 had they been available when I was buying in July.
 
I hate it when people completely freak out when I recommend an AMD CPU or GPU and they say"OMG IF U BUY TEH AMD UR GUNNA SPEND MILLIONS OF DOLLARS ON POWAR AND U GUNNA HAVE TO GET A 1500 WATT POWAH SUPPLY BRO" Anyone else had this happen to them?
 
I hate it when people completely freak out when I recommend an AMD CPU or GPU and they say"OMG IF U BUY TEH AMD UR GUNNA SPEND MILLIONS OF DOLLARS ON POWAR AND U GUNNA HAVE TO GET A 1500 WATT POWAH SUPPLY BRO" Anyone else had this happen to them?
well ... exactly the word of a friend ... when i took my 290 ... he run a 780 and has a 850w PSU ... i run a 290 and have a 650w PSU ... we both have the same MVIIR and i5-4690K ... you get the idea? and yes my 290 at 1000/1300 outperform his stock 780 in nearly any test we ran ... he has 200w more but doesn't OC his 780 :rolleyes:
 
I hate it when people completely freak out when I recommend an AMD CPU or GPU and they say"OMG IF U BUY TEH AMD UR GUNNA SPEND MILLIONS OF DOLLARS ON POWAR AND U GUNNA HAVE TO GET A 1500 WATT POWAH SUPPLY BRO" Anyone else had this happen to them?

Definitely, and now we can finally confirm the GTX 480 hate was just fanboyism too.
 
well ... exactly the word of a friend ... when i took my 290 ... he run a 780 and has a 850w PSU ... i run a 290 and have a 650w PSU ... we both have the same MVIIR and i5-4690K ... you get the idea? and yes my 290 at 1000/1300 outperform his stock 780 in nearly any test we ran ... he has 200w more but doesn't OC his 780 :rolleyes:
Well of course power consumption is the new hot topic right now and will stay that way probably for another 2-3 months depending on the future release.

I'm not sure that I'm understanding you Ebo. If you are paying 34 cents per kWh and use an AMD card that might draw 150 watts more than an Nvidia equivalent and game ~20 hours a week on average then it would increase your electric bill by $4.42 a month or $53 per year.
Yea its not enough to really shake a fist at...Most people would not notice a difference in their electric bill (Of course depending on location and in some of the high cost areas) between some of the high end cards. I also still stand on the point that if electricity is a problem you should probably look more towards a lower video card instead of the top anyways as that is a much better way to save money on all fronts especially seeing how many cards will 1080p Ultra game even on the lower ends.

Either way Fiji will be the thing that either ends this debate or causes it to continue depending on how much better it is (or if) and how much power it uses.
 
Definitely, and now we can finally confirm the GTX 480 hate was just fanboyism too.

So true.

It will be interesting to see if the news that Nvidia reworked the silicon for GK210 for their newest compute part means that any further Maxwell development is 'distant'. Or was it just far cheaper for them to refine a known quantity and keep GM200 held back for consumer later on? This is relevant for the 390X as it could mean perhaps GM200 isn't as good as people are hoping, maybe the 390X will surprise us all in a nice way?

Time will tell. In the meantime, let the misguided brand loyalists spout their nonsense, be it green or red.
 
It will be interesting to see if the news that Nvidia reworked the silicon for GK210 for their newest compute part means that any further Maxwell development is 'distant'. Or was it just far cheaper for them to refine a known quantity and keep GM200 held back for consumer later on?
Parallel development I believe. The Kepler development team and the Maxwell team aren't a unified research pool as I understand it. The Kepler team will move to Pascal, the Maxwell team to Volta. Obviously with sharing of information and movements within the teams.
GK 210 looks like a natural development of GK 110 ( area-ruled refinements, doubled L1 cache and register file etc.), and was probably taped out around the same time (April) as GM 200. GK210 might be also be a "Plan B", but I'm thinking that might also be a way to strengthen the existing Kepler based enterprise market (higher GPU density per rack, variable load programming through dynamic boost etc.). Í don't think its a coincidence that the K80 is being aimed at forestalling Xeon Phi adoption judging by the launch partners.
This is relevant for the 390X as it could mean perhaps GM200 isn't as good as people are hoping, maybe the 390X will surprise us all in a nice way?
Anything is possible, although the 390X and it's Fiji-based FirePro (which are probably more relevant given AMD's HSA compute commitment) aren't going to cure too many woes. Power usage shouldn't be an issue to the vast majority of potential users, but it is indicative of a company buying into the compute-at-any-cost ethos that Nvidia pioneered with the G80, GT200, GF100. Where Nvidia seem to be actively bifurcating their product line, I'm more interested to see how AMD's Bermuda pans out. After all, large die/large power budget is fine for enthusiasts and enterprise, but it doesn't make for a healthy profit line when priced down the consumer product stack., and for my money, AMD desperately need a second-tier chip that is a GM204 competitor, especially for the mobile market since the second tier GPU becomes the halo product in the laptop market.
Time will tell. In the meantime, let the misguided brand loyalists spout their nonsense, be it green or red.
Of which, 99.99% won't ever buy the Fiji or GM200....at least, not while it is the current flagship.
 
Don't see why people are so fixated on 20nm ... HBM is a much bigger advancement than a node drop.

20nm will be 'ready' for this ... however it depends whether AMD designed the new series to work on a 'low power' process or not ... because there is no high power 20nm bulk process anywhere ... TSMC cancelled theirs, and with it went any possibility of NVIDIA 20nm, and maybe AMD too.
 
Last edited:
If you game for an average of 20 hours a week then your monthly electric bill is going to go up ~$1.56

So, not much.
Let's say I let a computer run x hours a day for 1 month with 150 watts difference from another machine with AMD hardware:

Energy cost here, ¢/kWh = $0.32
12 hours = +$17.28 (or $207,36 /year)
16 hours = +$23.04 (or $276,48 /year)
24 hours = +$34.56 (or $414,72 /year)

That would be a great impact on my monthly bill. I pay about $60 /month. Power inefficiency costs a lot. :O
 
I'll wait for the actual release before I get excited about it.
Power is pretty cheap where I live, so 150W doesn't bring the bill up noticeably.

I'll just continue to buy the best that I can afford at the time. Today's GPUs are already pretty freakin' nice,.........




Oh, and I like this little runner.

144079.jpg


 
Sonic 1/2.

I'll wait for the actual release before I get excited about it.
Power is pretty cheap where I live, so 150W doesn't bring the bill up noticeably.

I'll just continue to buy the best that I can afford at the time. Today's GPUs are already pretty freakin' nice,.........




Oh, and I like this little runner.

144079.jpg


 
Can't we discuss AMD, Nvidia, power, etc without insults and name calling?

Boy I regret my initial comment in this thread :oops:

This site used to be like that, but as of lately the fanboys & trolls are ruining this site :banghead:
 
This site used to be like that, but as of lately the fanboys & trolls are ruining this site :banghead:
yep... all the kids rolling in as computers become more popular. :/
 
I am a green Team fan boy! but i might buy one of these.
 
Back
Top