• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Kepler Unbeatable: NVIDIA

Who cares about power usage,
People who buy high end gpu's don't care about power usage, most hardware junkies get water blocks or aftermarket coolers

I know i can't wait and at the end of this year a will grap one, i don't care about price, my gtx275 is getting old as shit
 
Who cares about power usage,
People who buy high end gpu's don't care about power usage, most hardware junkies get water blocks or aftermarket coolers

I know i can't wait and at the end of this year a will grap one, i don't care about price, my gtx275 is getting old as shit

Well my Q9650 + X48 chipset are getting old as shit, bottleneck my GPU.
 
Talk is unbeatably cheap.
 
NVidia is far more competent at making powerful GPUs than AMD is at making powerful CPUs...:laugh:

It also helps if you have been in the lead the past two generations already, which AMD wasn't when Bulldozer came out, but nVidia is with Kepler.



Lead for maximum power yes, but not at sales.. I you look t he market share, AMD is in front of nVidia..

ANd I don't know why there was an annoucement, xbox720 and the new playstation will use AMD graphics, while the Wii 2 use also AMD graphics.. This is bad news for nVidia :(

Anyway, AMD was good at making CPU, they just sit on their ass a few years ago, and now you see the what it does.

But for now, HD7970 is here, way faster than what nVidia has and that's it..it has a powerful overclocking capabilities, which gives spaces for newer cards or super overclocked edition. While nVidia doesn't show their Kepler, AMD might be preparing already, a new revision ot Tahiti GPU, for next gen. HD8000 will not be a new architecture, I'm sure. So it's just speculation about Kepler for now, they talk but show nothing.

I'm still waiting has I want cheaper card that will perform fine :) (Or getting a second HD6950, at low price)


edit: for power consumption, AMD controlle their power consumption way better than nVidia.. If you run your rig 24/7, this could make a difference at the end of the year. anyway, mine run smooth idle but CPU 100% for BOINC :)
 
I think as PSU prices go up that people may start having more concern over power consumption. Regardless, even if I am wrong.... IF you had 2 gpu's with the same performance/price why buy the one that uses more power? I would take a slight hit on performance for a good power saving.
 
Wow, AMD graphics fanboys out in force today.

Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.

NV makes a lot of bonehead moves, why they lost all the contracts for the next consoles. But I can't fault them on building good GPUs. With Kepler moving them away from the monolithic monster GPU design, I can't wait to see it.
 
I would take a slight hit on performance for a good power saving.

I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.

Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.


*Unless you participate in a distributed platform using the GPU. ;)
 
Lead for maximum power yes, but not at sales.. I you look t he market share, AMD is in front of nVidia..
Thats not true for the discrete market nv has a 10% lead over amd.
nvidia is the 1# seller of discrete desktop cards in the world.

http://techreport.com/discussions.x/22543
It does not help one bit in market terms that amd has the first top dollar card out.
Now if they had luanched there 7870 that was priced 250usd and as fast as a 6970 they would have grabed market share...not with 470+ cards they wont.
470USD cards are les then 3% of total sales.
Its not first blow but the last blow that will call market winners
If you look at the whole picture igp and cpu/gpu combos then you would be right :)
 
I would take a slight hit on performance for a good power saving.

I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.

Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.


*Unless you participate in a distributed platform using the GPU. ;)

I pay $0.15/kWh. That means it cost me near $36/month for a 300W GPU, if it ran 24/7. My power bill for January was $437.30. You bet performance/watt matters, because over 12 months, that's $432 to buy Mcdonalds with. I'll take that cheque, please, as it's easy enough for me personally to make it worthwhile. For the average user, it still might buy that McDonalds. I can't be bothered to guess at how long a GPU is at full load, on average...depends on the app and such, but it's be interesting to get a real number.
 
Don't compare that POS faildozer to Kepler. If Kepler is even half of what it's claimed, it'll be a good improvement over Fermi. Most probably it'll slaughter AMD once again, but I guess that's just too sensitive for the red fan boys, so they won't accept this.
Why not? Remember when Fermi came out? All the nvidia fanboy's were like "wait for Fermi, wait for Fermi" So a lot of people did. Then when it tanked, arguably as hard as Bulldozer, ALL of the ATi video cards sold out overnight --Literally. I watched the prices for the AMD cards rise, also overnight, $50+.

So I'm willing to bet that nvidia saw how well the 79xx cards perform and realize they have work to do yet.
Wow, AMD graphics fanboys out in force today.

Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.

NV makes a lot of bonehead moves, why they lost all the contracts for the next consoles. But I can't fault them on building good GPUs. With Kepler moving them away from the monolithic monster GPU design, I can't wait to see it.
More like hot air. Ever see the YouTube video where a guy cooked an egg on his 480?
I pay $0.15/kWh. That means it cost me near $36/month for a 300W GPU, if it ran 24/7. My power bill for January was $437.30. You bet performance/watt matters, because over 12 months, that's $432 to buy Mcdonalds with. I'll take that cheque, please, as it's easy enough for me personally to make it worthwhile. For the average user, it still might buy that McDonalds. I can't be bothered to guess at how long a GPU is at full load, on average...depends on the app and such, but it's be interesting to get a real number.
Bah you're better off spending that money on power than McDonalds anyway. ;)
 
like hd6970?w saw how hd6900 series beat fermi :lol:

in this time kepler has extremely powerfull sm architecture and much number cuda core's .... so we saw GK107 with 75w beating hd7770 and much power full than hd6850

kepler is clear winner

You simply forget that 6990 did beat 590. So, Fermi was beaten in the end. Somehow, I suspect the same thing is going to happen again this time...
 
I pay $0.15/kWh. That means it cost me near $36/month for a 300W GPU, if it ran 24/7. My power bill for January was $437.30. You bet performance/watt matters, because over 12 months, that's $432 to buy Mcdonalds with. I'll take that cheque, please, as it's easy enough for me personally to make it worthwhile. For the average user, it still might buy that McDonalds. I can't be bothered to guess at how long a GPU is at full load, on average...depends on the app and such, but it's be interesting to get a real number.

You are absolutely right. However most dont run distu platforms or their GPU 24/7365. You made an example out of the worst case scenario in which I specifically mentioned was an exception. Good job! :p

So now, do the math and help this guy out... 75W difference (225W vs 300W) Lets just say 100W to make it easy on me (college is over, so is mathssssssssssssz). So divide your numbers by 66%. Thats the difference if you run 24/7/365 between a 225W card and a 300W card (142.xx /year or ~$12 /month at your rate assuming my math is correct.

Now, if someone plays games 2 hours /day for 30 days (so 60 hours vs 720 /month), you can see the Mcdonalds analogy coming CLEARLY in to focus I would imagine... which is why I put the "*" disclaimer there in the first place to prevent replies like yours!
 
Last edited:
If some rumours and slides are to believed Keplar will have 100% increase in performance over the 5** series for the respectable replacements, I seriously hope this is true as I would buy double the performance of an 570 for the same price, nevermind TWIMTBP, TWTUTBM = the way they used to be made :P
 
I would take a slight hit on performance for a good power saving.

I would also get one single more powerful card vs SLI/Crossfire any day with a single monitor.

Why would you do that? Have you ever sat down and actually calculated out the difference between a GPU using 225W vs 300W over the course of a year? I would bet my paycheck says it would barely take your family to McDonalds* with typical GPU usage.


*Unless you participate in a distributed platform using the GPU. ;)

Why would you not??? Keep in mind I said slight hit for GOOD power savings.
 
I'm not sure you are understanding how negligible the differences are for 'average' users. If one want sto save a few dollars /year on your electric bill, why the hell are they buying $500 GPU's in the first place?

Do the math to see what the differences actually are (Its about $1 /month if used ~2 hours /day @ .15 kw/h...again assuming my math is correct from above). :)

If you are trying to save $12 a year, i would say not to buy a $500 GPU instead. :p
 
You simply forget that 6990 did beat 590. So, Fermi was beaten in the end. Somehow, I suspect the same thing is going to happen again this time...

Really?

In the summary page (7970 xfire review), the 590 beats the 6990 at every resolution. Here's the 2560 res summary.

perfrel_2560.gif


I'm only putting this in to stop blatant mistruths. Lots of people give the 590 a hard time but it runs cooler and quieter by most accounts and the very own TPU round up for the link above puts 590 as better for every resolution. But as always, it's really game dependant.

I'm pissed NV is holding back info on Kepler as I'm looking to upgrade but it's so close I need to wait to see how Kepler performs as I'm keen to see a 7970 price drop. Unless Kepler is way better (doubt it).

If Kepler bombs, I'm buying 2 7970's just as a capitalist reaction!!
 
Wow, AMD graphics fanboys out in force today.

Heck I switch between sides often enough but AMD hasn't launched anything since 5000 series which really has been worth it to me. 6000 series was a bunch of rebadges in the mid and disappointments in other segments. 7000 hasn't fared much better. Fermi was a breath of fresh air after the G92 era.


:) exatctly what ive been thinking, and no need for bickering amd is winning now for a bit then its nvidias turn for a bit then lo and behold its amd's turn add nauseum:D

I just hope nvidia dont fully drop the ball, as amd did with its hype management when releasing BD, the fallout of such a thing could prove expensive to us enthusiasts

ie kepler Needs to be good and worthy of such hype ,either way tho i will deffinately be buying a low to mid kepler card for some folding/hybrid physx action(gits will obv make this bit hard unnecessarily):)

I'm not sure you are understanding how negligible the differences are for 'average' users.

to be fair dude you are on TPU and this is a place not oft visited by the Average user, i want it all personally max performance and minimal power draw, the power pulled can be highly regarded by some as it is by me, if you fold and can run two cards 24/7 with a smaller psu and cheaper case doors can open(welll not doors more folding oportunities:))
 
Last edited:
6000 series was a bunch of rebadges in the mid and disappointments in other segments

Not completely. The crossfire scaling was much improved and overall great which is what led me to go through the hassle and hit of selling my 5850 to buy two 6850s instead of just getting another of the former. Very happy with that $300 purchase for the power for the last year and likely would still be using them for awhile but I wanted to try the NV drivers again for a change and also want more vram than 1GB. So I just found a cheap 480 for now (lots out there too; soon to be even more surely come April...).
 
Reading the original post here, I can't help but feel like we gained no new information. This is just nvidia saying: "Hey guys! The stuff we make will be awesome!"
 
Reading the original post here, I can't help but feel like we gained no new information. This is just nvidia saying: "Hey guys! The stuff we make will be awesome!"

bang on

70 posts later the dabate rages on:D
 
I don't think anyone can really refute that Nvidia will offer more powerful cards, they have pretty reliably, but the real question is both cost and relative performance. If you could pay $1000 for a 680 that has 30-40% higher performance than an HD7970, you're not exactly compelled to buy it. If Nvidia can match AMD at price points and offer much higher performance they will crush the competition.

I have high hopes for Kepler and plan on getting a 6xx series GPU to give Nvidia a shot since the last time I had an Nvidia card was a 7900GS a few years back. Nvidia is just trying to keep people in anticipation of their new line, but given the problems with the 7xxx series and drivers I've seen around, I don't think they really have much to worry about.
 
What you will see in April or maybe even sooner won't destroy the 7970. Instead it will bring something that NV was behind: perf/watt and maybe also a better perf/dollar. Don't expect a "performance" chip to beat AMD's top dog. That will come later with the GK110. I think GK104 will be (at stock clocks) between the 7950 and the 7970.
 
I'm not sure you are understanding how negligible the differences are for 'average' users. If one want sto save a few dollars /year on your electric bill, why the hell are they buying $500 GPU's in the first place?

Do the math to see what the differences actually are (Its about $1 /month if used ~2 hours /day @ .15 kw/h...again assuming my math is correct from above). :)

If you are trying to save $12 a year, i would say not to buy a $500 GPU instead. :p

More power consumption boils down to more than energy savings per month. I don't even look at it that way. I think about a smaller power supply, a power supply lasting longer, and a quieter system due to less heat output.

This argument is pointless though, for all we truly know Kepler could use very little energy.
 
I like how everyone is ranting about a chip where there is almost no factual information available yet. How many times do I have to say that we're still waiting on Kepler and that it does no one any good until it is released. The 7970 is here and it is doing great, that is more I can say for Kepler at the moment...
 
Back
Top