Discussion in 'News' started by btarunr, Nov 2, 2013.
We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.
When 20nm comes its going to be a lot more costly for both AMD and NVidia. 28nm is refined and as far as they are concerned it's dirt cheap to produce something that has much better failure rates - one of the many reasons 20nm is taking so long. If and when 20nm does come, the GPU's are probably going to be pretty expensive.
It's Radeon HD 7970 vs GeForce GTX 680 all over again...
I see this being mostly a driver war... again, which is great, we didn't have drivers that squeezed the living daylight (in terms of performance) before that generation (at least, in the limited way high-level APIs can allow it).
It's fair to think that AMD might release a newer stepping of Hawaii Pro/XT (Pro-H2/XT2? Ha...) with better yields and maybe higher stock clocks (for one thing tho, the memory could be a lot higher clocked).
R9 290X w/ increased efficiency and 1075MHz reference boost clock & 8GB of 6500MHz GDDR5s please...
BTW, anyone know if Hawaii XT is full Hawaii chip? 3072 ALUs / 192 TMUs sounds like way better numbers... Maybe only for the Pro market.
All in all, performance is on the way up and prices on the way down, and the (GP)GPU landscape isn't boring anymore, for the moment.
Yayyy it looks like..... the other gpus.....
Shitty card which tackle 650$ even 1000$ card
Yes sir. It's a shitty card indeed.. driver blah blah classy...
Are you mad bro?
Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price
Next thing you know you'll also say that the higher power consumption is a plus also
Driver point is moot since both sides have it's fair share of problems.
It's just funny to see how very few mention the power consumption now like they did when the 290x was tested. I personally don't care but it's funny to read.
I might hold out for the R9 290, but with the performance I get now it's questionable as to why not wait for 20nm. All these "new" cards are 28nm being pushed to it's limits, and while impressive there are key areas (in my book) these cards sacrificed for there sheer performance. So whatever, I'll wait, there are certain things I want with my cards.
i think this card is better faster and cheaper than R9 290x
when any company unleashe any GPU faster and powerfull think about
power consumption and noise and heats
i dont think come soon any 20 nm gpu
waiting for any more benchs you see GTX 780 Ti beats all AMD GPUS single and duals
Cheaper no. Faster yes.
NVidia already confirmed 699 price for the 780ti ($150 more then the r290x). It also probably wont beat Dual GPU cards, but if it even gets near one it'll be right around 690 performance.
Are those really leaks?As they say in description,I thionk its been done on purpose just to add some hype to potential customers,so they just nameing it as leak just to hide advert BUt it looks monster
Conversely, the $550 290X offers 4% (quiet) / 11.1% (scream) more performance than the reference 780 for a 10% higher price and three fewer games ( or 16% higher price w/ 2 fewer games)...and that's assuming that you could find a 290X in stock.
Nice cherry picking. Even nicer incoherent rant.
Kirk Lazarus doesn't approve.
Looks like smaller die is useless die.
I don't get what you people gain by throwing gasoline on threads, especially you.
You should know by now Rad that some people can only post in certain colours. I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.
Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:
I'm calling BS on their power consumption numbers; something is seriously wrong.
W1zzard tested the GTX 690 at 274W Peak and the R9 290X at 282W peak. Yet somehow the 780Ti in these charts consumes 75W more than either of those. That would make it a 375W card, which it couldn't be since it only has a 300W power design (6+8pin PCIe connectors).
I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec. These power consumption numbers can't be correct for the benchmarks.
The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT. AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.
NVIDIA wins in everything. Nice.
Apparently they have official specs of the card. Another interesting thing is:
The amount of ignorance and fanboysm in this thread is too damn high!
The amount of worthiness in this post is too damn low.
WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews
This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)
Are you sure you don't prefer your ladies with "quality silicone" instead?
Are you looking at the same TPU data I am? It's obvious that both AMD and NVidia clamp board power consumption to the specification. Look at the maximum power consumption chart from the R9 290X review. Titan and the 780 are both 250W cards, and they are held to within 7% of that specification. R9 290X is a 300W card, and it is held to 5% of that specification. The leaked specifications call the 780Ti a 250W card. There is no way that the 780Ti is drawing 75W more than the R9 290X even though other 250W NVidia cards draw 50W less.
I share your skepticism, but unlike you I don't think in any way this is the best case power consumption scenario for the 780Ti; In fact, I would argue that this is much worse than the worst case. It looks like the tester tried to overclock a reference board to get better performance results but also is drawing vastly more power because of it. I would expect actual 780Ti performance to be slightly less than these results but board power to be significantly less.
Release a card for $500 bucks over all other GPU's. Some buy it, some are on the fence, some think it's a ripoff.
...Then release a card for $200 over all other GPU's and it's a bargain!!
Where's the uber mode chart?
Separate names with a comma.