Monday, February 20th 2017

NVIDIA to Steal AMD's Ryzen Limelight on Feb 28

NVIDIA could attempt to steal the limelight from AMD's 2017 "Capsaicin & Cream" launch event for its Ryzen desktop processors, slated for February 28, with a parallel GeForce GTX event along the sidelines of the 2017 Game Developers' Conference (GDC). At this event, the company is expected to launch its next enthusiast-segment graphics card, the GeForce GTX 1080 Ti. This could at least be a paper-launch, with market availability following through in March.

While the GTX 1080 Ti is a graphics card, and Ryzen a processor (they don't compete), NVIDIA's choice of launch-date could certainly steal some attention away from AMD's big day. Besides launching Ryzen, it wouldn't surprise us if AMD teases its upcoming Radeon "Vega" graphics cards a little more. The GeForce GTX 1080 Ti is expected to be based on the same "GP102" silicon as the company's flagship TITAN X Pascal graphics card, and could be positioned very close to the USD $1,000 mark, given that NVIDIA priced the TITAN X Pascal at a wallet-scorching $1,199.
Sources: IO-Tech, NordicHardware
Add your own comment

78 Comments on NVIDIA to Steal AMD's Ryzen Limelight on Feb 28

#51
kruk
thesmokingman, post: 3604636, member: 91203"
The ironic thing about the licenses right now is that x86 is pretty much dead so Intel is like living off AMD in a roundabout way.
Isn't AMD64 just an extension of x86 and cannot exist without the latter?
Posted on Reply
#52
eidairaman1
The Exiled Airman
kruk, post: 3604653, member: 168606"
Isn't AMD64 just an extension of x86 and cannot exist without the latter?
It can Exist on its own, X86 is just for legacy stuff now. Once everyone is pushed away from 32bit instructions and memory spaces (all devices that have a CPU in them) then it will just be called X64 at that point. X86 started 8086, aka 80286, 80386, 80486, 80586(pentium)

https://en.wikipedia.org/wiki/X86
Posted on Reply
#53
ADHDGAMING
FordGT90Concept, post: 3604192, member: 60463"
Ha, NVIDIA wishes! GTX 1080 Ti is too expensive for most gamers to care. Ryzen, on the other hand, is upsetting the almost 11 year status quo. GTX 1080 Ti will be a footnote while Ryzen gets mainstream press coverage.

This behavior coming out NVIDIA is expected: very unsportsmanlike.
Dont we mean Vega???
Posted on Reply
#54
mouacyk
efikkan, post: 3604523, member: 150226"
It will be ~30% faster than GTX 1080 and cost ~40% more, so a good deal for demaning gamers.
By pricing the TXP at $1200, NVidia just cleared a pricing spot for the Ti. They are regretting the killing they could've done with the 980 ti but didn't because they practically re-offered the TXM at $650. It's called price anchoring, and the TXP pricing gave it all away when it came out.

I'm not budging from my 980 TI, unless Vega has a SKU that can offer +50%.
Posted on Reply
#55
efikkan
kruk, post: 3604541, member: 168606"
Do you really think nVidia will charge only 800$ for a card allegedly faster than $1200 Titan X Pascal with Vega missing? I will be surprised if the card is under $900.
GTX 1080 Ti will not be faster than Titan X (Pascal), it will use a lower binning but with slightly higher clocks, giving marginally lower performance than the Titan X.

kruk, post: 3604541, member: 168606"
You basically don't know nothing about Vega and yet you keep telling us how fast it is and how much power it consumes. Please stop doing this until it releases, thanks!
AMD has demonstrated the performance of Vega 10, slightly beating GTX 1080 in AMD favoring titles like Doom, so we actually know. This is the Vega chip which will be just as big as a GP102, while matching the GP104 in performance.

BTW; The most efficient Pascal is >80% more efficient than the most efficient Polaris. There is no way Vega will close the whole gap.

phanbuey, post: 3604548, member: 45008"
A significant upcoming product it is, but not compared to Ryzen... Not anywhere close. You're looking at the first mass market octacore vs. a cheaper variant of the existing Titan line.

One product is ~30% faster and 40% more expensive, and the other is ~50% cheaper and beats/matches the fastest cpu on the market.
I never said it was, and if you actually read my post you'll see I argued it would be unfortunate for Nvidia to release it at the same time.
Posted on Reply
#57
FordGT90Concept
"I go fast!1!11!1!"
ADHDGAMING, post: 3604666, member: 168394"
Dont we mean Vega???
Nope, Vega is gonna be a while yet (1H 2017 last I heard). NVIDIA is trying to rain on Ryzen's parade.
Posted on Reply
#58
Hotobu
FordGT90Concept, post: 3604728, member: 60463"
Nope, Vega is gonna be a while yet (1H 2017 last I heard). NVIDIA is trying to rain on Ryzen's parade.
I don't see how. They don't compete in the same sector. I guess the argument can be made that if people go out and spend money on a processor motherboard combo then that leaves less wallet share for spending on a video card, but I would think very rare is the case that someone says, "Let me skip this top of the line GPU to go buy a new motherboard/processor." My guess is that the people who can afford to drop close to 1K on a videocard are most likely very rarely going to be in a situation where they are CPU bottlenecked enough to have to make that decision. If anything new systems should encourage the purchase of video cards.

I think it's moreso the case that they're piggybacking off of AMD and not trying to steal the limelight.
Posted on Reply
#59
jabbadap
FordGT90Concept, post: 3604728, member: 60463"
Nope, Vega is gonna be a while yet (1H 2017 last I heard). NVIDIA is trying to rain on Ryzen's parade.
Well you could say If Ryzen is great success, then Nvidia is trying to ride on it by offering top performant graphic cards.
Posted on Reply
#60
kruk
efikkan, post: 3604695, member: 150226"
GTX 1080 Ti will not be faster than Titan X (Pascal), it will use a lower binning but with slightly higher clocks, giving marginally lower performance than the Titan X.
Maybe not the reference versions, but last time I checked the partner 980 Ti's were faster than Titan X Maxwell and thats what most people will buy.

efikkan, post: 3604695, member: 150226"
AMD has demonstrated the performance of Vega 10, slightly beating GTX 1080 in AMD favoring titles like Doom, so we actually know. This is the Vega chip which will be just as big as a GP102, while matching the GP104 in performance.

BTW; The most efficient Pascal is >80% more efficient than the most efficient Polaris. There is no way Vega will close the whole gap.
I'm not going through this again. You have a black box GPU running a single game and you are able to predict it's performance and power consumption? No, just no.
Posted on Reply
#61
medi01
uuuaaaaaa, post: 3604720, member: 98273"
AMD machine right now:


Thanks, not sure what Raja has to do with Ryzen though. He better work harder on Vega, Polaris was meh.
Posted on Reply
#62
medi01
efikkan, post: 3604695, member: 150226"
BTW; The most efficient Pascal is >80% more efficient than the most efficient Polaris. There is no way Vega will close the whole gap.
Fury X, a 28nm product, consumes 70% more than 1070, that it occasionally beats.

Power consumption difference between 480 and 1060 is about 20%.

"most efficient" vs "most efficient" is a silly comparison, as one might even not be comparing cards from the same tier.
Posted on Reply
#63
Zeki
coolernoob, post: 3604200, member: 169961"
that is right - a 1 year old architecture, 1000$ GPU (that is not even the top card by any means) will steal the show. I and many will buy that 1000$ card and pair with old intel i5-2500k, because there was little to none improvement in these last 6 "generations" - and none of us cares about CPU's and platform, but that is ok, cuz nvidia told us so.
Lol so true, I went from gtx980ti & i5-2500k to gtx980ti& i7-6700k to gain a few fps, well I hope I did it's hard to tell.
Posted on Reply
#64
efikkan
medi01, post: 3604939, member: 158537"
Fury X, a 28nm product, consumes 70% more than 1070, that it occasionally beats.

Power consumption difference between 480 and 1060 is about 20%.
That's not even remotely close to the truth.



medi01, post: 3604939, member: 158537"
"most efficient" vs "most efficient" is a silly comparison, as one might even not be comparing cards from the same tier.
You are missing the whole point. The least efficient Pascal (GTX 1060) is still ~55% more efficient than the most efficient Polaris (RX 480). Vega 10 have to compete with gp104 which is 80% more efficient than Polaris, so that is the benchmark!
Posted on Reply
#65
medi01
efikkan, post: 3605126, member: 150226"
That's not even remotely close to the truth.
Besides being apples to oranges (perf/watt is not the same as absolute power consumption, good job trying to twist reality here) that's an ancient review, a sad example of why you shouldn't trust TPU much:

1) that has rather remarkable power consumption difference (compared to majority of sites) with 480 consuming even more than 1070
2) has messed up charts (arithmetic mean instead of geometric)
3) 480 gained a lot of perf after that

Check recent reviews of AIBs in power consumption in actual games with actual drivers. E.g. on computerbase.

efikkan, post: 3605126, member: 150226"
The least efficient Pascal (GTX 1060) is still ~55% more efficient than the most efficient Polaris (RX 480). Vega 10
Bullshit, even if you'd omit vega, which you can't know shit for sure about, in that ridiculous sentence.
Posted on Reply
#66
bobalazs
Or the other way around; Amd steals the spotlight away from *vidia.
Posted on Reply
#67
efikkan
medi01, post: 3605335, member: 158537"
1) that has rather remarkable power consumption difference (compared to majority of sites) with 480 consuming even more than 1070
2) has messed up charts (arithmetic mean instead of geometric)
3) 480 gained a lot of perf after that

Check recent reviews of AIBs in power consumption in actual games with actual drivers. E.g. on computerbase.
That's completely untrue. AMD fanboys keep telling a fairytale that the first reviews and the first cards were bad. I've actually checked >80 reviews of RX 480, there is no significant variance between the initial reviews and the more recent, just the normal few percent variance as you should expect.

You claim RX 480 has gained a lot since that, well it has not. It has gained a few percent here in select games and there through driver tweaks, but so has the competition too. The performance gap between RX 480 and GTX 1060 remain unchanged.

medi01, post: 3605335, member: 158537"
Bullshit, even if you'd omit vega, which you can't know shit for sure about, in that ridiculous sentence.
What? Try reading your own post before submitting.

Vega 10 has to compete with GP104, so the efficiency of GP104 is the benchmark. GTX 1080 is 80% more efficient than RX 480, so that's the gap AMD need to close. I also want to remind you that even though AMD promised ~2.5× performance/watt with Polaris, it was just a few percent more efficient than Fiji, and that's even with a node shrink! For comparison, the "boring" Pascal achieved a massive gain in efficiency vs. Maxwell. Try reading the numbers yourself.:rolleyes:
Posted on Reply
#68
medi01
efikkan, post: 3605388, member: 150226"
I've actually checked >80 reviews of RX 480, there is no significant variance between the initial reviews and the more recen
Oh really:
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review.html

It improved for two reasons:
1) Ref card was poor (throttling)
2) Driver bump

Oh, 290x is faster than 780Ti nowadays, isn't it? Miracle.


efikkan, post: 3605388, member: 150226"
You claim RX 480 has gained a lot since that, well it has not.
Yeah, "has not". And for the power consumption, Anno game, total system power consumption (in watts):



From https://www.computerbase.de/2017-02/palit-geforce-gtx-1050-ti-kalmx-test/2/#abschnitt_leistungsaufnahme



efikkan, post: 3605388, member: 150226"
1080 is 80% more
Oh dear.
1) %-s don't work like you think
2) With your approach you have 2 different values from your own slides.
3) For real figures of real cards in real game see chart above.
Posted on Reply
#70
londiste
medi01, post: 3605640, member: 158537"
Oh really:
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review.html

It improved for two reasons:
1) Ref card was poor (throttling)
2) Driver bump
it improved primarily for one reason - different selection of benchmarked games.
in games benchmarked both in this review and previous ones, rx480 improved 2-3% averaged across games (which is pretty good result in itself).

regarding these power consumption graphs, anno 2205 is a pretty strange game to use for power consumption test. if nothing else, that game is pretty consistently favoring nvidia cards.
look at the previous page in both these reviews - 1060 gets 44 fps, rx480 gets 33 fps, that is a 33% difference.
in a test machine with an overclocked 6700k, 4 ram sticks and 3 drives (this setup should consume above 100w at load even without graphics card), it would seem that neither card is fully utilized by anno 2205.
Posted on Reply
#71
medi01
londiste, post: 3605692, member: 169790"
it would seem that neither card is fully utilized by anno 2205.
Mind blown.
Posted on Reply
#72
medi01
Note Vishera in this lits

Posted on Reply
#73
efikkan
medi01, post: 3605640, member: 158537"
Oh dear.
1) %-s don't work like you think
Look at the link I provided:

The other one:


As everyone can see, GTX 1080 is ~80% more efficient, since it provides ~80% more performance per watt. This is the very definition of energy efficiency. You seriously can't disagree with basic math! I'm sorry, but this is completely ludicrous.

medi01, post: 3605640, member: 158537"
2) With your approach you have 2 different values from your own slides.
No, again see the link I provided and this one;
GTX 1080 182% performance per watt compared to RX 480, and 178% in the other. Different benchmarks showing exactly the same.
I have not manipulated any data, just compressed the charts to show the relevant models. Being dishonest does not help your case.

medi01, post: 3605640, member: 158537"
3) For real figures of real cards in real game see chart above.
So you are accusing Techpowerup and the sites behind 80+ other reviews which arrived with the same conlusion for fake reviews?:rolleyes:
I think you should learn about statistical outliers is before you start cherry-picking.
Posted on Reply
#74
medi01
efikkan, post: 3605872, member: 150226"
As everyone can see, GTX 1080 is ~80%
You have been told, what was wrong with this benchmark.
You ignored it or didn't comprehend.
You were told, %-s don't work like you think.
You got to an idea that repeating the same statements somehow makes it better.

You were provided with benchmarks and links showing actual power consumption in actual game across AIB 480's and AIB 1060's being roughly in the same ballpark, which you again have ignored.

What are you expecting from me to tell you now?
Posted on Reply
#75
efikkan
medi01, post: 3605979, member: 158537"
You have been told, what was wrong with this benchmark.
You ignored it or didn't comprehend.
You were told, %-s don't work like you think.
You got to an idea that repeating the same statements somehow makes it better.

You were provided with benchmarks and links showing actual power consumption in actual game across AIB 480's and AIB 1060's being roughly in the same ballpark, which you again have ignored.

What are you expecting from me to tell you now?
Please stop filling the forum with trash, BS and false information. You've already proven wrong multiple times, even by Techpowerup's own benchmarks. If you can't accept basic facts, please go play elsewhere!

This thread was supposed to be about GTX 1080 Ti and whatever Nvidia will showcase at GDC.
Posted on Reply
Add your own comment