Friday, February 22nd 2019

AMD Partners Cut Pricing of Radeon RX Vega 56 to Preempt GeForce GTX 1660 Ti

AMD cut pricing of the Radeon RX Vega 56 in select markets to preempt the GeForce GTX 1660 Ti, and help the market digest inventory. The card can be had for as little as €269 (including VAT) for an MSI RX Vega 56 Air Boost, which is a close-to-reference product. The GTX 1660 Ti reportedly has a starting price of $279.99 (excluding taxes). This development is significant given that the GTX 1660 Ti is rumored to perform on-par with the GTX 1070, which the RX Vega 56 outperforms. The RX Vega series is still very much a part of AMD's product stack, and AMD continues to release new game optimizations for the card. NVIDIA is expected to launch the GeForce GTX 1660 Ti within February. Although based on the "Turing" architecture, it lacks real-time raytracing and AI acceleration features, yet retains the increased IPC of CUDA cores from the new generation.
Add your own comment

120 Comments on AMD Partners Cut Pricing of Radeon RX Vega 56 to Preempt GeForce GTX 1660 Ti

#76
bug
efikkanI'm not convinced it will be easy for AMD to sell Vega 56 at a competitive price and make some profits.
The profit margins of Vega was an issue at launch, and I don't think production costs have dropped enough to sell it way below $300.
At this point it may not be about making a profit, but minimizing losses. Because just making the cards and storing them still costs money.
Posted on Reply
#77
efikkan
bugAt this point it may not be about making a profit, but minimizing losses. Because just making the cards and storing them still costs money.
Sure, it's about dumping excess stock. This was probably overstock of some MSI cards, and they exploited this price dump for PR purposes.

Vega(14nm) is practically "unsalable" at this point, so I don't expect them to do more production runs of these chips, even though the bigger Navi chip might be many months away. Similar price dumps could be expected if they sell too slowly.
Posted on Reply
#78
Super XP
kastriotRemoving old stocks and prepping for navi in july good move.
Well why else would Nvidia release the 1660ti? I don't see any other reason but to somehow prepare for a Navi July launch. The 7-7 2019 i.e. 7nm Navi & 7nm ZEN 2. This is going to be a calculated launch by AMD which they cannot mess up. And cannot delay either. lol

Compared to either card, Vega 56 & 1660ti, the Vega 56 hardware wise wins out any day.
Its a compute card & a gaming card in one.
Posted on Reply
#79
las
Well it's going to be ugly when Nvidia puts out 7nm+ chips.

After all Turing was never meant for 12nm, so I'd expect 7nm+ GPU's this year.
Posted on Reply
#80
efikkan
Super XPWell why else would Nvidia release the 1660ti? I don't see any other reason but to somehow prepare for a Navi July launch.
Expected competition is of course a part of the reason, especially since the planning and making of this chip is prepared years in advance.

But you can't ignore the technological improvements of the new architecture, like the "small" optimization of splitting fp32 and int units in the shader processors, which gives some nice improvements in games which heavily rely on integer math in their shaders, which more and more games do.
lasAfter all Turing was never meant for 12nm, so I'd expect 7nm+ GPU's this year.
From TSMC or Samsung?
TSMC 7nm+(EUV) are starting tapeouts now, so finished chips can theoretically arrive very late 2019 or early 2020. I believe Samsung is about the same or a little later.

I still haven't heard anything about a new consumer lineup from Nvidia this year, and I don't expect one, but something limited like a new Titan based on a potential new Tesla lineup is entirely possible.
Posted on Reply
#81
las
efikkanExpected competition is of course a part of the reason, especially since the planning and making of this chip is prepared years in advance.

But you can't ignore the technological improvements of the new architecture, like the "small" optimization of splitting fp32 and int units in the shader processors, which gives some nice improvements in games which heavily rely on integer math in their shaders, which more and more games do.


From TSMC or Samsung?
TSMC 7nm+(EUV) are starting tapeouts now, so finished chips can theoretically arrive very late 2019 or early 2020. I believe Samsung is about the same or a little later.

I still haven't heard anything about a new consumer lineup from Nvidia this year, and I don't expect one, but something limited like a new Titan based on a potential new Tesla lineup is entirely possible.
Well Nvidia's 2000 series sales are pretty bad. Performance uplift from 1000 series were not big enough. People are holding on to their cards. Nvidia had to cut back on specs because of 12nm. So I'd think Nvidia wants to make people start buying cards again ASAP. I'll change my 1080 Ti only when 3000 series comes out or AMD surprises us with a high-end GPU, which I sadly don't think they will.
Posted on Reply
#82
efikkan
lasWell Nvidia's 2000 series sales are pretty bad. Performance uplift from 1000 series were not big enough. People are holding on to their cards. Nvidia had to cut back on specs because of 12nm. So I'd think Nvidia wants to make people start buying cards again ASAP. I'll change my 1080 Ti only when 3000 series comes out or AMD surprises us with a high-end GPU, which I sadly don't think they will.
A performance uplift of ~35-40% is not bad at all. Most gamers have not used to upgrade every generation anyway, so that's just a poor excuse.

The main reason for poor sales of Turing are a couple of things. Firstly a higher than normal overstock of old cards on discount led to an undeserved negative response by reviewers; RTX 2070 and RTX 2080 is not that badly priced, but looks bad vs. Pascal on discount. Secondly, the availability of the new cards have been a little low.
Posted on Reply
#83
notb
bugAt this point it may not be about making a profit, but minimizing losses. Because just making the cards and storing them still costs money.
Precisely that.
It's not RX480. They can't just push the clocks and recycle it. Unless they make Navi just another GCN refresh on the old node.
lasWell Nvidia's 2000 series sales are pretty bad.
This is based on...? Do we have some official data?
2000 series are high-end cards and they're meant to sell in relatively low volume. But from what I've heard, sales are fine in the target group. Even looking at this forum, you can see many people decided to replace their 1080.
Super XPWell why else would Nvidia release the 1660ti? I don't see any other reason but to somehow prepare for a Navi July launch. The 7-7 2019 i.e. 7nm Navi & 7nm ZEN 2. This is going to be a calculated launch by AMD which they cannot mess up. And cannot delay either. lol
Because RTX cards are just for the high-end at this point. Nvidia released 2060 first hoping some people would increase their budget by those $50. It's like smartphone manufacturers who make the flagship phones a bit more expensive with every generation.
Now it's time for the mainstream cards.
Compared to either card, Vega 56 & 1660ti, the Vega 56 hardware wise wins out any day.
Its a compute card & a gaming card in one.
"if you tell a lie a thousand times"...
Posted on Reply
#84
Vayra86
efikkanA performance uplift of ~35-40% is not bad at all. Most gamers have not used to upgrade every generation anyway, so that's just a poor excuse.

The main reason for poor sales of Turing are a couple of things. Firstly a higher than normal overstock of old cards on discount led to an undeserved negative response by reviewers; RTX 2070 and RTX 2080 is not that badly priced, but looks bad vs. Pascal on discount. Secondly, the availability of the new cards have been a little low.
What the hell are you talking about. Pascal has been on discount for 1-3 months from its release until EOL and even thén it was priced above MSRP. The only real discounts were second hand GPUs that people mined with.

Furthermore the only performance uplift is found in the 2080ti. The rest of the stack is identical to Pascal in terms of raw perf, and costs as much or more, except for the 2060. This isn't progress, its just Nvidia pushing an overpriced Titan variant to the market when it comes to performance. Been there done that, and it was never a good buy, and it isn't today.

Realistically what we're looking at since Pascal release is complete stagnation for the overwhelming majority of the target audience.
Posted on Reply
#85
efikkan
Vayra86Furthermore the only performance uplift is found in the 2080ti. The rest of the stack is identical to Pascal in terms of raw perf, and costs as much or more, except for the 2060. This isn't progress, its just Nvidia pushing an overpriced Titan variant to the market when it comes to performance. Been there done that, and it was never a good buy, and it isn't today.

Realistically what we're looking at since Pascal release is complete stagnation for the overwhelming majority of the target audience.
You have a very interesting relationship with the truth.:wtf:
Rough estimates from Techpowerup's reviews:
GTX 1060 => RTX 2060: +~63%(1440p) +~60%(1080p)
GTX 1070 => RTX 2070: +~44%(4K) +~41%(1440p)
GTX 1080 => RTX 2080: +~43%(4K) +~41%(1440p)
GTX 1080 Ti => RTX 2080 Ti: +~38%(4K) +~33%(1440p)

If only the shoe was on the other foot, you would have praised Turing.
Posted on Reply
#86
Fluffmeister
efikkanYou have a very interesting relationship with the truth.:wtf:
Rough estimates from Techpowerup's reviews:
GTX 1060 => RTX 2060: +~63%(1440p) +~60%(1080p)
GTX 1070 => RTX 2070: +~44%(4K) +~41%(1440p)
GTX 1080 => RTX 2080: +~43%(4K) +~41%(1440p)
GTX 1080 Ti => RTX 2080 Ti: +~38%(4K) +~33%(1440p)

If only the shoe was on the other foot, you would have praised Turing.
Hey at least we have the worlds first 7nm GPU that has finally put Pascal to bed:

Radeon VII 16GB (February 2019 7nm) => GTX 1080 TI (March 2017 16nm) -8% (1080P) -5%(1440P) +/- 0% (4K)
Posted on Reply
#87
efikkan
FluffmeisterHey at least we have the worlds first 7nm GPU that has finally put Pascal to bed:

Radeon VII 16GB (February 2019 7nm) => GTX 1080 TI (March 2017 16nm) -8% (1080P) -5%(1440P) +/- 0% (4K)
At least the red team is innovating!:rolleyes:
Posted on Reply
#88
bug
efikkanYou have a very interesting relationship with the truth.:wtf:
Rough estimates from Techpowerup's reviews:
GTX 1060 => RTX 2060: +~63%(1440p) +~60%(1080p)
GTX 1070 => RTX 2070: +~44%(4K) +~41%(1440p)
GTX 1080 => RTX 2080: +~43%(4K) +~41%(1440p)
GTX 1080 Ti => RTX 2080 Ti: +~38%(4K) +~33%(1440p)

If only the shoe was on the other foot, you would have praised Turing.
The thing is, numbering is a little misleading. If we look at price, performance and power draw, RTX 2060 is the successor to GTX 1070(Ti), RTX 2070 is the successor to GTX 1080 and the RTX 2080 replaces GTX 1080Ti. Even then, the new cards are faster. How can someone look at these cards and conclude "the only performance uplift is found in the 2080ti" is beyond me. Yet it happens.
Posted on Reply
#89
Super XP
notbPrecisely that.
It's not RX480. They can't just push the clocks and recycle it. Unless they make Navi just another GCN refresh on the old node.

This is based on...? Do we have some official data?
2000 series are high-end cards and they're meant to sell in relatively low volume. But from what I've heard, sales are fine in the target group. Even looking at this forum, you can see many people decided to replace their 1080.

Because RTX cards are just for the high-end at this point. Nvidia released 2060 first hoping some people would increase their budget by those $50. It's like smartphone manufacturers who make the flagship phones a bit more expensive with every generation.
Now it's time for the mainstream cards.

"if you tell a lie a thousand times"...
It's a good thing I stated Hard Facts then.
efikkanAt least the red team is innovating!:rolleyes:
:toast: :laugh:
Posted on Reply
#90
Assimilator
Super XPIt's a good thing I stated Hard Facts then.
Hard facts that don't mean s**t in the market that Vega is playing in. The only people who ever bought those cards for compute were miners, miners aren't buying them anymore, therefore compute on Vega is effectively wasted die area at this point.
Posted on Reply
#91
Vayra86
efikkanYou have a very interesting relationship with the truth.:wtf:
Rough estimates from Techpowerup's reviews:
GTX 1060 => RTX 2060: +~63%(1440p) +~60%(1080p)
GTX 1070 => RTX 2070: +~44%(4K) +~41%(1440p)
GTX 1080 => RTX 2080: +~43%(4K) +~41%(1440p)
GTX 1080 Ti => RTX 2080 Ti: +~38%(4K) +~33%(1440p)

If only the shoe was on the other foot, you would have praised Turing.
1060 ($ 249,-) >> 2060 ($ 349,-) (You will find me saying the 2060 is the only interesting release this gen, but comparing to a 1060 is not correct)
1070 ($ 379,-) >> 2070 ($ 499,-)

Should I continue? ... this isn't rocket science. Perf/dollar is 100% stagnant from Pascal to Turing and the only interesting release is found in the midrange, with performance that was available for over 2 years now.

There is absolutely nothing to praise here, and this has been clear as day since this generation was announced. Nvidia does its usual twist and bend of the numbers to extract more cash from a market that has been starving for upgrades for years, the question is should you fall for it, or use common sense and hard numbers that factor in price. The reality is, if you look at TPU reviews, barely anything has changed across the board in terms of realistic purchase options - the stack got updated and we barely gained anything. A few 5-8% perf jumps at the same price point is hopeless, (that is what AMD managed with just driver releases over the time Pascal was released...for some perspective) - especially with twice as much time in between releases as usual.

But let's drive this home anyway, because apparently I'm saying strange things to you

970 >> 1070: +40% perf / $329,- vs $ 379,- (Let's say the usual, generational +30% got a little bonus and it costs us 50 bucks extra, reasonable!)
980 >> 1080: +51% perf / $549,- vs $ 599,- (Same deal, bigger jump)

Yeah, business as usual hm? :p My god... I thought we had smart people here.
bugThe thing is, numbering is a little misleading. If we look at price, performance and power draw, RTX 2060 is the successor to GTX 1070(Ti), RTX 2070 is the successor to GTX 1080 and the RTX 2080 replaces GTX 1080Ti. Even then, the new cards are faster. How can someone look at these cards and conclude "the only performance uplift is found in the 2080ti" is beyond me. Yet it happens.
Titan was faster too back in the Kepler days, but nobody told you it was an interesting GPU to buy for gaming.

I think its about time people cooled down their upgrade itch a bit and look at the hard numbers, most notably the one with the dollar sign in front. There is little to be gained here for a high end Pascal user, which most of us are.
Posted on Reply
#92
bug
Vayra86Titan was faster too back in the Kepler days, but nobody told you it was an interesting GPU to buy for gaming.

I think its about time people cooled down their upgrade itch a bit and look at the hard numbers, most notably the one with the dollar sign in front. There is little to be gained here for a high end Pascal user, which most of us are.
You are not wrong (and I have always bought mid-range cards because perf/$), but at the same time, cost isn't everything to everyone.
Posted on Reply
#93
Vayra86
bugYou are not wrong (and I have always bought mid-range cards because perf/$), but at the same time, cost isn't everything to everyone.
Cost is a thing to 95% or more of the target market, its too big to ignore. The exotic (questionable) forum statements of a tiny minority of enthusiasts doesn't change a thing about how a GPU gen is received.

Note: this does NOT mean there is a hard upper limit to the cost of a GPU. A good example is the 1080ti, lots of those have been sold in spite of a pretty hefty price tag. It was simply great price/perf. And this is where Turing falls flat on its face (most of it).

Price is a key factor, there is no denying it. If it weren't we'd have our ray tracing decades ago, and a computer farm in the basement to run it.
Posted on Reply
#94
bug
Vayra86Cost is a thing to 95% or more of the target market, its too big to ignore. The exotic (questionable) forum statements of a tiny minority of enthusiasts doesn't change a thing about how a GPU gen is received.

Note: this does NOT mean there is a hard upper limit to the cost of a GPU. A good example is the 1080ti, lots of those have been sold in spite of a pretty hefty price tag. It was simply great price/perf. And this is where Turing falls flat on its face (most of it).

Price is a key factor, there is no denying it. If it weren't we'd have our ray tracing decades ago, and a computer farm in the basement to run it.
Yeah, I won't argue with that.
The only thing I was trying to correct was when you said "I think its about time people cooled down their upgrade itch a bit and look at the hard numbers". Then you also said "Cost is a thing to 95% or more of the target market, its too big to ignore."
The 95% is debatable, but then again not everybody lives in a first world country. My point was, there aren't many that need to "cool down their upgrade itch". If you're already price conscious, you weren't planing to upgrade anyway. That's all. I hope that makes sense now.
Posted on Reply
#95
las
efikkanA performance uplift of ~35-40% is not bad at all. Most gamers have not used to upgrade every generation anyway, so that's just a poor excuse.

The main reason for poor sales of Turing are a couple of things. Firstly a higher than normal overstock of old cards on discount led to an undeserved negative response by reviewers; RTX 2070 and RTX 2080 is not that badly priced, but looks bad vs. Pascal on discount. Secondly, the availability of the new cards have been a little low.
I have 1080 Ti. The RTX 2080 is a side-grade. 2080 Ti is expensive and not that impressive.

How can you say 2070 and 2080 are not badly priced haha, 2080 took over 1080 Ti's pricing while performing the same.

I'll be buying Nvidia GPU's again when 7nm chips are out.
Posted on Reply
#96
Super XP
AssimilatorHard facts that don't mean s**t in the market that Vega is playing in. The only people who ever bought those cards for compute were miners, miners aren't buying them anymore, therefore compute on Vega is effectively wasted die area at this point.
Miners? I'm talking about professional use.
Posted on Reply
#97
Assimilator
Super XPMiners? I'm talking about professional use.
Nobody uses consumer cards in professional applications, probably because you can't, because the professional features are locked to actual professional SKUs, so that graphics card companies can make money off said professional SKUs.
Posted on Reply
#98
bug
Super XPCompared to either card, Vega 56 & 1660ti, the Vega 56 hardware wise wins out any day.
Its a compute card & a gaming card in one.
Super XPMiners? I'm talking about professional use.
It could have a place into rigs that both game and use OpenCL for stuff like photo editing or something. But Nvidia has the nasty habit of beating the crap out of AMD's OpenCL implementation. Not always, but often enough.
Posted on Reply
#99
medi01
Vayra86980 >> 1080: +51% perf / $549,- vs $ 599,
980 was piece of shit at $549 for quite some time, before 1080 was in sight, let alone "Fools Edition" pricing.
But good job greentwisting reality.

Another great excuse it how expensive new process node is.
Posted on Reply
#100
Vayra86
medi01980 was piece of shit at $549 for quite some time, before 1080 was in sight.
But good job greentwisting reality.

Another great excuse it how expensive new process node is.
Damn! Call the news, because we agree on a point here! Still doesn't change a thing about the 970 vs 1070, both high-end-value kings.

I'm also not presenting excuses, quite the opposite, in fact. Do you even recognize an ally when he drops a care package at your door, or are you simply blinded by anger?
Posted on Reply
Add your own comment
Apr 26th, 2024 22:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts