Sunday, July 12th 2020

NVIDIA Prepares to Stop Production of Popular RTX 20-series SKUs, Raise Prices

With its GeForce RTX 30-series "Ampere" graphics cards on the horizon, NVIDIA has reportedly taken the first steps toward discontinuing popular SKUs in its current RTX 20-series graphics cards. Chinese publication ITHome reports that several premium RTX 20-series SKUs, which include the RTX 2070, RTX 2070 Super, RTX 2080 Super, and the RTX 2080 Ti, are on the chopping block, meaning that NVIDIA partners are placing the last orders with upstream suppliers for parts that make up their graphics cards based on these GPUs.

It is a slow process toward product discontinuation from this point, which usually takes 6-9 months, as the market is left to soak up leftover inventory. Another juicy bit of information from the ITHome report is NVIDIA allegedly guiding its partners to increase prices of its current-gen high-end graphics cards in response to a renewal in interest in crypto-currency, which could drive up demand for graphics cards. NVIDIA is expected to announce its GeForce RTX 30-series on September 17, 2020.
Sources: ITHome, Tom's Hardware
Add your own comment

66 Comments on NVIDIA Prepares to Stop Production of Popular RTX 20-series SKUs, Raise Prices

#26
fynxer
Here is how it works:

NVIDIA wants to raise the price of the RTX2000 series with all partners, this way the RTX3000 series can be introduced at an higher price giving more value to the RTX3000 series compared to RTX2000 series.

Then after the RTX3000 is introduced they can make a FAKE price cut to the RTX2000 series based on the new inflated price.

This way they get a higher introduction price of the RTX3000 without it seaming to weird AND a sell out price of the RTX2000 that is basically the same as today.

Let me show you an example of EVGA RTX 2070 Super Gaming priced at approx $519 at Amazon.com today:

1. Today they raise the price of RTX2070 from $519 to $650
2. In end of August they introduce RTX3070 at $699
3. In beginning of September they lower the price of RTX2070 to $499 as a "SALE" to dump stock
4. End of September they release RTX3070 at $699

Now you have a BIG FAKE SALE to dump stock of RTX2070 in September at the same price as you actually can buy the card for today.

They created an illusion of dropping the price that they just inflated, THIS IS CLASSIC PRICE MANIPULATION TO SCREW YOUR CUSTOMERS.

On top of all this they are trying to use a FAKE EXCUSE of "increased mining demand" that is non existing at this point as a cover to raise the price of the RTX 2000 series.

Since NVIDIA is involving its partners it must be seen as an ILLEGAL PRICE SYNDICATE to manipulate prices with a fake cover story, a class action suit waiting to happen for sure.

I have now made snapshot of prices with all major vendors within EU, if NVIDIA goes ahead with this price increase before the RTX 3000 release I will report them to the appropriate EU authorities.
Posted on Reply
#27
InVasMani
watzupken
I think AMD have been quite tight lipped about the performance of Big Navi, so it is hard to tell how fast it will perform. It is not that Nvidia is confident that they can beat Big Navi, but rather the lack of competition from AMD now on the high end. But if Nvidia increase price, and AMD is able to fill that supply void, then it may benefit AMD instead.
Would be funny if AMD slashed pricing if Nvidia actually does this nonsense raising pricing and just ends up taking sales and marketshare away from them in the process while still turning healthy profit because 7nm yields happen to be good or better than expected.
bug
Funny how we went from the initial sticker shock to "the popular SKU series".
If you recall Turing dies are pushing the maximum size allowable by the process, you're not surprised Nvidia is moving to discontinue these parts asap.
That could simply be a excuse by Nvidia phase them out while jacking up prices due to yields. Those chips are larger and on a higher node relative to the AMD chips they are competing against which isn't a advantage on yields.
Posted on Reply
#28
hat
Enthusiast
What crypto boom? :kookoo:
Posted on Reply
#29
cucker tarlson
Would it be funny if AMD yet again followed the busness practises of Nvidia but fanboys turned a blind eye again ?
Posted on Reply
#30
medi01
cucker tarlson
samsung 10nm
8nm, but you can call it 28nm if it makes you feel better.
cucker tarlson
is rdna the reason they made 2080Ti a 4300 cuda 750mm monster ?
RDNA is why you got 2060 super and 2070 super.
Just by incident, it's where the 5700 series are performance wise.

Hopefully, when next gen cards come, NV would have better RT performance, this would reduce number of green meltdowns.

AMD will have great products in under 1k $ space, which is where 99% of the market is.
Posted on Reply
#31
GreiverBlade
a raise of price of something already overpriced? NICE!
well ... my 1070 was 526chf/$ ...

for sure i will not care about bench and performance and take whatever red team is putting as RX5700XT successor ... (with the "issues" to tinker with ... which i will probably never have to ... no ATI/AMD card i owned ever needed a driver rollback unlike NV's )









someone want a new leather jacket, even before 30XX launch .... :ohwell:
Posted on Reply
#32
cucker tarlson
medi01
8nm, but you can call it 28nm if it makes you feel better.


RDNA is why you got 2060 super and 2070 super.
Just by incident, it's where the 5700 series are performance wise.

Hopefully, when next gen cards come, NV would have better RT performance, this would reduce number of green meltdowns.

AMD will have great products in under 1k $ space, which is where 99% of the market is.
It makes me feel better to be factually correct,samsungs 8 nm is not a 8nm process same as 12nm was a more mature 16nm.

If a 505mm 7nm expectation is so low youre not even thinking it can match nvidias 10nm 630mm its sad,especially with 3080 being possibly a cut 102 this time.
So yeah,amds best 7nm against nvidias middle 10nm,competition these days.So much rdna2 pressure.
Posted on Reply
#33
medi01
cucker tarlson
nvidias 10nm
It's OK to call Samsung 8nm "nvidias 10nm", no worries.
(oh, just realized that it would also be a way to survive next gen, if RT disparity ain't there)
cucker tarlson
I505mm 7nm expectation
I expect rough perf/transistor parity (Navi <=> Tesla) to persist into next gen.
Posted on Reply
#34
cucker tarlson
medi01
It's OK to call Samsung 8nm "nvidias 10nm", no worries.


I expect rough perf/transistor parity (Navi <=> Tesla) to persist into next gen.
yes.
nvidia's 10nm card.amd's 7nm card.
not nvidia's 10nm process or amd's 7nm process.
too hard for you ?

lol,perf/transistor.
Posted on Reply
#35
medi01
cucker tarlson
nvidia's 10nm card.amd's 7nm card.
That bit is missing the chip size.
cucker tarlson
lol,perf/transistor.
This bit does not.

Although, yeah, apples to oranges comparison (250mm2 7nm chip vs 750mm2 12nm one) look funny when this metric is applied. But so they should.


And, speaking of confidence, I find it apparent, that NV will be getting more and more upset by AMD offerings, as time goes. Shortly, AMD will also go oversized chips, covering entire GPU market. Intel, on the other hand, will disrupt NVs GPU market.
Posted on Reply
#36
dyonoctis
cucker tarlson
so what you're saying is amd's 7nm+ 505mm will lose to nvidia's 10nm 630mm and that'll be a win for amd but kids dont know.
I feel like you are not talking the same language as the people that you are arguing with. When you are looking at who get the absolute technological lead , all they care about is how much fps they get for a said amount of money. If AMD 7 nm mid range manage to compete with Nvidia 10 nm mid range for 50$ lower, it's enough for them.
Posted on Reply
#37
cucker tarlson
medi01
And, speaking of confidence, I find it apparent, that NV will be getting more and more upset by AMD offerings, as time goes.
upset as time goes by ? :roll:
how about beat with a node down ? they have 7nm process available with 95MT/mm why not beat the living crap out of Nvidia's cards made on 64MT/mm ?
sad.
dyonoctis
I feel like you are not talking the same language as the people that you are arguing with. When you are looking at who get the absolute technological lead , all they care about is how much fps they get for a said amount of money. If AMD 7 nm mid range manage to compete with Nvidia 10 nm mid range for 50$ lower, it's enough for them.
it's enough if you want entire upper end to nvidia apparently.
so look at amd's non-existent offerings next time you whine about g102 cards being expensive.
Posted on Reply
#38
watzupken
jax
Where are those high value of crypto ?Should be 10-20x times higher in order to start another mining round.
Gungar
Crypto mining doesn't use standard GPUs for some time now, and no the crypto currency isn't high value at all, it can't even keep half the pick value it had...
You don't need the price to go through the roof before people react. If so, it will be too late.

Picking up does not mean mining craze by the way. Given the lower energy prices, it may make it more attractive to start some mining activity.

As to whether people still use GPU to mine, I am not sure since I don't do crypto mining. But I believe it will still make sense for some people to utilize GPU for mining. Afterall if one chooses not to continue doing crypto mining, they can still sell the GPU.
Posted on Reply
#39
bug
Two things:

1. We only have a rumor of a rumor Nvidia wants to raise prices. Prices are what the market dictates. Initially I thought Turing was way too expensive, but it seems Nvidia had no problem moving the parts. So don't worry about that.
2. Process names are mostly indicative. Everybody knows they don't tell much (if anything) and you have to be familiar with each process to be able to compare between them.
Posted on Reply
#40
watzupken
bug
Two things:

1. We only have a rumor of a rumor Nvidia wants to raise prices. Prices are what the market dictates. Initially I thought Turing was way too expensive, but it seems Nvidia had no problem moving the parts. So don't worry about that.
2. Process names are mostly indicative. Everybody knows they don't tell much (if anything) and you have to be familiar with each process to be able to compare between them.
Turing is too expensive. However there isn't much competition coming from AMD to put pressure on them. While AMD eventually did have decent RDNA graphic cards to compete, but based on what I read, the initial launch was marred by poor driver stability which persisted for some time. Which basically continued to fuel the sale of Turing cards.
Posted on Reply
#41
dyonoctis
cucker tarlson
it's enough if you want entire upper end to nvidia apparently.
so look at amd's non-existent offerings next time you whine about g102 cards being expensive.
Yhea, but it's not like AMD is doing that on purpose. It's better than when they talk big (poor volta) and launch a product that is nothing like it was marketed.

If RDNA cannot scale on the same magnitude as Turing it's better to just own it rather than trying to make that tech do something that it wasn't meant to do. At the end of the day the Radeon wing still need to make money to not be a dead weight and keep founding the R&D, they cannot just keep scratching every design that isn't up to par with Nvidia offering.

But it's true that the comments about Nvidia sweating about a product that we know nothing about are somewhat wacky.
Posted on Reply
#42
RedelZaVedno
I have a simple buying rule. I buy GPU when I can get at least 50% performance increase for the same price.
I'll buy next GPU when I get 1080TI + 50% level of performance for $600. The price I payed for 1080TI. If that means skipping another generation of GPUs, so be it. I can wait.
Posted on Reply
#43
R0H1T
bug
Prices are what the market dictates. Initially I thought Turing was way too expensive, but it seems Nvidia had no problem moving the parts. So don't worry about that.
Not necessarily, I know for a fact that companies do this sh!t to test the waters though generally through official channels. Just had One Plus send us a survey (not me personally but people I know) for their upcoming mobile launch in India, guess what the price of that thing will be?

I have a strong feeling that nCoV will render the JHH Kool-Aid worthless this round if they indeed go for a much higher price than the already overpriced Turing. You simply can't beat nature or pandemics, eventually the real economy & global downturn will catch up & then who knows Nvidia may not have the time or enough room to move chips through the consumer market! Like Intel Nvidia's fall is gonna be spectacular & it will be their own doing, just like Intel. Only a matter of when, not if.
Posted on Reply
#44
bonehead123
Soooo glad I got my 2080's when I did (last year) :clap:

Now I guess we can wait to see everyone wanting to sell them for like 2-3x what they paid for them, so they can afford the 3x cards when they come out....
Posted on Reply
#45
bug
watzupken
Turing is too expensive. However there isn't much competition coming from AMD to put pressure on them.
If it were too expensive, it would collect dust on shelves. It's just more expensive than many would like it to be.
watzupken
While AMD eventually did have decent RDNA graphic cards to compete, but based on what I read, the initial launch was marred by poor driver stability which persisted for some time. Which basically continued to fuel the sale of Turing cards.
Navi's issues are still unresolved. But they're affecting few cards, if you're willing to deal with one RMA in case you draw the short straw, you should be ok.
Posted on Reply
#46
medi01
Where does the "but it doesn't scale" concept come from, pretty please?
When did AMD roll out bigger chip that was remarkably not gaining performance vs its smaller chips?
cucker tarlson
upset as time goes by ?
No, not just upsat, as Tesla was already disrupted ('upset") by Navi, but more upset. Which will inevitably include datacenter business.

You know, "GSync is dead" kind of upset.
cucker tarlson
with a node down ?
That excuse could have been applied to most of the GPU history. AMD was notoriously the first to embrace the next node (and it hit them harder past 28nm).

NV chose Samsung's 8nm over TSMC 7nm, surely, had the gap been to big, they would not.
cucker tarlson
they have 7nm process available with 95MT/mm why not beat the living crap out of Nvidia's cards made on 64MT/mm ?
Intel has above 100MT/mm2 (that's where their 10nm is) and tell me how it works.
I guess it's much more nuanced than single figure tells.

Had the gap between Samsung 8 and TSMC 7 been really big, leather man would have stayed with Taiwan.
cucker tarlson
sad.
Actually dumb. Fanboism does not make people smarter.
Posted on Reply
#47
cucker tarlson
medi01
Where does the "but it doesn't scale" concept come from, pretty please?
When did AMD roll out bigger chip that was remarkably not gaining performance vs its smaller chips?


No, not just upsat, as Tesla was already disrupted ('upset") by Navi, but more upset. Which will inevitably include datacenter business.

You know, "GSync is dead" kind of upset.


That excuse could have been applied to most of the GPU history. AMD was notoriously the first to embrace the next node (and it hit them harder past 28nm).

NV chose Samsung's 8nm over TSMC 7nm, surely, had the gap been to big, they would not.


Intel has above 100MT/mm2 (that's where their 10nm is) and tell me how it works.
I guess it's much more nuanced than single figure tells.

Had the gap between Samsung 8 and TSMC 7 been really big, leather man would have stayed with Taiwan.


Actually dumb. Fanboism does not make people smarter.
Tesla was what,lol
They made the most advanced and expensive card ever,they just got rid od Tesla name.that counts as beating Tesla to you,lol
Same way AMD beat gtx when rtx launched.oh wait,they are still chasing 1080ti
Posted on Reply
#48
medi01
dyonoctis
When you are looking at who get the absolute technological lead , all they care about is how much fps they get for a said amount of money.
This would make sense if applied to Vega: an expensive to manufacture product, sold at a loss/or with tiny margins.

This doesn't apply at all to 250mm2 Navi , which ties with 330mm2 Vega7 (on the same process) equipped with much more expensive memory.

The "oh, but node" doesn't apply here: chips are smaller than competition, and it is just a pit-stop node on the way to 7nm EUV.
Posted on Reply
#49
cucker tarlson
RedelZaVedno
I have a simple buying rule. I buy GPU when I can get at least 50% performance increase for the same price.
I'll buy next GPU when I get 1080TI + 50% level of performance for $600. The price I payed for 1080TI. If that means skipping another generation of GPUs, so be it. I can wait.
It does
2080ti is 1.35-1.40x.
You'll need 3070ti to get to 1.5x
Wait,that actually might be under 600.if amd play the ball.if not then 699 for sure.
Posted on Reply
#50
chrcoluk
watzupken
I am not surprise that crypto currency mining will pick up due to (1) low energy prices, and (2) high value of crypto currency. But if there is anything we learned from the last crypto currency fueled GPU price hikes, anything that goes up will come down. Even if the graphic card makers are now more prepared and thus, unlikely to go overboard with supply like they did the last time, they may not be able to avoid the fact that people will start dumping mining cards cheap, which will affect their sales, and will require a price cut to make buying a new card more attractive.
I keep hearing about energy been cheap.

Here in the UK I live alone and my electric bill is £90-£110 a month for a single person, its a 1/4 of my rent.

In 2010 I was paying around 9p per unit of electric, now I pay 15p, my standing fee for electric has tripled in the same 10 year period.
Posted on Reply
Add your own comment