• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's SUPER Tease Rumored to Translate Into an Entire Lineup Shift Upwards for Turing

Wait... isn't that what AMD was doing since every release post HD7970? Maxwell? "Here, have this rebrand" And isn't Navi the exact same thing?

GPUs are supposed to devalue fast, if they do not, it means progress is stalling. That is what we've been looking at post-Pascal to present... 1080ti performance still sells for about MSRP.

This SUPER feels a bit like a Kepler refresh to me. Not a bad thing IMO, more cards = lower prices.
The only reason for this "not bad thing" is the release of Navi .
You may have had to wait 1-2 years to see a "real" refresh or 3 years for a new thing from Nvidia. While AMD were refreshing actually because they weren't able to make new cards because they had to focus on CPU segment and on console graphics AND this mining thing just saved AMDs ass in recent years or they'd had very tough period.
 
Last edited:
Do you have a prove that "they costs just as much"?
Just dont eat anything Nvidia throws at you, like "our cards cost too much, that's why they are expensive" or the reason why they didn't move to 7nm. They are just dishonest and not only not telling the truth but they are obviously and intentionally lying. (Yes there is a huge difference between keeping silent and not saying anything and between lying) Nvidia didn't move to 7nm because there were no NEED for it (yet). That is the only and the only reason for that.
Is this proof enough for you? https://www.anandtech.com/show/14528/amd-announces-radeon-rx-5700-xt-rx-5700-series
 
Guess we'll have to wait and see what happens. If they can get current build "2080 Ti" performance into a card that costs around $700, I'll buy two of them and corresponding waterblock kits to go with them. If they don't, then I'll keep waiting for Intel Xe & Nvidia 3000 series.
 
Devalued by the same exact card now being a tier lower in pricing.

I bought a X1800XT and low, the X1900XT displaced it and devalued it so it lost a couple hundred bucks in value for resale.

Did it still work? Yes.
Was it worth the cost? No.

It's about consumer value, if I buy a premium product I would expect it to be valued as a premium product.

Just my opinion.

Did you buy the card as an investment? Or did you, like most everyone else, buy the card to play games or do some other kind of work?

If it is the first one, then I'm sorry. If it was the second one, then whether or not something else came out later doesn't effect your usage ability of the card. You still get the same "value" out of the card as you got before the new one came out.
 
It's interesting that Nvidia feels the need to come out with a whole new product stack instead of just dropping prices on their current offerings. Perhaps these SUPER cards are going to hit more cost-efficient configurations of the current chips. Or maybe Nvidia just felt like "new" would sell better than "now cheaper." Whatever the case, the whole GPU space is going to be a nightmare now. 1650, 1660, 1660 Ti, 2060, 2060 Super, 2070, 2070 Super, rumored 2070 Ti, 2080, 2080 Super, 2080 Ti, 2080 Ti Super. 12 GPUs not counting laptop parts. That's obscene. And without a bump in the product numbers, there are going to be so many confused people buying inferior cards. It's going to take a while for retailers to discount the "old" GPUs, and so I can guess some kids are going to be buying the non-SUPER parts for more.

I really hope Nvidia is flushing their non-SUPER chips out of the market, because 12 GPUs is just ridiculous.

EDIT: Forgot about the rumored 1650 Ti, so that's at least 13 GPUs to keep a hold of! Pray for @W1zzard, those GPU tests are not going to be fun...
 
I really hope Nvidia is flushing their non-SUPER chips out of the market, because 12 GPUs is just ridiculous.

Last quarter report showed inventory turn over at 143 days, Normal is 60-ish. Its been above 120 for 3 quarters. Will be interesting if a new line-up is added, what effect it has.
 
How exactly is the card they just bought devalued? Does it all of a sudden get less fps in games, now that a new card is out?

Of course, you can always wait and get something newer and faster. But that will always be true. Should people not buy Zen 2 processors, because Zen 3 will be coming down the road, not to mention Zen 4 or 5?

I think it's the timing and the high price when it was released. The Turing cards were released ~ 6 months ago at atronomical prices and now a new card is coming and is faster and cheaper. At least with the Zen chips; people knew the schedule is about 1 year out.
 
Devalued by the same exact card now being a tier lower in pricing.

I bought a X1800XT and low, the X1900XT displaced it and devalued it so it lost a couple hundred bucks in value for resale.

Did it still work? Yes.
Was it worth the cost? No.

It's about consumer value, if I buy a premium product I would expect it to be valued as a premium product.

Just my opinion.

But it is - the 2080ti is not getting the SUPER treatment as far as I can read...

So the 'premium' product is still up there on its number one halo spot. The rest is not 'premium' in the stack, an x80 is nothing special in the usual line of things. The price is special, sure .... :D
 
But it is - the 2080ti is not getting the SUPER treatment as far as I can read...

So the 'premium' product is still up there on its number one halo spot. The rest is not 'premium' in the stack, an x80 is nothing special in the usual line of things. The price is special, sure .... :D

The 2080 Ti should release in the Super lineup but not at the same time as the 2060, 2070, 2080 Supers according to the current rumors. I'm sure Nvidia wants to keep milking the Titan RTX for $2,500 for a little while longer.
 
How exactly is the card they just bought devalued? Does it all of a sudden get less fps in games, now that a new card is out?

Of course, you can always wait and get something newer and faster. But that will always be true. Should people not buy Zen 2 processors, because Zen 3 will be coming down the road, not to mention Zen 4 or 5?

Value is directly tied to performance to dollar.

The value of every frame per second the card can deliver is now cut in half because the card is now worth half as much as it was worth before.

The value (ie what the card could be sold to someone else for) has now decreased and their "investment" is going to return half as much as it could have if they had sold it before these cards existed.

That's the "value" most would be upset about losing.

Did you buy the card as an investment? Or did you, like most everyone else, buy the card to play games or do some other kind of work?

If it is the first one, then I'm sorry. If it was the second one, then whether or not something else came out later doesn't effect your usage ability of the card. You still get the same "value" out of the card as you got before the new one came out.
Sorry but for many its actually a bit of both we buy the card to play games or what not but we also take into account the resale value of the card and how long the card can last without losing too much of its value for resale.

The process has allowed me to upgrade almost yearly and lose very little value in my parts over the years.

This move will be the first time I will most likely have to spend double what I normally pay to upgrade due to the quick de-value of the current 20 series.

It is a big deal to some of us for more than "clout" reasons.
 
Sorry but for many its actually a bit of both we buy the card to play games or what not but we also take into account the resale value of the card and how long the card can last without losing too much of its value for resale.

The process has allowed me to upgrade almost yearly and lose very little value in my parts over the years.
Upgrading graphics cards yearly is probably the most expensive thing you can do, unless you rely on buying during heavy discounts. Resell value is also very hard to predict; 1.5 years ago resell value was good due to the mining mania, lately it has been very bad, and it also depends on the region/country.

If you buy a new card at x dollars every year, and sell the old for 67% of the price (which is optimistic), after three years you've spent 1.67x the original card price(probably >2x if we account for shipping), for very minimal upgrades. In general you would be much better off buying one or two tiers up and keeping it for three years.
 
Upgrading graphics cards yearly is probably the most expensive thing you can do, unless you rely on buying during heavy discounts. Resell value is also very hard to predict; 1.5 years ago resell value was good due to the mining mania, lately it has been very bad, and it also depends on the region/country.

If you buy a new card at x dollars every year, and sell the old for 67% of the price (which is optimistic), after three years you've spent 1.67x the original card price(probably >2x if we account for shipping), for very minimal upgrades. In general you would be much better off buying one or two tiers up and keeping it for three years.
We'll for one it's not really yearly but I tend to sell right before a new release while value is still high using backup cards in the mean time or just go without gaming. Then I'm not over spending on crazy cards as I only need base cards and I watercool.

My last investment was about 1500 for my two 1080 ti's and I sold them for 1250 2 years later took that 1250 got a base 2080ti and watwecooled it.

All in all I only "spent" about 250 dollars for 2 years worth of top end gaming.

Now I'll be lucky to clear 700 and then I'll be reinvesting another 1200 or more meaning I'll have spent over 500 for 6 months or less of similar top end gaming.

You see the issue I'm facing? If I were to upgrade not saying I would but the "value" of my setup is absolutely trashed by this.
 
You see the issue I'm facing?

Drag that 6 months out to 12 and you will have doubled your return. The fact you sold your 1080 tis basically for what you paid for them is a fluke and should be used to base any other transactions.

Also, by purchasing a top end GPU yearly, you should never expect any sort of return and always expect to lose your ass.
 
Drag that 6 months out to 12 and you will have doubled your return. The fact you sold your 1080 tis basically for what you paid for them is a fluke and should be used to base any other transactions.

Also, by purchasing a top end GPU yearly, you should never expect any sort of return and always expect to lose your ass.
I've managed a similar loss on each generation before if just a little worse than the 125 a year I lost on the 1080ti's and again its not exactly yearly but I owned a 780 sli 980 ti sli 1080 sli and 1080 ti sli each time I sold the previous cards right before the price fell on the old cards but between most of those it was a year or more I did get the most out of the 1080ti setup for sure but my losses were never as bad as they would be if u were to follow the same path I've taken before now with the 2080 ti.

The only thing that will force my hand to upgrade at this point is hdmi 2.1 support so if these cards have it I will upgrade if not I'll get to hold out for the next Gen, but as someone with a new oled tv that has hdmi 2.1 and no sources that support it currently. The new Gen consoles and a video card that offers it are the only things I'm looking forward to right now.
 
Have you noticed an elephant in the room?
A whole tier bump for the same price, when was the last time you've seen that from greedy green?

Something is going on... hm... what could it be, hehehe...

TSMC 7nm is at least twice as expensive per density, probably more, since the old "16/12nm" node have reached its full potential, while the 7nm node is still maturing.

It MUST BE. Else we'd need to call BS on the whole "pretty cool margins".
 
Soo.... For Palit...

Thats going to be:
GeForce RTX2080 Super Super JetStream !? :D:laugh:

 
Have you noticed an elephant in the room?
A whole tier bump for the same price, when was the last time you've seen that from greedy green?
Save for an odd release here and there, I've only seen this on every single launch I care to remember.
 
Save for an odd release here and there, I've only seen this on every single launch I care to remember.

The only difference is that this isn't really a 'release' in the true sense of the word. Same arch, same node. All they are doing is releasing what they probably planned on 6 months ago before AMD decided to show up to the potluck without a dish again.
 
The only difference is that this isn't really a 'release' in the true sense of the word. Same arch, same node. All they are doing is releasing what they probably planned on 6 months ago before AMD decided to show up to the potluck without a dish again.
Eh, solid arguments are lost on medi, no need to bother with little details like these.
 
The only difference is that this isn't really a 'release' in the true sense of the word. Same arch, same node. All they are doing is releasing what they probably planned on 6 months ago before AMD decided to show up to the potluck without a dish again.
It's a product refresh. It's at least a change in the binning, if not a new stepping.
Yields were initially a little "sub-optimal" leading Nvidia to create the "A" and "non-A" versions of the chips.
 
It's a product refresh. It's at least a change in the binning, if not a new stepping.
Yields were initially a little "sub-optimal" leading Nvidia to create the "A" and "non-A" versions of the chips.
Given the maturity of TSMC 12nm at the time of launch, I guess that gives us a pretty good indication of just how troublesome the gigantic die sizes of Turing were - and seemingly still are, or have at least continued being for quite a while.
 
Given the maturity of TSMC 12nm at the time of launch, I guess that gives us a pretty good indication of just how troublesome the gigantic die sizes of Turing were - and seemingly still are, or have at least continued being for quite a while.
It's the same trouble as producing any large die: you get fewer dies from a wafer. If you get 20 dies out of a wafer, losing a die is losing 5%. If you only get 10 dies, then one defective transistor will cost 10% instead. Imprinting the pattern is pretty much copy/paste, the limiting factor here is the fab's ability to keep defects to a minimum. And while 12nm was over a year old at the time we got Turing, I don't think that's long enough to get to the optimum output for a given process.

Regardless, we're getting cheaper Turing now.
 
Regardless, we're getting cheaper Turing now.
It's easy to release something cheaper when the whole lineup is hella expensive :D
 
It's the same trouble as producing any large die: you get fewer dies from a wafer. If you get 20 dies out of a wafer, losing a die is losing 5%. If you only get 10 dies, then one defective transistor will cost 10% instead. Imprinting the pattern is pretty much copy/paste, the limiting factor here is the fab's ability to keep defects to a minimum. And while 12nm was over a year old at the time we got Turing, I don't think that's long enough to get to the optimum output for a given process.

Regardless, we're getting cheaper Turing now.
A year into volume production for a process node as similar to its predecessor as TSMC's 12nm it really ought to be quite mature - but as you say, even a single defect can have big consequences when the die is big enough, and Nvidia are really pushing things in that regard with Turing. Even the RTX 2070 and 2060 are gigantic compared to previous products in the same market segment (note: not price segment). Kind of funny how people were shocked at the size of the Fury X die back in the day at 596mm2 on 28nm, yet here Nvidia is pushing out significantly bigger dice than that on 12nm, and "mid-range" dice more than 2/3 of this. Times are changing, I suppose.
 
  • Like
Reactions: bug
It's easy to release something cheaper when the whole lineup is hella expensive :D
I didn't say it was hard ;)
Hell, till I see benchmarks I'm not even sure we're back into sane territory.
 
Back
Top