• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's SUPER Tease Rumored to Translate Into an Entire Lineup Shift Upwards for Turing

I thought I saw on the Videocardz site something about price reductions on the Supers. I can't find it now though. Maybe I just imagined it. It wouldn't make much sense for Nvidia to lower the price below what the cards are selling for now unless the stockholders are getting nervous about the RTX cards not selling as well as expected according to Mr Huang.
 
I thought I saw on the Videocardz site something about price reductions on the Supers. I can't find it now though. Maybe I just imagined it. It wouldn't make much sense for Nvidia to lower the price below what the cards are selling for now unless the stockholders are getting nervous about the RTX cards not selling as well as expected according to Mr Huang.
This is the most reliable (if it can be called that) rumor we have so far: https://hothardware.com/news/nvidia-geforce-rtx-super-family-next-week
So probably the same prices combined with better specs. The 2060 was the best deal and will become way sweeter with 256 bits memory bus and 8GB VRAM.

Edit: Managed to dig up the version that includes pricing: https://www.techradar.com/news/nvidia-super-rtx
 
Last edited:
  • Like
Reactions: 64K
And thus the cycle begins again, Welcome back LE cards!
 
Hell, till I see benchmarks I'm not even sure we're back into sane territory.

Yep that's what I'm watching for. At this point I fully expect NVidia to disappoint me. However, my wallet is on standby on the off chance that they don't.

My expectations at this point are simply a small performance bump, but maintaining the same unreasonable price scheme. Now, if the "2080 Ti Super" version is say ~80+% faster than a 1080Ti, I may be willing to buy a couple of them at $1200 per unit, but if it's only 50-60% faster, then they can keep 'em and I'll stay in my holding pattern until Intel Xe / Nvidia 3000 series.
 
This is the most reliable (if it can be called that) rumor we have so far: https://hothardware.com/news/nvidia-geforce-rtx-super-family-next-week
So probably the same prices combined with better specs. The 2060 was the best deal and will become way sweeter with 256 bits memory bus and 8GB VRAM.
Their actual price will depend on AMDs new cards price and perofmance. And no, Nvidias current cards are way overpriced and the 2060 should've been at best 250$ and the 2070 should've been 350$. Stop justifying the ultimate greediness of Nvidia and the SUPER EXPENSIVE OVERLRICED PRODUCTS.
 
Their actual price will depend on AMDs new cards price and perofmance. And no, Nvidias current cards are way overpriced and the 2060 should've been at best 250$ and the 2070 should've been 350$. Stop justifying the ultimate greediness of Nvidia and the SUPER EXPENSIVE OVERLRICED PRODUCTS.
Stop confusing wishful thinking with market rules. Why should the 2060 go for $250? Why should the 2070 go for $350?
Turing is expensive for me, but the right price is the one the market is willing to pay.
 
Stop confusing wishful thinking with market rules. Why should the 2060 go for $250? Why should the 2070 go for $350?
Turing is expensive for me, but the right price is the one the market is willing to pay.
It is not wishful, It is the truth, they are mid and upper-mid range cards and their price was always like that, until Lisa SU, the niece of Jensen Huang took over AMD and killed the GPU department of AMD. (I don't care if it is a coincidence or a conspiracy, but that's what happened)
 
It is not wishful, It is the truth, they are mid and upper-mid range cards and their price was always like that, until Lisa SU, the niece of Jensen Huang took over AMD and killed the GPU department of AMD. (I don't care if it is a coincidence or a conspiracy, but that's what happened)
They're only mid and upper-mid rangers if you look at their names. Otherwise, 2060 replaces the 1070 in both HP and TDP, while 2070 does the same for 1080.
But if you think the box defines a product better than its capabilities, well, you'll probably keep complaining.
 
They're only mid and upper-mid rangers if you look at their names. Otherwise, 2060 replaces the 1070 in both HP and TDP, while 2070 does the same for 1080.
But if you think the box defines a product better than its capabilities, well, you'll probably keep complaining.
So according to your logic the 2060's price should've been 20.000$ because it has 20x the perofmance of 7800gtx ?
 
So according to your logic the 2060's price should've been 20.000$ because it has 20x the perofmance of 7800gtx ?
I don't think what I wrote can lead you to this conclusion.
 
It is not wishful, It is the truth, they are mid and upper-mid range cards and their price was always like that, until Lisa SU, the niece of Jensen Huang took over AMD and killed the GPU department of AMD.
The market defines what is low-end, mid-range and high-end, and the only qualifier that makes sense is performance vs. the rest of the market, not price. E.g. if AMD priced RX 570 at $2000, it would still be a low-end card. The typical bins of low-end, mid-range and high-end divides the market in three bins, which would make the transition between upper mid-range and high-end at or above RTX 2080 in the current market.
 
I don't think what I wrote can lead you to this conclusion.

Well, that depends.

Otherwise, 2060 replaces the 1070 in both HP and TDP, while 2070 does the same for 1080.

If just HP and TDP determine the price compared to previous generations then, yes, in far stretched, obtuse way.
 
They're only mid and upper-mid rangers if you look at their names. Otherwise, 2060 replaces the 1070 in both HP and TDP, while 2070 does the same for 1080.
But if you think the box defines a product better than its capabilities, well, you'll probably keep complaining.
Expecting equal performance to shift down a full product tier per generation is a reasonable expectation, and has always been so, and "product tier" in this case means "price tier" - as that's really the only sensible differentiator given that there's neither a set maximum or minimum for performance, while price brackets change far more slowly. The issue with Turing is that we got somewhat less than a full tier down in terms of naming, while at the same time prices didn't budge whatsoever. While the argument can be made that "the market sets the prices", that's also rather tautological - after all, any market actor with enough power has the power to determine prices, and consumers have zero say on the matter. Consumers are the weakest actors in any market, period. "Voting with your wallet" isn't a thing, as there's no way of voting no, just yes or an uncounted and thus unnoticeable "option" to abstain.
 
Consumers are the weakest actors in any market, period.

Yes and no. Yes if you can get enough consumers to band together. No if the consumers are fractured.
 
The market defines
The market in which we have a little thing called duopoly in action. Not much of a market when you've got that, especially when one of the two doesn't even bother to compete at the high end.

Worse yet, one of the two players is actively working against the PC gaming platform by promoting the nonsense known as the console (for a total of three artificially incompatible x86 platforms — unnecessary platform fragmentation). The situation will be less dire once Jaguar finally gets the boot it should have gotten when it was first suggested for adoption but the problem remains. It's actually in AMD's interest to not compete at the high end in order to make consoles look better. Let Nvidia keep prices in the stratosphere for the best hardware. Then people will have little choice if they are of modest means to buy either the midrange AMD cards on offer or a console. Neat, huh? Everyone wins but the consumer.

Monopolization eats markets and spits out inefficiency in the form of yachts and swallows' nest dinners.
 
Last edited:
Yes and no. Yes if you can get enough consumers to band together. No if the consumers are fractured.
Consumers are notoriously fractured, and the vast majority of attempts to organize them fail miserably (and the ones that don't are short lived). The reason is simple: most consumers don't have the time, money or energy to put too much effort into ensuring their consumption is ethically sound, as they have other things (e.g. life) that take precedence. The myth of the empowered consumer is propagated by the powerful in capitalist systems to make the system seem fair and make consumers seem equal, when the core of capitalism is inequality and concentration of power.

The only effective example of a large-scale boycott or similar consumer action over the past few decades is the one against South African apartheid. And that is quite a while ago.
 
Expecting equal performance to shift down a full product tier per generation is a reasonable expectation, and has always been so, and "product tier" in this case means "price tier" - as that's really the only sensible differentiator given that there's neither a set maximum or minimum for performance, while price brackets change far more slowly. The issue with Turing is that we got somewhat less than a full tier down in terms of naming, while at the same time prices didn't budge whatsoever. While the argument can be made that "the market sets the prices", that's also rather tautological - after all, any market actor with enough power has the power to determine prices, and consumers have zero say on the matter. Consumers are the weakest actors in any market, period. "Voting with your wallet" isn't a thing, as there's no way of voting no, just yes or an uncounted and thus unnoticeable "option" to abstain.
Well, the 2060 is slightly faster than a 1070Ti and the 2070 is better than a 1080. The reason we're not seeing the whole expected performance shift is RTRT. So for Turing, we're trading some of the expected performance for a whole new GFX tech. I just don't see the fuss.

If you take RTRT out of the equation, even the 1660 is a good deal faster than the 1060 for about the same price.
 
Well, the 2060 is slightly faster than a 1070Ti and the 2070 is better than a 1080. The reason we're not seeing the whole expected performance shift is RTRT. So for Turing, we're trading some of the expected performance for a whole new GFX tech. I just don't see the fuss.

If you take RTRT out of the equation, even the 1660 is a good deal faster than the 1060 for about the same price.
The 1660 is a decent deal, absolutely, but sadly the clear exception to the rule in terms of Turing and value. The 1650 is downright terrible value, and the higher tiers provide no real-world value gains for normal users. You're right that we gain new tech, but that tech works in, what, five games now, 6 months after the launch of the "people's RTX" 2060? I doubt RTX will provide any actual value for users before the majority of these cards are obsolete, sadly - which means buyers are giving up on a generational perf/$ gain for a feature with very, very limited use. And in two years or so when we have far more games with RTRT, will an RTX 2060 be powerful enough to run these at 1080p60 without very significant compromises? That sounds unlikely to me - but of course, predicting the future is impossible, and I might be entirely wrong. It just seems like a poor bet from an end user stand point, and a poor showing from Nvidia in the way that they're essentially saying "No, you're not getting a faster GPU this time around, but you're getting a feature that you might - if you're lucky - get some IQ gains from in games in the future." It's also rather odd, given that GPUs tend to be upgraded on a 2-3 year cycle, meaning that 1st-gen products like these inevitably get superseded by newer solutions long before the tech actually becomes relevant or useful.

Now, don't get me wrong, I'm all for more realistic lighting, reflections and spatial audio (realism can foster immersion, after all), but I'm not a fan of paying a(n effective) premium for the promise that this might become a reality at some point in the future.
 
@Valantar I was just pointing out we get new tech supported by new hardware and yet everybody is dismissing Turing for not following the price/perf ratio perfectly. We do get a more than decent perf boost, but the large silicon costs $$$. That's all there is to it.

And yes, RTRT is only present in a handful of games, but guess what? So was PS3.0 or PS2.0 that came before. And I'm pretty sure Nvidia would have kept the lid on RTRT for one more generation till they could introduce it a more palatable price point. But with AMD pretty much a no show for years now, they would have been stupid not to gain a foothold in the new tech as soon as they could.
 
Better hardware and faking it like CryTek/ReShade is still the best option considering the pace of advancements to refresh rates and resolution size options on the market and in the near horizon today. Those are easier options and will perform better and not look a whole heap load worse in the end at the same time.
 
From WCCyouknowheweare:

125291


giphy.gif
 
Appreciate you sharing that Medi -

Will be curious to see performance. Wonder how long before they release the 2080 Ti Super, sounds like the Ti models will be last to be released.
 
Back
Top