Friday, September 7th 2018

NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

NVIDIA's Turing-based RTX 20-series graphics cards have been announced to begin shipping on the 20th of September. Their most compelling argument for users to buy them is the leap in ray-tracing performance, enabled by the integration of hardware-based acceleration via RT cores that have been added to NVIDIA's core design. NVIDIA has been pretty bullish as to how this development reinvents graphics as we know it, and are quick to point out the benefits of this approach against other, shader-based approximations of real, physics-based lighting. In a Q&A at the Citi 2018 Global Technology Conference, NVIDIA's Colette Kress expounded on their new architecture's strengths - but also touched upon a possible segmentation of graphics cards by raytracing capabilities.

During that Q&A, NVIDIA's Colette Kress put Turing's performance at a cool 2x improvement over their 10-series graphics cards, discounting any raytracing performance uplift - and when raytracing is indeed brought into consideration, she said performance has increased by up to 6x compared to NVIDIA's last generation. There's some interesting wording when it comes to NVIDIA's 20-series lineup, though; as Kress puts it, "We'll start with the ray-tracing cards. We have the 2080 Ti, the 2080 and the 2070 overall coming to market," which, in context, seems to point out towards a lack of raytracing hardware in lower-tier graphics cards (apparently, those based on the potential TU106 silicon and lower-level variants).
This is just speculation - based on Kress's comments, though - but if that translates to reality, this would be a tremendous misstep for NVIDIA and raytracing in general. The majority of the market games on sub-**70 tier graphics cards (the 20-series has even seen a price hike up to $499 for the RTX 2070...), and failing to add RT hardware to lower-tier graphics would exclude a huge portion of the playerbase from raytracing effects. This would mean that developers adding NVIDIA's RTX technologies and implementing Microsoft's DXR would be spending development resources catering to the smallest portion of gamers - the ones with high-performance discrete solutions. And we've seen in the past what developers think of devoting their precious time to such features.
Additionally, if this graphics card segregation by RTX support (or lack of it) were to happen, what would be of NVIDIA's lineup? GTX graphics cards up to the GTX 2060 (and maybe 2060 Ti), and RTX upwards? Dilluting NVIDIA's branding through GTX and RTX doesn't seem like a sensible choice, but of course, if that were to happen, it would be much better than keeping the RTX prefix across the board.

It could also be a simple case of it not being feasible to include RT hardware on smaller, lower performance GPUs. As performance leaks and previews have been showing us, even NVIDIA's top of the line RTX 2080 Ti can only deliver 40-60 FPS at 1080p in games such as the upcoming Shadow of the Tomb Raider and Battlefield V (DICE has even said they had to tone down levels of raytracing to achieve playable performance levels). Performance improvements until release could bring FPS up to a point, but all signs point towards a needed decrease in rendering resolution for NVIDIA's new 20-series to be able to cope with the added raytracing compute. And if performance looks like this on NVIDIA's biggest (revelaed) Turing die, with its full complement of RT cores, we can only extrapolate what raytracing performance would look like in cut-down dies with lower number of RT execution units. Perhaps it really wouldn't make much sense to add the increased costs and per-die-area of this dedicated hardware, if raytracing could only be supported in playable levels at 720p.
All in all, it seems to this editor that segregation of graphics cards via RTX capabilities would be a mistake, not only because of userbase fracturing, but also because the highest amount of players game at **60 and lower levels. Developers wouldn't be so inclined to add RTX to their games to such a small userbase, and NVIDIA would be looking at dilluting its gaming brand via RTX and GTX - or risk confusing customers by branding a non-RTX card with the RTX branding. If any of these scenarios come to pass, I risk saying it might have been too soon for the raytracing push - even as I applaud NVIDIA for doing it, anyway, and pushing graphics rendering further. But perhaps timing and technology could have been better? But I guess we all just better wait for actual performance reviews, right?
Source: NVIDIA via Seeking Alpha
Add your own comment

132 Comments on NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

#51
Totally
Durvelle27Well if someone thought RT would be on lower tier GPUs that’s their fault.
This. Especially when the little info released from nvidia shows performance taking a massive performance hit, and is borderline acceptable when RT is on vs off, it is easy to see that RT running on less powerful cards will be unplayable.
Posted on Reply
#52
B-Real
Actually, in the case of a 1050-1100+ $ card (2080 Ti), where game developers were super happy to tell us that one game can run 1080p 60 fps (Metro), and we saw a game that cannot run with 60 fps in FHD, even being in the 30s (SotTR), why is it important that a card with half or less performance (2060) supports RT or not? Because if we calculate a bit, we see that it may run RT games with 30ish fps - like having the performance of a x50 card in normal circumstances - at best.

And I bet that most gamers would like to see better facial (and body) animations, no corpse bugs through the textures, better A.I. etc. instead of the realtime lightning and reflections over the ones we have now which you may not know is realtime or not...
Posted on Reply
#53
dick_cheney
Gamers : I want advanced graphics and faster Video Cards

Company: Here is a card that is faster and has advanced features

Gamers: WHAT?!?! why does this cost more !!!

The lack of competition does not help i must admit but Nvidia is a for profit business and not a charity, so vote with your wallets i suppose.
Posted on Reply
#54
GoldenX
Next year Vulkan and DirectX 12 will get ray tracing extensions and this will be free sync all over again.
Posted on Reply
#55
Apocalypsee
This gives me flashback during GeForce4 era, where the MX cards are DirectX 7 with software vertex shader while Ti cards were full DirectX 8 compliant with pixel and vertex shader.
Posted on Reply
#56
GoldenX
A lot of low end 600, 700, 900 series cards are Fermi, without Vulkan or even modern drivers.
Posted on Reply
#57
Captain_Tom
dick_cheneyGamers : I want advanced graphics and faster Video Cards

Company: Here is a card that is faster and has advanced features

Gamers: WHAT?!?! why does this cost more !!!

The lack of competition does not help i must admit but Nvidia is a for profit business and not a charity, so vote with your wallets i suppose.
These "features" are only advanced if they are usable. They are not.

If Nvidia actually wanted Raytracing to be a big deal, there would be 4 RT cores per SM instead of 1. Oh, and those "RT cores" are really just re-purposed cores meant for general compute use. They are selling cut-down compute cards to gamers. That's it, and it's hilarious there are people defending these shenanigans. Turing is a cheaper-to-manufacture Volta meant to be able to compete with Vega (Which is a huge threat to Nvidia's professional market).

You are correct that people will NEED to vote with their wallets this year. At a certain point though I worry - It seems like somehow Nvidia really has convinced a wide swath of sheep that they should just keep mailing them their money for almost zero gains every year. Now they are actually trying to convince them that you should pay twice as much for a downgrade to 1080p gaming again.

But hey - if you pay them $2400 I am sure you can game in 1440p on a nice $1000 G-sync monitor with 2080 Ti SLI. Right sheep? Right? Pathetic...
Posted on Reply
#58
timta2
GoldenXWow, this release is getting worse and worse.
The fanboy wishful thinking is nothing new. I've been watching it with every new Nvidia release, for longer than a lot of the fanboys have been alive.
Posted on Reply
#59
jmcosta
Captain_TomIf Nvidia actually wanted Raytracing to be a big deal, there would be 4 RT cores per SM instead of 1. Oh, and those "RT cores" are really just re-purposed cores meant for general compute use. They are selling cut-down compute cards to gamers. That's it, and it's hilarious there are people defending these shenanigans.
It always been like that for Nvidia, AMD and Intel
at least with NVidia sometimes we get so see the full chip on mainstream, the Titan. which was never an ideal card for gaming but its there.
AMD also did tried recently with the frontier edition with full chip and stack of memory but it was more of a concept card in the end.

Don't forget that us consumers in this particular market we are the "2nd class citizen.
Posted on Reply
#60
GoldenX
At the very least with AMD and Intel, all GPUs get the full specs of their series.
Posted on Reply
#61
FordGT90Concept
"I go fast!1!11!1!"
AdoredTV said as much (RTX for top cards, GTX for bottom). As long as they keep going with the pattern of RTX establishing a baseline of raytracing performance, it's a good idea. If it's purely marketing fluff, it's rubbish.
Posted on Reply
#62
moproblems99
Captain_TomIt seems like somehow Nvidia really has convinced a wide swath of sheep that they should just keep mailing them their money for almost zero gains every year.
So, do you really call the difference between Maxwell and Pascal 'almost zero gain'? If so, what was 480 to 580?

EDIT: Quick piece of additional info, I was referring to RX 480 to RX 580.

EDIT2: I have a Vega 56 and love it.
Posted on Reply
#63
R0H1T
moproblems99So, do you really call the difference between Maxwell and Pascal 'almost zero gain'? If so, what was 480 to 580?

EDIT: Quick piece of additional info, I was referring to RX 480 to RX 580.

EDIT2: I have a Vega 56 and love it.
Nearly all the gains came from higher clocks, there were virtually zero "IPC" gains. The higher clocks were due to 16nm FF from TSMC, as for 580 that's just a respin of 480 - not unlike zen+ ;)
Posted on Reply
#64
Captain_Tom
jmcostaIt always been like that for Nvidia, AMD and Intel
at least with NVidia sometimes we get so see the full chip on mainstream, the Titan. which was never an ideal card for gaming but its there.
AMD also did tried recently with the frontier edition with full chip and stack of memory but it was more of a concept card in the end.

Don't forget that us consumers in this particular market we are the "2nd class citizen.
What on Earth are you talking about? There have been full chips from both companies for YEARS lol.
Posted on Reply
#65
Batailleuse
Captain_TomBut hey - if you pay them $2400 I am sure you can game in 1440p on a nice $1000 G-sync monitor with 2080 Ti SLI. Right sheep? Right? Pathetic...
Well... Technically sli doesn't do much since gen 10 2x1080ti in sli in 90%+of game is barely 5-10% fps. And sli included frame lag, desync and random graphical bugs. I'd assume 2 2080ti would be same so basically throwing 2400$ for performances you could achieve with a simple. Overclock under watercooling for way cheaper. So far nothing lead to think that sli raytrace would act differently. If it does, yeah maybe you could play with raytrace otherwise I'd rather just buy a single 2080, overclock the fuck out of It with watercooling to play everything in 1440p at 120-144fps and forget about 4k and raytrace all together.
Posted on Reply
#66
GoldenX
Captain_TomWhat on Earth are you talking about? There have been full chips from both companies for YEARS lol.
He must be young.
Posted on Reply
#67
B-Real
dick_cheneyGamers : I want advanced graphics and faster Video Cards

Company: Here is a card that is faster and has advanced features

Gamers: WHAT?!?! why does this cost more !!!

The lack of competition does not help i must admit but Nvidia is a for profit business and not a charity, so vote with your wallets i suppose.
Being faster doesn't mean it has to be more expensive. You should be aware that NV is milking you. That's the fact.
Posted on Reply
#68
Aquinus
Resident Wat-man
So what you're saying is that nVidia just took their professional lineup and rebranded it for the consumer. Now I really want to see benchmarks because I'm feeling far more skeptical than I was before.
Posted on Reply
#69
Vayra86
R0H1TNearly all the gains came from higher clocks, there were virtually zero "IPC" gains. The higher clocks were due to 16nm FF from TSMC, as for 580 that's just a respin of 480 - not unlike zen+ ;)
The higher clocks came only in part due to 16nm, they had a much better power delivery and updated GPU Boost to go along with that and they refined (read: made more focused for gaming purposes, less FP32) their shaders. And it is precisely that which AMD refused to do with GCN and why it lags behind on performance per shader/mm2/watt.

images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf
- GP100’s SM incorporates 64 single-precision (FP32) CUDA Cores. In contrast, the Maxwell and Kepler SMs had 128 and 192 FP32 CUDA Cores, respectively.

The silly thing about it all is that Turing is in fact a step backwards, making the SM less focused again, and needing more of them at somewhat lower clocks to get performance.
Posted on Reply
#70
rtwjunkie
PC Gaming Enthusiast
B-RealYou should be aware that NV is milking you. That's the fact.
They may in fact, be milking buyers. The fact remains that as with any for profit, publicly traded company, and not a charitable organization, they are beholden to the stock holders and will take steps to be as profitable as possible.

If they can do that at a higher price for their product then they will, as long as it sells. You are free as a consumer to vote with your wallet as many will, and not buy it. I know I won’t be buying any RTX 20 series.
Posted on Reply
#71
notb
AquinusSo what you're saying is that nVidia just took their professional lineup and rebranded it for the consumer. Now I really want to see benchmarks because I'm feeling far more skeptical than I was before.
Quadro and Tesla are the pro GPU lineup. RTX is still the consumer stuff. No ECC, pro drivers or support.
But it seems like a very decent Titan replacement. I haven't seen a full feature comparison, but the obvious things are present - Tensor cores being the most important part.
rtwjunkieIf they can do that at a higher price for their product then they will, as long as it sells. You are free as a consumer to vote with your wallet as many will, and not buy it. I know I won’t be buying any RTX 20 series.
Well... I'm pretty sure I would look for a notebook with RTX 2050/2060 if it came out, but that seems unlikely.
Nvidia only announced RTX 2080 MaxQ for now.

If I keep a desktop, RTX 2070 seems like a probable choice. It's way out of my typical PC budget, but I should be able to find the extra ~$300 somewhere else.
And if I switch to a notebook, RTX 2070 seems like a good reason to finally try eGPU. :-)
Posted on Reply
#72
Aquinus
Resident Wat-man
notbQuadro and Tesla are the pro GPU lineup. RTX is still the consumer stuff. No ECC, pro drivers or support.
I remember the days when you used to be able to trick the driver to think a GeForce card was a Quadro card by faking a different device id. Features can be disabled in the driver but, my point is that the hardware itself is feeling like professional hardware getting re-purposed with watered down drivers. Saying that lower end "GTX" cards are going to be more like the cards nVidia had before which screams to me: re-purposed hardware. I wouldn't be surprised if ECC is supported but, they just put non-ECC DRAM on it for the consumer.
Posted on Reply
#73
iO
AquinusI remember the days when you used to be able to trick the driver to think a GeForce card was a Quadro card by faking a different device id. Features can be disabled in the driver but, my point is that the hardware itself is feeling like professional hardware getting re-purposed with watered down drivers. Saying that lower end "GTX" cards are going to be more like the cards nVidia had before which screams to me: re-purposed hardware. I wouldn't be surprised if ECC is supported but, they just put non-ECC DRAM on it for the consumer.
This.
It looks like TU102/4 was specifically made for Quadro cards and making a RT and Tensor core-less GPU just for gaming cards wasnt finacially viable if 7nm is just around the corner.
Posted on Reply
#74
robal
I don't give much toss about ray tracing, but this quote is an eyebrow raiser:
NVIDIA's Colette Kress put Turing's performance at a cool 2x improvement over their 10-series graphics cards, discounting any raytracing performance uplift
2x ? 2x what ? Price ?
Posted on Reply
#75
D1RTYD1Z619
coonbromaybe a 2080 RTX at 1500$ and then a 2080 GTX at 750$ [with 20$ mail in rebate]
It would probably just be 10 bucks of Fortnite credit.
Posted on Reply
Add your own comment
Apr 25th, 2024 05:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts