• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Calling all low and mid GPU owners - shall we swap RT for more performance or lower prices?

Would you be open to sacrificing the capability to run Ray Tracing ?

  • Yes, for 30% lower price.

    Votes: 31 48.4%
  • Yes, for 30% more performance.

    Votes: 21 32.8%
  • No, I love RT even with low performance.

    Votes: 12 18.8%

  • Total voters
    64
Joined
Oct 6, 2021
Messages
1,926 (1.45/day)
System Name Raspberry Pi 7 Quantum @ Overclocked.
The pursuit of incorporating Ray Tracing (RT) into games, started by Nvidia, comes at the cost of allocating space on its die exclusively for RT. This valuable space could otherwise be utilized for additional shaders, thereby enhancing overall performance. This trade-off is becoming increasingly apparent in the escalating prices of GPUs, driven by the soaring costs of chip production and development, and a divergence from advancements in density.

Achieving higher performance in Ray Tracing (RT) entails augmenting the dedicated hardware on the die for this purpose. There's no magic involved, and as a consequence, GPUs are likely to become so expensive that what is currently considered low-end will become the equivalent of today's mid-range pricing. Nvidia isn't dismayed by this situation because its profit is directly tied to a percentage of the price; the greater the price, the larger its share.

It is evident that low and mid-tier GPUs lack the necessary power to effectively handle Ray Tracing, except when employed in an extremely limited manner. I won't even delve into mentioning Path Tracing (PT) due to its resource-intensive nature. So, taking all this into account, answer the question below.

- Would you be open to sacrificing the capability to run Ray Tracing on mid and low-end(under U$500) GPUs in exchange for either a 30% boost in performance or a 30% reduction in price, while maintaining the same level of performance as the current lineup?
 
The solution for me was to buy an AMD RX card instead.
 
I enjoy playing some games, for sure. How they look matters. But the games I've played recently that I've enjoyed the most, didn't have ant RT, or at least, none that were required. A few this year - The Talos Principle 2, Days Gone, Dead Island 2, Res Evil: Village, Dead Space remake.

RT does not make the game. Playability, novelty, story (for some), game mechanics. Lighting is important, but it can be aped. So, yeah, personally, I'd rather see a more tolerable price point than have baked in, over-hyped hardware (that still requires huge software implementations to run).

But that's just me.
 
Ray Tracing in Gaming was actually introduced by EA when Tripp Hawkins was the CEO circa 1989. One of the lies the narratives establishes is that you need Ray Tracing to enjoy Gaming. I wonder if you have a 3090 if you feel you are missing out if you play your Games at native. I always use Spiderman and how beautiful that Game is as a reference.

In some ways we are stuck. AAA Developers are all in for Hardware Ray tracing but they have to make Games for the Console too. GTA6 will be a console exclusive and since the console is AMD that means that their version of Ray tracing will get more than just money and Nvidia engineers thrown at it. We may actually get innovations that come from the Group philsophy when it comes to these technologies.

Even Hardware Ray Tracing seems similar to Freesync vs Gsync with the way everything is positioned. The consoles will demand that their hardware can do acceptable Ray Tracing. Just like how this generation focused on 4K 60 as a benchmark.
 
I am on both end when it comes to GPU. Low end with a RTX 4060 and the high end with a RTX 4090.

With RTX 4060 i never use RT as its to weak generally to go with RT on. So speaking as a low end gamer right now, I am actually in for both more performance and lower price.
If Nvidia could reduce price on GPU´s thats generally to weak for RT any way, it would be a win with a GPU that could either be made with no RT cores at all and by that cheaper to make and/or with more cuda cores in stead to encrease performance.

As a high-end gamer. There are some times where i dont use RT either, but that is more to reduce power consumption and in rare situation, to get better performance in speacially 4k. There games out there giving a RTX 4090 a hard time at 4K. Latest exsample is avatar frontiers of pandora. 4K all maxet out native resolution even with out RT on. 4090 have a hard time maintaning above 60 FPS at all time. So sometimes not even 4090 is perful enoufh to handle RT wit all maxet out in 4K.
 
Wouldn't that mean two different architectures per generation?
Definitely won't be cheap, and may probably even performance (or quality) due to added spreading of resources.

Do low end cards even exist anymore?
 
Let's wait if @JHuang will join soon... :D
 
Obvious answer: RT is useless at low/mid tiers because what what you need is higher framerates, not lower.

However RT did something very important for low end GPU buyers: It required the development of DLSS to mitigate the RT performance cratering on high end cards, and later on forced FSR and XeSS development all of which are fantastic tech for lower end cards to maintain framerates in newer games with better visuals than just reducing resolution & quality.

The effing hilarious part is it took the performance-crippling feature addition of RT to force its development.
 
I never once thought gdamn, my life is missing RT!
 
However RT did something very important for low end GPU buyers: It required the development of DLSS to mitigate the RT performance cratering on high end cards
RT made DLSS a necessity indeed, but I believe DLSS would have been developed regardless of whether RT was on the table or not.DLSS development serves Nvidia's agenda in becoming an AI/ML powerhouse.

It could be that RT serves DLSS and its ilk more than it is the other way around. Just consider how much ML stuff Nvidia keeps pushing with its RTX offerings (e.g. the upscaling tools in Remix).
 
Let's wait if @JHuang will join soon... :D
Who knows, maybe if @W1zzard puts this poll on the main page, one of the Nvidia employees will see it and say "Look boss, those guys over there are right. The performance/dollar of GPUs in this segment hasn't advanced almost at all because we mainly focus on shoving RT down people's throats."
 
Who knows, maybe if @W1zzard puts this poll on the main page, one of the Nvidia employees will see it and say "Look boss, those guys over there are right. The performance/dollar of GPUs in this segment hasn't advanced almost at all because we mainly focus on shoving RT down people's throats."

I don't think NVIDIA cares about a web poll when they're rolling in money from GPU sales.
 
I never owned any nVidia RTX GPU whatsoever and the most RT capable GPU I had is my RX 6700 XT. At 1440p, it's almost acceptable in one of the least AMD favouring titles regarding RT, the Cyberpunk 2077, with RT partially enabled and FSR cranked up to Balanced or rather Performance mode. RX 6800 non-XT, the best sub-500 USD AMD GPU, is just a third faster so it's quite possible to get playable framerates at 1440p with FSR at Quality and touching 50s with FSR at Balanced. AMD GPUs, regardless of the price segment, are suboptimal for ray tracing and should only be bought for this kinda gaming if nVidia GPUs are unavailable or extremely overpriced in your location.

Speaking of nVidia GPUs... I don't think RTX 4070 upwards need any calibration other than making them cheaper but AMD don't compete hard enough for it to be possible. Would also like 4070 Ti to be introduced as a 16 GB VRAM GPU but even at 12 GB, its only real disadvantages are price and a proprietary power connector. This GPU does good job at RT all things considered.

RTX 4060 Ti is just an all-around rubbish device which is too slow to justify anything it has that 3060 Ti hasn't considering near identical prices. This thing could be much better if it had the same 192-bit 12 GB VRAM configuration as in RTX 4070, even if downclocked to, say, 19 GHz. RT, regardless, is the least of this GPU's problems.

4060 non-Ti, however, is totally an RT-incapable GPU and I'd like it to have RT blocks halved with price being cut by a dozen or even twenty percent. It will still be able to run RT for screenshots and cinematic experience in less demanding titles/resolutions and will be cheap enough for it to be a really awesome offer for lower-budget gamers who don't mind activating DLSS for better experience (at 1440p, it's borderline mandatory for GPUs of such class) and also don't mind disabling most or even all RT in their games.

The upcoming generation is likely to bring a couple improvements here and there and I expect the direct 4060 Ti successor to be as RT capable as RTX 4070 currently is, or insignificantly slower. And this is just enough for reasonable RT gaming at lower resolutions with 60 FPS in mind. The $300 Blackwell GPU, however, is likely to be as horrible as the 4060 Ti (can't back it up, call it anxiety, or spider sense, or whatever) regarding RT performance so yeah, trading a dozen percent RT cores for increasing its raster performance would make some sense.

Yet keep in mind that RT becomes more and more of a thing lately. Some latest games don't include full-on raster modes and it seems to become a trend rather than them being outliers for good. Of course it's not 2000s anymore and you can play almost any existing game at 60+ FPS on the lowest tier next-gen GPUs (aka RX 7600 and RTX 4060) if you're at 1080p or 1440p+upscaling. My main concern here is the games themselves. Will they be worth investing quid in beefier GPUs, or we're better off changing our hobbies?
 
RT is the first thing i disable (even when i had my 4080) as its waay too subtle.
Maybe its my 43 year old eys but i can BARELY notice RT visually.. the performance hit i can notice though.
 
I wonder how long threads like these will continue to fight the future of GPU design. Thankfully, the people actually designing GPUs understand what that future should be.
 
Thankfully, the people actually designing GPUs understand what that future should be.
If we talk engineers then yes.
If we talk marketologists that forced anemic VRAM capacity in most nVidia GPUs then they obviously ruining the progress.
 
RT is the first thing i disable (even when i had my 4080) as its waay too subtle.
Maybe its my 43 year old eys but i can BARELY notice RT visually.. the performance hit i can notice though.
Subtle is exactly the point. Besides the light basic hybrid "RT" in most games (due to AMD consoles that can't do path tracing) RT as a mechanism is supposed to make you immersed. You aren't meant to notice each individual effect, it's supposed to mimic reality and how physics based light etc works.

Same goes for developers, they don't want to think about lighting, they just want to build their world and have everything render the way it would look in reality. This is the point of fully ray traced games, and the reason why it's being pushed by everyone who understands graphics.
 
rt cores are not that big. The 1660 Ti for example is devoid of rt while having 66% cuda cores compared to the rtx 2070 resulting in 64% die area. So you saved 2%, not 30%.
 
rt cores are not that big. The 1660 Ti for example is devoid of rt while having 66% cuda cores compared to the rtx 2070 resulting in 64% die area. So you saved 2%, not 30%.
Yep, not sure if this is 100% to scale, but RT cores aren't a huge part of 4090, they're just integral.

1703723531425.png
 
I wonder what will happen to a GPU if you double the amount of RT cores on it.

Let's say RX 6800 non-XT has 60 said cores onboard and AMD reconfigured it to have 120 RT cores with all the clocks and the rest staying as is. How does it affect RT performance and how much more expensive is this to produce? I mean, does this make any sense?

[This is not trolling, I really don't know and want to know]
 
Why in the world would we force gamers to make more sacrifices when Nvidia's gross margins are reaching near double Apple's. It's illogical, what we need is sane pricing. Not more pandering to Nvidia's endless greed.

Same goes for developers, they don't want to think about lighting, they just want to build their world and have everything render the way it would look in reality. This is the point of fully ray traced games, and the reason why it's being pushed by everyone who understands graphics.

I think what you meant to say is that devs want lighting to be easier. Devs that don't want to think about lighting at all likely aren't going to be good devs. Lighting is one of the biggest elements to building your game world. RT doesn't remove the requirement for a dev to deal with lighting. In outdoor scenes you can use GI but in many cases this will leave the player with sub-optimal visability in any shaded areas. There will for sure still be same hand tuning required. Interior scenes are a lot more complicated, where you can't rely on GI nearly as much. There is still a need to placing lighting in both interior and exterior scenes but obviously a lot more for interior. Particularly becuase modern graphics cards can only simulate 0 to 1 bounce, which means light doesn't really fill interior spaces as it would in real life through light bounces. There's also only 0.5 rays per pixel with current RT implementations, which means there will be a lot of edge cases where that doesn't output satisfactory results. There's also a limitation to the number of RT light sources on current hardware as well. If you look at CP2077 at the highest settings, it's why the game looks more like a dollhouse or diaroma with RT enabled, because the only RT exterior light is the sun and no other light source is calculated using RT. Exterior lighting doesn't look right either because there is a lack of light bouncing, hence why many exterior object in the game have dark shadows on parts obscured from the sun but that should be lit due to sunlight bouncing off the ground. Can't say I'm a fan of the game's RT mode, it just uncanny.

Will they be worth investing quid in beefier GPUs, or we're better off changing our hobbies?

I got a lot of people into PC gaming by selling them affordable systems that could run the latest titles very well. Now that entry level GPUs are essentially the same price as what I used to charge for entire PC builds, I'd have to say that for many it simply will be outside their budget. That said I cannot say I recommend console gaming either given the $70 game price point and subscription cost to even play online. At the end of the day you have choices but they are all terrible, except for maybe forfeiting the rat race altogether and becoming a monk.
 
My laptop's RTX 3050 does a pretty decent job, considering what it is and that it has 4 GB of VRAM.

I wonder how long threads like these will continue to fight the future of GPU design. Thankfully, the people actually designing GPUs understand what that future should be.

You nailed it. Personally, I have the view that the RT vs. no RT debate is largely fueled by clubism and the whole AMD vs. NVIDIA feud. It has been painfully obvious that AMD's hardware has historically been brutally incompetent at handling this type of workload, they were years late to the party, remain a full generation behind and even with RDNA 3, it's not quite there yet, generating resentment that there's such an insane gap between Nvidia and AMD in this regard - and Nvidia has actively exploited this in their marketing material, and took advantage that they were the only vendor that offered complete DirectX 12 Ultimate and hardware accelerated RT to keep their ASP high during the twilight years of GCN and the bug festa that was the original RDNA.

Couple that with the frame rates being relatively poor even on high end hardware (at least until high-end Ampere came about), budget-conscious customers that shop at performance segment or below (which is the only market where AMD has any meaningful presence) as "superfluous", and often baseless arguments are made against the tech, in a futile attempt to "reject" this change.

But it's the future, whether one likes it or not. Games will be heading towards using ray and path tracing, ray traced global illumination, etc. - because that is the only way to improve graphics any further from where they currently stand. True that graphics don't make the game - but some games do need the graphics, and I will admit that I like having games with eye candy and a high degree of immersion... or i'd just have a RTX 3050 on my desktop, too.

I wonder what will happen to a GPU if you double the amount of RT cores on it.

Let's say RX 6800 non-XT has 60 said cores onboard and AMD reconfigured it to have 120 RT cores with all the clocks and the rest staying as is. How does it affect RT performance and how much more expensive is this to produce? I mean, does this make any sense?

[This is not trolling, I really don't know and want to know]

In the Navi 21 design, the RT accelerator is bound to a compute unit, each workgroup consisting of two compute units. If you simply doubled it, it'd likely be resource starved and even more inefficient. It'd not necessarily be faster, possibly slower, even.

Non-XT RX 6800 is simply a low-quality Navi 21 die that is 25% disabled. Only three of the four compute slices (each comprised of ten WGPs or 20 CUs) are enabled on this model. Only way to extract more out of this generation would be to enable the remaining units (6800 XT, 6900 XT), and then overclock it further (6950 XT), in an attempt to defy scaling with the tradeoff for power efficiency. If the 6800 achieves 60% of the RT performance of the 6950 XT, then it's already succeeded at what it had to do, but it's not enough: AMD's first-generation RT design in RDNA 2 really sucks compared even to Turing.
 
Last edited:
Now that entry level GPUs are essentially the same price as what I used to charge for entire PC builds
Excuse me, what? $250ish for RX 7600/6650 or RTX 3060/4060 is way less than the price of a PC with a GPU of similar class at any time in the past. Remember 2021 when 6700 XT's street price was 4 times the price as of today? Remember 2015 when you could pay $400 for an AIB GTX 970 and still get almost no 1440p60 gaming experience whatsoever? My PC estimates sub 1000 USD with VAT and I'm gaming at 4K. PCs are exceptionally cheap if we talk 1080p60 to 1440p60 gaming.

Complete absence of lowest tier next-gen/last gen GPUs that make any sense is, however, a bit sad.

Of course things like RTX 4080 and 4090 cost ridiculously much and an average Joe can't afford it at all but considering what you get for that money and the fact they have no competition it's not that insane.

My question explicitly stands this way: is it gonna be worth upgrading for future games? Will they provide something besides better RT implementation? Because if the only good thing in game is RT this game sucks.

it'd likely be resource starved
Meaning wattage or raw raster performance to support it? I mean, cranking wattage is non-issue...
 
Meaning wattage or raw raster performance to support it? I mean, cranking wattage is non-issue...

Registers, cache and memory bandwidth, as well as the fact that just increasing the amount of available processors doesn't mean that it will scale perfectly (Amdahl's law), raster performance wouldn't have much to do with it. With RDNA 2 specifically, whether we're talking RT or not, the hardware lives or dies by the "Infinity Cache" hit rate, since the memory bandwidth is wholly inadequate not only for RT but also for traditional games.
 
Registers, cache and memory bandwidth, as well as the fact that just increasing the amount of available processors doesn't mean that it will scale perfectly (Amdahl's law), raster performance wouldn't have much to do with it. With RDNA 2 specifically, whether we're talking RT or not, the hardware lives or dies by the "Infinity Cache" hit rate,
Thanks, gotcha.
since the memory bandwidth is wholly inadequate not only for RT but also for traditional games.
Totally agree.
 
Back
Top