• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Calling all low and mid GPU owners - shall we swap RT for more performance or lower prices?

Would you be open to sacrificing the capability to run Ray Tracing ?

  • Yes, for 30% lower price.

    Votes: 31 48.4%
  • Yes, for 30% more performance.

    Votes: 21 32.8%
  • No, I love RT even with low performance.

    Votes: 12 18.8%

  • Total voters
    64
My laptop's RTX 3050 does a pretty decent job, considering what it is and that it has 4 GB of VRAM.



You nailed it. Personally, I have the view that the RT vs. no RT debate is largely fueled by clubism and the whole AMD vs. NVIDIA feud. It has been painfully obvious that AMD's hardware has historically been brutally incompetent at handling this type of workload, they were years late to the party, remain a full generation behind and even with RDNA 3, it's not quite there yet, generating resentment that there's such an insane gap between Nvidia and AMD in this regard - and Nvidia has actively exploited this in their marketing material, and took advantage that they were the only vendor that offered complete DirectX 12 Ultimate and hardware accelerated RT to keep their ASP high during the twilight years of GCN and the bug festa that was the original RDNA.

Couple that with the frame rates being relatively poor even on high end hardware (at least until high-end Ampere came about), budget-conscious customers that shop at performance segment or below (which is the only market where AMD has any meaningful presence) as "superfluous", and often baseless arguments are made against the tech, in a futile attempt to "reject" this change.

But it's the future, whether one likes it or not. Games will be heading towards using ray and path tracing, ray traced global illumination, etc. - because that is the only way to improve graphics any further from where they currently stand. True that graphics don't make the game - but some games do need the graphics, and I will admit that I like having games with eye candy and a high degree of immersion... or i'd just have a RTX 3050 on my desktop, too.



In the Navi 21 design, the RT accelerator is bound to a compute unit, each workgroup consisting of two compute units. If you simply doubled it, it'd likely be resource starved and even more inefficient. It'd not necessarily be faster, possibly slower, even.

Non-XT RX 6800 is simply a low-quality Navi 21 die that is 25% disabled. Only three of the four compute slices (each comprised of ten WGPs or 20 CUs) are enabled on this model. Only way to extract more out of this generation would be to enable the remaining units (6800 XT, 6900 XT), and then overclock it further (6950 XT), in an attempt to defy scaling with the tradeoff for power efficiency. If the 6800 achieves 60% of the RT performance of the 6950 XT, then it's already succeeded at what it had to do, but it's not enough: AMD's first-generation RT design in RDNA 2 really sucks compared even to Turing.
It seems to me that you are very good at constructing long texts devoid of any argument other than saying that everyone who thinks for themselves and questions is a faboy and absolutely everything Nvidia does is right, and the future undeniable, unquestionable. A suggestion: you might as well replace the text with a photo of you kissing a Huang statue, achieving the same effect!

RT proves inefficient, consuming die space, energy, and resources, particularly in the specified context. With manufacturing processes becoming increasingly expensive and cache no longer substantially shrinking, proportional advancements are unlikely in the foreseeable future. Given the current stage, as we're beyond the beginning of the century, the imminent challenges make it apparent that there's insufficient margin for genuine RT viability in average consumer GPUs.There is no counter argument other than generic texts because no one refutes reality.
 
Excuse me, what? $250ish for RX 7600/6650 or RTX 3060/4060 is way less than the price of a PC with a GPU of similar class at any time in the past. Remember 2021 when 6700 XT's street price was 4 times the price as of today? Remember 2015 when you could pay $400 for an AIB GTX 970 and still get almost no 1440p60 gaming experience whatsoever? My PC estimates sub 1000 USD with VAT and I'm gaming at 4K. PCs are exceptionally cheap if we talk 1080p60 to 1440p60 gaming.

Complete absence of lowest tier next-gen/last gen GPUs that make any sense is, however, a bit sad.

The irony, you can't get a 1440p card to this day for under $780 on the Nvidia side and that's according to Nvidia's marketing itself.

The 4060 / RX 7600 are absolute trash, they are not even entry level compared to prior generations. A perfect example of that is the fact that the 980 Ti provides 289% of the performance relative to the 950 while the 4090 provides 312% of the performance relative to the 4060. The 4060 provides a mere 33% relative to the flagship, making it worse than even the lowly 950 relative to the flagship card of each generation and that's considering the 950 was a $130 card while the 4060 is $300. It's simply misleading to compare that to anywhere near the level of performance the 970 provided at it's time, which was 76% of flagship performance. The closest current Nvidia card is the 4080, which had a whopping $1,200 MSRP which really translated to $1,300 IRL.

Putting it into perspective, you really have to spend $400 - $500 to get just a decent GPU today and it will still be a fraction of what you would have gotten previously. FYI the 970's MSRP was $330. I put together so many $450 - $500 systems with RX 480s / 580s / 970s. Bought 480s and 580s in bulk when they frequently dipped to $80 - $100. Mining cards, despite people's trepidations, are fantastic. Not a single one failed and most all of them were in like new condition. I still have two as backups sitting on the shelf. Most certainly you got way more value back then, especially if you were looking for deals. There was no Mining bust for the 3000 series due to AMD and Nvidia's cartel like supply control and that deprived a lot of gamers of what could have been amazing deals.

Of course things like RTX 4080 and 4090 cost ridiculously much and an average Joe can't afford it at all but considering what you get for that money and the fact they have no competition it's not that insane.

My question explicitly stands this way: is it gonna be worth upgrading for future games? Will they provide something besides better RT implementation? Because if the only good thing in game is RT this game sucks.

The entire lineup from both companies is a complete ripoff. As the statitics I pulled up above show, it's not just the flagship cards but the entire stack. $300 is the new $100, and $1,600 is the new $700. It is isane. I don't know how anyone who's witnessed the impact of the 10.66% inflation from 2020 till now and would say that 220% - 300% isn't insane.
 
Last edited:
The irony, you can't get a 1440p card to this day for under $780 on the Nvidia side and that's according to Nvidia's marketing itself.

The 4060 / RX 7600 are absolute trash, they are not even entry level compared to prior generations. A perfect example of that is the fact that the 980 Ti provides 289% of the performance relative to the 950 while the 4090 provides 312% of the performance relative to the 4060. The 4060 provides a mere 33% relative to the flagship, making it worse than even the lowly 950 relative to the flagship card of each generation and that's considering the 950 was a $130 card while the 4060 is $300. It's simply misleading to compare that to anywhere near the level of performance the 970 provided at it's time, which was 76% of flagship performance. The closest current Nvidia card is the 4080, which had a whopping $1,200 MSRP which really translated to $1,300 IRL.

Relative performance to flagship is only one metric. You could also look at how many FPS the 950 was providing for then-current games at 1080p and compare to the 4060/7600 in current games. The 4060/7600 are twice as fast, delivering ~85FPS average while the 950 was at about 40-45fps (rough estimate based on TPU reviews). These things are delivering the FPS in current games that the 970 was in 2015.

The $160 950 delivered about half the FPS for more than half the cost, being $205 after accounting for inflation. What's missing is the <$200 950-class replacement as nobody wants to make one that's worth buying. I assume the margins aren't worth it.
 
RT proves inefficient, consuming die space, energy, and resources, particularly in the specified context. With manufacturing processes becoming increasingly expensive and cache no longer substantially shrinking, proportional advancements are unlikely in the foreseeable future. Given the current stage, as we're beyond the beginning of the century, the imminent challenges make it apparent that there's insufficient margin for genuine RT viability in average consumer GPUs.There is no counter argument other than generic texts because no one refutes reality.
RT cards are inefficient because they still rely on rasterization heavily as the main form of rendering.
Removing the rasterization will make them much faster, however they won't be backwards compatible with games that have rasterization, that's the main problem.
it requires a paradigm shift in development, gaming, & even on the OS side would need to support RT only designed cards with no rasterization.
 
you can't get a 1440p card to this day for under $780
Depends on the definition of a 1440p GPU.

If you need this GPU to run all games at 1440p at whatever settings with 30 FPS at least, you're golden with GTX 1080 level GPUs, rendering spending <150 USD.
If you need this GPU to run all games at 1440p at 60 FPS and you're fine with lowering settings here and there you won't feel essentially bad if your GPU is about RTX 3060 level. That's about $250.
If you don't want anything but High+ settings at 60+ FPS then your choice is something of an RTX 4070 calibre. That's just above five hundred.
And an $780 GPU, 4070 Ti, marketed as an ultimate 1440p performer, indeed is an ultimate 1440p performer. Any game at Ultra with 60 or even 144 FPS. Most ray tracing titles are also available with DLSS turned on.

The 4060 / RX 7600 are absolute trash, they are not even entry level compared to prior generations. A perfect example of that is the fact that the 980 Ti provides 289% of the performance relative to the 950 while the 4090 provides 312% of the performance relative to the 4060. The 4060 provides a mere 33% relative to the flagship, making it worse than even the lowly 950 relative to the flagship card of each generation and that's considering the 950 was a $130 card while the 4060 is $300. It's simply misleading to compare that to anywhere near the level of performance the 970 provided at it's time, which was 76% of flagship performance. The closest current Nvidia card is the 4080, which had a whopping $1,200 MSRP which really translated to $1,300 IRL.
I know they are relatively weaker by almost a magnitude but that doesn't mean we're currently in a worse spot. Back then, you had to invest in a GTX 980 Ti just to be sure your FHD 60 Hz display is fully saturated. Now, you ditched FHD and fully saturate 1440p60 for almost a third cheaper. And by the way, name me at least 2 games you cannot play comfortably at 1080p native using an RX 7600/RTX 4060. Cannot play comfortably meaning sub 60 FPS and/or ruined textures regardless of settings.

I put together so many $450 - $500 systems with RX 480s / 580s / 970s.
Do you realise these Polaris/Maxwell GPUs are at least two point two times slower than the slowest next-gen GPUs? Do you realise they have never been 1080p60Ultra go to options, even when released? Do you realise they are so aged it's only possible to sell them if your barter skill is over 70 points or your price is below $50? They lost their value and they have never been more valuable than current gen GPUs, especially $500ish ones (4070 and 7800 XT).
 
As has been stated, removing RT/Tensor cores saves exceptionally little die space, so I'm not sure how you get to 30% savings to have no RT. I just want lower prices in general because Nvidia and AMD are both greedy and content with their market positions as a duopoly or leader/follower with prices set accordingly, and they're both making some good margins on many of their products.

As to the answer "No, I love RT even with low performance.", my lowest end RT supported card is an A2000, and I've been able to tweak a half dozen or more games to use RT and can get very good experiences at 1080p. Is it for everyone at that perf level? probably not, but by making it a rast only GPU, the inherent savings in manufacturing there would be maybe 5-10% tops, so I'd rather have that hardware and be able to use it, than not have it at all, and prices should be lower across both brands anyway.
 
LOL, all the best looking games of 2023 use RayTracing, especially Avatar in which you cannot turn off RayTracing.

If you don't like pretty visual, why bother buying a new graphic card at all? just keep the old card and play 2D games...
 
Hmm. Why isn't there a poll question: "Would you shoot RT to death if you could"?
 
Used to be able to buy an entire high end computer for the price of one video card now. That's just stupid.
 
Used to be able to buy an entire high end computer for the price of one video card now. That's just stupid.
This is the furthest thing from the truth oh .. my .. god...

My FX-55 new was like 800$. Back in like 01. I got a 13700K new at like 450$.
 
The solution for me was to buy an AMD RX card instead.
Same here. They're already cheaper than equivalent Nvidia cards, but with lower ray tracing performance, which is kind of a win, I guess (whatever you can call a win these days).
 
Relative performance to flagship is only one metric. You could also look at how many FPS the 950 was providing for then-current games at 1080p and compare to the 4060/7600 in current games. The 4060/7600 are twice as fast, delivering ~85FPS average while the 950 was at about 40-45fps (rough estimate based on TPU reviews). These things are delivering the FPS in current games that the 970 was in 2015.

The $160 950 delivered about half the FPS for more than half the cost, being $205 after accounting for inflation.

The fundemental problem with comparing things this way is that the test suite is not the same between them. Game selection, platform, software, ect are all different. You could show a very high jump in average FPS if your test suite favors high FPS games or a very low FPS if your test suite happens to have heavy games but it wouldn't give the end user a single bit of useful data because you are trying to compare two different sets of games running on completely different platforms and software. It's not an Apples to Apples comparison. Comparing relative to other cards in the same generation makes far more sense given the test setup and test suite is the same between them. That gives you a concrete vision of what value customers got then vs what they get now. I'd also argue that you should be getting more FPS even on budget products as we now have 240 - 540 Hz monitors, customer expect more FPS.

Depends on the definition of a 1440p GPU.

Going by Nvidia's definition.

And an $780 GPU, 4070 Ti, marketed as an ultimate 1440p performer, indeed is an ultimate 1440p performer. Any game at Ultra with 60 or even 144 FPS. Most ray tracing titles are also available with DLSS turned on.

I would say in 2023 in a world with 540 Hz monitors, getting 144 FPS is not what I'd call the ultimate experience. That doesn't even max out my 4 y/o 240 Hz IPS. Having to turn on DLSS in order to use RT further detracts from that ultimate moniker, particularly when we are talking about only 1440p for $780 in 2023.

I know they are relatively weaker by almost a magnitude but that doesn't mean we're currently in a worse spot. Back then, you had to invest in a GTX 980 Ti just to be sure your FHD 60 Hz display is fully saturated. Now, you ditched FHD and fully saturate 1440p60 for almost a third cheaper.

The 980 Ti could achieve 60+ FPS in most 1440p games: https://www.techpowerup.com/review/nvidia-geforce-gtx-980-ti/22.html

In fact it still achieved that in many 2020 games: https://www.techspot.com/review/2005-geforce-gtx-980-ti-revisited/

I think you are confusing it with the 970, which could get 120 FPS in BF3 at 1080p.

Do you realise these Polaris/Maxwell GPUs are at least two point two times slower than the slowest next-gen GPUs? Do you realise they have never been 1080p60Ultra go to options, even when released? Do you realise they are so aged it's only possible to sell them if your barter skill is over 70 points or your price is below $50? They lost their value and they have never been more valuable than current gen GPUs, especially $500ish ones (4070 and 7800 XT).

I did this years ago lol. When the cards were selling for $250 - $320 new. Not talking about recently, there haven't been any good deals recently save for some enterprise liquidations.

This is the furthest thing from the truth oh .. my .. god...

My FX-55 new was like 800$. Back in like 01. I got a 13700K new at like 450$.

He's not wrong: https://gamersnexus.net/pc-builds/1019-high-end-overclocking-pc-build

According to a GN article from 2013, a high-end gaming PC cost at the time $1,150. That leaves you significant room considering the 4090's $1,600 price tag.
 
The fundemental problem with comparing things this way is that the test suite is not the same between them. Game selection, platform, software, ect are all different. You could show a very high jump in average FPS if your test suite favors high FPS games or a very low FPS if your test suite happens to have heavy games but it wouldn't give the end user a single bit of useful data because you are trying to compare two different sets of games running on completely different platforms and software. It's not an Apples to Apples comparison. Comparing relative to other cards in the same generation makes far more sense given the test setup and test suite is the same between them. That gives you a concrete vision of what value customers got then vs what they get now. I'd also argue that you should be getting more FPS even on budget products as we now have 240 - 540 Hz monitors, customer expect more FPS.



Going by Nvidia's definition.



I would say in 2023 in a world with 540 Hz monitors, getting 144 FPS is not what I'd call the ultimate experience. That doesn't even max out my 4 y/o 240 Hz IPS. Having to turn on DLSS in order to use RT further detracts from that ultimate moniker, particularly when we are talking about only 1440p for $780 in 2023.



The 980 Ti could achieve 60+ FPS in most 1440p games: https://www.techpowerup.com/review/nvidia-geforce-gtx-980-ti/22.html

In fact it still achieved that in many 2020 games: https://www.techspot.com/review/2005-geforce-gtx-980-ti-revisited/

I think you are confusing it with the 970, which could get 120 FPS in BF3 at 1080p.



I did this years ago lol. When the cards were selling for $250 - $320 new. Not talking about recently, there haven't been any good deals recently save for some enterprise liquidations.



He's not wrong: https://gamersnexus.net/pc-builds/1019-high-end-overclocking-pc-build

According to a GN article from 2013, a high-end gaming PC cost at the time $1,150. That leaves you significant room considering the 4090's $1,600 price tag.
Back in 2013, you'd buy 2x top cards for SLI which is no longer a thing. Though AMD still support Crossfire for DX12....

That and I (maybe not everyone) make considerably more money than I did in 2013.
 
I give up... I guess 3k for a 4090can is a steal. And another 1k for the entire computer to go with it. I get it.....
 
I give up... I guess 3k for a 4090can is a steal. And another 1k for the entire computer to go with it. I get it.....
It's only confusion on your part with the wording.

Nobody said most expensive hardware you can buy right now.
 
I will say this as a RTX 4050M owner. Though my laptop is $800 whole, the graphics card makes a fraction of that price so I guess I could comment on this anyway.

I think for games that use HWRT play just fine on my end so long as I use DLSS 3.1+ (I can replace old DLSS DLL files with the DLSS 3.1 DLL file from TPU), optimize other settings, & use framerate limiters to make a fresh & consistent experience.
 
I would say in 2023 in a world with 540 Hz monitors, getting 144 FPS is not what I'd call the ultimate experience. That doesn't even max out my 4 y/o 240 Hz IPS.
I respectfully disagree.

I would say in 2023 in a world with variable refresh rate monitors, you don't need to max out the top of your refresh rate range. If you're within range, you're good - potentially, even lower than it can be fine, depending on the game and your monitor's low frame rate compensation mechanism (obviously not for competitive online, but for atmospheric walking simulators).
 
Full VRR that (almost)competes with Gsync ultimate is probably the best thing to come out of 2023 & should really be a standard in the years to come! Freesync premium pro FTW o_O
 
Idk things have gotten so bad one of the most well received gpu's of this generation the 7800XT is barely faster than it's naming predecessor while only being 150 usd cheaper msrp vs msrp almost 3 years later.... The other exciting gpu is 2000 usd looking at current listings. People can spin it however they want this gpu generation has been mostly meh and that has 0 to do with RT.

I don't really expect any of that to change next generations just more of the same with maybe slightly more vram on the nvidia side just enough to not choke in current games.
 
And of course AI :shadedshu:
That's the main reason, imo. Do I remember right that one article mentioned that Nvidia's revenue is 80% AI and datacenter now? Even AMD has AI cores now, although I can't name a single thing that uses them. Gaming is becoming an afterthought fast, if it isn't already.

To be fair everything has gotten meh in 2023 down to just shopping for groceries, putting gas in your car, utilities, housing market, and I am sure many others things.
Game releases were awesome, imo. But that's it.
 
Game releases were awesome, imo. But that's it.

I think people forget that a 3060/6700XT can play pretty much every game at decent settings a lot of fomo going on with the 900+ usd gpu I guess.

Sure hardware was cheaper a decade ago but we also were just trying to hit 60fps at 1080p lol.
 
I think people forget that a 3060/6700XT can play pretty much every game at decent settings a lot of fomo going on with the 900+ usd gpu I guess.
That's the other reason why graphics card releases have been meh, imo. If most people don't have a reason to upgrade, then why should AMD/Nvidia spend hard cash to make something to upgrade to? No one's gonna buy it anyway. Releasing the same thing again with slight improvements should be a safe move until the industry picks up again (unless you do it like Disney).
 
The fundemental problem with comparing things this way is that the test suite is not the same between them. Game selection, platform, software, ect are all different. You could show a very high jump in average FPS if your test suite favors high FPS games or a very low FPS if your test suite happens to have heavy games but it wouldn't give the end user a single bit of useful data because you are trying to compare two different sets of games running on completely different platforms and software.

It's 100% appropriate to compare each card's performance relative to the games that people are playing when it came out. Because that's want people are going to do: buy a current-gen GPU and play current games. Someone who buys a 7600 or 4060 will have a much better gaming experience even with the most difficult games today than someone with a 950 in 2015, because these cards are a tier up. Unfortunately they cost it too.

You said this earlier:

The 4060 / RX 7600 are absolute trash, they are not even entry level compared to prior generations.

That's what I responded to and continue to disagree with as their performance at 1080p Ultra is well above 60fps. By this measure then the 950 was one of the most useless releases ever with it's 45 fps, so I fail to understand how it can be used as a good example with the others so easily outperforming it in their respective times.
 
Back
Top