• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers

When the poll first dropped, it swung wildly back and forth between the niche options that ended up loosing. I think at one point it suddenly shot up to like 90% RT. Where people spamming the poll?
 
Uh oh VRAM is low priority on this TPU poll... only 12GB VRAM on 5070 confirmed.
What, where?
If only 12GB, that is some really big boy sweater the 5050 wearing.
 
I don't have a single RT game in my library, granted life is always in the way of playing the games I have, but as stated, the juice is not worth the squeeze.

If a game has to rely on RT to sell its shit. It's like AI on everything, no one actually wants it, yet here it is....
 
A lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
Perhaps rasterization is a dead end, but who will drop backward compatibility with current tech for a promise of new? Look what happens when Intel said it's going to drop support of some x86 legacy stuff. GFX generation needs a change, but I don't see it happening in PC ecosystem, even consoles fell into backward compatibility trap. Nope, it will be something new, not burdened by the old.
 
Rasterization is only a dead end once RT can work on midrange cards without a significant performance hit, RT has it own hacks being upscaling and fake frames which makes games look worse as well.
 
A lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
> Looks at a gaming landscape in which rasterization accounts for 95% of every scene
> "Rasterization is a dead end"
 
People should stop acting like RT is something the majority cares about, because it's not.
 
Upscaling and frame generation lol no one wants that bullshit, finally people awakes, and those thiefs companies are pushing hard in this way !!
Yes. But game companies are forcing TAA on their titles so they can promote useless gimmicks like DLSS.
 
I'm honestly surprised Energy Efficiency was 16% I must me a luggite because I DGAF about it.

I don't worry too much about energy consumption, but what I absolutely do care about is how much heat there is in my room. I currently have a 230W card and I am completely unwilling to buy one more power hungry than that purely for ambient heat reasons. And as an added bonus, typically the lower wattage cards are able to run quieter (assuming you buy a model with a halfway competent cooling solution).

That said, the raster price to performance benefit has to be there before I even consider looking at cards to upgrade to, so I would not have voted efficiency as my priority. But it and VRAM are still important factors before a purchase. I DGAF about RT or upscaling.
 
It's funny someone said that current artificial feature is the future, when in fact it is only a small part of post processing.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
 
Agreed. Those new games are so extremely poor optimized, now is mandatory to use upscaling tech and fake frames just to run them OK-ish even on uber expensive GPUs.... Ridiculous.
 
Last edited:
A lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
No, rasterization is not a dead end. The problem is that with manufacturing process evolution slowing down scaling up compute unit counts is no longer a viable way ahead for long.
Up to them, next gpus generation rtx 6xxx/ rx 8xxx will surely handle 4k 60fps+ easily there will be no need for that bullshit specially if graphics does not make a huge evolution !!
No they won't. Look at state of the art games making 4090 struggle even without RT. Unreal Engine 5 with Nanite and Lumen turned up for example.
It's funny someone said that current artificial feature is the future, when in fact it is only a small part of post processing.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
RT is straight-up not postprocessing.
DLSS/FSR/XeSS are so-so, tehnically they are at postprocessing stage but they have hooks in the pipeline - if only motion vectors and such.
 
Rasterization is only a dead end once RT can work on midrange cards without a significant performance hit, RT has it own hacks being upscaling and fake frames which makes games look worse as well.
> Looks at a gaming landscape in which rasterization accounts for 95% of every scene
> "Rasterization is a dead end"
It's funny someone said that current artificial feature is the future, when in fact it is only a small part of post processing.
I humbly remind you that there are 2 global processes, yeah I kinda oversimplified, namely rendering pipeline and post processing.
Ray tracing is just post processing, sorry if this hurts but that's the reality.
Those who write that rasterization is a "dead end", it is very clear that they do not understand how the graphic card displays images from code.
Good points. Rasterization is crucial for rendering a scene. There is still plenty of room left to get more detailed scenes with improved rasterization.

Agreed. Those new games are so extremely poor optimized, now is mandatory to use upscaling tech just and fake frames just to run them OK-ish even on uber expensive GPUs.... Ridiculous.
Yes. But game companies are forcing TAA on their titles so they can promote useless gimmicks like DLSS.
DLSS-like functionalities helps devs to avoid proper optimization of games. In other words, it enables them to get paid quickier and for less work done. Next there will be games generated by "AI" sold with price tags of games made by humans.
 
It makes sense. For me, price and raw performance ratio was the most important and that's why I didn't buy 4090 but I bought 7900XTX from Sapphire Nitro range. Nvidia card was only ~25% faster but ~60% more expensive. Pretty simple calculation for me, while both cards offer 24GB of VRAM.

The Nitro card received several tech awards for its design and quality. It can even hit 3GHz in several games, at stock settings. Happy to provide screenshots after my holiday is finished. The card is fantastic in 4K/120Hz gaming on OLED display.
 
Can someone explain to me what is it with this strange obsession with software when it comes to ray-tracing these days? :)
Another occasion for dUck-measuring if you mean auditorium. My card make more rt than your card. For game studios is easy way to transfer part of their work on the he consumer's back. For Nvidia is way to bigger profit margins.
 
Here's some REALITY:
1) Path-tracing:
This IS the end-goal for most 3D gaming. Rasterized graphics will eventually go away. Game DEVELOPMENT will actually be far EASIER once it's only being done for path-tracing for many reasons including issues related to light baking and conflicts between raster lighting methods. It's not even debatable if you're in the industry.
(It will take a LONG time though because game devs still need to make most games for older hardware due to the install base vs profitability.)

2) AI UPSCALING:
It's not a "BS" feature like some people seem to think. It's actually the most BENEFICIAL feature that has ever come to gaming. Has it been ABUSED? Absolutely.

Also, future games will continue to optimize around the assumption that AI UPSCALING will be used. For example, deciding on the art style, or line thickness, or how many light bounces to do etc.

Summary:
The future of HARDWARE design is primarily based around moving towards more PATH TRACING and utilizing AI UPSCALING. The Sony PS5 PRO is a good example of how this is happening. The main GPU architecture is done for a bit more general compute to get higher resolution, a better focus on path-tracing and some dedicated DIE SPACE for PSSR to utilize AI upscaling.

The PS6, next XBOX and PC gaming in general will be optimized around these TWO things.

By all means, be ANNOYED about upscaling and/or frame generation being abused. Or path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
 
Here's some REALITY:
1) Path-tracing:
This IS the end-goal for most 3D gaming. Rasterized graphics will eventually go away. Game DEVELOPMENT will actually be far EASIER once it's only being done for path-tracing for many reasons including issues related to light baking and conflicts between raster lighting methods. It's not even debatable if you're in the industry.
(It will take a LONG time though because game devs still need to make most games for older hardware due to the install base vs profitability.)

2) AI UPSCALING:
It's not a "BS" feature like some people seem to think. It's actually the most BENEFICIAL feature that has ever come to gaming. Has it been ABUSED? Absolutely.

Also, future games will continue to optimize around the assumption that AI UPSCALING will be used. For example, deciding on the art style, or line thickness, or how many light bounces to do etc.

Summary:
The future of HARDWARE design is primarily based around moving towards more PATH TRACING and utilizing AI UPSCALING. The Sony PS5 PRO is a good example of how this is happening. The main GPU architecture is done for a bit more general compute to get higher resolution, a better focus on path-tracing and some dedicated DIE SPACE for PSSR to utilize AI upscaling.

The PS6, next XBOX and PC gaming in general will be optimized around these TWO things.

By all means, be ANNOYED about upscaling and/or frame generation being abused. Or path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
This future is still quite a bit away.

Path tracing is very-very expensive. Expanding what the RT units in GPUs are doing today is actually easy enough but while these do a lot of heavy lifting there is a lot else in RT/PT that needs to be figured out and standardized. Simple example - the bloody structure of the scene. Whether it will eventually be BVH, some variation of it, some other mesh structure or something completely new, for full path tracing to take off there are parts of it that probably need to end up hardware assisted if not straight up accelerated. And that requires some level of standardization about what and how everyone does with it. More likely though this will still be run on either shaders or CPUs which shifts the responsibility and performance cost but does not really solve anything. And afaik there are still some things that are difficult to do with path tracing.

Plus there is the elephant in the room - whoever wants to go for the full path tracing hardware needs to figure out what to do with all the existing rasterized games. Essentially something like a DX/OGL/Vulkan wrapper to whatever API will run path tracing. Since we do not know what the hardware approach will end up, this may end up to be a simple problem or an extremely difficult one :)
 
First game using real time RT came out in 2018 (Battlefield V).

Looking at Tim's video about RT the situation is not that great six years later. Games where RT is properly implemented, does not cause issues and is meaningfully distinct can be counted on two hands at best. If this has taken six years then we can only guess how long it takes for PT to start appearing in games in large numbers. Im guessing that's still at least a decade away and raster will not disappear even 30 years from now.
 
Raytracing is a gimmick only, for now.
It is OK at best for the most part, and you need to burn a lot of money to make it fluid in QHD or larger.
Within in 3~5 maybe becomes standard for games on High end configs, but nothing else.
 
Or path-tracing seemingly not offering much for the performance cost. That doesn't change the fact that these are amazing tools in the toolbox when used correctly. And they are the future.
This is a thread about next-gen GPUs. I don't think anyone here disputes that full path-tracing is a worthwhile goal, or even that it will eventually take over, but the question is when? "Sometime between now and the heat death of the universe" doesn't move the needle on purchasing decisions made now or indeed within the next decade.

At this rate, I honestly wouldn't bet good money that full path-tracing will become standard within my lifetime, which we'll call another 30 years, give or take. We've seen plenty of hype campaigns for supposedly paradigm-destroying tech innovations that stall out on the 5-yard line. Wake me when path-tracing becomes more than a tech demo or a marketing bullet point. In the meanwhile, as @londiste suggests, there are tens of thousands of excellent games out there, many of them much more fun than the focus-group-driven corporate products at the leading edge of graphics tech. They're all rasterized.
 
DLSS-like functionalities helps devs to avoid proper optimization of games. In other words, it enables them to get paid quickier and for less work done. Next there will be games generated by "AI" sold with price tags of games made by humans.
DLSS is barely AI. It's temporal upscaling. If it's AI performance will be bad because every frame needs to be trained and reconstructed in real-time.
 
Like a few people here, I don't have any interest in ray-tracing or any of Nvidia's "special sauce". I want as many frames as possible at 1440p native, a fair bit of VRAM for future-proofing and drivers that don't cause me issues on Linux at a reasonable price. AMD has ticked all of those boxes for me with the 7900 XTX :)
 
Back
Top