• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hypothetical questions involving AMD Rumors and Nvidia's featureset

Nvidia's featureset is worth how much more than AMD assuming performance is the same?

  • Nvidia's featureset is worth 5% more

    Votes: 10 35.7%
  • Nvidia's featureset is worth 10% more

    Votes: 3 10.7%
  • Nvidia's featureset is worth 15% more

    Votes: 5 17.9%
  • Nvidia's featureset is worth 20% more

    Votes: 2 7.1%
  • Nvidia's featureset is worth 25% more

    Votes: 4 14.3%
  • Nvidia's featureset is worth greater than 25% more

    Votes: 4 14.3%

  • Total voters
    28
  • Poll closed .
I'll add one more specific but potentially widely applicable thing as a value difference between Nvidia and AMD: Minecraft with shaders. Minecraft is rather a widely played game with a quarter billion copies sold over all platforms so this is a reasonable consideration.

Native Minecraft or MC with regular performance mods plays very well with both brands' GPUs. But modded with shaders sucks toads on AMD cards. Man I hate to say that but I think all the devs use Nvidia cards as they work very well on Nvidia GPUs, with 100% GPU utilization and good FPS. But the utilization on AMD GPUs is often 25-50% to where my 2060 Super dukes it out and even beats the 6800 XT.

Now the 2060S is a sleeper awesome GPU (especially with DLSS) but the 6800 XT should soundly beat it in everything, but not here. Note this is NOT Minecraft RTX, just regular modded-in shaders like most 3D games. I hadn't played MC on an Nvidia GPU in a couple of years before popping the 2060S in, and getting 3X the FPS I expected was.....

Pleasant.
 
I'll add one more specific but potentially widely applicable thing as a value difference between Nvidia and AMD: Minecraft with shaders. Minecraft is rather a widely played game with a quarter billion copies sold over all platforms so this is a reasonable consideration.

Native Minecraft or MC with regular performance mods plays very well with both brands' GPUs. But modded with shaders sucks toads on AMD cards. Man I hate to say that but I think all the devs use Nvidia cards as they work very well on Nvidia GPUs, with 100% GPU utilization and good FPS. But the utilization on AMD GPUs is often 25-50% to where my 2060 Super dukes it out and even beats the 6800 XT.

Now the 2060S is a sleeper awesome GPU (especially with DLSS) but the 6800 XT should soundly beat it in everything, but not here. Note this is NOT Minecraft RTX, just regular modded-in shaders like most 3D games. I hadn't played MC on an Nvidia GPU in a couple of years before popping the 2060S in, and getting 3X the FPS I expected was.....

Pleasant.
I haven't played MC in a while, is the RT edition still a sidegrade since it has no mod support?

I remember being blown away when Minecraft RT released, feels like version 2.0 of the game.

 
I haven't played MC in a while, is the RT edition still a sidegrade since it has no mod support?

I remember being blown away when Minecraft RT released, feels like version 2.0 of the game.


Minecraft RTX is based on the Bedrock codebase for Minecraft (cross-platform with Win10/11, consoles, and mobile OSes) and as such is non-moddable. Maybe someday it will but as I understand it, code efficiency and multidevice interoperability is the point of Bedrock, not code flexibility. So Minecraft RTX is stuck being unmoddable.

Which is fine as I prefer MC Java for it's ridiculous flexibility, which comes at the expense of old, twisted, crappy code which itself has mods to mitigate much of that. Lol, but not fix. There are straight up RT mods for Java which I haven't tried yet as frankly, a good set of shaders IMO look better than the RT implementations I've see so far. That will change but shaders have had quite the head start.
 
• DLSS: meaning in games where it's available I'm getting give or take same image quality on Balanced if compared to FSR Quality. Often better but not always. Contributes to ~15 percent free performance in a limited number of games. DLAA makes gaming at "native" much more compelling on green GPUs. Ten percent premium over AMD.
• Better RT support: meaning in games like Alan Wake 2 or Cyberpunk 2077 I would get much more all-rounded experience. At 400+ USD and at 7900 XTX level of performance, you expect enabling at least some RT. Deserves about five to ten percent additional premium.
• Better professional software support. Easily quantifies to another five to ten percent premium being reasonable.

So, I would've gone no-brainerly NV in case 4080S being no more than 125% the price. 126 to 135 % is "it's probably still worth it" territory. For 136 percent or more, NV GPUs will be overpriced. FSR 3.1 is 1.5 years too late.
 
At present? I'd be willing to pay up to 10% more for Nvidia's perks if I was looking for a new card.
 
Minecraft
I'd like to point out that TPU tests with Ray Tracing - which means high settings, but not Path Tracing or "Full Ray Tracing" as NVIDIA calls it.

The point i'm trying to make here, is that as you increase the load on the ray tracing hardware of these cards, the NVIDIA cards become faster, relatively, to their AMD equivalents. This is important to understand, because some popular games include very lightweight or basic implementations of "ray tracing", such as global illumination only, for example. This skews the data slightly, because the "ray tracing" performance penalty is much smaller, than if the entire game's lighting was ray or path traced, instead of a hybrid design of rasterised lighting and ray/path traced lighting.
I am very aware of that. I thought I made that clear already in the thread but now you know. We have actually discussed exactly that in other threads. I chose to say "The 4080s has ~20% more RT performance than the 7900xtx according to TechPowerUP reviews" rather than reference path tracing because so few games use it. Are there even more than 10 games that use path tracing so far? The handful of people who play those games should understand their performance needs and that should reflect upon their purchasing decisions. Someone who plays a lot of path tracing games would be willing to pay a significantly higher price for Nvidia Hardware because it is simply worth it for them.
Obviously 4K path traced gaming isn't currently viable at native, without using some form of upscaling/tech to improve FPS, or any combination of performance/quality improving tech such as DLSS/DLAA, Frame Generation and Ray Reconstruction. ...when these cards are actually stressed with intensive ray/path tracing implementations.

I wanted to write this because I don't think people (especially people who don't own an RTX card, or even those who haven't tried a higher end Ada generation card) really understand the difference in performance between the two vendors, and just how far ahead NVIDIA is.
Path tracing performance, up scaling, frame generation are all within Nvidia's feature set and are something people should consider in their purchasing decisions.

With the release of the PS5 Pro and the eventual Xbox Series refresh, the PS5 Pro is rumoured to have significantly faster ray tracing performance, meaning developers will probably start using heavier RT/PT implementations. But I doubt we'll see widespread path tracing until the next generation of consoles are released, e.g. PS6.

As developers start to actually make full use of the new lighting techniques of the latest game engines moving forward, I expect this trend will really start to show the differences in performance more completely, and game performance testing will show numbers skewing closer and closer to what's been shown here. Path traced lighting, e.g. no rasterized lighting at all, is the obvious end game.

This is what I'm getting at, most games today don't come close to fully using the ray tracing hardware on current generation cards, so even with "ray tracing" turned on, the FPS is still dictated by classic rasterization performance. This will change, as games use heavier and heavier RT, or full RT/PT implementations.
I cannot speak for everyone. There are obviously those who are currently enjoying ray tracing games. I personally do not care about most of Nvidia's feature set.

I do not play path tracing games. I have only ever played one ray tracing game and my meager 6750xt gets adequate performance. As pretty as ray tracing and path tracing are, I do not value it right now. Too few games use it. Even fewer are games I am likely to play.
DLSS of any version is great but not necessary to me. Too few games use it. Even fewer are games I am likely to play.
Nvenc is amazing. I enjoyed it with my gtx 1060. I don't often record gameplay so that feature goes unused with me.
CUDA and in general compute features and performance is wasted in me. I haven't done anything with compute since 2014 if I ever did.
The ai hdr feature interests and reflex are the two features that interest me most but I can easily live without. I would rather spend less money.
I am probably forgetting about features that is how little they matter to me right now.

Maybe one day xx60 class gpu has above 5090 performance and games I actually play heavily use ray tracing I will care. That is me though. That is why I only value nvidia's feature set 5 to 10% higher than AMD. Other people have different priorities and will value it differently.
I remember being blown away when Minecraft RT released, feels like version 2.0 of the game.
I think Minecraft RT is one of the best showcases of ray tracing. Minecraft does not have fancy graphics. Everything is blocky and low res by default. When ray tracing is the only thing adding visual interest it really shows how big a difference ray tracing can make.
 
I utterly don't care about any NVIDIA features, so I voted 5%.
My brother has more preference to high FPS, is more into ray tracing and is completely fine with DLSS, so he would be fine with 20~25%.

A bit extreme difference, I know.
 
I think my opinion on this aligns with what some others have said about where the GPUs sit in the stack.

In the lower end, the added value of a featureset is near zero, you just want maximum GPU hardware for your money. I don't really count upscale technologies personally, because they're going to be universal in due time, the DLSS lead will evaporate sooner rather than later.

In the higher end, while I don't care much for RT in gaming yet, of course it will get better over time. It already does but while Nvidia leads on it, its still too much of a marketing plaything, a game I'm not playing. To me personally the feature isn't worth much if anything, but I do appreciate the idea of the Nvidia product being able to do cutting edge stuff better. Fun to play around with. Would I pay premium for that... I think so. 15% would be the top end of that in 2024 / current gen.

But looking forward, I don't know if its entirely plausible Nvidia's going to keep their lead. RDNA4 might not compete in the high end, it might just as well (and instead) place more focus on refinement of its featureset. Upscale is already moving that way it seems lately.
 
A bit off topic.


It would be a huge problem for gamers, if nVidia tries to become 3Dfx.
Trying to push the developers to use their SDK for RT acceleration.
 
A bit off topic.


It would be a huge problem for gamers, if nVidia tries to become 3Dfx.
Trying to push the developers to use their SDK for RT acceleration.
Nah its not a problem for gamers at all. We survived 3Dfx too. Nvidia is taking a bigger risk here.

This is a game Nvidia can't win. If they would hypothetically drive all gaming GPUs including consoles, they would have defacto monopoly on x86 gaming and it would trigger a response. Or, ARM would have by then made its way in the market, but I don't see that going places soon. And even then: how could a competitor enter the market if Nvidia owns the tech to play the games? There are a LOT of partners in the value chain that wouldn't like this one bit.
 
For me AMD has to offer at least a combination of things that equal at least 20-30% more for my money.

That can be a combination of more vram and general performance but I won't touch an AMD card for one of my primary system till they massively improve RT performance in games like CP/Witcher/Alan Wake and massively improve FSR. I currently only consider them in the below 400 usd range where RT and upscaling don't really matter due to being trash at 1080p and too taxing on anything lower than a 4070.

The problem this generation is both the 7600 and 7700XT are meh AF as well as the competing Nvidia cards it really doesn't start to get mildly interesting till we hit 500 usd with the 7800XT/7900GRE 4070/4070 super and I'd only use that class of perfomance at 1080p so it would really come down to my goals with the system.

As for a theoretical 8800XT for 400 usd which seems way too optimistic as we know AMD will just price it 10-20% below whatever replaces the 4080. For me there are too many variables to say for certain but if AMD improve RT/efficiency/upscaling to at least be ballpark with what Nvidia offers then maybe 15% cheaper would have me recommending it over the green option.
 
The only feature I look at is the actual performance and price.

Where is the “not worth a single cent more” option? Biased lol
 
It's objective analysis of data generated by TPU's own testing.

But I'm not surprised the guy with a full AMD rig might think otherwise.
Probably wanted to vote 0% or minus % because 'vibes'.

The feature set has certainly been worth something to me, the last 3.5 years of vastly superior upscaling, solid RT experiences (partly enabled by DLSS), broadcast, and now RTX HDR off the top of my head. Remains to be seen which if any of those features/characteristics will be legitimately matched in coming years. And those features weren't even that mature when I bought, imo the card has aged like... A fine wine, partly due to the sheer muscle it packs, but largely due to the feature refinement and polish.

All this enjoyed enjoyed at both the higher end with a 3080 and lower perf and low power end with an A2000. Seems to be worth at the very least 15% to me for the richer features, if not 20%. I'd happily do an insta buy of an AMD card if it was 75%+ faster than my 3080 and 25%+ cheaper than the competing Nvidia product at raster, there's a point where that's a no brainer. There'd still be a part of me that would miss those features though.
 
there's a point where that's a no brainer.
You would think but I have talked to a few people who would be willing to pay whatever for Nvidia's feature set. Ray tracing. upscaling, all of it is stuff they use everyday and AMD is not a viable alternative under any circumstances right now. I asked one guy if the 7900xtx was $1 for everyone and plentiful. Anyone could buy as many as they wanted for $1. He says he would still buy a 4090.
 
You would think but I have talked to a few people who would be willing to pay whatever for Nvidia's feature set. Ray tracing. upscaling, all of it is stuff they use everyday and AMD is not a viable alternative under any circumstances right now. I asked one guy if the 7900xtx was $1 for everyone and plentiful. Anyone could buy as many as they wanted for $1. He says he would still buy a 4090.
Wow what a friend, that seems like he has other motivations beyond the cards themselves maybe, like political as we see so often. Or he just loves the features that much but I mean, 7900XTX for 1$ just seems impossible to refuse.

I can see a possibility in there if you are so unreasonably, disgustingly rich that the difference between $1 and $1600 is utterly insignificant to you, but somehow I doubt that's your friend.
 
He has money and really wants the features.
 
Nah its not a problem for gamers at all. We survived 3Dfx too. Nvidia is taking a bigger risk here.

This is a game Nvidia can't win. If they would hypothetically drive all gaming GPUs including consoles, they would have defacto monopoly on x86 gaming and it would trigger a response. Or, ARM would have by then made its way in the market, but I don't see that going places soon. And even then: how could a competitor enter the market if Nvidia owns the tech to play the games? There are a LOT of partners in the value chain that wouldn't like this one bit.

Without Glide support, we could play Unreal and Fifa 98 etc. back then. But they looked like shXt.
When we run the same games in the Voodoo 3, it was like we were seeing the light for first time.

The same could happen now. If you have nVidia RT SDK support, you can play Alan Wake 3, let's say, with all the bell and whistles on. If not, take a 2005 level of graphics.
 
Without Glide support, we could play Unreal and Fifa 98 etc. back then. But they looked like shXt.
When we run the same games in the Voodoo 3, it was like we were seeing the light for first time.

The same could happen now. If you have nVidia RT SDK support, you can play Alan Wake 3, let's say, with all the bell and whistles on. If not, take a 2005 level of graphics.
Yeah but that low hanging fruit is gone. Even now RT and raster are often interchangeable.. all games look fine
 
Back
Top