• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Spider-Man 2 Performance Benchmark

I recommend people view the comparison slider images with one having max RT and one just being max without RT.

I know RT is more accurate but man, just seeing how max adds some shadow detail here and there that arnt there in RT, which again is probably more realistic....it does make the non RT version more believable looking, on RT a lot of it is just so flat and ugly in how it meshes together. (mostly referring to the 2nd image, the one outside)
I feel like non RT shadows are actually doing too much sometimes, some shadows are so big that you get the impression that there's a big gap between some objects (like the ceilling with the wood moulding strip), or they seem to be hovering. But you are right that RT does seems awfully flat at times...there's probably improvement to be made to their Global illumination, because the shadows under the object/characters should be smaller not non-existent :D
Edit I just saw that they are using RT AO instead of RT GI. That might be the explanation.
1738534107231.png
1738533879272.png
 
Last edited:
I feel like non RT shadows are actually doing too much sometimes, some shadows are so big that you get the impression that there's a big gap between some objects (like the ceilling with the wood moulding strip), or they seem to be hovering. But you are right that RT does seems awfully flat at times...there's probably improvement to be made to their Global illumination, because the shadows under the object/characters should be smaller not non-existent :D
Edit I just saw that they are using RT AO instead of RT GI. That might be the explanation.
View attachment 383003View attachment 383002
RT looks as it should (i.e. realistic). Beyond that, it's all in the hands of the artist to decide whether they want to add texture detail, light sources, mess with the light color. Just like rasterization, RT is a tool only. How you use it is a different matter.
I keep recycling this example, but back when we got pixel shader 3.0, game devs didn't know how to use that properly either. All characters looked like they made of plastic. Devs got the hang of it eventually.
 
RT looks as it should (i.e. realistic). Beyond that, it's all in the hands of the artist to decide whether they want to add texture detail, light sources, mess with the light color. Just like rasterization, RT is a tool only. How you use it is a different matter.
I keep recycling this example, but back when we got pixel shader 3.0, game devs didn't know how to use that properly either. All characters looked like they made of plastic. Devs got the hang of it eventually.
That's true, but I've learned that even path tracing can look noticeably different from one engine to the other. The shape and position of the shadows is the same, but the intensity is different. Offline 3D pictures are rarely used "as in" you can tweak shadows/reflection/specularity in post, but I don't know how that would work for real-time applications, besides tweaking how the raw render looks.
1738538685787.png

comparison.jpg
 
That's true, but I've learned that even path tracing can look noticeably different from one engine to the other. The shape and position of the shadows is the same, but the intensity is different. Offline 3D pictures are rarely used "as in" you can tweak shadows/reflection/specularity in post, but I don't know how that would work for real-time applications, besides tweaking how the raw render looks.
View attachment 383007
View attachment 383009
I guess it depends. For professional work, you'd think someone would have come up with some sort of reference by now. For games... that's just entertainment, I don't think being spot-on matters much. Though it's not inconceivable to put a bunch of scenes through a software/CPU renderer and say "this is what the scenes must look like". Sort of what happened during the early AF optimizations days (not sure whether you were into 3D gfx back then).
 
Great review, good to know RT does nothing but tank performance on everything. I agree, many older games look far better. The waterfall is highly realistic.

I get it’s a cartoonish game, it should have been left as just that, low enough specs that a potato could play it to bring the masses together.
 
Thanks for the test!

At 1440p, no upscaling, the 4070 Ti Super has only 28fps less than the 5080, i.e. 111fps vs 139fps, WTF

How bad is the 5070 going to be?

I don't know, but let's take a moment to really, really fixate on the 4070Ti 12GB. The 4080 12GB. 12288MB. The wanted-to-be-$900 card. The actually cost eight hundred goddamn dollar card.

Really, just...take it in. Breathe it in. This is why I attempt to explain things to people. For these moments. Then think about the 5070.


1738547322974.png


1738547390064.png


What more can you say, really?

SCORN APPROACHING!
 
Last edited:
What I see is that the 24GB of vram on the 7900xtx never saves you when RT/PT is on.
It’s highly more likely to get better fps from a 12GB 4070Ti rather than the 7900XTX.
And we got already games with no option to disable the RT…
 
\
I don't know, but let's take a moment to really, really fixate on the 4070Ti 12GB. The 4080 12GB. 12288MB. The wanted-to-be-$900 card. The actually cost eight hundred goddamn dollar card.

Really, just...take it in. Breathe it in. This is why I attempt to explain things to people. For these moments. Then think about the 5070.


View attachment 383017

View attachment 383018

What more can you say, really?

SCORN APPROACHING!
What are we looking at? The 4070ti, an 800$ card is faster than the 7900xt, a 900$ card and closer to the xtx, a 1000$ card.
 
Another terribly optimized game, i can barely see a difference in both shadows and texture comparing MAX RT ULTIMATE and Very Low but somehow on max settings the fps get destroyed.......... i see no reason to play above medium settings, seriously.

Also Crysis 3 looks better than this
 
Crazy when you consider we had to add RT and Tensor cores to GPUs just to get that.

You have to wonder how good rasterizated lighting implementations would have been had we continued to invest fully into them. Threat interactive already demonstrated that light bounces can be simulated via surface probes using traditional rasterized techniques. I guess we'll never find out because Nvidia threw and continues to throw money at ensuring RT is everywhere. I guess that's one of the dangers of a vertical monopoly, Nvidia exerts influence over the software to integrate features that it wants to push which in turns creates an incentive for customers to buy Nvidia cards and not competitor's products. The more influence Nvidia has over the software, the more it can influence hardware sales.

RT performance needs massive uplifts to be viable still, meanwhile Nvidia is out here selling 8% performance gains at ripoff prices. RT effects like shadows are often still done at low res and obviously grainy. We still have 0 light bounces and 0.5 rays per pixel. Nvidia but need increase to 1 ray per pixel to double the RT workload.

At the current rate RT will take another 4 generations to get to a decent place and god knows how long before that trickles down to the low end.
Raytracing at home. What Path of Exile 2 uses:
 
\

What are we looking at? The 4070ti, an 800$ card is faster than the 7900xt, a 900$ card and closer to the xtx, a 1000$ card.
The 4070Ti runs out of VRAM at 1440p with RT on. I wonder how the 5070 will do when it only has 12GB.
And the 7900XT has been well under $900.
 
I feel the need to point out the 1440p results are likely to skew a little bit lower than most users would report here as the test system has a 16:10 aspect ratio shown in the settings images. The scaling would still likely leave the rankings unchanged, but the difference between 2560x1600 and 2560x1440 is about 400,000 pixels per frame needing to be rendered. If this wasn't accounted for, you could expect an 11% performance difference if the scaling is linear
 
The 4070Ti runs out of VRAM at 1440p with RT on. I wonder how the 5070 will do when it only has 12GB.
And the 7900XT has been well under $900.
It has higher minimums than the more expensive 7900xt and is close enough to the 24gb super expensive xtx. But it runs out of vram... Okay
 
The 4070Ti runs out of VRAM at 1440p with RT on. I wonder how the 5070 will do when it only has 12GB.
And the 7900XT has been well under $900.
Again, there's a difference between "game AAA needs BBB amount of VRAM" and "game AAA uses BBB amount of VRAM". Since 4070Ti's performance does take a sharp dive at higher resolutions, that would suggest this title falls into the latter category. Most do.
 
I absolutely love that you added the performance with RT + Quality upscaling. Fantastic addition to the benchmark results, since it reflects what ~95% of the people with balanced setups will actually do.
 
What are you using to get the FPS for the AMD cards? There is a huge discrepancy between reported FPS numbers between Afterburner and AMD overlay.
 
I don't know, but let's take a moment to really, really fixate on the 4070Ti 12GB. The 4080 12GB. 12288MB. The wanted-to-be-$900 card. The actually cost eight hundred goddamn dollar card.

Really, just...take it in. Breathe it in. This is why I attempt to explain things to people. For these moments. Then think about the 5070.


View attachment 383017

View attachment 383018

What more can you say, really?

SCORN APPROACHING!
It's obviously not a VRAM issue when the 24GB 3090 and 7900 XTX are getting the same performance. Not quite sure what point you're trying to make. This looks completely normal with the expected strength of the GPUs. 4070-Ti/4070 Super and the 3090 are all around the same strength in regular performance and ray tracing, and they get the same performance and 1% lows, with AMD being trash in RT as usual.
 
7 years since the introduction of ray tracing and we still only have reflections on glass paying half the total fps... embarrassing.
Thanks Nvidia for this legalized scam, let's wait for the PS6.
 
7 years since the introduction of ray tracing and we still only have reflections on glass paying half the total fps... embarrassing.
Thanks Nvidia for this legalized scam, let's wait for the PS6.
You have much more than that. You have correct shadows, proper global illumination... Of course it doesn't help when dev go "oh, this ray tracing thing is starting to work? let's do path tracing then...".
But yes, opening that can of worms when getting ready to face the limits of silicon probably didn't do us many favors.
 
This game come with several RT options,,, and RT reflection is the only thing worth using

Instead of testing only RT option off or RT at max... why not also test RT reflection only ??
 
What I see is that the 24GB of vram on the 7900xtx never saves you when RT/PT is on.
It’s highly more likely to get better fps from a 12GB 4070Ti rather than the 7900XTX.
And we got already games with no option to disable the RT…

No, it saves you in 4k60 raster, which is where and why even the 4080 gets smooshed.

\

What are we looking at? The 4070ti, an 800$ card is faster than the 7900xt, a 900$ card and closer to the xtx, a 1000$ card.

It can't keep 60fps with 960p upscaling to 1440p because it clearly runs out of the ram. I mean, literally look at the VRAM usage and the amount under 60 and do the appropriate math.
It will not get better in the future. You want at least 45TF/16GB. Weird how 4070Ti and 5070 are purposely below that (except not at all, bc nVIDIA!)

This is all by design. If you live in denial about the ram, you live in denial about the ram.
I'm just reporting the facts of the situation so people don't fall into the same trap with 4k upscaling and 5080 in the future. Because it happens. That is literally why they did it. And will sell a 24GB super.
Probably. Which unless they clock it a ton higher will make absolutely no sense bc it won't have enough resources for 24GB to make sense. They will sell a (probably 11264-12288sp) 6080 on 3nm that does.
Because the 16GB too will run out of ram in new titles at 1440p->4k with all this stuff enabled; likely to be >16GB, perhaps lower than 18GB. Look at 5080, now scale to 60 and add proportionate RAM.
That is what a well-matched card would be, which 5080 is not. Again, this is the job of 192-bit 3nm cards (and people need to be aware of that) and why 5080 16GB is a ridiculous investment.
1738603023715.png



Don't buy a 4070ti/5070 for 1440p (even upscaled) for RT. That is the point. It's likely any future competition will also be playable in 1080p RT (960->1440p 9070xtx?) bc 16GB of ram.

Again, otoh, 6800xt for 1440p60, 7900xtx for 4k60 native non-rt. 4090 for all the things.
Obviously the upcoming the upcoming 9070 series is a the most sensible wrt perf/$ longevity, but an OC 7800xt ain't bad for 1440p native. 6800xt ok. None of them unplayable for 1440p.
7800/9070 for 1080p RT, 9070xtx for (960p) 1440p upscaled RT. I'm sure there are instances 4070ti/5070 will be okay, but I'm betting on it not being the case more often than not bc <45TF/16GB.
I'm hoping most logical people can understand what's happening and understand my point.

I'm not trying to insult people that made that choice. I'm suggesting people make common-sense choices.
It's obviously not a VRAM issue when the 24GB 3090 and 7900 XTX are getting the same performance. Not quite sure what point you're trying to make. This looks completely normal with the expected strength of the GPUs. 4070-Ti/4070 Super and the 3090 are all around the same strength in regular performance and ray tracing, and they get the same performance and 1% lows, with AMD being trash in RT as usual.

You don't buy a 7000 series for RT. But if you bought an XTX for 4k60 raster, you could still upscale from 960/1080->1440p/4k RT (with OC) is my point. Not with either using a similar-enough priced 4070ti.

They don't have regular 1440p->4k upscaling non-RT on the graph, but that is where 7900xt shines and 4070ti would falter. Same with 9070 (1440p native, 1440->4k upscaling sometimes non-rt) vs 5070.

Do what you want...I don't really want to have this argument until after 5070 and 9070 series launch.

It will prove my point on a greater scale about the common sense of AMD and the constant upgrade cycle required with nVIDIA (outside 4090).

In reality, people should again be waiting for 192-bit parts (comparable to a PS6) or 256-bit parts (comparable to a 4090). Those will be aimed at what people actually want.

I don't care which company you buy. I'm simply saying I wouldn't be buying a card for >1440p non-rt RN. And that is the point of the 9070 series (and 6800xt/7800xt for that matter, granted 9070 1080pRT).

If you want native 4k max non-rt, a 7900xtx will get you that. So will a 5080. A 5080 will also run out of ram upscaling and/or in native 4k relatively soon. A 7900xtx will not, but perhaps lack enough raster.
Again, this is why you wait for a 256-bit 3nm card that will replace the 5080/7900xtx and perform like a 4090, or buy the step down that will be faster but have 18GB of ram.

It will all become more clear as more games and their engines start to show signs of being prepared to be ported to the PS6.
There is a reason why the 4090 is exactly 1440p->4k upscaled RT. The PS6 would likely do that from 1080p, which means a card that performs like a 5080 (/9070 XTX?) but in some cases may use more ram.

I'm sure we'll have this conversation more over the next year or so leading into whatever the PS6 and 3nm specs turn out to be...even though you can already predict them more-or-less.

(11264?) ~11520-12288sp @ 3.7-4ghz 256-bit/24GB
8192-9216sp @ 192-bit/18GB

I will buy the perf/$ card at 11264sp(+), because it should scale 1440p->4k in almost all instances a PS6 is 1080p->4k. Exactly like a (at worst overclocked) 4090. Exactly like how 2080ti lasted through ps5.
If you want to buy a 192-bit card then I wouldn't blame you, but it's really toss-up that or a PS6 (if you're just gaming). Cards will likely come first and (I would hope) be similar-priced or cheaper.
Again, nothing available rn makes sense for over 16GB/1440p (1080p RT) unless you bought a 4090. 1440p (1080p RT) DOES make sense, nVIDIA is taking the over/under, because that's how they getcha.
And 9070 will hit it on the head.
That's exactly why AMD didn't make a faster card this gen. Couldn't beat a 4090 and that card will scale for a long time at precisely what you see here. AMD need to match it, again is the point of 3nm.
They will likely have to beat the price of a 6080 with similar characteristics for $1000, maybe AMD priced Ti-level, and then we can all be happy for a long time, whatever brand we personally choose.

That's the main take-away I want people to understand. Look at how this game performs on a 4090. Look at Timespy GT1 scores (and imagine a card at 120fps; essentially a 4090).

Everything is in flux, but the 4090 is the centerpiece of the next generation of gaming, more-or-less. 1440p->4k RT upscaling. 4k60 native max w/o RT.
The next step down would be 1080p->4k RT; a 5080 (9070xtx?) with more RAM. This would be your native 1440p card moving forward.
5080 and 9070 (arguably 6800xt/7800xt/4080) are good limbo picks for 1440p. Some of those cards are/will be vastly cheaper than others. This is what people should hold themselves over on IMHO.

If you don't believe me, then just wait and see. We'll see who's right. :)

I know I am, but that's not the point. The point is I want people to be happy and not foolishly waste their money. I also want nVIDIA to stop the shenanigans, and you all should too.
 
Last edited:
It can't keep 60fps with 960p upscaling to 1440p because it clearly runs out of the ram. I mean, literally look at the VRAM usage and the amount under 60 and do the appropriate math.
It will not get better in the future.
Neither can the much more expensive and with twice the vram 7900xtx. So what? Why aren't you complaining about the 7900xtx not keeping 60 fps at that exact same scenario?
 
I just bought this Game to see what the difference will be for AMD Overlay vs Afterburner.
 
Back
Top