• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Spider-Man 2 Performance Benchmark

So, RayTracing is a dead end. 6,5 years on the market and still unavailable to general public, unles You have $1K for a GPU alone. Also, with devs so lazy, TechPowerUp has to step up Its game and, like TechSpot start adding tests with Medium and High for people to know what playable settings They can expect with Their under-$1K GPUs. In the past You could simply wait with Your game purchase, and next gen GPU would improve Your frame rates, now that They came to the end of the rope with process shrinking, We no longer get to improve our PCs, unless emmigrate to the US and get a better paid job.
I just google "<game> <graphics card>" and there's almost never a shortage of people testing that game with that graphics card at all settings.

IMO, Techpowerup testing at max settings perfectly highlights the problem with RT and why, despite all the promises, it's rarely worth it for most people. Even if you'd splashed out on a PS5's worth of graphics card a generation or two ago, your RTX 2070 and RTX 3070 has aged terribly in the one thing it was named for.
My son's busy playing SM2 on a 1060 3GB. Getting surprisingly smooth gameplay (goes 35-55fps) on a 1080p60 TV. No crashes as yet, but some odd game bugs. He wants a new GPU but we're waiting on the next game patch first, as there are literally hundreds of bug reports - mostly with 40 series Nvidia cards, it seems, but also better AMDs too.

I told him he should wait a few months, but he'd played SM1 and MM (which were waaay less demanding, hardware-wise) and had been looking forward to this for so long.
That's why I linked the GTX 1060 video up above - not that long ago (mid 2022) it was the most popular GPU on the Steam hardware survey. Plenty of people are still using them!
 
This game looks really poor. Even the more-than-10-year-old Crysis 3 looks considerably better and more modern.

1738493205592.png


1738493237176.png



VS:

1738493284192.png



At least, it shows that 16 GB VRAM is a no-go already!

1738493346941.png
 
That's why I linked the GTX 1060 video up above - not that long ago (mid 2022) it was the most popular GPU on the Steam hardware survey. Plenty of people are still using them!

Yeah I bought it for Zwifting in the garage. He's using my HTPC, which until Thursday night had an 8GB RX580 Nitro+ in it. But that card isn't allowed to run the game due to the drivers (you need 25.1.1). So, given that an upgrade is probably long overdue anyway, we're looking for an AMD card* like the 7800XT perhaps. Which is what brought me to the TPU test round-up. Very useful indeed.

*need frame-packed 3D, plus ideally 4k60 (HTPC is attached to a TV and a 4K projector). The 1060 isn't nearly as good at keeping 23.976 movies sync'd as the 580 and doesn't easily do 3D mvc playback.
 
Last edited:
I'm happy to see that plenty of people in the comments recognise how much work TPU put into these benchmarks, and I'll gladly join them. Thanks for this massive benchmark. This is why I support TPU.
The side-by-sides with drop downs for various settings is really valuable. It's so much more useful than YouTube vids that rarely zoom in enough to overcome the terrible compression of YouTube.
 
I can't really tell much of a difference between medium and RT ultimate outside of the shadows and reflections on glass in the outdoors scene. It's really annoying see all the comments in benchmark after benchmark about people complaining about optimization on games, claiming they need a 4090 or 5090 to play when they could just turn the settings down.

The "Max" quality benchmarks we always see feel misleading because of that.
 
Well done for the effort but I can't understand how almost half of the 7800xt's frames were lost in 4k non RT resolution? I tried the same settings in 3 different scenes and the loss was always at 15%.
 
It's funny seeing the total shift in PC gaming where a decade ago running below console settings was unthinkable and now it's common advice. PS5 runs this just fine with RT in every single mode, even the performance mode has RT.
 
I can't really tell much of a difference between medium and RT ultimate outside of the shadows and reflections on glass in the outdoors scene. It's really annoying see all the comments in benchmark after benchmark about people complaining about optimization on games, claiming they need a 4090 or 5090 to play when they could just turn the settings down.

The "Max" quality benchmarks we always see feel misleading because of that.
I feel like the RT lighting (ambient occlusion, mostly) is closest to raster medium or high settings.
Based on W1zzard's screenshots, I think I'd be playing this game at non-RT high settings, as the AO seems overdone at higher quality levels.
 
PThink of the RT option as a wholly optional game mode that you're free to ignore, like those tick boxes for ambient lighting sync or HairWorks. It doesn't make sense to me to complain so bitterly that it exists. It's bad; leave it off.
We are complaining because Nvidia makes us pay for something we never asked for or wanted. GPUs are so expensive because of the added RT and tensor garbage Nvidia pushes on the gaming community.

Do I have the option of paying less for a geforce product by buying a non-RT version? No! And for that reason RT better work and better be worth it. But it’s not!

I don’t remember paying $500-$1000 extra dollars on a GPU for Hairworks or ambient lighting sync.
 
Last edited:
This game has low requirements but just thrashes modern hardware lol.
 
For the first time there is such a big gap between 4060 ti 16gb vs 8gb at 1080p (or I didn’t notice earlier)
 
This game looks really poor. Even the more-than-10-year-old Crysis 3 looks considerably better and more modern.

snip
VS:
snip

At least, it shows that 16 GB VRAM is a no-go already!
This a joke right?
 
There are three ways to respond this this:
  1. My comments were in dollars, not Euros, Pounds, or Yen.
  2. $200 equivalent is solidly 1440p territory at major retailers in most of the developed world. If you're not in Europe, Canada, China, or Japan, AliExpress will sell to you in US dollars anyway.
  3. Querying panel type is a straw-man argument here, Not that it matters if the game is only running at 30-35fps because even a terrible VA panel could handle that just fine.
No @Chrispy_ the panel type and quality is a serious argument, not only because mid-range GPUs like mine achieve decent framerates, but also (and more important) because a monitor is not only for gaming.
 
I got so many games on hold right now. My 6900XT ain't playing anything other than Overwatch and Marvel Rivals because that is all that it can handle without it turning into a View Master

1738513258521.png
 
At 1440p, no upscaling, the 4070 Ti Super has only 28fps less than the 5080, i.e. 111fps vs 139fps, WTF
It's very simple, 20% difference in shaders 20% difference in performance.
How bad is the 5070 going to be
A little bit faster than a 4070 and below 4070 Super.
 
below 4070 Super.
I'd definitely place it below 4070 Ti but not below 4070 Super. Blackwell clocks incredibly high. We might as well have 3.1+ GHz out of the box. This compensates for abysmal core count a little bit.
 
That's what happens when devs still bother with classical lighting.

Can see the same in Witcher 3. Only difference RT makes is darker shadows, still all in the same place. Only costs half your FPS. Meanwhile, non-RT reflections in CP2077 reflect a blocky minecraft world back at you. No wonder they didn't implement mirrors.

Crazy when you consider we had to add RT and Tensor cores to GPUs just to get that.

You have to wonder how good rasterizated lighting implementations would have been had we continued to invest fully into them. Threat interactive already demonstrated that light bounces can be simulated via surface probes using traditional rasterized techniques. I guess we'll never find out because Nvidia threw and continues to throw money at ensuring RT is everywhere. I guess that's one of the dangers of a vertical monopoly, Nvidia exerts influence over the software to integrate features that it wants to push which in turns creates an incentive for customers to buy Nvidia cards and not competitor's products. The more influence Nvidia has over the software, the more it can influence hardware sales.

RT performance needs massive uplifts to be viable still, meanwhile Nvidia is out here selling 8% performance gains at ripoff prices. RT effects like shadows are often still done at low res and obviously grainy. We still have 0 light bounces and 0.5 rays per pixel. Nvidia but need increase to 1 ray per pixel to double the RT workload.

At the current rate RT will take another 4 generations to get to a decent place and god knows how long before that trickles down to the low end.
 
Last edited:
My 4070Ti runs at a little over 3GHz, I would hope these would be around 3.4 lol.. to me that would be a good clocker.
 
We know exactly which part of the game was used for the measurements because I have completely different numbers with the 7800xt at MAX RT in all 3 resolutions.
 
At the current rate RT will take another 4 generations to get to a decent place and god knows how long before that trickles down to the low end.
It will never get to a decent place, like many other workloads in graphics it's something that scales linearly. If you want twice the performance you need twice the number of shaders but if the impact to performance was -50%, it will remain -50% forever.
 
Another "Starfield incident" for Intel with B580. Is there a deep dive to read somewhere re: what are they doing so different with their cards with architecture or the way the work compared to others? I feel like if you don't care about a few extra fps you can usually get away with not updating the drivers for a whole year and the games still work fine for the most part with both Nvidia and AMD, but Intel's team needs to spend time on so many releases for them to work somewhat properly or at all.
 
The comments are more interesting than the article. Good to see people are seeing what RT is and comparing the overall visual experience to much older games like Crysis 3.

This is not progress.
 
If you expect to see good RT in a console port to PC game, ok....good luck.
The game is developed with console mindset.
That's why it runs heavy on the cpu side, since the PS5/XBX are equipped with a decent 8/16 core CPU, but not on the gpu side, since the equivalent to 6700 is good only for raster.

Console ports shouldn't even be considered as reference to check differences on RT. Since they apply the very basic effects at the lowest possible resolution and depth.
 
Back
Top