• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Spider-Man 2 Performance Benchmark

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,663 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Spider-Man 2 has arrived for the PC Platform. Sony's newest port offers fantastic super hero gameplay, but is plagued by instability issues and crashes. Once you get it to run stable, performance is pretty decent though, and all the major technologies are supported. In our performance review, we'll look at the game's graphics quality, VRAM consumption, and how it runs across a range of contemporary graphics cards.

Show full review
 
Wow, that’s a lot of work.
 
I recommend people view the comparison slider images with one having max RT and one just being max without RT.

I know RT is more accurate but man, just seeing how max adds some shadow detail here and there that arnt there in RT, which again is probably more realistic....it does make the non RT version more believable looking, on RT a lot of it is just so flat and ugly in how it meshes together. (mostly referring to the 2nd image, the one outside)
 
Last edited:
Awesome work, I imagine you haven't slept for a couple weeks now..... Love the comparisons as always.
 
Sheesh not a good game for the B580, that performance is dreadful.
AMD performance is also notably poorer than expected, especially if you're running out of VRAM as W1zzard mentioned.

I'd expect a fix from Intel soon, but AMD I'm not so sure about.
 
Sheesh not a good game for the B580, that performance is dreadful.
AMD performance is also notably poorer than expected, especially if you're running out of VRAM as W1zzard mentioned.

I'd expect a fix from Intel soon, but AMD I'm not so sure about.

How come?
Seems to me with the crashing etc it will require effort from the driver teams but also some patches from the devs, idk why you think you can already make such remarks.
 
This is what's wrong with the gaming industry; The two $500 graphics cards are delivering low-30s framrates at a $200 monitor resolution using RT which has been nothing short of a disaster for mainstream gaming for the last 6 straight years.

It doesn't look bad, but it certainly isn't a huge leap forwards over older games that run at 200fps on the same hardware, and graphics do not make the gameplay design any better. The bugs are still bugs, and the bad aspects of the gameplay are still bad.

Overall, the game is supposed to be fun, but it could have been fun without the godawful RT implementation and miserable RT performance.
 
This is what's wrong with the gaming industry; The two $500 graphics cards are delivering low-30s framrates at a $200 monitor resolution using RT which has been nothing short of a disaster for mainstream gaming for the last 6 straight years.

It doesn't look bad, but it certainly isn't a huge leap forwards over older games that run at 200fps on the same hardware, and graphics do not make the gameplay design any better. The bugs are still bugs, and the bad aspects of the gameplay are still bad.

Overall, the game is supposed to be fun, but it could have been fun without the godawful RT implementation and miserable RT performance.
What do you mean. The 4070 is getting above 60fps with max RT at 1080p not 30fps?
 
I recommend people view the comparison slider images with one having max RT and one just being max without RT.

I know RT is more accurate but man, just seeing how max adds some shadow detail here and there that arnt there in RT, which again is probably more realistic....it does make the non RT version more believable looking, on RT a lot of it is just so flat and ugly in how it meshes together. (mostly referring to the 2nd image, the one outside)
That's what happens when devs still bother with classical lighting.

Can see the same in Witcher 3. Only difference RT makes is darker shadows, still all in the same place. Only costs half your FPS. Meanwhile, non-RT reflections in CP2077 reflect a blocky minecraft world back at you. No wonder they didn't implement mirrors.
 
I recommend people view the comparison slider images with one having max RT and one just being max without RT.

I know RT is more accurate but man, just seeing how max adds some shadow detail here and there that arnt there in RT, which again is probably more realistic....it does make the non RT version more believable looking, on RT a lot of it is just so flat and ugly in how it meshes together. (mostly referring to the 2nd image, the one outside)
Actually, the shadows max (no RT) are unrealistic. Like the shadows at the bottom of the ledge in the lower left.
And if you want to talk max RT vs max non-RT, how about those off-screen window reflections (same scene) that non-RT cannot compute and just smears the windows instead?
 
I recommend people view the comparison slider images with one having max RT and one just being max without RT.

I know RT is more accurate but man, just seeing how max adds some shadow detail here and there that arnt there in RT, which again is probably more realistic....it does make the non RT version more believable looking, on RT a lot of it is just so flat and ugly in how it meshes together. (mostly referring to the 2nd image, the one outside)

The more important thing:

Look at those 1440p RT (with 960p upscaling) mins for 4070ti.
If you spent a not-dissimilar amount of money on a 7900xtx (for absurdly better raster) you could actually keep 60fps mins (with a little OC)! Or even 4k native w/o RT. 4080 not even close at 4k native.
Look at those 4k RT (1440p upscaling) mins with 4090/5080. Ooo....Missed it by that much. Totally not made that way at all on purpose so overclocking will still keep you at below 60. Time to upgrade! :(
Not good-enough for native 1440p RT for 5080 either.

6800xt good-enough for 1440p native max non-RT. Cheap as nuts.

Ain't that a b. Or rather an L for the overpriced nvidia cards.

...and prove my point EXACTLY about what nVIDIA does.

Or, you know, exactly why you shouldn't invest in ray-tracing until 3nm (built toward next-gen consoles)...which I have said for 6 damn years and will continue to say until both companies launch 3nm parts.

I will then declare it has arrived, and we can all play together on whatever cards that can actually run the damn thing for more than one gen of games.

...But I can't promise it won't still be ugly and/or still won't be worth the performance trade-off. :p
 
This is what's wrong with the gaming industry; The two $500 graphics cards are delivering low-30s framrates at a $200 monitor resolution using RT which has been nothing short of a disaster for mainstream gaming for the last 6 straight years.

It doesn't look bad, but it certainly isn't a huge leap forwards over older games that run at 200fps on the same hardware, and graphics do not make the gameplay design any better. The bugs are still bugs, and the bad aspects of the gameplay are still bad.

Overall, the game is supposed to be fun, but it could have been fun without the godawful RT implementation and miserable RT performance.
Think of the RT option as a wholly optional game mode that you're free to ignore, like those tick boxes for ambient lighting sync or HairWorks. It doesn't make sense to me to complain so bitterly that it exists. It's bad; leave it off.
 
Think of the RT option as a wholly optional game mode that you're free to ignore, like those tick boxes for ambient lighting sync or HairWorks. It doesn't make sense to me to complain so bitterly that it exists. It's bad; leave it off.

That's fair-enough, but also remember that it is what people prop their buying habbits of nVIDIA products upon. To sweep that under the rug is to gaslight people.

It is COMPLETELY fair game to criticize every single aspect of it bc of that.
 
What do you mean. The 4070 is getting above 60fps with max RT at 1080p not 30fps?
I mean a $500 GPU and a $200 monitor.

The 4070 was a $600 card, dropped to ~$550 when the 4070S launched, and is now back at $600 due to a lack of supply. $500 gets you a 4060Ti 16GB or 7800XT at best, though if you walk into a store and try to buy the cheapest-listed 7800XT in US with five benjamins, you'll need to supplement that to cover the sales tax that's omitted for unjustifiable reasons only North America seems to perpetuate.

As for monitors, $200 gets you a very decent, fully-featured 1440p high-refresh display. if you think $200 is 1080p territory perhaps you should take a serious look at the current monitor market:
  • Reasonable 1080p monitors (IPS, 100Hz) start at $59, with most major brands offering decent high-refresh models for $99.
  • Decent 1440p 144HZ+ gaming monitors are offered by most major brands at $149, though you can get China-direct 1440p models for under $100.
  • There are a few 3440x1440 gaming monitors for under $200, but like 4K monitors, the decent, high-refresh offerings from name brands start at about $280.

That's fair-enough, but also remember that it is what people prop their buying habbits of nVIDIA products upon. To sweep that under the rug is to gaslight people.

It is COMPLETELY fair game to criticize every single aspect of it bc of that.
Exactly, double-standards everywhere!

We've been paying for RT in our GPUs for six years now, without any choice in the matter - and developers have been using it for 6 years in their games as well. I personally think DXR/RT is a wasteful, immature technology that is a long way from the real raytracing that I deal with daily at work, but it doesn't change the fact that people buy new hardware and new games for the latest features like RT and high-res textures.

Spider man 2 is playable an ancient GTX1060, worth a measly $60-70 today, but that's not why people are looking at graphics card reviews and game performance reviews in 2025.
 
Last edited:
if you think $200 is 1080p territory perhaps you should take a flight
Fixed that. Outside US, it IS 1080p territory. Unless, of course, you're fine with mediocre VA/TN and/or gnome diagonals.

Not the point. I guess RT or no RT, shooting web is what people buy Spider-man for in the first place. Obwohl, meine Geheimklinik für Spinnenexperimente... it spins that dispute. RT in this game is awful tho. Totally not investing in a GPU for that.
 
Obwohl, meine Geheimklinik für Spinnenexperimente...
I would picture it more of something out of DeadPool rather than Oscorp..

Only judging by location :laugh:
 
I would picture it more of something out of DeadPool rather than Oscorp..

Not really into Marvel, or DC. It's just... technically speaking, I'm yet to achieve at least one living spider-man.
 
Fixed that. Outside US, it IS 1080p territory. Unless, of course, you're fine with mediocre VA/TN and/or gnome diagonals.
There are three ways to respond this this:
  1. My comments were in dollars, not Euros, Pounds, or Yen.
  2. $200 equivalent is solidly 1440p territory at major retailers in most of the developed world. If you're not in Europe, Canada, China, or Japan, AliExpress will sell to you in US dollars anyway.
  3. Querying panel type is a straw-man argument here, Not that it matters if the game is only running at 30-35fps because even a terrible VA panel could handle that just fine.
 
Fixed that. Outside US, it IS 1080p territory. Unless, of course, you're fine with mediocre VA/TN and/or gnome diagonals.

Not the point. I guess RT or no RT, shooting web is what people buy Spider-man for in the first place. Obwohl, meine Geheimklinik für Spinnenexperimente... it spins that dispute. RT in this game is awful tho. Totally not investing in a GPU for that.
This is just not true at all. In Europe you can get quality 1440p 180Hz IPS panels for 150 euros, that's about 129$ without tax btw.

How come?
Seems to me with the crashing etc it will require effort from the driver teams but also some patches from the devs, idk why you think you can already make such remarks.
Because Intel has some strong motivation and has a long history with fixing modern games for the GPU's but consistently missing the deadline illustrating this iffy performance on launch.
AMD seems intent on cruising and not really on fixing performance issues, only if they are downright dreadful or crash. The latter is typically more on the developer than AMD though.
 
Actually, the shadows max (no RT) are unrealistic. Like the shadows at the bottom of the ledge in the lower left.
And if you want to talk max RT vs max non-RT, how about those off-screen window reflections (same scene) that non-RT cannot compute and just smears the windows instead?

yes, i know, I said that "that arnt there in RT, which again is probably more realistic"
 
  • Like
Reactions: bug
So, RayTracing is a dead end. 6,5 years on the market and still unavailable to general public, unles You have $1K for a GPU alone. Also, with devs so lazy, TechPowerUp has to step up Its game and, like TechSpot start adding tests with Medium and High for people to know what playable settings They can expect with Their under-$1K GPUs. In the past You could simply wait with Your game purchase, and next gen GPU would improve Your frame rates, now that They came to the end of the rope with process shrinking, We no longer get to improve our PCs, unless emmigrate to the US and get a better paid job.
 
My son's busy playing SM2 on a 1060 3GB. Getting surprisingly smooth gameplay (goes 35-55fps) on a 1080p60 TV. No crashes as yet, but some odd game bugs. He wants a new GPU but we're waiting on the next game patch first, as there are literally hundreds of bug reports - mostly with 40 series Nvidia cards, it seems, but also better AMDs too.

I told him he should wait a few months, but he'd played SM1 and MM (which were waaay less demanding, hardware-wise) and had been looking forward to this for so long.
 
Thanks for the test!

At 1440p, no upscaling, the 4070 Ti Super has only 28fps less than the 5080, i.e. 111fps vs 139fps, WTF

How bad is the 5070 going to be?
 
Back
Top