• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Last of Us Part I Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,671 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The Last of Us is finally available for PC. This was the reason many people bought a PlayStation, it's a masterpiece that you can't miss. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a selection of modern graphics cards.

Show full review
 
This is quite an appalling thing:

On first game launch, the title will start precompiling shaders, a lot of them, totaling 12 GB. This takes forever, even on a decent CPU, 10-20 minutes is not unexpected. What makes things worse is that this shader compilation will often crash, especially on GeForce cards, especially when you have less than 32 GB RAM. At least you don't have to start all over again, but launching the game ten or twenty times just to get to the main menu is really unacceptable. I wonder what the QA testers were doing all this time... Once you've made it into the game the game will crash sometimes (4x in my testing). The AMD Radeon driver also crashed several times during testing, and I've seen numerous reports of DLSS causing additional crashes, when enabled. The developers have already pushed out several game updates, and it seems that they are serious about fixing these problems quickly. Let's hope they can resolve them soon, so that people can enjoy this masterpiece of a game.

I'm old enough to remember inserting a tape casette into a 'computer' to load a game, all with the classic screeching noise. That a game takes longer to load in 2023, than a 128Kb game from the eighties is quite an achievement in backward nostalgia.

I'd like to play the game, but I'll wait for the crinkles to be ironed out with a planet cracker.

I mean, why is this acceptable with today's gaming tech?
 
long shader compiling times should have been identified early on.

This is unavoidable, the alternative is on the fly shader compilation which means horrid non stop stutter, trust me they made the correct choice.
 
@W1zzard

Hi Wizzard,

I'm doing my usual thing. :)

"Still, his is a good thing" on the conclusion page. Should be: "Still, this is a good thing".
 
It's a bad release but I'm convinced it will run better in the somewhat near future..
 
This is unavoidable, the alternative is on the fly shader compilation which means horrid non stop stutter, trust me they made the correct choice.
Have we reached a point where games are getting so sophisticated that shader compilation will not be done "quietly" anymore ?
 
That 8GB 4060Ti is looking dead on arrival, no matter the price.

The list of games that hate 8GB is growing fast, and 12GB is fast becoming the minimum for 1440p.

This is unavoidable, the alternative is on the fly shader compilation which means horrid non stop stutter, trust me they made the correct choice.
Unavoidable? It's 12GB of shaders that will be the same for any given graphics architecture. My Steam deck downloads pre-compiled shaders for deck-verified games hosted by Valve. Why does my Geforce or Radeon need to waste half an hour creating its own when they could be pre-compiled by the developer and downloaded as additional data during install?
 
I haven't read so many times word crash in while and it ain't crash bandicoot.
 
It's 12GB of shaders that will be the same for any given graphics architecture.
It's not the same, that's the point. Shaders can be precompiled for your Steam deck because that's a fixed known architecture, it's the same reason shaders are precompiled for consoles as well.

There are many other architectures besides whatever the Steam deck is using, every shader needs to be compiled for each of them otherwise it's simply incompatible. Even if you could get the option to download the precompiled shaders for all existing architectures every new GPU that comes out will be unable to run those games. The only way this shader compilation issue could ever be solved properly is if all GPU manufacturers settle on a shared instruction set however that's probably never going to happen.
 
RTX 3070 8GB overtaked by RTX 3060 12 GB, thats shows that VRAM is really important in this game.
The RX 5700 XT 8GB vs RX 6600 XT 8GB could show that PCI Express connection does matter when VRAM isn't enough but 4K results makes me not to be sure about it
 
The 3070 is getting beaten by a 3060 here. Not much more than needs to be said, most people knew from the start that 8GB was not enough for the card.
 
rtx 4090 and 7900 xtx got trashed in 4k lmao, guess I spend my 55 bucks to re 4 remake.
 
It's not the same, that's the point. Shaders can be precompiled for your Steam deck because that's a fixed known architecture, it's the same reason shaders are precompiled for consoles as well.

There are many other architectures besides whatever the Steam deck is using, every shader needs to be compiled for each of them otherwise it's simply incompatible. Even if you could get the option to download the precompiled shaders for all existing architectures every new GPU that comes out will be unable to run those games. The only way this shader compilation issue could ever be solved properly is if all GPU manufacturers settle on a shared instruction set however that's probably never going to happen
There aren't that many architectures that meet the minimum specs. RDNA1, RDNA2, Turing, Ampere, Arc. That's it, just precompile those and host them on the digital download service.

If you want to game on a GPU that isn't covered then sure, compiling shaders yourself is the fallback, but the game was originally designed for RDNA2 and according to the Steam hardware survey, over 50% of the market is running Turing or Ampere.
 
There aren't that many architectures that meet the minimum specs. RDNA1, RDNA2, Turing, Ampere, Arc. That's it, just precompile those and host them on the digital download service.

If you want to game on a GPU that isn't covered then sure, compiling shaders yourself is the fallback, but the game was originally designed for RDNA2 and according to the Steam hardware survey, over 50% of the market is running Turing or Ampere.
Valve already does this with the Steam Deck... and every other hardware/software combo in existence that has the pre-caching feature activated on Linux. If you have it enabled you process the shaders if they aren't in the database or you download them if they already exist. I wonder the size of the database, must be massive.

You gotta also take in mind that the compiled shaders have to match the version of the drivers. So it's not only by architecture, also by driver version and possibly OS (kernel).
 
poor 3080. why did they even make such a card.

just like rtx 4070 12gb & 4070 ti 12gb become obselete with 2160p on this game.....

rtx 3090 24gb & rtx 3090 ti 24gb still rules the way it's meant to be played......
 
What a beautiful game.
Played about an hour with no crashes and smooth @ 1440p.
My biggest problem was shader compile and area loading.
 
This is quite an appalling thing:



I'm old enough to remember inserting a tape casette into a 'computer' to load a game, all with the classic screeching noise. That a game takes longer to load in 2023, than a 128Kb game from the eighties is quite an achievement in backward nostalgia.

I'd like to play the game, but I'll wait for the crinkles to be ironed out with a planet cracker.

I mean, why is this acceptable with today's gaming tech?
I had a tape player and a 5 1/4" floopy drive. I feel your pain. ^_^
 
....the 2080Ti is a true legend. I could have stayed with for some time longer.

A 1080Ti should exist in the charts. It would show the difference between the generations when vram is not an issue.
 
Aside from the crashes I would say this screams the need for more game optimization. This really, really should not demand that much. Sorry, just no.
 
Back
Top