• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Halo Infinite Benchmark Test & Performance

Joined
Nov 11, 2016
Messages
3,068 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11


I think this is to show the great difference in texture compression both camps use.

Nvidia skimps on texture quality. Hence why AMD always looks better at native.

Care to take a professional camera, record gameplay to backup your claim? here is an example of how to do it
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,756 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It's a typical open world scene, near one of the enemy bases, with a few enemies showing up towards the end of the run.
I'm at least halfway through the game and haven't really encountered "hordes" of enemies, so testing such a scenario would be cherry picking a worst-case.
Your insights as always are much appreciated! Gotta hand it to RDNA2 on this one.
 
Joined
Nov 11, 2016
Messages
3,068 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Joined
Dec 30, 2010
Messages
2,099 (0.43/day)
Great attempt to pull something out of context. You know that farcry and proberly a few other engines do "allocate" VRAM but not always use it right?

The amount allocated has really no meaning to anything then some amount being allocated. But given the memory compression Nvidia is using (you cant deny this) the previous provided image with Halo is a perfect example of that. Now if you compress textures, you obviously have less to transport over a bus. And with that you offer a faster overall throughout. Nvidia has bin caught cheating with this before. AMD and former ATI always came better in relation of image quality when you head to head test these. I'd always favor AMD for just presenting textures as intended and not tweaked by.
 
Joined
Nov 11, 2016
Messages
3,068 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
It's a typical open world scene, near one of the enemy bases, with a few enemies showing up towards the end of the run.
I'm at least halfway through the game and haven't really encountered "hordes" of enemies, so testing such a scenario would be cherry picking a worst-case.


Indeed, I do like the art style, too, even though technically and objectively it's not as good visually as other 2021 titles.

Halo Infinite is not whitelisted for ReBar in the current 497.09 driver, can you try forcing ReBar by using Nvidia Inspector for Halo Infinite and see whether it improve performance?
 
Joined
Nov 10, 2008
Messages
1,984 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
I really liked the game - looked pretty good throughout the whole campaign which felt to be the right length (freeing all teams, taking every base etc). Strangely on my system I averaged over 100fps at 4K (according to my Radeon dashboard) , however that will be including indoor scenes where things were faster. Still, I never noticed my frames to be as low as around 60 even in the open world, although I was more focused on pew pew vs framerate watching.

Edit:

I will say the ending and the lack of info on the next campaign in terms of timing isn't great - it leaves you wanting more but we have no idea when it may come..
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,053 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I averaged over 100fps at 4K (according to my Radeon dashboard) , however that will be including indoor scenes where things were faster.
Definitely that, plus menus running at 3587459843653 FPS
 
Joined
Nov 10, 2008
Messages
1,984 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
Joined
Sep 20, 2019
Messages
478 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
In all seriousness.....this is the first time I have read a review and took notice that this section shows a difference between BRANDS of GPUs. I do see the Far Cry 6 pic posted too, so I know this isn't a new practice, I just have never taken notice to it until now I guess

I certainly understand different resolutions would use different amount of VRAM, higher res = higher VRAM. I get that....

I don't get why a BRAND (Nvidia vs AMD), using the same resolution setting would use a different amount of VRAM? I would appreciate it if someone could explain any of this because I am confused :confused: I honestly thought selecting for example (using random values for this example) 1920x1080, if that uses 7GB of VRAM, it will use 7GB of VRAM whether a Nvidia GPU or AMD GPU is installed. But this belief looks wrong to me now!

1639757533760.png
 
Joined
Mar 18, 2015
Messages
178 (0.05/day)
In all seriousness.....this is the first time I have read a review and took notice that this section shows a difference between BRANDS of GPUs. I do see the Far Cry 6 pic posted too, so I know this isn't a new practice, I just have never taken notice to it until now I guess

I certainly understand different resolutions would use different amount of VRAM, higher res = higher VRAM. I get that....

I don't get why a BRAND (Nvidia vs AMD), using the same resolution setting would use a different amount of VRAM? I would appreciate it if someone could explain any of this because I am confused :confused: I honestly thought selecting for example (using random values for this example) 1920x1080, if that uses 7GB of VRAM, it will use 7GB of VRAM whether a Nvidia GPU or AMD GPU is installed. But this belief looks wrong to me now!
Compression. This is a pretty in-depth piece on it: https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/8

A few generations old, but it's still a thing today. Nvidia generally does better in VRAM-starved situations, which is either down to their compression being better or simply more aggressive. Certainly there are cases where AMD cards appear to offer superior image quality.
 
Joined
Sep 20, 2019
Messages
478 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
Compression. This is a pretty in-depth piece on it: https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/8

A few generations old, but it's still a thing today. Nvidia generally does better in VRAM-starved situations, which is either down to their compression being better or simply more aggressive. Certainly there are cases where AMD cards appear to offer superior image quality.

damn 2016 article, no kidding a couple gens old! lol

thanks, I will give this a read. I've never heard of this before. I do understand compression, but didn't know it's abilities/performance varies between which brand GPU is installed. learn something new everyday!
 
Joined
Jan 27, 2015
Messages
1,649 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
In all seriousness.....this is the first time I have read a review and took notice that this section shows a difference between BRANDS of GPUs. I do see the Far Cry 6 pic posted too, so I know this isn't a new practice, I just have never taken notice to it until now I guess

I certainly understand different resolutions would use different amount of VRAM, higher res = higher VRAM. I get that....

I don't get why a BRAND (Nvidia vs AMD), using the same resolution setting would use a different amount of VRAM? I would appreciate it if someone could explain any of this because I am confused :confused: I honestly thought selecting for example (using random values for this example) 1920x1080, if that uses 7GB of VRAM, it will use 7GB of VRAM whether a Nvidia GPU or AMD GPU is installed. But this belief looks wrong to me now!

View attachment 229294

None of the cards are loading all textures at once, they try to be smart about what textures to load as far as what is likely to be needed. There is also compression method differences like @Aretak mentioned.

To give an analogy, database servers. A database server will try to keep in-memory any tables that are heavily used. Moreover, after a while a DB server will tend to fill up all its available memory as it 'learns' more about what is being asked for and keeps more and more tables that are less frequently used in memory. I can see this regularly where a DB server is restarted with a max in memory size of say 24GB and starts out at 8GB. Over the course of a few days of use this will rapidly go up to 16GB and then over a couple of weeks eventually hit 22GB or so. The pattern will vary by use case but the concept is the same.

I believe GPUs are the same way. They will cache what is used the most frequently right off, and less used textures aren't cached. However it will keep trying to use all available VRAM until it is near exhaustion just leaving a bit for anything that has to be swapped in. If that cache is missed and it has to go to disk especially for multiple textures at once, that's when you may see something odd like an image without a texture on it.

All that said, most of these use case are bunk.

Using 'ultra' textures usually means you are using a texture meant for 4K monitors. There are plenty of examples / comparisons you can find for individual games, but unless you actually have a 4K monitor that's usually a waste. You can set ultra settings for all else and lower the textures down to 'very high' (usually 1440P) or 'high' (usually 1080P) for your monitor. It's below 'high' settings that you may, for example, start to see some real degradation in image quality on a 1080P monitor as lower settings are usually targeting 720P and so on. Again what these actually mean vary by game, but I think what I just said is correct for 80% or more of games.
 
Joined
Apr 17, 2021
Messages
524 (0.47/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
I use radeon chill to cap me at 144fps in all games that prevents that :)
I tried the nVidia FPS cap at 300FPS for example and it didn't seem to work...
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
In all seriousness.....this is the first time I have read a review and took notice that this section shows a difference between BRANDS of GPUs. I do see the Far Cry 6 pic posted too, so I know this isn't a new practice, I just have never taken notice to it until now I guess

I certainly understand different resolutions would use different amount of VRAM, higher res = higher VRAM. I get that....

I don't get why a BRAND (Nvidia vs AMD), using the same resolution setting would use a different amount of VRAM? I would appreciate it if someone could explain any of this because I am confused :confused: I honestly thought selecting for example (using random values for this example) 1920x1080, if that uses 7GB of VRAM, it will use 7GB of VRAM whether a Nvidia GPU or AMD GPU is installed. But this belief looks wrong to me now!

View attachment 229294
1. Allocation is done by the drivers, If Nvidia cards have less VRAM on average, they're going to optimise the drivers to use within those limits
2. texture compression (Nvidia has always been slightly ahead with this side of things, i recall it being a marketed feature in the Riva/TNT days)
 
Joined
Oct 15, 2011
Messages
1,985 (0.43/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
Yeah this game doesn't look that good to have such low framerates.
More CP2077-like than Halo-like! Even PC MCC is much better! With The Masterchief Collection, I can hit 60 FPS with a stock RX 5600 XT in UHD, thus making Infinite seem CP2077-like!
If not worse! The only possibly-worse I can think of, is Superposition 1080p Extreme! Well, it honestly looks more like a video card stability test!

It is clear devs are giving a middle finger to people with low-end hardware.
Can someone explain why the diff between FullHD texture size and 4k is soo small?
Yep, I wish the FPS is like Valorant. (Which is like the Halos before this!) Note that I never ran CP2077, I'm going by test videos. Same with Valorant.

Reminds me that Take Two did a GJ with GTA V! They rock!
 
Last edited:
Joined
Dec 30, 2010
Messages
2,099 (0.43/day)
damn 2016 article, no kidding a couple gens old! lol

thanks, I will give this a read. I've never heard of this before. I do understand compression, but didn't know it's abilities/performance varies between which brand GPU is installed. learn something new everyday!

The whole idea of large VRAM is to store textures. GPU's dont always use what they allocate, but simply store when it's needed. If there was'nt any allocation being done, it had to be pushed from disk or memory to the GPU (like in the old days) which was quite slow. Nvidia obviously uses compression here, to "save" consumed bandwidth, and the more you can consume the faster the GPU could proces. While on raw fps numbers it might be faster (or not), the tradeoff is lower quality textures. I'm not saying the image looks way worse compared to AMD, but if you start picking in HDR you could defenitly see the impact of the compression Nvidia is applying.

With that in mind, i always believed the quality of textures with AMD has always bin better. The trade-off for AMD is that their hardware on some to many occasions seems slower, while on paper even faster. Now i could push a driver that would strip any of the textures and provide you guaranteed 2000 FPS in Halo. But that would completely nuke the image your seeing. Nvidia has always bin caught cheating with drivers, to mimmick a higher score then it would actually present. Cheating in benchmarks happens. There are not alot of tests that purely stress the GPU on which you can cheat with using drivers, but there is one. 3DMark nature. That particular test does'nt count the CPU speed, cache speed, memory speed, just purely GPU, and spits out a number at the end which is based on purely GPU processing power.

Ive done some OC contests in the past; drivers was a serious thing. The more stuff you could cut from the actual rendering, the higher the FPS would be, but at the cost of visual obviously. At the end of the day, and i think if you would use HDR, a AMD card would defenitly favor you better. If you dont care about quality nvidia is your camp. I'd choose for AMD, purely for not applying a too great of a compression for decades.
 
Joined
Nov 11, 2016
Messages
3,068 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I guess W1zzard should include some screenshots from both camps from now on so people can compare the IQ
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
I think this is to show the great difference in texture compression both camps use.

Nvidia skimps on texture quality. Hence why AMD always looks better at native.

This is a blatantly false statement which has been debunked. Under certain situations NVIDIA enabled a limited color-space by default but that took literally a few mouse clicks to fix - but that resulted in a washed out image, it did not affect texture quality.

Another issue was a very short period of time when NVIDIA sort-of cheated with anisotropic filtering but that was almost two decades ago.
 
Last edited:

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
This is a blatantly false statement which has been debunked. Under certain situations NVIDIA enabled a limited color-space by default but that took literally a few mouse clicks to fix - but that resulted in a washed out image, it did not affect texture quality.
^ This.

The texture scandal was... geforce FX series?

The HDMI output defaulting to limited colour still happens to this day, and it's only an issue with certain HDMI devices and cables where it cant detect things - so it goes to a 'safe' mode
AMD and intel also had issues in that era, but theirs were mostly overscan defaults being incorrect
 
Joined
Dec 30, 2010
Messages
2,099 (0.43/day)
This is a blatantly false statement which has been debunked. Under certain situations NVIDIA enabled a limited color-space by default but that took literally a few mouse clicks to fix - but that resulted in a washed out image, it did not affect texture quality.

Another issue was a very short period of time when NVIDIA sort-of cheated with anisotropic filtering but that was over a decade ago.







By nvidia's default the "high quality preset" in regards of textures and all that is'nt turned on. And with AMD/ATI it actually it. With a default installation of drivers it would look like Nvidia is faster, while ATI on the other hand produces consistently better quality.

Well, i just leave it to what it is.
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
By nvidia's default the "high quality preset" in regards of textures and all that is'nt turned on. And with AMD/ATI it actually it. With a default installation of drivers it would look like Nvidia is faster, while ATI on the other hand produces consistently better quality.

Well, i just leave it to what it is.

Would be great if your learned a tiny bit about video compression and then about YouTube's video compression specifically not to embarrass yourself.

Hint: either you install, run these games using identical settings in reproducible situations (not dynamic games with a ton of randomness) and then make screenshots in the PNG format or bust.

Making screenshots of highly compressed 1080p streams from YouTube? Oh, God.
 
Last edited:
Joined
Dec 30, 2010
Messages
2,099 (0.43/day)
It is PNG.

And even if there was compression from youtube, the visual differences is still there.
 
Joined
Sep 8, 2020
Messages
204 (0.15/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
Quality of textures is the very last problem for PC gamers, price is the main problem.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
It is PNG.

And even if there was compression from youtube, the visual differences is still there.
Yeah because you can take a screenshot 2 frames apart and have different things on screen

You can tell super, super easily those screenshots are not identical - the HUD elements dont align the same with onscreen objects like the cars... so they're useless to compare with.
 
Top