• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 7900 GRE Pulse

Joined
Jun 21, 2013
Messages
552 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
The support for DLSS I would put in the same boat as CUDA, seeing as it is vendor locked. It would make more sense to put it as a positive when it has it, rather than a negative when it doesn't, it's not like say, mesh shaders, which is a hardware agnostic feature that you would probably want your gpu to support.
But that's also just my way of thinking, I am one of the few that doesn't see DLSS as the be all end all.
Yeah, might as well put "It's not nvidia" in the negatives.
 
Joined
Nov 27, 2023
Messages
1,400 (6.90/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original)
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
Good point about Reflex. Is it more important for you personally? Any thoughts on whether Radeon Anti-Lag and Anti-Lag+ are a decent alternative? If not then Reflex should be mentioned, too, because from what I hear it definitely helps competitive gamers? No a pro gamer myself
Reflex is inherently superior to Anti-Lag and pretty much always will be as long as it’s an in-engine solution that developers implement with NVidias tools as opposed to a blanket in-driver solution which is AL. There was a good video by GN of their talk with one of the Reflex engineers which lays out pretty well that it’s doing quite a bit more than just modifying the render queue like what NULL and AL do. This is useful and not only to competitive players, but overall in achieving the best smoothnesses to low latency ratio in any game. Especially considering how it works with VRR (GSync in NV case) to improve frame delivery.
So yes, in short, I would say that Reflex is absolutely a superior feature to AL. We may not like it, but just like with DLSS, NVidias interconnected ecosystem where features are hooking into each other and are implemented in-engine does have its benefits compared to open standards that use “one size fits all” solutions.

Edit: This isn’t even to mention that the AL+ way of doing things by injecting DLLs into a game instead of having an SDK for developers like Reflex already had some issues (the CS2 thing) and is inherently a silly way to achieve what AMD wants. I have no idea why they haven’t just went with an SDK. The fact that it REQUIRES RDNA 3 while Reflex works on, like, any reasonably modern NV card is also a baffling decision.
 
Last edited:
Joined
Jan 27, 2024
Messages
138 (0.97/day)
Location
TPU censorship
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
Having DLSS support will matter more for longevity than VRAM anyway.

More VRAM is always better. AMD has FSR and the in-games option to simply lower the settings, so that one can receive higher FPS output.

I forgot one really strong advantage.
The card has the gold rigid strong and proven PCIe connectors, Nvidia no longer uses these :laugh: :slap:

1709061757313.png


 
Joined
Jan 18, 2020
Messages
708 (0.44/day)
Will be a good option once an unlocked bios leaks...

Lot of headroom I think?
 
Joined
Jan 14, 2019
Messages
10,316 (5.21/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
You are absolutely right that they can't do anything about it, which is exactly why NVIDIA made it that way, and people are buying NV cards because they want this specific feature. Just like people are buying AMD cards because they want the Radeon Settings UI
The problem is, with that logic, AMD and Intel cards will always be inferior, as lack of DLSS will always be listed as a negative in every single review until all eternity. Most games support both FSR and DLSS these days, so I see it as a non-issue.

As for AntiLag, as a non-pro gamer, I think it compares pretty well with Reflex. :)
 
Joined
Jan 27, 2024
Messages
138 (0.97/day)
Location
TPU censorship
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
I wonder what causes the extremely low performance in Counter-Strike 2?

1709104683171.png
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,168 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Feb 27, 2024
Messages
35 (0.32/day)
Processor Ryzen 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Peerless Assassin 120
Memory 32GB (6000/30)
Video Card(s) 4070 Ti @ 3+ GHz
Storage Samsung 990 Pro 4TB
Display(s) Dell 1440p 360 Hz QD-OLED
The problem is, with that logic, AMD and Intel cards will always be inferior, as lack of DLSS will always be listed as a negative in every single review until all eternity. Most games support both FSR and DLSS these days, so I see it as a non-issue.

As for AntiLag, as a non-pro gamer, I think it compares pretty well with Reflex. :)

Yeah they will, but AMD and Intel is free to catch up and make it even ground. At this point tho, even XeSS is better than FSR, which is kinda sad to see.

DLSS is better and has way higher adoption rate, like 500+ games have RTX features now.

AMD users will be forced to use the inferior option and DLSS/FSR is coming to pretty much all new games, and even replaces other AA methods.

This was true, even in the AMD sponsored Starfield. FSR2 was enabled as default AA solution. DLSS/DLAA mod beat FSR2 on day one in terms of visuals in Starfield. Techpowerup has comparison.

DLSS and FSR might look close in some still photos, IN MOTION tho, is where the true difference lies. DLSS has far less artifacts, jitter and smearing. Which TPU concludes in all their DLSS vs FSR testing.

Upscaling is here to stay. Consoles relies on it as well and Sony works hard to deliver DLSS-like upscaling to PS5 Pro.

DLAA beats native image quality every single time and even DLSS Quality looks better than native in most games while boosting perf bigtime. Built in AA + sharpening is the reason.
 
Last edited:
Joined
Jan 14, 2019
Messages
10,316 (5.21/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Yeah they will, but AMD and Intel is free to catch up and make it even ground. At this point tho, even XeSS is better than FSR, which is kinda sad to see.

DLSS is better and has way higher adoption rate, like 500+ games have RTX features now.

AMD users will be forced to use the inferior option and DLSS/FSR is coming to pretty much all new games, and even replaces other AA methods.

This was true, even in the AMD sponsored Starfield. FSR2 was enabled as default AA solution. DLSS/DLAA mod beat FSR2 on day one in terms of visuals in Starfield. Techpowerup has comparison.

DLSS and FSR might look close in some still photos, IN MOTION tho, is where the true difference lies. DLSS has far less artifacts, jitter and smearing. Which TPU concludes in all their DLSS vs FSR testing.

Upscaling is here to stay. Consoles relies on it as well and Sony works hard to deliver DLSS-like upscaling to PS5 Pro.

DLAA beats native image quality every single time and even DLSS Quality looks better than native in most games while boosting perf bigtime. Built in AA + sharpening is the reason.
I still fail to see how upscaling is more than just a crutch when performance is lacking for the desired resolution and graphics quality. It's great for low to mid range gaming, but I don't want it with a high-end GPU.

DLAA is a different matter, though. It's more of a downscaler than an upscaler, as far as I understand.
 
Joined
Jul 10, 2015
Messages
750 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,168 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
A "bug", sure.
Considering how long ago the card launched in China, I'm thinking it's less a bug and more an artificial limitation they regret after seeing international reactions
Exactly my thoughts. Someone in some meeting must have said "we put the limiter at x", otherwise the limiter wouldn't be set in the VBIOS the way it is
 
Joined
Feb 27, 2024
Messages
35 (0.32/day)
Processor Ryzen 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Peerless Assassin 120
Memory 32GB (6000/30)
Video Card(s) 4070 Ti @ 3+ GHz
Storage Samsung 990 Pro 4TB
Display(s) Dell 1440p 360 Hz QD-OLED
I still fail to see how upscaling is more than just a crutch when performance is lacking for the desired resolution and graphics quality. It's great for low to mid range gaming, but I don't want it with a high-end GPU.

DLAA is a different matter, though. It's more of a downscaler than an upscaler, as far as I understand.
Then you should probably read up on DLSS, because it has built in anti aliasing and sharpening which is why it replaces other AA methods when enabled.

Meaning, DLSS Quality running at 1440p like I use, will look better than 1440p native, while boosting performance by 50-75% on top. Win/win for most gamers.

DLAA is everyhing DLSS is, just without upscaling, uses native res and just improve the image quality. And this is why native res gaming don't matter anymore, because native res with 3rd party AA on top looks worse. TAA is terrible just to name one inferior solution.

DLAA is the best anti aliasing method today and will work in all DLSS games. DLAA is part of DLSS presets now.

Still boggles my mind that some people think DLSS is only for improving performance while sacrificing visuals, this is not true at all. Mostly its AMD users that think this, probably because FSR is mostly doing this and looks mediocre in comparison, which is probably why they hate DLSS too and upsclaers in general, even tho they don't know that DLSS works far better than FSR, without the massive shimmering and artifacts that FSR produce, especially true when you actually move, instead of looking at still photos.

I hated FSR on my Radeon 6800. I love DLSS/DLAA on my 4070 Ti. AMD is years behind on upscaling. Simply too many problems with it (mostly artifacts and shimmering) and it looks like AMD hit a wall in terms of improving it.

AMD needs to vastly improve FSR going forward, maybe they should go the hardware route like Nvidia. More and more games rely on upscalers as AA now. Upcoming PS5 Pro and next gen Xbox also will rely on upscalers. Upscaling is here to stay, game devs embraced it and every single new AAA games has it built in. It is even enabled as default, also in AMD sponsored Starfield. Sadly it looked crazy bad and RTX users installed DLSS/DLAA mod on day one which beat FSR2 with ease o_O AMD users were better off enabling CAS Shapening in the game, sadly performance takes a hit.
 
Last edited:
Joined
Jun 29, 2023
Messages
512 (1.45/day)
System Name Gungnir
Processor Ryzen 5 7600X @1.25v
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319 @1.1v @2600MHz clock @2140MHz vram freq. (surprisingly stable)
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
Exactly my thoughts. Someone in some meeting must have said "we put the limiter at x", otherwise the limiter wouldn't be set in the VBIOS the way it is
Well, once said overclocking limits are removed, will there be a retest of the card's overclocking performance?
 
Joined
Feb 27, 2024
Messages
35 (0.32/day)
Processor Ryzen 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Peerless Assassin 120
Memory 32GB (6000/30)
Video Card(s) 4070 Ti @ 3+ GHz
Storage Samsung 990 Pro 4TB
Display(s) Dell 1440p 360 Hz QD-OLED
Well, once said overclocking limits are removed, will there be a retest of the card's overclocking performance?
Card is memory starved, AMD used slower VRAM on 7900 GRE than 7800 XT to gimp it on purpose to not get it close to 7900XT territory.

7900 GRE makes little sense to buy unless price is closer to 7800 XT than 7900 XT.
 
Joined
May 31, 2016
Messages
4,351 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
More VRAM is always better. AMD has FSR and the in-games option to simply lower the settings, so that one can receive higher FPS output.

I forgot one really strong advantage.
The card has the gold rigid strong and proven PCIe connectors, Nvidia no longer uses these :laugh: :slap:

View attachment 336652

I wonder, how a 10GB 3080 would do in these titles. Considering my 6900xt, I'm not worried about Vram but the 10GB 3080? It might get roughed-up a bit in some of the most demanding Vram titles.
For the GRE. Lets say not bad but I would not consider it with what I currently have despite the price.
 
Joined
Feb 27, 2024
Messages
35 (0.32/day)
Processor Ryzen 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Peerless Assassin 120
Memory 32GB (6000/30)
Video Card(s) 4070 Ti @ 3+ GHz
Storage Samsung 990 Pro 4TB
Display(s) Dell 1440p 360 Hz QD-OLED
I wonder, how a 10GB 3080 would do in these titles. Considering my 6900xt, I'm not worried about Vram but the 10GB 3080? It might get roughed-up a bit in some of the most demanding Vram titles.
For the GRE. Lets say not bad but I would not consider it with what I currently have despite the price.

Just fine -> https://www.techspot.com/review/2671-geforce-3080-vs-radeon-6800-xt/

Most of the games that had VRAM issues are AMD sponsored and rushed console ports ... and was fixed with patches too. Many of them even ran bad om AMDs own cards in the first months. Tons if issues in general.

Lets look at one of the best looking games right now - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

3080 beats 6900XT in 4K/UHD, minimum fps included.
You will need upscaling to make it run great at 4K/UHD using these last gen GPUs tho, and DLSS easily beats FSR too.

VRAM don't magically make slow GPUs run maxed out settings in new demanding games. The GPU itself will buckle and you will be forced to lower settings anyway. Good upscaling will matter more for longevity than just VRAM.

For a 4K/UHD gamer that can't afford 4090 and want to play all the new AAA games, good upscaling is a requirement unless you want to play games on lowest preset in the future, or live with console-like fps, meaning ~30 avg.

For the other 99% of PC gamers, which use 1440p or less, 16GB is not needed at all. VRAM requirement is completely overblown from AMD marketing. Because they have nothing else to speak about.

PS5 and XSX have 16GB shared RAM. Most games use 4-6GB for graphics.

Upcoming PS5 Pro will get 16GB too. Not 16GB VRAM, but 16GB RAM total.

VRAM demand for PC games won't rise before 2028+ when PS6 and next gen Xbox hit.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,168 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Well, once said overclocking limits are removed, will there be a retest of the card's overclocking performance?
Of course, it'll yield 2-3% additional real-life performance in best case
 
Joined
May 31, 2016
Messages
4,351 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Just fine -> https://www.techspot.com/review/2671-geforce-3080-vs-radeon-6800-xt/

Most of the games that had VRAM issues was AMD sponsored and console ports ... and was fixed with patches too.
It shows OK but there are few titles already that make the min FPS drops probably due to Vram. I guess it's a matter of time when the 3080 10GB starts showing lack of vram issue just like the 3070 but the judgment day for the card is postponed.
Considering the Vram needs nowadays, I'm pretty sure it wasn't a marketing scheme. You have a very clear evidence of that. Especially if you put these into RT perspective which these cards were supposed to shine and yet they are not if the Vram capacity is not in place. It kinda makes the good RT performance argument for these cards moot and yet there's so many people still bringing this up. It kinda makes you wonder, if all those 4070 and Ti's with 12Gb will end up just like that in few years time. Everything below 16GB might have a problem sooner than later.
Especially in the laptop department. All mobile graphics from NV are below 16Gb except 4090.
 
Joined
Feb 27, 2024
Messages
35 (0.32/day)
Processor Ryzen 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Peerless Assassin 120
Memory 32GB (6000/30)
Video Card(s) 4070 Ti @ 3+ GHz
Storage Samsung 990 Pro 4TB
Display(s) Dell 1440p 360 Hz QD-OLED
It shows OK but there are few titles already that make the min FPS drops probably due to Vram. I guess it's a matter of time when the 3080 10GB starts showing lack of vram issue just like the 3070 but the judgment day for the card is postponed.
Considering the Vram needs nowadays, I'm pretty sure it wasn't a marketing scheme. You have a very clear evidence of that. Especially if you put these into RT perspective which these cards were supposed to shine and yet they are not if the Vram capacity is not in place. It kinda makes the good RT performance argument for these cards moot and yet there's so many people still bringing this up. It kinda makes you wonder, if all those 4070 and Ti's with 12Gb will end up just like that in few years time. Everything below 16GB might have a problem sooner than later.
Especially in the laptop department. All mobile graphics from NV are below 16Gb except 4090.
Yeah on settings none of these cards will run anyway, due to slow GPU.

People who don't care about Ray Tracing, you will need even less VRAM. Or did AMD users start to care about RT suddently?
 
Joined
May 31, 2016
Messages
4,351 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yeah on settings none of these cards will run anyway, due to slow GPU.

People who don't care about Ray Tracing, you will need even less VRAM. Or did AMD users start to care about RT suddently?
You have seen in the video that 6800 make the gameplay enjoyable whilst 3070 doesn't. I dont care about RT thus my graphics choice but I remember what these 3000 series cards had been advertised as and now they can't even pull this off with RT off. The AMD's 6000 series like the rx 6800 bashed for slow RT performance (for those RT mattered so much and claiming is the future) and look at it now. 3070 the great RT graphics choice crapped itself out in games with RT loosing to rx 6800. Irony isn't it. Nowadays these are barely ready for RT off in several games.
It shows one thing. Choice matters and for some people, this might be hard to swollow.
 
Last edited:
Joined
Jan 14, 2019
Messages
10,316 (5.21/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Then you should probably read up on DLSS, because it has built in anti aliasing and sharpening which is why it replaces other AA methods when enabled.
I know, but I don't care. I've watched comparisons between Native TAA vs DLSS Quality, and I can't say that DLSS was a definite winner in any game. It's only in one game with an absolutely rubbish TAA implementation (I think it was SW: Jedi Survivor) where I could say that they're close to being equal. Everywhere else, DLSS is slightly worse.

Meaning, DLSS Quality running at 1440p like I use, will look better than 1440p native, while boosting performance by 50-75% on top. Win/win for most gamers.
No. Just... no. Because DLAA is not DLSS.

Still boggles my mind that some people think DLSS is only for improving performance while sacrificing visuals, this is not true at all.
And it boggles my mind that some people confuse DLAA with DLSS. DLAA improves visuals. DLSS does not. The end.
 
Joined
Jul 10, 2015
Messages
750 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
  • Erratic fan speed behavior
OMg, amd, get your shit together.
W1zz, whats the diff between Erratic fan speed behavior and Fan overshoot, later you were writting after many Radeons reviews? Its AMDs fault not AIBs, right?
 
Joined
Jun 29, 2023
Messages
512 (1.45/day)
System Name Gungnir
Processor Ryzen 5 7600X @1.25v
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319 @1.1v @2600MHz clock @2140MHz vram freq. (surprisingly stable)
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
And it boggles my mind that some people confuse DLAA with DLSS. DLAA improves visuals. DLSS does not. The end.
And even DLAA isn't always perfect either, since it can still present ghosting and some temporal instability, but then that goes towards the topic of TAA and that's another (and irrelevant) can of worms.
 
Joined
May 11, 2020
Messages
192 (0.13/day)
Why does that matter? 12GB is enough even in those edge cases where 8GB isn't.
Because game devs cant do stuff about optimization. Because I don't want to spend 500$+ on 12GB GPU. Because we live in 2024. Because I don't change gpu every year. 12GB/192bit on 4070 is a very disrespecting behavior towards gamer consumers, while Nvidia pursuit to satisfy their AI consumers.
 
Top