• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announced as Starfield's Exclusive Partner on PC

Joined
Sep 6, 2013
Messages
3,050 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
It only mean RTX sadly, not even GTX cards from 4-5 years back. Talk about stiffing your own customers :shadedshu:
:confused::confused::confused:
Yeah, thank you. That's what I wrote. :toast:

My God, why did you cut the post there? Why you left the rest part out? Maybe I should have put a comma "," after the only? Syntax error from my part?
DLSS only, means GTX, Radeon and console owners not having an upscaling tech AT ALL.
 
Joined
Sep 17, 2014
Messages
21,210 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Great...so now we're starting to see titles like this and Jedi Survivor where the game doesn't have DLSS in it when that's what the vast majority of people are using. Just one more thing to create division in the gaming arena now.
Haha. Its a great reality check for those counting on that free Nvidia TLC.

Welcome to the Nvidia clusterfuck because youre also lacking the VRAM to run native at some point.

As predicted. Fools & Money will be parted

I dont 'count' on FSR either. We all need to judge raw raster performance and nothing else. This confirms it.

Great. Now when it launches with issues, people will blame AMD instead of looking at Bethesda's long history of launching games in a bug filled state.
Doubtful. Even with DLSS20 you cant hide a shit engine and stutters

It only mean RTX sadly, not even GTX cards from 4-5 years back. Talk about stiffing your own customers :shadedshu:
Exactly.
So again: if you count on DLSS/FSR to get your game playable, youre an idiot.
 
Last edited:
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
1>SO CONFIDENT WE ARE ON THE THIRD TRY.

and the third breaks compatibility with the first card to use dlss,which is, arse

2 Big leaps of bullllllshit right there, intermingled with conspiracy and unproven wccf shit, in their own post on it they adequately show its a shitshow of support that's about equal in, fsr and dlss being both supported or just one, no drought of dlss exists, just a darth of Entitled plebs who think they bought Nvidia everyone else should, and everyone has to dance to Huangs tune.

3 that wccftech is a trusted journo source, I read it but its not an instant fact type of site is it.

4 Do you think when cyberpunk got bought by Nvidia it didn't get leaned towards Nvidia!, I wouldn't buy it until it was A fixed and B supported a decent FSR version, it took a while , but I survived :) :D.

Now let's see the honesty and maturity of posters come release since Bethesda make some shocking first-day shit Ala fallout 76 /everything they do so expecting much here on new IP well.

@phanbuey Bullshit, nearly every Crysis game and remake, fu$$$$ by Nvidia co-operation money, many others I waited ages for Fsr support to be added way after dlss, or like Crysis Basic DX12 Raytracing support not RTX only!, Gameworks making some games unlayable on day one on AMD, your blinkers need to come off.

Your anger is impotent and misdirected. You are blinded by your hatred of Nvidia and not coming to rational conclusions. CP2077, like every single other game, currently runs faster on Nvidia hardware. Don't discredit the source just cause you don't agree with it, they are the messengers, after all, the quote comes from an AMD spokesperson.

Don't confuse confidence in their technology with confidence in their mindshare, and remember how little Nvidia needs the consumer GPU space.

Mindshare comes from a perceived success, not necessarily a product's technical characteristics. Many inferior technologies have prevailed over others in the past (for example, VHS over Beta), and some have actually persisted and surpassed the test of time itself (for example, the mp3 codec). If the mindshare enough is sufficient to make a product not only stay afloat, but take a position of market leader, then they have indeed a product that is welcomed with open arms by the customers.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Your anger is impotent and misdirected. You are blinded by your hatred of Nvidia and not coming to rational conclusions. CP2077, like every single other game, currently runs faster on Nvidia hardware. Don't discredit the source just cause you don't agree with it, they are the messengers, after all, the quote comes from an AMD spokesperson.



Mindshare comes from a perceived success, not necessarily a product's technical characteristics. Many inferior technologies have prevailed over others in the past (for example, VHS over Beta), and some have actually persisted and surpassed the test of time itself (for example, the mp3 codec). If the mindshare enough is sufficient to make a product not only stay afloat, but take a position of market leader, then they have indeed a product that is welcomed with open arms by the customers.
Angry, no I just explained why your wrong, I'm not even bothered, as a casual glance at my steam account would show, owned cb2077 too long, yet not played it, so no I couldn't care any less, however spreading of BS needs stopping. and the best retort you have is that, what are you on about? the source took a quote and ran 7 miles round fifteen corners with it to make some school boy clickbait shit, with no evidence and a list actually proving them wrong ie plenty have just dlss or both ??

Hate? wtaf i have many Nvidia cards even now I'm on a 2060-equipped laptop I've exclusively gamed on all week,I hate Nvidia's recent market practices, I used to be able to buy ROG AMD cards, you have seen one since the 7970 platinum i bought?, no no one has why, Nvidia's shitty business practices are not new its probably time AMD got involved, I Don't Hate Nvidia though that's for tools and fools, I have witnessed many PROVEN examples of they're excessive Nvidia shittyness though, whereas WCCFTECH has nothing but a vague statement off AMD saying what, they back FSR, oh wow the basterds.
 
Joined
Apr 13, 2023
Messages
232 (0.56/day)
System Name Can it run Warhammer 3?
Processor 7800X3D @ 5Ghz
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Enermax Liqmax III 360mm
Memory Corsair Vengeance @ 6000Mhz
Video Card(s) Asus Strix 3080
Storage Silicon Power XS70
Display(s) BenQ EX2710Q, BenQEX270M
Case NZXT H7 Flow
Audio Device(s) AudioTechnica M50xBT
Power Supply SuperFlower Leadex III 850W
Ok at this point I actually have to figure out how to hide the troll-posters. Apparently any ill-informed individual can get an i9 and 3090 at this point.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Angry, no I just explained why your wrong, I'm not even bothered, as a casual glance at my steam account would show, owned cb2077 too long, yet not played it, so no I couldn't care any less, however spreading of BS needs stopping. and the best retort you have is that, what are you on about? the source took a quote and ran 7 miles round fifteen corners with it to make some school boy clickbait shit, with no evidence and a list actually proving them wrong ie plenty have just dlss or both ??

Hate? wtaf i have many Nvidia cards even now I'm on a 2060-equipped laptop I've exclusively gamed on all week,I hate Nvidia's recent market practices, I used to be able to buy ROG AMD cards, you have seen one since the 7970 platinum i bought?, no no one has why, Nvidia's shitty business practices are not new its probably time AMD got involved, I Don't Hate Nvidia though that's for tools and fools, I have witnessed many PROVEN examples of they're excessive Nvidia shittyness though, whereas WCCFTECH has nothing but a vague statement off AMD saying what, they back FSR, oh wow the basterds.

See, you were rambling, can't have a productive argument that way. I have spread no such thing, only exposed the situation for what it is. Did you think AMD were your friends? One ill turn doesn't explain another. If ASUS no longer releases ROG AMD cards, you have to wonder why is that they don't want their premium brand to carry Radeon GPUs. Perhaps therein lies your ultimate answer.

Ok at this point I actually have to figure out how to hide the troll-posters. Apparently any ill-informed individual can get an i9 and 3090 at this point.

I am not trolling, nor am I some random individual, I've been a regular at this forum for almost 3 years. You just seem to have a problem with what I said and no meaningful answer for it, though. By all means I am quite interested in your reply as to why mindshare is not derived from a perception of success over time. As if people decided to buy Nvidia out of the kindness of their hearts or something.
 
Joined
Apr 14, 2018
Messages
480 (0.21/day)
Whats up with the doom and gloom comments in here? People pretending as if their house will blow up if they the run the game on non amd hardware. Its just a marketing deal just like RE4 and Dead Island 2 had, which ran well on all hardware.

And here i thought tpu comments would have more common sense

Considering the general Nvidia bias on these forums, I’m honestly not surprised.
 
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
See, you were rambling, can't have a productive argument that way. I have spread no such thing, only exposed the situation for what it is. Did you think AMD were your friends? One ill turn doesn't explain another. If ASUS no longer releases ROG AMD cards, you have to wonder why is that they don't want their premium brand to carry Radeon GPUs. Perhaps therein lies your ultimate answer.



I am not trolling, nor am I some random individual, I've been a regular at this forum for almost 3 years. You just seem to have a problem with what I said and no meaningful answer for it, though. By all means I am quite interested in your reply as to why mindshare is not derived from a perception of success over time. As if people decided to buy Nvidia out of the kindness of their hearts or something.
Your lacking in proof and spouting nonesense, keep it up and I'll drag back up the TPU news about Nvidia doing as I said to prove you wrong.

Ps I see no Anger in this reply either next day too?!.

Direct, Absolutely, anger, not at all
 
Last edited:
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
See, you were rambling, can't have a productive argument that way. I have spread no such thing, only exposed the situation for what it is. Did you think AMD were your friends? One ill turn doesn't explain another. If ASUS no longer releases ROG AMD cards, you have to wonder why is that they don't want their premium brand to carry Radeon GPUs. Perhaps therein lies your ultimate answer.
How long have you been a Gamer? ROG was removed because Nvidia demanded it. No different than Intel for MBs. The world is a different place today and Nvidia have shifted their shafting to their customers with $1500+ GPUs and cut down budget GPUs selling as Mid range.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Your lacking in proof and spouting nonesense, keep it up and I'll drag back up the TPU news about Nvidia doing as I said to prove you wrong.

You literally started fighting me by implying that DLSS sucks because it was updated into a third generation, let me quote you:

1>SO CONFIDENT WE ARE ON THE THIRD TRY.

Then you continued to go on about how we're supposedly incredibly upset that because I currently have a GeForce card I think everything must have DLSS out of entitlement, while in reality I was talking about Streamline which Intel adopted but not AMD.

So like, this isn't a way to make up a productive argument. Right now we are only pointing fingers at one another, this is no way to exchange ideas. I'm not even particularly pro-Nvidia, I simply recognized that they have done something right for once. I want all three technologies to be present in this title, what I would personally use is not even DLSS, but native - I actually insist on that.

I'll happily debate you if we can be reasonable to one another, but I'm unwilling to share in the animosity :)

How long have you been a Gamer? ROG was removed because Nvidia demanded it. No different than Intel for MBs. The world is a different place today and Nvidia have shifted their shafting to their customers with $1500+ GPUs and cut down budget GPUs selling as Mid range.

Can you back that claim up? I am well aware of the GPP stunt they tried to pull many years ago. But we have no evidence that this is still the case for the current generation, if you can do that, then I will more than agree with you. In fact I hope GN makes a video on that, they would deserve the heat and lawsuits.
 
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
You literally started fighting me by implying that DLSS sucks because it was updated into a third generation, let me quote you:



Then you continued to go on about how we're supposedly incredibly upset that because I currently have a GeForce card I think everything must have DLSS out of entitlement, while in reality I was talking about Streamline which Intel adopted but not AMD.

So like, this isn't a way to make up a productive argument. Right now we are only pointing fingers at one another, this is no way to exchange ideas. I'm not even particularly pro-Nvidia, I simply recognized that they have done something right for once. I want all three technologies to be present in this title, what I would personally use is not even DLSS, but native - I actually insist on that.

I'll happily debate you if we can be reasonable to one another, but I'm unwilling to share in the animosity :)
No I replied to

"1. The first and obvious is that Nvidia is confident in the superiority of its technology; and that they are willing to stake on it by making it extra easy for all their competitors to be included alongside it;"

With, no they're on they're third try why wasn't dlss 2 good enough or one, confidence on show?!, No.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
No I replied to

"1. The first and obvious is that Nvidia is confident in the superiority of its technology; and that they are willing to stake on it by making it extra easy for all their competitors to be included alongside it;"

With, no they're on they're third try why wasn't dlss 2 good enough or one, confidence on show?!, No.

I understand your confusion and indeed it is NV marketing's fault, you see, a while ago, DLSS 2.x was folded into DLSS 3, with Frame Generation being considered a subset of DLSS that only works on Ada generation cards. I'll be the first to say that FG is a gimmick and a hard pass for me. Even now they occasionally refer to DLSS's regular upscaling features as DLSS 2 sometimes, it's intentionally designed to generate FOMO and cause people to itch for an upgrade they don't need. This means it is correct that Turing and Ampere can run DLSS 3.1, but they cannot run Frame Generation because that feature is not available in this hardware class.

Like I said, your anger is misdirected, I am not shilling Nvidia here, and I share more than a few of your sentiments, particularly regarding Ada. Making that framework so that DLSS, XeSS and FSR could easily be implemented by developers is commendable especially with their track record, and AMD boycotting it is disappointingly surprising for the same reason, their track record is that they embrace choice and openness... yet that wasn't displayed here.
 
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I understand your confusion and indeed it is NV marketing's fault, you see, a while ago, DLSS 2.x was folded into DLSS 3, with Frame Generation being considered a subset of DLSS that only works on Ada generation cards. I'll be the first to say that FG is a gimmick and a hard pass for me. Even now they occasionally refer to DLSS's regular upscaling features as DLSS 2 sometimes, it's intentionally designed to generate FOMO and cause people to itch for an upgrade they don't need. This means it is correct that Turing and Ampere can run DLSS 3.1, but they cannot run Frame Generation because that feature is not available in this hardware class.

Like I said, your anger is misdirected, I am not shilling Nvidia here, and I share more than a few of your sentiments, particularly regarding Ada. Making that framework so that DLSS, XeSS and FSR could easily be implemented by developers is commendable especially with their track record, and AMD boycotting it is disappointingly surprising for the same reason, their track record is that they embrace choice and openness... yet that wasn't displayed here.
I'll leave you to your delusions, your having a different conversation to me, you think arguing a tangent viable, bye now.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I'll leave you to your delusions, your having a different conversation to me, you think arguing a tangent viable, bye now.

So you're resorting to gaslighting, I expected more of you man. FSR 3.0 is in active development for some time now, that doesn't make AMD desperate for releasing it. It's not a third try, it's a generational improvement and one most of us have a high expectation from. Either you can't express yourself very well - that is fine - but like I said, I don't want to share in the animosity. We'll pick this up some other time when it's convenient and clarify it all on DMs if you want. Cheers
 
Joined
Nov 13, 2007
Messages
10,272 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
You are NOT forced to do anything. You can disable it in the settings. You have a 4090 based on your system specs. Did you bought a 4090 to have the absolute need of an upscaling tech just to play at acceptable framerates? Also it was already posted that the game will have unofficial support soon after release. If the game is a success you can bet official support to come latter.

I bought it to play at 4K and high FPS -- and yes DLSS definitely makes that much better. FSR - as much as I want to like it looks like crap. Ill even take XeSS -- At 4K DLSS+Sharpening sometimes looks better than native TAA also +30%fps helps quite a bit in 4K.

Starfield is a heavy game, I doubt the 4090 can keep up with settings cranked - or even at High -- the recommended is a 6800XT o_O
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
So you're resorting to gaslighting, I expected more of you man. FSR 3.0 is in active development for some time now, that doesn't make AMD desperate for releasing it. It's not a third try, it's a generational improvement and one most of us have a high expectation from. Either you can't express yourself very well - that is fine - but like I said, I don't want to share in the animosity. We'll pick this up some other time when it's convenient and clarify it all on DMs if you want. Cheers
It's in the name, and even then your in denial.

Again did they make just 3 versions of dlss, no, many ,many more.

Yawn, bye now your now close to ignore because I can't stand ignorance.

I don't debate those making they're own reality while stearing said convo down tangential arguments meant to make me look biased, I'm not.

And I have seen your trolling style before.
 
Joined
Sep 17, 2014
Messages
21,210 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I am not trolling, nor am I some random individual, I've been a regular at this forum for almost 3 years. You just seem to have a problem with what I said and no meaningful answer for it, though. By all means I am quite interested in your reply as to why mindshare is not derived from a perception of success over time. As if people decided to buy Nvidia out of the kindness of their hearts or something.
I don't think you're trolling, and I totally get your mindshare / trust built over time perspective too, I've been in that place with Nvidia until the moment GTX turned into RTX. That is frankly the moment GPUs all went to shit in steady paces. We're paying a massive price for technologies that to this date have questionable purpose and eat costly die space.

What I'm seeing today is an Nvidia that is readily gearing up to create a forced upgrade path that matches their newly timed release cadence, whereas during GTX, they just had the 'best options available' most of the time and the release cadence meant the market was nicely populated. Now, it isn't and when we do get a new gen, the improvements are lackluster; 3060 > 4060 is a complete joke, and note, this is that vast midrange we're talking about. The added technologies were never as influential as they are today. The problem however then is the proprietary approach combined with Nvidia's track record.

When I place that next to an AMD that is really not changing its pace from the last, well, nearly ten years; I mean they still release new stuff slow as molasses ever since the post Hawaii XT era; and is STILL able to keep up to everything except Nvidia's overpriced top end, I know what's what. Nvidia hasn't really got a lead at all, they just create a reality where they have one. And in every place where they stop supporting that reality, you're left with a brick given the importance of a tech like DLSS3. And again, the problem is the proprietary approach, because tech like DLSS3 is used to sell GPUs. Its absolutely silly Ampere doesn't have access to it. What's next?

You've seen the first comments here now that a game doesn't release with it. Drama
 
Last edited:
Joined
Nov 13, 2007
Messages
10,272 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
When I place that next to an AMD that is really not changing its pace from the last, well, nearly ten years; I mean they still release new stuff slow as molasses ever since the post Hawaii XT era; and is STILL able to keep up to everything except Nvidia's overpriced top end, I know what's what. Nvidia hasn't really got a lead at all, they just create a reality where they have one. And in every place where they stop supporting that reality, you're left with a brick given the importance of a tech like DLSS3. And again, the problem is the proprietary approach, because tech like DLSS3 is used to sell GPUs. Its absolutely silly Ampere doesn't have access to it. What's next?

In quite a few cases the 4070TI plays games like Hogwarts Legacy, Atomic Heart, Cyberpunk etc. better than a 7900XTX in real life --- why? because the nvidia nerd just enables DLSS 2/3 sets it to balanced and BOOM - game plays smoother and looks better than it does on the 7900XTX, no matter what settings the AMD owner uses. How do I know? Just built a 4070ti 5800x3d upgrade rig for a friend, and another 12600K 7900xtx mini itx build... And those were the games I happened to be testing with at the time.

The 7900XTX is super powerful card and a MUCH better card in raw stats and raster, but technology is a thing -- you can get to a good gaming experience without brute force-only.

That's why there's so much drama in this thread -- nvidia's software shenanigans actually work well, and when we're forced to use Raster only or (god forbid) FSR it's a big deal for people that use NVidia because it materially degrades the gaming experience simply because AMD can't compete with their vaseline smear upscaler.

Im not mad at AMD for what they did, I'm just generally mad that i'm probably going to have to subject my eyeballs to FSR if I cant get the FPS. Hopefully they do a good job like in Cyberpunk so it's not too bad.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
In quite a few cases the 4070TI plays games like Hogwarts Legacy, Atomic Heart, Cyberpunk etc. better than a 7900XTX in real life --- why? because the nvidia nerd just enables DLSS 2/3 sets it to balanced and BOOM - game plays smoother and looks better than it does on the 7900XTX, no matter what settings the AMD owner uses. How do I know? Just built a 4070ti 5800x3d upgrade rig for a friend, and another 12600K 7900xtx mini itx build... And those were the games I happened to be testing with at the time.

The 7900XTX is super powerful card and a MUCH better card in raw stats and raster, but technology is a thing -- you can get to a good gaming experience without brute force-only.

That's why there's so much drama in this thread -- nvidia's software shenanigans actually work well, and when we're forced to use Raster only or (god forbid) FSR it's a big deal for people that use NVidia because it materially degrades the gaming experience simply because AMD can't compete with their vaseline smear upscaler.

Im not mad at AMD for what they did, I'm just generally mad that i'm probably going to have to subject my eyeballs to FSR if I cant get the FPS. Hopefully they do a good job like in Cyberpunk so it's not too bad.
How do you go about proving that?
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I don't think you're trolling, and I totally get your mindshare / trust built over time perspective too, I've been in that place with Nvidia until the moment GTX turned into RTX. That is frankly the moment GPUs all went to shit in steady paces. We're paying a massive price for technologies that to this date have questionable purpose and eat costly die space.

What I'm seeing today is an Nvidia that is readily gearing up to create a forced upgrade path that matches their newly timed release cadence, whereas during GTX, they just had the 'best options available' most of the time and the release cadence meant the market was nicely populated. Now, it isn't and when we do get a new gen, the improvements are lackluster; 3060 > 4060 is a complete joke, and note, this is that vast midrange we're talking about. The added technologies were never as influential as they are today. The problem however then is the proprietary approach combined with Nvidia's track record.

When I place that next to an AMD that is really not changing its pace from the last, well, nearly ten years; I mean they still release new stuff slow as molasses ever since the post Hawaii XT era; and is STILL able to keep up to everything except Nvidia's overpriced top end, I know what's what. Nvidia hasn't really got a lead at all, they just create a reality where they have one. And in every place where they stop supporting that reality, you're left with a brick given the importance of a tech like DLSS3. And again, the problem is the proprietary approach, because tech like DLSS3 is used to sell GPUs. Its absolutely silly Ampere doesn't have access to it. What's next?

You've seen the first comments here now that a game doesn't release with it. Drama

And we are in resounding, complete agreement about this!
 
Joined
Nov 13, 2007
Messages
10,272 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
How do you go about proving that?
1688065517290.png


Raw raster-hogwarts legacy...

Now look at the difference with DLSS3 on vs off.... You can run 4k with RT no issues. At 4k it smashes the 7900xtx by 40FPS just with DLSS3 alone.
(9) Hogwarts Legacy - DLSS 3 test @ INNO3D RTX 4070 Ti | 160W TDP limit - YouTube

Now turn on DLSS 3 - and you get over 150 FPS on the 4070ti.

Or you can build the two rigs and see for yourself.

Or let's do atomic heart native TAA vs DLSS vs FSR
Atomic Heart: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison Review | TechPowerUp

"Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people."

"DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency."

^ from TPU reviewers.

I've played on both, and I can tell you there are quite a few games that the 4070ti outright smashes the 7900xtx in gaming experience due to the settings it allows purely due to DLSS. And in just DLSS2 games, no frame gen, DLSS 2 balanced still looks better than any FSR 2 Quality - so you're basically getting the same performance at a better IQ.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
View attachment 302891

Raw raster-hogwarts legacy...

Now look at the difference with DLSS3 on vs off.... You can run 4k with RT no issues. At 4k it smashes the 7900xtx by 40FPS just with DLSS3 alone.
(9) Hogwarts Legacy - DLSS 3 test @ INNO3D RTX 4070 Ti | 160W TDP limit - YouTube

Now turn on DLSS 3 - and you get over 150 FPS on the 4070ti.

Or you can build the two rigs and see for yourself.

Or let's do atomic heart native TAA vs DLSS vs FSR
Atomic Heart: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison Review | TechPowerUp

"Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people."

"DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency."

^ from TPU reviewers.

I've played on both, and I can tell you there are quite a few games that the 4070ti outright smashes the 7900xtx in gaming experience due to the settings it allows purely due to DLSS. And in just DLSS2 games, no frame gen, DLSS 2 balanced still looks better than any FSR 2 Quality - so you're basically getting the same performance at a better IQ.
Yep 3 Games. So let's look at a review and you tell me how you feel. I too have had both and know that the 7900XT and XTX are plenty fast enough to drive my monitor. Then let's look at pricing.




I like raw performance and as much as people love to talk about DLSS. Sapphire had an upscaler in TRixx before any of these were available but that does not matter. I love my 7900XT and so does my FV43U. DLSS, Frame Generation and RT mean absolutely nothing to me. I do know that more VRAM is better than less VRAM in the long run though.
 
Joined
Nov 13, 2007
Messages
10,272 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Yep 3 Games. So let's look at a review and you tell me how you feel. I too have had both and know that the 7900XT and XTX are plenty fast enough to drive my monitor. Then let's look at pricing.




I like raw performance and as much as people love to talk about DLSS. Sapphire had an upscaler in TRixx before any of these were available but that does not matter. I love my 7900XT and so does my FV43U. DLSS, Frame Generation and RT mean absolutely nothing to me. I do know that more VRAM is better than less VRAM in the long run though.

Number of Games Supporting NVIDIA DLSS Crosses 300: DLSS 3 Now in 33 Games | Hardware Times

Yep 3 games that I played, reviews purposely stay away from DLSS comparisons (especially FG) or the Radeon cards get crushed by 40% and it's "not fair".

Here's from one of the reviews:
1688089274033.png


Now let's see an example of the Impact of DLSS in that game:
1688089304817.png


I mean that's great -- congrats -- the 7900XT is a good card and the ram will definitely last longer than the 12G on the TI. Doesn't change the fact that the Ngreedia tech is good, and usually works -- and not having it enabled in games is kind of disappointing.

In reality if you have the 4070ti and you're playing any of those 33 games that support DLSS3 at 4k (or the 290 that support 2.0) -- you're playing with it on. Reviews won't show that -- and it can really mislead people on the overall experience.
 
Last edited:
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Number of Games Supporting NVIDIA DLSS Crosses 300: DLSS 3 Now in 33 Games | Hardware Times

Yep 3 games that I played, reviews purposely stay away from DLSS comparisons (especially FG) or the Radeon cards get crushed by 40% and it's "not fair".

Here's from one of the reviews:
View attachment 302932

Now let's see an example of the Impact of DLSS in that game:
View attachment 302933

I mean that's great -- congrats -- the 7900XT is a good card and the ram will definitely last longer than the 12G on the TI. Doesn't change the fact that the Ngreedia tech is good, and usually works -- and not having it enabled in games is kind of disappointing.

In reality if you have the 4070ti and you're playing any of those 33 games that support DLSS3 at 4k (or the 290 that support 2.0) -- you're playing with it on. Reviews won't show that -- and it can really mislead people on the overall experience.
I am so glad I watched the MSI Gaming livestream this week. They showed DLSS3 with Frame Gen and the perosn playing could not shoot anyone in a FPS and admitted the floaty feeling and lag that those "innovations" introduced into the Game. If you like them good for you. I spent my money on VRAM as the 127 FPS that the Hitman 3 shows is perfect to be smooth. Then I have an X3D chip for the 1% lows so I am Golden.
 
Top