Tuesday, June 27th 2023

AMD Announced as Starfield's Exclusive Partner on PC

AMD and Bethesda have today revealed that Starfield will be best experienced on a Ryzen processor and Radeon graphics card-equipped PC. Team Red has been announced as the giant open world game's official graphics and GPU partner, but its Xbox Series hardware also gets a couple of friendly shout-outs. Todd Howard, director and executive producer at Bethesda Game Studios, stated in the video presentation: "We have AMD engineers in our code base working on FSR (FidelityFX Super Resolution) 2.0 image processing and upscaling and it looks incredible. You're going to get the benefits of that obviously on your PC but also on Xbox. We're super excited and can't wait to show everybody more."

Jack Huynh, Senior Vice President and General Manager of its Computing and Graphics Group at AMD, added: "Making this game even more special, is the close collaboration between Bethesda and AMD to unlock the full potential of Starfield. We have worked hand-in-hand with Bethesda Game Studios to optimize Starfield for both Xbox and PC with Ryzen 7000 series processors and Radeon 7000 series graphics. The optimizations both accelerate performance and enhance the quality of your gameplay using highly multi-threaded code that both Xbox and PC players will get to take advantage of."
AMD is proud to announce that we are Bethesda's exclusive PC partner for the next-generation role-playing game, Starfield. Watch this special announcement video to learn how AMD and Bethesda are working together to bring the galaxy to all players this September:


About Starfield
Starfield is the first new universe in over 25 years from Bethesda Game Studios, the award-winning creators of The Elder Scrolls V: Skyrim and Fallout 4. In this next generation role-playing game set amongst the stars, create any character you want and explore with unparalleled freedom as you embark on an epic journey to answer humanity's greatest mystery. In the year 2330, humanity has ventured beyond our solar system, settling new planets, and living as a spacefaring people. You will join Constellation - the last group of space explorers seeking rare artifacts throughout the galaxy - and navigate the vast expanse of space in Bethesda Game Studios' biggest and most ambitious game.

AMD is the exclusive PC partner for Starfield, promising to deliver the most complete PC gaming experience in the galaxy. We cannot wait to explore the universe with you this September. Ready to learn more?
Source: AMD
Add your own comment

226 Comments on AMD Announced as Starfield's Exclusive Partner on PC

#101
john_
R0H1TIt only mean RTX sadly, not even GTX cards from 4-5 years back. Talk about stiffing your own customers :shadedshu:
:confused::confused::confused:
Yeah, thank you. That's what I wrote. :toast:

My God, why did you cut the post there? Why you left the rest part out? Maybe I should have put a comma "," after the only? Syntax error from my part?
john_DLSS only, means GTX, Radeon and console owners not having an upscaling tech AT ALL.
Posted on Reply
#102
Vayra86
Razrback16Great...so now we're starting to see titles like this and Jedi Survivor where the game doesn't have DLSS in it when that's what the vast majority of people are using. Just one more thing to create division in the gaming arena now.
Haha. Its a great reality check for those counting on that free Nvidia TLC.

Welcome to the Nvidia clusterfuck because youre also lacking the VRAM to run native at some point.

As predicted. Fools & Money will be parted

I dont 'count' on FSR either. We all need to judge raw raster performance and nothing else. This confirms it.
sLowEndGreat. Now when it launches with issues, people will blame AMD instead of looking at Bethesda's long history of launching games in a bug filled state.
Doubtful. Even with DLSS20 you cant hide a shit engine and stutters
R0H1TIt only mean RTX sadly, not even GTX cards from 4-5 years back. Talk about stiffing your own customers :shadedshu:
Exactly.
So again: if you count on DLSS/FSR to get your game playable, youre an idiot.
Posted on Reply
#103
Dr. Dro
TheoneandonlyMrK1>SO CONFIDENT WE ARE ON THE THIRD TRY.

and the third breaks compatibility with the first card to use dlss,which is, arse

2 Big leaps of bullllllshit right there, intermingled with conspiracy and unproven wccf shit, in their own post on it they adequately show its a shitshow of support that's about equal in, fsr and dlss being both supported or just one, no drought of dlss exists, just a darth of Entitled plebs who think they bought Nvidia everyone else should, and everyone has to dance to Huangs tune.

3 that wccftech is a trusted journo source, I read it but its not an instant fact type of site is it.

4 Do you think when cyberpunk got bought by Nvidia it didn't get leaned towards Nvidia!, I wouldn't buy it until it was A fixed and B supported a decent FSR version, it took a while , but I survived :) :D.

Now let's see the honesty and maturity of posters come release since Bethesda make some shocking first-day shit Ala fallout 76 /everything they do so expecting much here on new IP well.

@phanbuey Bullshit, nearly every Crysis game and remake, fu$$$$ by Nvidia co-operation money, many others I waited ages for Fsr support to be added way after dlss, or like Crysis Basic DX12 Raytracing support not RTX only!, Gameworks making some games unlayable on day one on AMD, your blinkers need to come off.
Your anger is impotent and misdirected. You are blinded by your hatred of Nvidia and not coming to rational conclusions. CP2077, like every single other game, currently runs faster on Nvidia hardware. Don't discredit the source just cause you don't agree with it, they are the messengers, after all, the quote comes from an AMD spokesperson.
sethmatrix7Don't confuse confidence in their technology with confidence in their mindshare, and remember how little Nvidia needs the consumer GPU space.
Mindshare comes from a perceived success, not necessarily a product's technical characteristics. Many inferior technologies have prevailed over others in the past (for example, VHS over Beta), and some have actually persisted and surpassed the test of time itself (for example, the mp3 codec). If the mindshare enough is sufficient to make a product not only stay afloat, but take a position of market leader, then they have indeed a product that is welcomed with open arms by the customers.
Posted on Reply
#104
TheoneandonlyMrK
Dr. DroYour anger is impotent and misdirected. You are blinded by your hatred of Nvidia and not coming to rational conclusions. CP2077, like every single other game, currently runs faster on Nvidia hardware. Don't discredit the source just cause you don't agree with it, they are the messengers, after all, the quote comes from an AMD spokesperson.



Mindshare comes from a perceived success, not necessarily a product's technical characteristics. Many inferior technologies have prevailed over others in the past (for example, VHS over Beta), and some have actually persisted and surpassed the test of time itself (for example, the mp3 codec). If the mindshare enough is sufficient to make a product not only stay afloat, but take a position of market leader, then they have indeed a product that is welcomed with open arms by the customers.
Angry, no I just explained why your wrong, I'm not even bothered, as a casual glance at my steam account would show, owned cb2077 too long, yet not played it, so no I couldn't care any less, however spreading of BS needs stopping. and the best retort you have is that, what are you on about? the source took a quote and ran 7 miles round fifteen corners with it to make some school boy clickbait shit, with no evidence and a list actually proving them wrong ie plenty have just dlss or both ??

Hate? wtaf i have many Nvidia cards even now I'm on a 2060-equipped laptop I've exclusively gamed on all week,I hate Nvidia's recent market practices, I used to be able to buy ROG AMD cards, you have seen one since the 7970 platinum i bought?, no no one has why, Nvidia's shitty business practices are not new its probably time AMD got involved, I Don't Hate Nvidia though that's for tools and fools, I have witnessed many PROVEN examples of they're excessive Nvidia shittyness though, whereas WCCFTECH has nothing but a vague statement off AMD saying what, they back FSR, oh wow the basterds.
Posted on Reply
#105
sethmatrix7
Ok at this point I actually have to figure out how to hide the troll-posters. Apparently any ill-informed individual can get an i9 and 3090 at this point.
Posted on Reply
#106
Dr. Dro
TheoneandonlyMrKAngry, no I just explained why your wrong, I'm not even bothered, as a casual glance at my steam account would show, owned cb2077 too long, yet not played it, so no I couldn't care any less, however spreading of BS needs stopping. and the best retort you have is that, what are you on about? the source took a quote and ran 7 miles round fifteen corners with it to make some school boy clickbait shit, with no evidence and a list actually proving them wrong ie plenty have just dlss or both ??

Hate? wtaf i have many Nvidia cards even now I'm on a 2060-equipped laptop I've exclusively gamed on all week,I hate Nvidia's recent market practices, I used to be able to buy ROG AMD cards, you have seen one since the 7970 platinum i bought?, no no one has why, Nvidia's shitty business practices are not new its probably time AMD got involved, I Don't Hate Nvidia though that's for tools and fools, I have witnessed many PROVEN examples of they're excessive Nvidia shittyness though, whereas WCCFTECH has nothing but a vague statement off AMD saying what, they back FSR, oh wow the basterds.
See, you were rambling, can't have a productive argument that way. I have spread no such thing, only exposed the situation for what it is. Did you think AMD were your friends? One ill turn doesn't explain another. If ASUS no longer releases ROG AMD cards, you have to wonder why is that they don't want their premium brand to carry Radeon GPUs. Perhaps therein lies your ultimate answer.
sethmatrix7Ok at this point I actually have to figure out how to hide the troll-posters. Apparently any ill-informed individual can get an i9 and 3090 at this point.
I am not trolling, nor am I some random individual, I've been a regular at this forum for almost 3 years. You just seem to have a problem with what I said and no meaningful answer for it, though. By all means I am quite interested in your reply as to why mindshare is not derived from a perception of success over time. As if people decided to buy Nvidia out of the kindness of their hearts or something.
Posted on Reply
#107
rv8000
thunderingroarWhats up with the doom and gloom comments in here? People pretending as if their house will blow up if they the run the game on non amd hardware. Its just a marketing deal just like RE4 and Dead Island 2 had, which ran well on all hardware.

And here i thought tpu comments would have more common sense
Considering the general Nvidia bias on these forums, I’m honestly not surprised.
Posted on Reply
#108
TheoneandonlyMrK
Dr. DroSee, you were rambling, can't have a productive argument that way. I have spread no such thing, only exposed the situation for what it is. Did you think AMD were your friends? One ill turn doesn't explain another. If ASUS no longer releases ROG AMD cards, you have to wonder why is that they don't want their premium brand to carry Radeon GPUs. Perhaps therein lies your ultimate answer.



I am not trolling, nor am I some random individual, I've been a regular at this forum for almost 3 years. You just seem to have a problem with what I said and no meaningful answer for it, though. By all means I am quite interested in your reply as to why mindshare is not derived from a perception of success over time. As if people decided to buy Nvidia out of the kindness of their hearts or something.
Your lacking in proof and spouting nonesense, keep it up and I'll drag back up the TPU news about Nvidia doing as I said to prove you wrong.

Ps I see no Anger in this reply either next day too?!.

Direct, Absolutely, anger, not at all
Posted on Reply
#109
kapone32
Dr. DroSee, you were rambling, can't have a productive argument that way. I have spread no such thing, only exposed the situation for what it is. Did you think AMD were your friends? One ill turn doesn't explain another. If ASUS no longer releases ROG AMD cards, you have to wonder why is that they don't want their premium brand to carry Radeon GPUs. Perhaps therein lies your ultimate answer.
How long have you been a Gamer? ROG was removed because Nvidia demanded it. No different than Intel for MBs. The world is a different place today and Nvidia have shifted their shafting to their customers with $1500+ GPUs and cut down budget GPUs selling as Mid range.
Posted on Reply
#110
Dr. Dro
TheoneandonlyMrKYour lacking in proof and spouting nonesense, keep it up and I'll drag back up the TPU news about Nvidia doing as I said to prove you wrong.
You literally started fighting me by implying that DLSS sucks because it was updated into a third generation, let me quote you:
TheoneandonlyMrK1>SO CONFIDENT WE ARE ON THE THIRD TRY.
Then you continued to go on about how we're supposedly incredibly upset that because I currently have a GeForce card I think everything must have DLSS out of entitlement, while in reality I was talking about Streamline which Intel adopted but not AMD.

So like, this isn't a way to make up a productive argument. Right now we are only pointing fingers at one another, this is no way to exchange ideas. I'm not even particularly pro-Nvidia, I simply recognized that they have done something right for once. I want all three technologies to be present in this title, what I would personally use is not even DLSS, but native - I actually insist on that.

I'll happily debate you if we can be reasonable to one another, but I'm unwilling to share in the animosity :)
kapone32How long have you been a Gamer? ROG was removed because Nvidia demanded it. No different than Intel for MBs. The world is a different place today and Nvidia have shifted their shafting to their customers with $1500+ GPUs and cut down budget GPUs selling as Mid range.
Can you back that claim up? I am well aware of the GPP stunt they tried to pull many years ago. But we have no evidence that this is still the case for the current generation, if you can do that, then I will more than agree with you. In fact I hope GN makes a video on that, they would deserve the heat and lawsuits.
Posted on Reply
#111
TheoneandonlyMrK
Dr. DroYou literally started fighting me by implying that DLSS sucks because it was updated into a third generation, let me quote you:



Then you continued to go on about how we're supposedly incredibly upset that because I currently have a GeForce card I think everything must have DLSS out of entitlement, while in reality I was talking about Streamline which Intel adopted but not AMD.

So like, this isn't a way to make up a productive argument. Right now we are only pointing fingers at one another, this is no way to exchange ideas. I'm not even particularly pro-Nvidia, I simply recognized that they have done something right for once. I want all three technologies to be present in this title, what I would personally use is not even DLSS, but native - I actually insist on that.

I'll happily debate you if we can be reasonable to one another, but I'm unwilling to share in the animosity :)
No I replied to

"1. The first and obvious is that Nvidia is confident in the superiority of its technology; and that they are willing to stake on it by making it extra easy for all their competitors to be included alongside it;"

With, no they're on they're third try why wasn't dlss 2 good enough or one, confidence on show?!, No.
Posted on Reply
#112
Dr. Dro
TheoneandonlyMrKNo I replied to

"1. The first and obvious is that Nvidia is confident in the superiority of its technology; and that they are willing to stake on it by making it extra easy for all their competitors to be included alongside it;"

With, no they're on they're third try why wasn't dlss 2 good enough or one, confidence on show?!, No.
I understand your confusion and indeed it is NV marketing's fault, you see, a while ago, DLSS 2.x was folded into DLSS 3, with Frame Generation being considered a subset of DLSS that only works on Ada generation cards. I'll be the first to say that FG is a gimmick and a hard pass for me. Even now they occasionally refer to DLSS's regular upscaling features as DLSS 2 sometimes, it's intentionally designed to generate FOMO and cause people to itch for an upgrade they don't need. This means it is correct that Turing and Ampere can run DLSS 3.1, but they cannot run Frame Generation because that feature is not available in this hardware class.

Like I said, your anger is misdirected, I am not shilling Nvidia here, and I share more than a few of your sentiments, particularly regarding Ada. Making that framework so that DLSS, XeSS and FSR could easily be implemented by developers is commendable especially with their track record, and AMD boycotting it is disappointingly surprising for the same reason, their track record is that they embrace choice and openness... yet that wasn't displayed here.
Posted on Reply
#113
TheoneandonlyMrK
Dr. DroI understand your confusion and indeed it is NV marketing's fault, you see, a while ago, DLSS 2.x was folded into DLSS 3, with Frame Generation being considered a subset of DLSS that only works on Ada generation cards. I'll be the first to say that FG is a gimmick and a hard pass for me. Even now they occasionally refer to DLSS's regular upscaling features as DLSS 2 sometimes, it's intentionally designed to generate FOMO and cause people to itch for an upgrade they don't need. This means it is correct that Turing and Ampere can run DLSS 3.1, but they cannot run Frame Generation because that feature is not available in this hardware class.

Like I said, your anger is misdirected, I am not shilling Nvidia here, and I share more than a few of your sentiments, particularly regarding Ada. Making that framework so that DLSS, XeSS and FSR could easily be implemented by developers is commendable especially with their track record, and AMD boycotting it is disappointingly surprising for the same reason, their track record is that they embrace choice and openness... yet that wasn't displayed here.
I'll leave you to your delusions, your having a different conversation to me, you think arguing a tangent viable, bye now.
Posted on Reply
#114
Dr. Dro
TheoneandonlyMrKI'll leave you to your delusions, your having a different conversation to me, you think arguing a tangent viable, bye now.
So you're resorting to gaslighting, I expected more of you man. FSR 3.0 is in active development for some time now, that doesn't make AMD desperate for releasing it. It's not a third try, it's a generational improvement and one most of us have a high expectation from. Either you can't express yourself very well - that is fine - but like I said, I don't want to share in the animosity. We'll pick this up some other time when it's convenient and clarify it all on DMs if you want. Cheers
Posted on Reply
#115
phanbuey
john_You are NOT forced to do anything. You can disable it in the settings. You have a 4090 based on your system specs. Did you bought a 4090 to have the absolute need of an upscaling tech just to play at acceptable framerates? Also it was already posted that the game will have unofficial support soon after release. If the game is a success you can bet official support to come latter.
I bought it to play at 4K and high FPS -- and yes DLSS definitely makes that much better. FSR - as much as I want to like it looks like crap. Ill even take XeSS -- At 4K DLSS+Sharpening sometimes looks better than native TAA also +30%fps helps quite a bit in 4K.

Starfield is a heavy game, I doubt the 4090 can keep up with settings cranked - or even at High -- the recommended is a 6800XT o.O
Posted on Reply
#116
TheoneandonlyMrK
Dr. DroSo you're resorting to gaslighting, I expected more of you man. FSR 3.0 is in active development for some time now, that doesn't make AMD desperate for releasing it. It's not a third try, it's a generational improvement and one most of us have a high expectation from. Either you can't express yourself very well - that is fine - but like I said, I don't want to share in the animosity. We'll pick this up some other time when it's convenient and clarify it all on DMs if you want. Cheers
It's in the name, and even then your in denial.

Again did they make just 3 versions of dlss, no, many ,many more.

Yawn, bye now your now close to ignore because I can't stand ignorance.

I don't debate those making they're own reality while stearing said convo down tangential arguments meant to make me look biased, I'm not.

And I have seen your trolling style before.
Posted on Reply
#117
Vayra86
Dr. DroI am not trolling, nor am I some random individual, I've been a regular at this forum for almost 3 years. You just seem to have a problem with what I said and no meaningful answer for it, though. By all means I am quite interested in your reply as to why mindshare is not derived from a perception of success over time. As if people decided to buy Nvidia out of the kindness of their hearts or something.
I don't think you're trolling, and I totally get your mindshare / trust built over time perspective too, I've been in that place with Nvidia until the moment GTX turned into RTX. That is frankly the moment GPUs all went to shit in steady paces. We're paying a massive price for technologies that to this date have questionable purpose and eat costly die space.

What I'm seeing today is an Nvidia that is readily gearing up to create a forced upgrade path that matches their newly timed release cadence, whereas during GTX, they just had the 'best options available' most of the time and the release cadence meant the market was nicely populated. Now, it isn't and when we do get a new gen, the improvements are lackluster; 3060 > 4060 is a complete joke, and note, this is that vast midrange we're talking about. The added technologies were never as influential as they are today. The problem however then is the proprietary approach combined with Nvidia's track record.

When I place that next to an AMD that is really not changing its pace from the last, well, nearly ten years; I mean they still release new stuff slow as molasses ever since the post Hawaii XT era; and is STILL able to keep up to everything except Nvidia's overpriced top end, I know what's what. Nvidia hasn't really got a lead at all, they just create a reality where they have one. And in every place where they stop supporting that reality, you're left with a brick given the importance of a tech like DLSS3. And again, the problem is the proprietary approach, because tech like DLSS3 is used to sell GPUs. Its absolutely silly Ampere doesn't have access to it. What's next?

You've seen the first comments here now that a game doesn't release with it. Drama
Posted on Reply
#118
phanbuey
Vayra86When I place that next to an AMD that is really not changing its pace from the last, well, nearly ten years; I mean they still release new stuff slow as molasses ever since the post Hawaii XT era; and is STILL able to keep up to everything except Nvidia's overpriced top end, I know what's what. Nvidia hasn't really got a lead at all, they just create a reality where they have one. And in every place where they stop supporting that reality, you're left with a brick given the importance of a tech like DLSS3. And again, the problem is the proprietary approach, because tech like DLSS3 is used to sell GPUs. Its absolutely silly Ampere doesn't have access to it. What's next?
In quite a few cases the 4070TI plays games like Hogwarts Legacy, Atomic Heart, Cyberpunk etc. better than a 7900XTX in real life --- why? because the nvidia nerd just enables DLSS 2/3 sets it to balanced and BOOM - game plays smoother and looks better than it does on the 7900XTX, no matter what settings the AMD owner uses. How do I know? Just built a 4070ti 5800x3d upgrade rig for a friend, and another 12600K 7900xtx mini itx build... And those were the games I happened to be testing with at the time.

The 7900XTX is super powerful card and a MUCH better card in raw stats and raster, but technology is a thing -- you can get to a good gaming experience without brute force-only.

That's why there's so much drama in this thread -- nvidia's software shenanigans actually work well, and when we're forced to use Raster only or (god forbid) FSR it's a big deal for people that use NVidia because it materially degrades the gaming experience simply because AMD can't compete with their vaseline smear upscaler.

Im not mad at AMD for what they did, I'm just generally mad that i'm probably going to have to subject my eyeballs to FSR if I cant get the FPS. Hopefully they do a good job like in Cyberpunk so it's not too bad.
Posted on Reply
#119
kapone32
phanbueyIn quite a few cases the 4070TI plays games like Hogwarts Legacy, Atomic Heart, Cyberpunk etc. better than a 7900XTX in real life --- why? because the nvidia nerd just enables DLSS 2/3 sets it to balanced and BOOM - game plays smoother and looks better than it does on the 7900XTX, no matter what settings the AMD owner uses. How do I know? Just built a 4070ti 5800x3d upgrade rig for a friend, and another 12600K 7900xtx mini itx build... And those were the games I happened to be testing with at the time.

The 7900XTX is super powerful card and a MUCH better card in raw stats and raster, but technology is a thing -- you can get to a good gaming experience without brute force-only.

That's why there's so much drama in this thread -- nvidia's software shenanigans actually work well, and when we're forced to use Raster only or (god forbid) FSR it's a big deal for people that use NVidia because it materially degrades the gaming experience simply because AMD can't compete with their vaseline smear upscaler.

Im not mad at AMD for what they did, I'm just generally mad that i'm probably going to have to subject my eyeballs to FSR if I cant get the FPS. Hopefully they do a good job like in Cyberpunk so it's not too bad.
How do you go about proving that?
Posted on Reply
#120
Dr. Dro
Vayra86I don't think you're trolling, and I totally get your mindshare / trust built over time perspective too, I've been in that place with Nvidia until the moment GTX turned into RTX. That is frankly the moment GPUs all went to shit in steady paces. We're paying a massive price for technologies that to this date have questionable purpose and eat costly die space.

What I'm seeing today is an Nvidia that is readily gearing up to create a forced upgrade path that matches their newly timed release cadence, whereas during GTX, they just had the 'best options available' most of the time and the release cadence meant the market was nicely populated. Now, it isn't and when we do get a new gen, the improvements are lackluster; 3060 > 4060 is a complete joke, and note, this is that vast midrange we're talking about. The added technologies were never as influential as they are today. The problem however then is the proprietary approach combined with Nvidia's track record.

When I place that next to an AMD that is really not changing its pace from the last, well, nearly ten years; I mean they still release new stuff slow as molasses ever since the post Hawaii XT era; and is STILL able to keep up to everything except Nvidia's overpriced top end, I know what's what. Nvidia hasn't really got a lead at all, they just create a reality where they have one. And in every place where they stop supporting that reality, you're left with a brick given the importance of a tech like DLSS3. And again, the problem is the proprietary approach, because tech like DLSS3 is used to sell GPUs. Its absolutely silly Ampere doesn't have access to it. What's next?

You've seen the first comments here now that a game doesn't release with it. Drama
And we are in resounding, complete agreement about this!
Posted on Reply
#121
phanbuey
kapone32How do you go about proving that?


Raw raster-hogwarts legacy...

Now look at the difference with DLSS3 on vs off.... You can run 4k with RT no issues. At 4k it smashes the 7900xtx by 40FPS just with DLSS3 alone.
(9) Hogwarts Legacy - DLSS 3 test @ INNO3D RTX 4070 Ti | 160W TDP limit - YouTube

Now turn on DLSS 3 - and you get over 150 FPS on the 4070ti.

Or you can build the two rigs and see for yourself.

Or let's do atomic heart native TAA vs DLSS vs FSR
Atomic Heart: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison Review | TechPowerUp

"Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people."

"DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency."

^ from TPU reviewers.

I've played on both, and I can tell you there are quite a few games that the 4070ti outright smashes the 7900xtx in gaming experience due to the settings it allows purely due to DLSS. And in just DLSS2 games, no frame gen, DLSS 2 balanced still looks better than any FSR 2 Quality - so you're basically getting the same performance at a better IQ.
Posted on Reply
#122
kapone32
phanbuey

Raw raster-hogwarts legacy...

Now look at the difference with DLSS3 on vs off.... You can run 4k with RT no issues. At 4k it smashes the 7900xtx by 40FPS just with DLSS3 alone.
(9) Hogwarts Legacy - DLSS 3 test @ INNO3D RTX 4070 Ti | 160W TDP limit - YouTube

Now turn on DLSS 3 - and you get over 150 FPS on the 4070ti.

Or you can build the two rigs and see for yourself.

Or let's do atomic heart native TAA vs DLSS vs FSR
Atomic Heart: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison Review | TechPowerUp

"Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people."

"DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency."

^ from TPU reviewers.

I've played on both, and I can tell you there are quite a few games that the 4070ti outright smashes the 7900xtx in gaming experience due to the settings it allows purely due to DLSS. And in just DLSS2 games, no frame gen, DLSS 2 balanced still looks better than any FSR 2 Quality - so you're basically getting the same performance at a better IQ.
Yep 3 Games. So let's look at a review and you tell me how you feel. I too have had both and know that the 7900XT and XTX are plenty fast enough to drive my monitor. Then let's look at pricing.

www.techpowerup.com/review/gigabyte-geforce-rtx-4070-ti-gaming-oc/24.html

www.newegg.ca/asus-geforce-rtx-4070-ti-tuf-rtx4070ti-12g-gaming/p/N82E16814126607?Description=4070TI&cm_re=4070TI-_-14-126-607-_-Product

www.newegg.ca/asrock-radeon-rx-7900-xt-rx7900xt-pg-20go/p/N82E16814930083?Description=7900XT&cm_re=7900XT-_-14-930-083-_-Product

I like raw performance and as much as people love to talk about DLSS. Sapphire had an upscaler in TRixx before any of these were available but that does not matter. I love my 7900XT and so does my FV43U. DLSS, Frame Generation and RT mean absolutely nothing to me. I do know that more VRAM is better than less VRAM in the long run though.
Posted on Reply
#123
phanbuey
kapone32Yep 3 Games. So let's look at a review and you tell me how you feel. I too have had both and know that the 7900XT and XTX are plenty fast enough to drive my monitor. Then let's look at pricing.

www.techpowerup.com/review/gigabyte-geforce-rtx-4070-ti-gaming-oc/24.html

www.newegg.ca/asus-geforce-rtx-4070-ti-tuf-rtx4070ti-12g-gaming/p/N82E16814126607?Description=4070TI&cm_re=4070TI-_-14-126-607-_-Product

www.newegg.ca/asrock-radeon-rx-7900-xt-rx7900xt-pg-20go/p/N82E16814930083?Description=7900XT&cm_re=7900XT-_-14-930-083-_-Product

I like raw performance and as much as people love to talk about DLSS. Sapphire had an upscaler in TRixx before any of these were available but that does not matter. I love my 7900XT and so does my FV43U. DLSS, Frame Generation and RT mean absolutely nothing to me. I do know that more VRAM is better than less VRAM in the long run though.
Number of Games Supporting NVIDIA DLSS Crosses 300: DLSS 3 Now in 33 Games | Hardware Times

Yep 3 games that I played, reviews purposely stay away from DLSS comparisons (especially FG) or the Radeon cards get crushed by 40% and it's "not fair".

Here's from one of the reviews:


Now let's see an example of the Impact of DLSS in that game:


I mean that's great -- congrats -- the 7900XT is a good card and the ram will definitely last longer than the 12G on the TI. Doesn't change the fact that the Ngreedia tech is good, and usually works -- and not having it enabled in games is kind of disappointing.

In reality if you have the 4070ti and you're playing any of those 33 games that support DLSS3 at 4k (or the 290 that support 2.0) -- you're playing with it on. Reviews won't show that -- and it can really mislead people on the overall experience.
Posted on Reply
#124
kapone32
phanbueyNumber of Games Supporting NVIDIA DLSS Crosses 300: DLSS 3 Now in 33 Games | Hardware Times

Yep 3 games that I played, reviews purposely stay away from DLSS comparisons (especially FG) or the Radeon cards get crushed by 40% and it's "not fair".

Here's from one of the reviews:


Now let's see an example of the Impact of DLSS in that game:


I mean that's great -- congrats -- the 7900XT is a good card and the ram will definitely last longer than the 12G on the TI. Doesn't change the fact that the Ngreedia tech is good, and usually works -- and not having it enabled in games is kind of disappointing.

In reality if you have the 4070ti and you're playing any of those 33 games that support DLSS3 at 4k (or the 290 that support 2.0) -- you're playing with it on. Reviews won't show that -- and it can really mislead people on the overall experience.
I am so glad I watched the MSI Gaming livestream this week. They showed DLSS3 with Frame Gen and the perosn playing could not shoot anyone in a FPS and admitted the floaty feeling and lag that those "innovations" introduced into the Game. If you like them good for you. I spent my money on VRAM as the 127 FPS that the Hitman 3 shows is perfect to be smooth. Then I have an X3D chip for the 1% lows so I am Golden.
Posted on Reply
#125
Dr. Dro

The official stance is the "no comment" card now. GN inquired and got deflected with something that all but amounts to a yes.

I suspect that the bad PR may yet result in all techs being released for this game. I really really really mean it when I say I'm interested in Starfield, I'm pre-ordering the Premium Edition as soon as the first previews go live and already started amassing the coins on my Steam wallet.

As I mentioned earlier if this game is particularly strong on Radeon I may even be willing to flip my 3090 and get a XTX, however bad of a deal that could be otherwise. I already upgraded my CPU in anticipation to it after all.
Posted on Reply
Add your own comment
May 16th, 2024 11:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts