• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Should I return to nVidia ?

Status
Not open for further replies.
Joined
Nov 25, 2023
Messages
192 (0.35/day)
Processor 5900x
Motherboard MSI 570s
Cooling AIO 240
Memory 32 GB G.Skill @ 3200MHz
Video Card(s) 6800 xt
Storage Many
Display(s) Two QHD
Power Supply SilverStone Hela 1200W
Software Windows 11
Hi

The reason I am asking I want to get Rx 7800 xt but I can not find it anywhere local and I can not buy it now online . Only new nVidia cards exists locally but I hate nVidia drivers while I like a lot Adrenaline also with features like fluid motion gave me free FPS . I saw in some games (Hogwarts Legacy) graphics is not that good because of slow ray trace with my 6800 xt card so I am confused if I should go with nvidia or wait until new AMD cards appear probably in several months ?
 
I'd also wait for Nvidia RTX 5xxx, even if you won't buy latest nvidia, prices will come down from other GPU's.
 
If RT is important you'd definitely be better of overall with Nvidia and DLSS frame generation is superior to Fluid motion frames at least in every game I've used both in.

What Nvidia cards are available to you and what games are you wanting to use RT in would be the more important deciding factor anything below a 4070 super is pretty meh even if they are a bit stronger at RT and have better Upscaling vs their Radeon counterparts.
 
Hi

The reason I am asking I want to get Rx 7800 xt but I can not find it anywhere local and I can not buy it now online . Only new nVidia cards exists locally but I hate nVidia drivers while I like a lot Adrenaline also with features like fluid motion gave me free FPS . I saw in some games (Hogwarts Legacy) graphics is not that good because of slow ray trace with my 6800 xt card so I am confused if I should go with nvidia or wait until new AMD cards appear probably in several months ?
ray-tracing looks worse in Hogwarts , things like reflections are blurred out , turn off ray-tracing in hogwarts , all of it ...
more fps and it looks and plays better ... , no need for a new videocard yet ... + you can use the fluid motion from AMD + frame generation via a mod for Hogwarts and other games that have DLSS 2.0 , free frames for everyone ...
if you want MORE frames , lol , you can always buy the ''lossless scaling'' app on Steam for a small one time price ...
i have DSRDL nvidia ( super resolution ) + DLSS quality in-game + DLSS from ''lossless scaling app'' + 2x frame generation
you could get 3x frame generation , if you wanted , with fluid motion added to the mix ...
i get 120 STABLE fps in Hogsmead @ 1440p , no dips , just with my 8700K 2080ti 32Gb DDR3
 
ray-tracing looks worse in Hogwarts , things like reflections are blurred out , turn off ray-tracing in hogwarts , all of it ...
more fps and it looks and plays better ... , no need for a new videocard yet ... + you can use the fluid motion from AMD + frame generation via a mod for Hogwarts and other games that have DLSS 2.0 , free frames for everyone ...
if you want MORE frames , lol , you can always buy the ''lossless scaling'' app on Steam for a small one time price ...
i have DSRDL nvidia ( super resolution ) + DLSS quality in-game + DLSS from ''lossless scaling app'' + 2x frame generation
you could get 3x frame generation , if you wanted , with fluid motion added to the mix ...
i get 120 STABLE fps in Hogsmead @ 1440p , no dips , just with my 8700K 2080ti 32Gb DDR3

Were you running low raytracing settings perhaps? It's not normal for RT to look blurred/disfigured unless the ray count is very low or there's a driver error.

Hi

The reason I am asking I want to get Rx 7800 xt but I can not find it anywhere local and I can not buy it now online . Only new nVidia cards exists locally but I hate nVidia drivers while I like a lot Adrenaline also with features like fluid motion gave me free FPS . I saw in some games (Hogwarts Legacy) graphics is not that good because of slow ray trace with my 6800 xt card so I am confused if I should go with nvidia or wait until new AMD cards appear probably in several months ?

If you bought the 7800 XT you'd be bitterly disappointed. It's on par or slower than the 6800 XT you already have. You only really have two options here on out if you're targeting RT, go NVIDIA or buy the 7900 XTX. It seems to me that the GPU of choice for you should be the RTX 4080 SUPER, but if you can't afford that, a 4070 Ti SUPER should also be a solid choice.
 
Keep the 6800 XT, wait for RTX50xx.
 
Keep the 6800 XT, wait for RTX50xx.

Agree with this for sure, anything below a 4080 super will probably feel pretty meh. The 6800XT is still super solid at rasterized performance and lite RT.
 
Up to you.

I know this is an alien concept for many people here at TPU but you aren't forced to only pick one. You can own multiple computers, even multiple GPUs from multiple manufacturers. It really comes down to what you are trying to accomplish, a budget (which you unhelpfully don't mention) and your desire to swap graphics cards.

Remember that ALL PC hardware is a compromise whether it's hardware features, software features, price, power consumption, heat, noise, size, weight, durability, aesthetics, et cetera.

In fact, most of life's decisions are compromises, not just PC component purchasing.
 
I saw in some games (Hogwarts Legacy) graphics is not that good because of slow ray trace with my 6800 xt card so I am confused if I should go with nvidia or wait until new AMD cards appear probably in several months ?

Graphics are good in Hogwarts legacy without the need for ray tracing. We are not yet at the point where everyone can agree that RT is just an improvement without downsides. In most games it's a stylistic change at best and it always comes with a massive performance hit regardless of which brand you choose.

You have a 6800 XT so you can enable RT and judge for yourself how significant RT is for your eyes. If you like the effect it has then you can decide if it's worth upgrading to Nvidia card for the addition RT performance.
 
Was looking for just 4070 for QHD to be suitable for my budget
personally i'd go for the 4070 super over the stock 4070 , fair bit quicker and usually only around 50 more.

i have the stock 4070 myself, but if i was buying now itd def be the super variant.
 
Was looking for just 4070 for QHD to be suitable for my budget

I reckon you won't be happy with the RTX 4070 if you already have a 6800 XT. It's about identical in performance to your card with ever so slightly better RT. Not worth the effort. Save a little more IMO
 
I reckon you won't be happy with the RTX 4070 if you already have a 6800 XT. It's about identical in performance to your card with ever so slightly better RT. Not worth the effort. Save a little more IMO
Ok then will wait...thanks alot to all

1710890089735.png
 
Generally Nvidia drivers are considered to be more stable. I know in the Vista era they were bad but they have generally been solid ever since. Admittedly I have not used an AMD GPU before; I did use them when they were called ATI. Though AMD drivers seem to be a good bit better than they were in the past.

If you're looking for ray tracing you'll want to go Nvidia. Nvidia is better at ray tracing, and DLSS looks better than FSR. Regardless of Nvidia or AMD, you'll more than likely have to use either DLSS or FSR for good frame rates. So if you use ray tracing, plan on using DLSS or FSR.

I would look at an RTX 4070 Super if you're interested in ray tracing, or RTX 4070 Ti Super. However in raster, the 4070 Super isn't quite worth the upgrade over a 6800XT, IMO. As others said, it is best to wait until the next generation. Unless you can afford a 4080 Ti Super, I feel like outside of ray tracing you would not be getting a worthwhile upgrade. There is also the RX 7900XT which is a decent upgrade in raster but seemingly falls behind the 4070 Super in ray tracing. So even if you wanted to stay with AMD, not quite worth the upgrade.

Generally the 6800 XT is still quite fast in raster. I would just turn down ray tracing; in some games it isn't really worth using. Cyber Punk, Control, Alan Wake 2 are exceptions.
 
Ok then will wait...thanks alot to all

View attachment 339758

Definitely your best bet, I'm hoping Nvidia has learned from ADA that gamers won't just buy anything they spit out and whatever sits in the 600 ish usd range is an actual generational leap not just from your 6800XT but also from the current generation cards.
 
I could be wrong, but depending on how they do in the AI market when it really starts taking off, I think Nvidia is going to have a smaller footprint in the consumer graphics card market or even possibly abandon it altogether at some point in the future so they can dominate the AI market. The amount of money that they could potentially gain from it vs. consumer graphics cards would be like comparing Microsoft and a small family-owned business. AI is going to become very big. The only question is who's going to be the new Bill Gates running it.
With that in mind(and yes, I could still be wrong), at some point, I don't expect them to produce as many different models like they have in the past, so maybe it will be cheaper low & mid-tier cards with more expensive high-tier/flagship cards.
 
Personally I'd take even a RX 6300 over a RTX 4090 if I'd get one of those for free. I just don't want to support that unimaginably greedy company in any way anymore. And the way they treat consumers these days is just ridiculous.
 
I love these titles topics, sounds like YT videos !
 
I could be wrong, but depending on how they do in the AI market when it really starts taking off, I think Nvidia is going to have a smaller footprint in the consumer graphics card market or even possibly abandon it altogether at some point in the future so they can dominate the AI market. The amount of money that they could potentially gain from it vs. consumer graphics cards would be like comparing Microsoft and a small family-owned business. AI is going to become very big. The only question is who's going to be the new Bill Gates running it.
With that in mind(and yes, I could still be wrong), at some point, I don't expect them to produce as many different models like they have in the past, so maybe it will be cheaper low & mid-tier cards with more expensive high-tier/flagship cards.
No chance.

Consumer AI is already a thing.

Besides, gaming is pushing towards AI features so native acceleration trumps cloud based latency.

Since GTX 1650/ RTX 2060 all cards have been RTX, which means inherent AI hardware, I see no chance of that changing or NVIDIA withdrawing from consumer GPU market.

Cards may be resegmented like with RTX Ada, but the performance is what matters, so I don't really care if the 4060 is made on what would typically be 4050 class die etc. Big dies like 4090 make more money being sold to enterprise under Quadro sure, but that's just reality, and why prices at the high end are so high these days.

As you said, things will be on smaller dies and get more expensive, that's true.
 
Personally I'd take even a RX 6300 over a RTX 4090 if I'd get one of those for free. I just don't want to support that unimaginably greedy company in any way anymore. And the way they treat consumers these days is just ridiculous.

Both of them are unimaginably greedy and think only of their bottom line. Their allegiance lies solely with the shareholders.

Besides AMD's the one with the $5000 consumer grade CPU after all. And yeah while it is supremely powerful I guess you could say that about each of Nvidia's $3000 Titans which have long since disappeared.
 
Were you running low raytracing settings perhaps? It's not normal for RT to look blurred/disfigured unless the ray count is very low or there's a driver error.



If you bought the 7800 XT you'd be bitterly disappointed. It's on par or slower than the 6800 XT you already have. You only really have two options here on out if you're targeting RT, go NVIDIA or buy the 7900 XTX. It seems to me that the GPU of choice for you should be the RTX 4080 SUPER, but if you can't afford that, a 4070 Ti SUPER should also be a solid choice.
i wtahced a youtube clip where they showed the settings next to each other , reflections on the floor and things all looked blurry with ray-tracing ,
so i tested it out and what do you know , it all looked blurry , so i turned it off , there wasn't an option that wasn't blurry
 
No chance.

Consumer AI is already a thing.

Besides, gaming is pushing towards AI features so native acceleration trumps cloud based latency.

Since GTX 1650/ RTX 2060 all cards have been RTX, which means inherent AI hardware, I see no chance of that changing or NVIDIA withdrawing from consumer GPU market.

Cards may be resegmented like with RTX Ada, but the performance is what matters, so I don't really care if the 4060 is made on what would typically be 4050 class die etc. Big dies like 4090 make more money being sold to enterprise under Quadro sure, but that's just reality, and why prices at the high end are so high these days.

As you said, things will be on smaller dies and get more expensive, that's true.
Yes, but AI is still considered to be in its infancy, which is why I raise the question as to whether or not there will be a split between graphics & AI. When I say s"split", I don't mean that graphics will lose its dependency on AI, but rather that it will reach a point where the two will have to be distinct with respect to hardware(i.e. having to buy a RTX 6090 GPU + RTX 6090 APU).
 
Yes, but AI is still considered to be in its infancy, which is why I raise the question as to whether or not there will be a split between graphics & AI. When I say s"split", I don't mean that graphics will lose its dependency on AI, but rather that it will reach a point where the two will have to be distinct with respect to hardware(i.e. having to buy a RTX 6090 GPU + RTX 6090 APU).
I doubt it tbh. There's already AI focused GPUs, but they're still GPUs. Dedicated AI cards exist but they're not great because you can't really use them for anything else. CPUs are getting NPUs integrated on desktop, have been in mobile for a long time. Doubt GPU will split into two products for the consumer.
 
Status
Not open for further replies.
Back
Top