• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Veteran gamer that never owned an NVidia card needs help to choose

Status
Not open for further replies.
At this price tier I would say that the difference in price is only a major factor between GPUs if one will make it within budget while another won't. Otherwise, go with the minimum viable product for what you want.

I'm used to not considering RT and only lightly upscale (~1080p -> 1440p) in games where it's available, and I run Linux. For my use case, the RX 9070 XT is especially well-suited to that task, and even in cases like RT or upscaling/framegen it's not terrible. Between the two, I'd be inclined to buy the 9070XT, even if its major selling point—value—is a moot point in your market.

However, if you like to do local AI work, you're a fiend for RT/upscaling/framegen, or you do any sort of productivity work (modeling, CAD, video production) the 5070Ti outclasses the 9070XT hands down. Some would consider those strengths alone well worth the extra ~$100 or so.
 
At this price tier I would say that the difference in price is only a major factor between GPUs if one will make it within budget while another won't. Otherwise, go with the minimum viable product for what you want.

I'm used to not considering RT and only lightly upscale (~1080p -> 1440p) in games where it's available, and I run Linux. For my use case, the RX 9070 XT is especially well-suited to that task, and even in cases like RT or upscaling/framegen it's not terrible. Between the two, I'd be inclined to buy the 9070XT, even if its major selling point—value—is a moot point in your market.

However, if you like to do local AI work, you're a fiend for RT/upscaling/framegen, or you do any sort of productivity work (modeling, CAD, video production) the 5070Ti outclasses the 9070XT hands down. Some would consider those strengths alone well worth the extra ~$100 or so.
What CAD programs need a 5070 Ti?
 
What CAD programs need a 5070 Ti?
I was specifically thinking of 3D CAD applications. There are certain compute tasks that benefit from CUDA library support, the most obvious and common one being render. Nvidia also tends to get the most love in terms of software support in those apps, from what I've experienced.

That said, some people might be content to use something like Fusion that moves the compute work off of the local machine, and if that's your workflow then CUDA isn't as important. The weighing of whether one GPU or the other is better depends on OP's own preferences and use case.
 
I was specifically thinking of 3D CAD applications. There are certain compute tasks that benefit from CUDA library support, the most obvious and common one being render. Nvidia also tends to get the most love in terms of software support in those apps, from what I've experienced.

That said, some people might be content to use something like Fusion that moves the compute work off of the local machine, and if that's your workflow then CUDA isn't as important. The weighing of whether one GPU or the other is better depends on OP's own preferences and use case.
Well, which ones? I didn't know anything from Autodesk benefited from a 5070 Ti for example.
 
Guys, thanks for all the info.

Ive got the Palit GeForce RTX 5070 Ti Gaming Pro NE7507T019T2-GB2031A US$ 1206
And a 5700X3D(replacing 5700X)....ill leave AM4 to AM6 or maybe AM7.

I'm an ATI fan since the start. My first Nvidia card and that may be the last one.

PS. I only have a full HD monitor....:)
 
Last edited:
Guys, thanks for all the info.

Ive got the Palit GeForce RTX 5070 Ti Gaming Pro NE7507T019T2-GB2031A US$ 1206
And a 5700X3D(replacing 5700X)....ill leave AM4 to AM6 or maybe AM7.

I'm an ATI fan since the start. My first Nvidia card and that may be the last one.

PS. I only have a full HD monitor....:)
Congrats on your purchase. Only thing as an ATI fan you would have loved AMD software.
 
Guys, thanks for all the info.

Ive got the Palit GeForce RTX 5070 Ti Gaming Pro NE7507T019T2-GB2031A US$ 1206
And a 5700X3D(replacing 5700X)....ill leave AM4 to AM6 or maybe AM7.

I'm an ATI fan since the start. My first Nvidia card and that may be the last one.

PS. I only have a full HD monitor....:)
Don't worry, you'll be replacing that monitor sooner than you think. High refresh 1440p oleds and 4k monitors are coming out for $500 and $300 respectively this year, and prices are only going to go down from there!
 
Congrats, man. How does this 5070 Ti feel?
 
why not a 7800xt, 16 GB ram, and a 6750xt, plays stellar blade demo everything high, at 60fps (with denovo, on demo i guess) . 6750has 12GB ram, just a but to liitle because on 1440p, everything max, and 4k textures requires 12.25GB, but seems to run ok.

the 7800xt is around 680 - 700 canadian… yes a 9070xt is better, because it will play 60fps at 4k… but not in the budget i presume… the 5060ti is “same, a bit better, a bit worse” but more expensive for the 16GB version…

instead on a new gpu, i decided to go with the AOC Q27G3XMN 27" HDR monitor, reviewed well at rtings . com…
 
why not a 7800xt
• Pisspoor RT performance.
• Non-existent PT performance.
• No CUDA support.
• Obsolete upscalers.
• Only wins by a couple percent in raster compared to nVidia pricemates.
• Questionable energy efficiency.

RDNA2 and RDNA3 are essentially DoA, unless it's 7900 XTX for much cheaper than 9070.
 
why not a 7800xt, 16 GB ram, and a 6750xt, plays stellar blade demo everything high, at 60fps (with denovo, on demo i guess) . 6750has 12GB ram, just a but to liitle because on 1440p, everything max, and 4k textures requires 12.25GB, but seems to run ok.

the 7800xt is around 680 - 700 canadian… yes a 9070xt is better, because it will play 60fps at 4k… but not in the budget i presume… the 5060ti is “same, a bit better, a bit worse” but more expensive for the 16GB version…

instead on a new gpu, i decided to go with the AOC Q27G3XMN 27" HDR monitor, reviewed well at rtings . com…

I only question the logic here, why on God's green earth would someone who 1. can afford a and 2. wants a card on the level of a 5070 Ti want a 7800 XT of all cards. You could argue an RTX 3090 Ti, power guzzling but same performance level, works on Windows 7 if it's a must. A 4070 Ti Super, if found for cheap. Maybe even a 7900 XT or XTX, depending on the price. But a 7800 XT? :confused: Why would anyone looking to buy on the 5070 Ti/9070 XT's segment want a 7800 XT?
 
Its a really bad time to be buying a GPU now, honestly.

If you can wait... wait...
We're practically in limbo since Ada price/perf wise, but the market will get saturated at some point. Availability on cards has been so-so shortly after launch, its only just now getting better.
 
I agree with Dro - doesn't make sense recommending something inferior when the OP clearly has the dosh to blow on something better, newer, faster and built to last longer. The only counter argument being, the intended workload barely tickles 50%+ utilisation therefore avoiding unnecessary overkills which would also suggest the OP is a noob and needs tech-counselling....

I came from a long path starting with a Voodoo, ATI Rage Fury Pro, X600XT, X800 GTO, X800Xt, X1950XT, 3070, R9 something i dont even remember. My last acquisition was 6x Vega 64 and 3x RX580(yes, i was a miner)in 2017.

... judging by that, hes been around the block more than most of us - no counselling required, just quality choices that hit the budget sweet spot.
 
Its a really bad time to be buying a GPU now, honestly.
It'll only be marginally better, if any, this decade. AMD made it crystal clear stagnation is what they actually want by releasing what's essentially less than a couple percent of enough non-fake MSRP GPUs.
NV made it crystal clear they won't do anything about it by admitting to working on AI rather than anything else. Intel made it transparent they can't do anything about it by being Intel.

So it doesn't matter when you buy from that perspective. It only matters how long you can afford to boycott them for.
 
Its a really bad time to be buying a GPU now, honestly.

If you can wait... wait...
We're practically in limbo since Ada price/perf wise, but the market will get saturated at some point. Availability on cards has been so-so shortly after launch, its only just now getting better.

24hr-late.... https://www.techpowerup.com/forums/...card-needs-help-to-choose.337515/post-5529086

From what i'm seeing, the Brazilians get shafted with premium price tags whether the dust settles or not. Reminds me of my holiday to Turkey well before the pandemic/crypto/AI craze... insane prices!
 
It'll only be marginally better, if any, this decade. AMD made it crystal clear stagnation is what they actually want by releasing what's essentially less than a couple percent of enough non-fake MSRP GPUs.
NV made it crystal clear they won't do anything about it by admitting to working on AI rather than anything else. Intel made it transparent they can't do anything about it by being Intel.

So it doesn't matter when you buy from that perspective. It only matters how long you can afford to boycott them for.
AMD guys are eating pretty good, the past two gens of their cards have been great. They can definitely wait if they're just doing gaming. Can't say the same for NVIDIA though, the 30 series is VRAM starved and the 40 series was overpriced.
 
We're practically in limbo since Ada price/perf wise, but the market will get saturated at some point. Availability on cards has been so-so shortly after launch, its only just now getting better.
Will the market get saturated though? Doesn’t feel like that’ll happen anytime soon. Especially considering that in order to saturate the market the vendors would need to produce more GPUs and, consequently, AIBs more cards. And when faced the choice between producing more AI accelerators or consumer cards NV and to a lesser extent AMD obviously keep choosing the former. There’s only so much fab allocation to go around, after all. I pretty much gave up on modern graphics intensive gaming at this point - it’s a fools game, you pay top dollar for the hardware required to run piss poor software. Miss me with that shit, honestly, might as well dig into the backlog and/or replay old classics. Smaller games run fine too.
 
Those prices are normal in Brazil. We do have a federal import tax of 50% flat. After that, my state will add a nice 28%. I think the stores are not playing all the taxes....
The cheapest 5070 will be US$875, several 875-900, and theyll go way up to US$1100.

Considering Brazilian wages compared to most western countries, that is absolutely nuts. How do Brazilians afford these cards?
 
the past two gens of their cards have been great
I use a 6700 XT. It's been... underwhelming. RDNA3 is essentially the same stuff on steroids and RDNA4 is "if you're very lucky you might even get value similar to that of nVidia GPUs."

I have absolutely no idea what is great about red GPUs. They don't achieve anything worthwhile.

VRAM? I run out of calculating power five gigabytes before VRAM becomes a concern FWIW.
 
I use a 6700 XT. It's been... underwhelming. RDNA3 is essentially the same stuff on steroids and RDNA4 is "if you're very lucky you might even get value similar to that of nVidia GPUs."

I have absolutely no idea what is great about red GPUs. They don't achieve anything worthwhile.

VRAM? I run out of calculating power five gigabytes before VRAM becomes a concern FWIW.
DLSS 4 performance makes the 3080 fly but amd owners have been eating good being stuck with fsr 3. Parallel universes and all that.
 
AMD guys are eating pretty good, the past two gens of their cards have been great. They can definitely wait if they're just doing gaming. Can't say the same for NVIDIA though, the 30 series is VRAM starved and the 40 series was overpriced.
DLSS 4 performance makes the 3080 fly but amd owners have been eating good being stuck with fsr 3. Parallel universes and all that.
Please no, make it stop.
 
@Raffles
Hey, the market is so bad that I say let everyone cope whichever way they want. If that’s by creating amazingly optimistic fanfic scenarios - so be it. I myself like to pretend that the 5080 is actually 500 dollars and the 5070 is a barely cut down 5080 for 350. I call it “Pascal forevermore” timeline.
 
Status
Not open for further replies.
Back
Top