• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GPU for 4K

If they are old-school dedicated G-Sync modules, then Freesync won't work, no. Freesync (and new G-Sync) use the proper VESA VRR standard. Original G-Sync was some kind of bastard FPGCA module that worked only with Nvidia cards.

What exact model of monitors do you have?
alienware aw3418dw
 
No, that uses a native, proprietary G-Sync module.
You'll need to get an Nvidia card if you want to use VRR/G-Sync at all:

1673612599693.png

A 3070 is roughly the same performance as your dead 2080Ti, They regularly sell for under $400 on ebay:
I don't know what country you're in, I'm in the UK and picked one up for £350 in October last year.
 
No, that uses a native, proprietary G-Sync module.
You'll need to get an Nvidia card if you want to use VRR/G-Sync at all:

View attachment 278917

A 3070 is roughly the same performance as your dead 2080Ti, They regularly sell for under $400 on ebay:
I don't know what country you're in, I'm in the UK and picked one up for £350 in October last year.
goodbye to the possibility of buying a radeon.
shit, thank goodness I remembered this detail, otherwise I would have bought a card that would have gone to 50hz
 
If you want 100fps in all of today's games at 4K ultra settings, nothing exists that will do that yet. The 4090 gets close but there are a handful of games like BL3, CP2077, Dying Light 2, Warhammer III which even a 4090 cannot achieve 100fps averages in.


View attachment 278908
You were right that now is a bad time to buy a new GPU - we're right in the middle of a generational transition. The midrange cards that usually shift the price/performance curve for the whole market aren't out yet, and the halo cards are shockingly expensive still, coming with early-adopter tax.

If I were in your situation I'd pick up a used 3070 on ebay as a stopgap and reassess the new GPU situation in 6-12 months. You're unlikely to lose much money on a 3070; If you buy one now for under $400 you should be able to get the majority of your money back by selling it on again in 6+ months.
Unfortunately waiting won't help. Given the prices of the SKUs released so far, there's very little hope for the SKUs to follow. There's a good chance they'll be priced at $500 and above, with performance still being a wildcard.
 
now I would have to choose between the 4070 ti and the 4080.
the x90 is way too expensive
 
now I would have to choose between the 4070 ti and the 4080.
the x90 is way too expensive
I've always felt the x90 is more of joke or a dare, not something to actually consider buying.
If I had to, I'd probably go for the 4070Ti, but that depends on what prices you can get locally and how long you want to hold on to that card. Remember, either way you get DLSS3 which will boost frame rates significantly. You can use that at least for the occasional poorly coded game.
 
I've always felt the x90 is more of joke or a dare, not something to actually consider buying.
If I had to, I'd probably go for the 4070Ti, but that depends on what prices you can get locally and how long you want to hold on to that card. Remember, either way you get DLSS3 which will boost frame rates significantly. You can use that at least for the occasional poorly coded game.
well, the 2080 ti lasted me 3 and a half years.
the last cold war, i played it in 4k with ultra details at 85/90 fps, obviously rt off. i played the cyberpunk campaign at 3440, everything ultra always rt off at over 100 fps. what i hope and i I'm waiting for the next gpu and this result. So I sincerely hope that one of the x70 or x80 will give me this result
 
  • Like
Reactions: bug
well, the 2080 ti lasted me 3 and a half years.
the last cold war, i played it in 4k with ultra details at 85/90 fps, obviously rt off. i played the cyberpunk campaign at 3440, everything ultra always rt off at over 100 fps. what i hope and i I'm waiting for the next gpu and this result. So I sincerely hope that one of the x70 or x80 will give me this result
Well, TPU reviews say 4070Ti does an average of 99fps at 4k. DLSS3 would certainly push most titles above that, but the caveat is it's all measured with RT off. I'd be really curious to see RT in action. I know it's still more of a tech demo, but down the road I expect titles will start making use of more rays and yield more stunning results (then again, I'm kinda of graphics freaks like that).
 
Well, TPU reviews say 4070Ti does an average of 99fps at 4k. DLSS3 would certainly push most titles above that, but the caveat is it's all measured with RT off. I'd be really curious to see RT in action. I know it's still more of a tech demo, but down the road I expect titles will start making use of more rays and yield more stunning results (then again, I'm kinda of graphics freaks like that).
compared to the 4080, is the price/performance ratio better?
 
compared to the 4080, is the price/performance ratio better?
It is, according to this:
performance-per-dollar_3840-2160.png


But I guess it depends on your local prices (and availability), really.
 
It is, according to this:
performance-per-dollar_3840-2160.png


But I guess it depends on your local prices (and availability), really.
I live in Brazil, so the prices are very high with import taxes over 30%. Don't make the mistake of comparing prices with the United States or Europe.

the available gpu are:
4070 ti gigabyte oc 1490 dollars
4070 ti msi ventus $1300
4070 ti tuf 1620 dollars

4080 gigabytes oc 2013 dollars
4080 tuf 2100 dollars
4080pny 1800 dollars

this pny, i really don't know what the fuck brand it is
 
I live in Brazil, so the prices are very high with import taxes over 30%. Don't make the mistake of comparing prices with the United States or Europe.

the available gpu are:
4070 ti gigabyte oc 1490 dollars
4070 ti msi ventus $1300
4070 ti tuf 1620 dollars

4080 gigabytes oc 2013 dollars
4080 tuf 2100 dollars
4080pny 1800 dollars

this pny, i really don't know what the fuck brand it is
I know, I'm not in the US so I know the situation with prices (if MSRP sucks, we get additional suckage on top of that) :(
PNY has been around for ages. Nothing too spectacular, but not something I'd avoid either.
 
I know, I'm not in the US so I know the situation with prices (if MSRP sucks, we get additional suckage on top of that) :(
PNY has been around for ages. Nothing too spectacular, but not something I'd avoid either.
there are also zotac, palit and galax, but even in this case I don't have the slightest idea of the quality
 
there are also zotac, palit and galax, but even in this case I don't have the slightest idea of the quality
Zotac is also average afaik. Palit can range from very good to poor. I have no experience with Galax, but they're owned by Palit, so probably the same thing.
 
there is nothing to object, for all intents and purposes 4070 Ti is a 3090 Ti on a 192 bit bus with 48 MB cache that has to saturate the same 40 Tflops, therefore in 4K it just craps up.
You are just spouting whatever theory is in your head without demonstrable evidence. Post a link or review analysis showing that is happening instead of just making things up,
 
You are just spouting whatever theory is in your head without demonstrable evidence. Post a link or review analysis showing that is happening instead of just making things up,
Demonstrate what? That where the 3090Ti has 40Tflops processing power, the 4070Ti also has 40Tflops, thus it has to feed the same(ish) processing power with half the memory bandwidth? That doesn't need a demonstration, it just needs reading the specs.
 
Demonstrate what? That where the 3090Ti has 40Tflops processing power, the 4070Ti also has 40Tflops, thus it has to feed the same(ish) processing power with half the memory bandwidth? That doesn't need a demonstration, it just needs reading the specs.
You seem out of the loop and unaware of what to reference in your reply.

Demonstrate this which was part of the quote that seemed to fly over your head...: "therefore in 4K it just craps up."

and his earlier statement: "4070 Ti is a 1440p card that sometimes suffers 10-20% penalty in 4K depending on the title because of its 192 bit bus, RX 6800 is the minimum requirement ~$500

Reference actual case scenarios reported in reviews, not spout unsubstantiated theory without evidence backing it up.
 
You seem out of the loop and unaware of what to reference in your reply.

Demonstrate this which was part of the quote that seemed to fly over your head...: "therefore in 4K it just craps up."

and his earlier statement: "4070 Ti is a 1440p card that sometimes suffers 10-20% penalty in 4K depending on the title because of its 192 bit bus, RX 6800 is the minimum requirement ~$500

Reference actual case scenarios reported in reviews, not spout unsubstantiated theory without evidence backing it up.
No need for that tone. As shown in TPU's reviews, 4070Ti performs the same as 3090Ti at FHD and QHD and it falls behind at UHD. You need more proof on top of that?
 
No need for that tone. As shown in TPU's reviews, 4070Ti performs the same as 3090Ti at FHD and QHD and it falls behind at UHD. You need more proof on top of that?
but if I'm not mistaken it loses in temperatures and consumption
 
but if I'm not mistaken it loses in temperatures and consumption
Yes, it seems to be doing all that while drawing half the power.
 
No need for that tone. As shown in TPU's reviews, 4070Ti performs the same as 3090Ti at FHD and QHD and it falls behind at UHD. You need more proof on top of that?
Yo keep missing vital stuff in the quotes... : "RX 6800 is the minimum requirement." ... suggesting it performs better at 4k than the 4070 ti?o_O

But apparently he edited his original post to include this:
"Or you could wait the cutdown 4070 sitting between 3070 Ti and 3080 that is rumored to soon enter production." ????
 
Last edited:
  • Like
Reactions: N/A
Yo keep missing vital stuff in the quotes... : "RX 6800 is the minimum requirement." ... suggesting it performs better at 4k than the 4070 ti?o_O

But apparently he edited his original post to include this:
"Or you could wait the cutdown 4070 sitting between 3070 Ti and 3080 that is rumored to soon enter production." ????
Of course it doesn't, even if you don't factor in DLSS3 and RT. Then again, if you can get the 6800XT for about half the price...
 
Of course it doesn't, even if you don't factor in DLSS3 and RT. Then again, if you can get the 6800XT for about half the price...
That has nothing to do with the original requirements of the OP who has a $1k budget and wants better 4k perf than what the 6800xt can offer.
 
The whole thing started with my preference of 6800 over 6750, it wasnt about 192 bit bad or Amd better.

Worst case is civilisation. But you can see that even 4080 is failing there. So it's unclear as of why.

1673625188923.png
 
Back
Top