• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Founders Edition

  • I'm not going to budge from honouring my predetermined $800 gpu-upgrade.... not a dollar/cent more! Even $800 is a HUGE charitable sum of cash to throw at these famished beneficiaries and not long ago this sort of price tag was touching on flagship territory
Well-said. Hopefully more consumers collectively say 'no' to Nvidia's ludicrous pricing. As long as some are willing to pay it, though, expect prices to just keep on rising. I enjoyed seeing how quickly they dropped the cost of the 3090 Ti when consumers were smart enough not to buy it. We need to see a lot more of that where people recognize a bad buy and opt out.
I haven't purchased a 'new' video card since 2017. Just been picking up used gear on ebay and that will continue for the foreseeable future.
 
Last edited by a moderator:
That was on my mind, 7900xt seems more like a 6800XT replacement in terms of performance(estimated, we'll see), don't know why amd opted for the xtx branding since it would not even be on par with the 4090. The real naming should have been 7900xt & 7800xt.
Because now the 7800xt will seat behind the 7900xt that most probably will lose to the 4080. Then this gen x80 amd gpu will compete with x70 nvidia gpu.

Looks like AMD's accepted defeat right at the top... also with the intimidating 4090 TI/SUPER/or-what-not looming. So a RDNA3 top of the range XTX from the get-go and then as we go down the ladder smaller performance differentials per model to keep up with a stronger challenge at the mid-segment compo. A bit of a fail with the XT sitting at $900... but i'm hoping the 7800XX will be closely trailing at a lower price point to compensate.

Well-said. Hopefully more consumers collectively say 'no' to Nvidia's ludicrous pricing. As long as some are willing to pay it, though, expect prices to just keep on rising.
I haven't purchased a 'new' video card since 2017. Just been picking up used gear on ebay and that will continue for the foreseeable future.

SNAP! lol. I too purchased a 1080 TI in "2017" and then waited a lifetime.... not long ago settled with a "used" liquid mod 2080 TI from a trusted source as a placeholder for the bigger and better. TBH, i'm not in a frantic need for an upgrade, the current setup without the on-screen FPS counter does the job well. Only for a couple of GPU-heavy titles i would have fancied something a little stronger to comfortably engage 1440p's (144hz) 90-120fps+ performance target. With 40-series/RDNA3 i'm expecting way more than the initial perf target..... they say good things come to those who wait... they forgot the bit "only this time around it will cost you an arm and a leg and if you haven't strapped your balls they'll take that too"... mines strapped with 800 strings of manhood
 
As much as I don't like the prices of the 4080/4090 they would still be the card I would be looking at if I needed an upgrade right now.
Maybe you would happily swallow a +$500 price increase, most people even if they want rt, won't.
 
Last edited:
Is there going to be a new geforce driver for this release?
 
Nothing in this world can justify the callous prices of the 40x0 series right now. Nothing.
Also, currently there are ZERO quality games that require you to buy this generation to play them. Until good games with latest UE5 Engine are released, for example, no need to have more than the last generation of cards to play them. Even so, the UE5 Engine is very good optimized so it doesn't require the latest and most expensivest of GPUs to play the games, even on 4K.
So, really, what is the purpose of those ridiculously overpriced cards in the end??
 
Very much appreciate the thorough review as always W1z, and I agree with essentially all of your conclusion.

Price - yuk
Product - excellent and feature rich
DLSS3 - loads of potential
efficiency - record setting

Sure shows that Samsung 8nm held back Ampere imo, if it was TSMC 7nm at the time there would have been a bigger leap from Turing but now a lower leap from Ampere to Ada.

My biggest gripe is the physical size of these cards, far out they do not need to be that big, pretty much same as all the 4090's which means I will have trouble fitting any in my case... ergh

Keen to see the 7900 series launch, allow a few months for the dust to settle and re-evaluate whether I want to jump into this gen at all or wait for RTX 50 / Radeon 8000 series
 
I don't love the price, but I'm pleased to see they're giving it more than 10GB VRAM at >$1000, at least.

Man, I really hope there's some fierce competition between the AMD and Nvidia $400 GPUs. If not, these stupid goddamn prices are going to kill the recovering PC gaming market and drive us back a decade to the days of console dominance where Microsoft and Sony had their exclusives and if any of them EVER made it to the PC, they were shoddy afterthoughs of a port which ran like crap.
 
Eh, I can understand why the 5800X bottleneck was largely unknown on the 4090 review. Comparing that review to other reviews with faster CPUs yielded interesting numbers that showed how CPU was becoming far more important on higher end rigs, which was a good learning from that review for those that cross compared.

But this time? There's supposed to be meaning to that word in the sentence above, 'learning'.

The usefulness of the data presented here as a "GPU" review becomes very limited due to running on a 5800X. You could have just as well done the review on a 10900K or a 12400 or some such.

This becomes a "If you buy a $1200 GPU and pair with a $250 CPU this is what happens" kind of scenario. It's one of those scenarios that doesn't much exist, sort of like when Toms buys an Asus Maximus and sticks DDR4-4800 in it at JEDEC speeds.

This is just not useful info.
 
Eh, I can understand why the 5800X bottleneck was largely unknown on the 4090 review. Comparing that review to other reviews with faster CPUs yielded interesting numbers that showed how CPU was becoming far more important on higher end rigs, which was a good learning from that review for those that cross compared.

But this time? There's supposed to be meaning to that word in the sentence above, 'learning'.

The usefulness of the data presented here as a "GPU" review becomes very limited due to running on a 5800X. You could have just as well done the review on a 10900K or a 12400 or some such.

This becomes a "If you buy a $1200 GPU and pair with a $250 CPU this is what happens" kind of scenario. It's one of those scenarios that doesn't much exist, sort of like when Toms buys an Asus Maximus and sticks DDR4-4800 in it at JEDEC speeds.

This is just not useful info.
W1zz already stated he's building a new test rig with a 13900k after the 7900XTX review in december. The 4k is accurate and so is the 1440P for the most part.
 
W1zz already stated he's building a new test rig with a 13900k after the 7900XTX review in december. The 4k is accurate and so is the 1440P for the most part.

Yep, I saw that.

And then I posted anyway, because the data is still pretty close to useless.

Example: +38% vs +50%, 5800X vs 12900K :

1668569887991.png
 
Eh, I can understand why the 5800X bottleneck was largely unknown on the 4090 review. Comparing that review to other reviews with faster CPUs yielded interesting numbers that showed how CPU was becoming far more important on higher end rigs, which was a good learning from that review for those that cross compared.

But this time? There's supposed to be meaning to that word in the sentence above, 'learning'.

The usefulness of the data presented here as a "GPU" review becomes very limited due to running on a 5800X. You could have just as well done the review on a 10900K or a 12400 or some such.

This becomes a "If you buy a $1200 GPU and pair with a $250 CPU this is what happens" kind of scenario. It's one of those scenarios that doesn't much exist, sort of like when Toms buys an Asus Maximus and sticks DDR4-4800 in it at JEDEC speeds.

This is just not useful info.
The amount in which the 4080 is faster than the 3090 Ti is fairly consistent at all resolutions. A small uptick of 3% from 1080p to 1440p & 2% at 1440p to 4k. It shows that the 5800X is barely bottlenecking the 4080 at 1440p & 4k.

Those numbers indicate that the 4080 would still be a decent upgrade for a system with something like a 5800X, 10700-10900k or 12600 series CPU if you're playing at 1440p or 4k.
 
great card with an insane price tag meant to upsell the 4090 as its sole purpose
 
Nothing in this world can justify the callous prices of the 40x0 series right now. Nothing.
Not even the request?
:mad::mad::mad::mad::mad::mad: for 4090 msrp, but buyers consider themselves lucky if they find the product $300 more expensive. I can't believe that nVidia limited the production. They have almost two months in which they can sell a lot and expensively until AMD comes with a counteroffer.
 
  • PERFORMANCE: better than expected

  • EFFICIENCY: I guess its fantastic by "current standards" and beats last Gen!

  • RT: Looks great. Usually i don't like going below 90fps in any gaming title hence I haven't bothered with RT at all. With the 4080's RT performance, definitely a key factor adding to the 4080 appeal

  • NOISE/TEMPS: Looks great for an FE. I wouldn't have it any other way

  • PRICE: What a joke! Puts all these nice perks to shame! Wouldn't touch it with a 10-foot telescopic ~25ft extended pole. Not even with rubber insulated gloves and the palm and fingers coated with a GPU-pandemic hand sanitizer

Maybe i'm just an uninformed/ignorant optimist... prior to actual availability I was hoping NVIDIA would come to it's senses and slash the 4080-16s MSRP to fill the sacked 4080-12s initial ask of $900. Yep an unsound expectation but its hardly completely irrational. Not that i'd buy it - I'm not going to budge from honouring my predetermined $800 gpu-upgrade.... not a dollar/cent more! Even $800 is a HUGE charitable sum of cash to throw at these famished beneficiaries and not long ago this sort of price tag was touching on flagship territory. Now it seems we'd be lucky to see a mid-ranged XX7X card for this sort of budget ceiling. Even AMD's second to the top 7900XT @ $900 is a middle finger to the price-2-performance conscientious buyer... what a terrible year for upgrades and the ~2022 upgrade hype/fervour remains shot and crippled in the knees.
They cancel the name "4080-12GB", not the price point.
You will still have your ~900$ NV GPU sooner than later.

Looks like AMD's accepted defeat right at the top... also with the intimidating 4090 TI/SUPER/or-what-not looming. So a RDNA3 top of the range XTX from the get-go and then as we go down the ladder smaller performance differentials per model to keep up with a stronger challenge at the mid-segment compo. A bit of a fail with the XT sitting at $900... but i'm hoping the 7800XX will be closely trailing at a lower price point to compensate.



SNAP! lol. I too purchased a 1080 TI in "2017" and then waited a lifetime.... not long ago settled with a "used" liquid mod 2080 TI from a trusted source as a placeholder for the bigger and better. TBH, i'm not in a frantic need for an upgrade, the current setup without the on-screen FPS counter does the job well. Only for a couple of GPU-heavy titles i would have fancied something a little stronger to comfortably engage 1440p's (144hz) 90-120fps+ performance target. With 40-series/RDNA3 i'm expecting way more than the initial perf target..... they say good things come to those who wait... they forgot the bit "only this time around it will cost you an arm and a leg and if you haven't strapped your balls they'll take that too"... mines strapped with 800 strings of manhood
Still plenty of room for AMD to make an 500w RX7950 to match or beat 4090/ti. Put 32GB of fastest DDR6 on it and call it a day.
 
"HUGE"
I have pointed out twice that it seems the test setup is running 2X8GB sticks when testing the with 5800X. This config does not benefit from Dual rank like a 2X16GB kit.
In some games up to +/-8% increase in min and max FPS can be seen with a dual rank memory setup on ZEN 3 vs single rank setups.

Also notice the 4X8GB 3800 CL18 kit performs mostly worse VS the 4X8GB 3200 CL14 kit in most tests except in RDR2. So 4000 CL20?! 2X16GB 3600 14-14-14 would have been a better choice here.
 
Last edited:
oh wow .. i had to read that part 3 times trying to find the mistake, and i've read the conclusion several times, and our proofreader, and most of our team members, too. fixing ..
It seems I have an eye for the odd ones out... ;)
 
Wait for it, the 4060 Ti looks like

what’s the point if the 4060Ti cost 699$ and perform like a 3080?

….please don’t say the DLSS 3….

Since the new gen doesn’t replace the cards below 1000, we are stagnating and that’s why the 3080 price hasn’t been affected 2 years after the release.
 
49% uplift in 4K games over RTX 3080 (from 81 to 121 average FPS), price uplift over 70%.

Thay's all you need to know.

Even a 50% price increase for a 50% performance uplift would have been disappointment. This is just sad.

And it will affect the whole GPU scene. AMD is a minor player with limited ability to offer large quantities of cards, so if they will be competitive you can only ecpect similar prices, give or take a few percent. If not officially, then by scalping (by online scalpers, or even by AIBs themselves).

And what's with the comments that the performance increase is something groundbreaking, and it justifies the price increase? ? GTX 980 to GTX 1080 was greater, RTX 2080 to RTX 3080 was greater - you can see it even in this review, average FPS of RTX 2080 at 4K is 50.2, of RTX 3080 is 81, a 61.3% uplift.

Infact, the only release with worse performance increase in last 10 years was RTX 2080 - and that showed in Nvidia's revenue for the whole duration of Turing generation.
 
Last edited:
Then that would be a pro for every video card release ever. Cross generational comparisons to determine efficiency don't make sense. You compare with the other cards in the gen. The GTX 1070 was 60-70 percent faster than the GTX 970, I didn't see people saying "most efficient ever".
What? This is literally how these comparisons always work. What are you supposed to make efficiency comparisons against, except for other GPUs that exist? And you can't have paid much attention to those 1070 reviews. For the most part, they were saying things along the lines of "the performance improvement is great, but the efficiency improvement is nowhere near what Maxwell brought to the table".
 
I wonder, how many leather jackets I can buy for $1000? I surely cant buy a decent card from NVidia for that price. Might as well buy a decent leather jacket. Aint that right Jensen?
Ridiculous pricing for the card. From disgust and disappointment, turning into mocking laughter.
 
Last edited:
2.1KG, 3 slot brick, shaky adapter, 1200+ lolcoins... and all it really is, is shrunk Ampere with flashy new pictures of the die and new DLSS.

Hard Pass. I don't understand this product.

Great review though as usual here!
As for the card, it's stupidly overpriced just like the 4090, and then the new RT cores what are they doing, the card loses performance similarly to Ampere, only the 4090 is step up in this regard.
Again wasted silicon.
As predicted, low hanging fruit is already gone, RT is brute force in the end, so there is only so much they can do without again sacrificing IQ they were supposed to be improving with the technology, or sacrificing raster performance and nobody in their right mind will cripple cards with competition in the high end, which clearly is happening even with RDNA2 already. The only way they get more RT perf now is by increasing the latency hit. Well yay, console-latency gaming on your >1200+ dollar gaming PC, enjoy the hidden peasantry. PCMR is becoming the PC Meme Race, a parody of itself by now. All you can really boast about is chasing the cutting edge of your wallet.

The gap in RT performance is negligible, especially in that high end, other factors are going to be more important for the most sensible upgrade path.

Early adopting, as always, has a price, and offers no real advantages.
 
Last edited:
49% uplift in 4K games over RTX 3080 (from 81 to 121 average FPS), price uplift over 70%.

Thay's all you need to know.

Even a 50% price increase for a 50% performance uplift would have been disappointment. This is just sad.
It is not measured by the meter. In 4k, even 10 fps more can make you happy. You will not notice differences in 1080p or 1440p, where the counter exceeds 100 fps, but at higher resolutions it changes the page.
Of course, everyone has their own options. It can play excrementably in 4K or invest in a video card that will give it at least a decent experience.

2.1KG, 3 slot brick, shaky adapter, 1200+ lolcoins... and all it really is, is shrunk Ampere with flashy new pictures of the die and new DLSS.

Hard Pass. I don't understand this product.
Except for the adapter, I can't understand you. Did you pay attention to the temperatures? They could offer you a 0.5 Kg flea with 85-95 degrees, for the feeling.
The heavy weight comes from the best performing cooler that a founder edition has had so far. At 300-400W you should have SSD temperatures, priceless.
 
Last edited:
Then that would be a pro for every video card release ever. Cross generational comparisons to determine efficiency don't make sense. You compare with the other cards in the gen. The GTX 1070 was 60-70 percent faster than the GTX 970, I didn't see people saying "most efficient ever".
Not true, take a look.
At: Turing vs Ampere.

This is watt/frame. Again: Samsung 8nm. Ampere held efficiency upgrades in the architecture, and they served only to offset the node characteristics while improving performance.
In the age of questionable node shrinks/marketing around it, these things are changing.

1668587447656.png


Yep, I saw that.

And then I posted anyway, because the data is still pretty close to useless.

Example: +38% vs +50%, 5800X vs 12900K :

View attachment 270140
Relatively there's barely a difference. A faster CPU elevates all the numbers. Per-game gaps may be bigger, but vary too much.
 
Last edited:
Back
Top