• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3080 Founders Edition

New consoles are 10 GB for CPU+VRAM, your PC has 16 GB for CPU + 10 GB for VRAM, should be fine. Obviously no way to predict the future

Thing is though, consoles will use dynamic resolutions, checkerboarding and all sorts of compromises to get there. For someone wanting native 4K with highest settings, it is concerning.

A game requiring < 3.5GB of system memory and >10GB of VRAM sounds unlikely.

I don't think anyone is wondering whether or not a game is going to need more than 10 GB any time soon because they wont. Running them at 4K with everything maxed out will, most definitely cause >10GB usage quite soon.

I fully expect not a single complain from AMD fans

Dude you are the only guy that keeps talking about AMD here, give it up. You are as cringe worthy as ever with your intentional flaming.
 
it is concerning
Not concerning imo, that's also why I didnt comment on it in the review. Not enough data anyway, let's talk again after xmas. But history suggests memory is not an issue. Remember 3 GB and 4 GB cards? These are still fine for 1080p, data is on the summary pages
 
3080 not so powerfull vs 2080ti as advertised...
typically ngreedia
 
Yep, this is the reason Nvidia held reviews until the day before release. Underwhelming compared to what was sold in their marketing show. Very underwhelming for a node shrink that has a much higher power draw.

Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?

Maybe you could share some graphs to prove your PoV?

3080 not so powerfull vs 2080ti as advertised...
typically ngreedia

It's exactly as powerful as advertised if not better at 4K, the resolution it was created for. Never seen so many AMD trolls in one discussion. Also, it costs $700, so not sure about greed. Also, no one forces you to buy it.

Speaking of greed:

f4114121b3eddaa6ab4f8f710044f519b39c5992bce548ec4b5da0ddb15d1fe7.png
 
Last edited:
It's amazing how much flak NVIDIA receives for the RTX 3080 despite the fact that quite a lot of games being reviewed are severely CPU limited even at 4K! You do not draw conclusions about the card's efficiency and performance based off poorly optimized games. I fully expect not a single complain from AMD fans about the upcoming RX 6900 which will have the exact same issue because no matter how fast your make your GPU, if it's constrained by the CPU, all your improvements are worth nothing.

I'd be extremely glad if @W1zzard created separate charts for performance and performance/power consumption only for the games whose performance scale linearly or near linearly with resolution.
I can't see a single game in the TPU test suite that doesn't show significant scaling at 4k. Are you saying that these are still CPU limited? Based on what? Is there hidden data somewhere showing CPU usage for each GPU tested? Or some other review you've seen showing this that you haven't mentioned?
 
So, It's offering a 31.7% performance improvement over the 2080Ti and 68% improvement over the 2080 at the same price.
I would've liked to see more of a P/W improvement but it is what it is I guess.
Overall, the performance increase is fine and I'm happy with that.
The most impressive piece of this card is probably the cooler, a 2-slot card handling this 320W beast with relative ease is certainly impressive.

And lastly it doesn't matter what Nvidia offers, be it a 70% improvement or a 3x improvement, crybabies will always complain.
 
Remember 3 GB and 4 GB cards? These are still fine for 1080p, data is on the summary pages

Maybe, but 1080p is long in the tooth by now so I don't think it's a good example.
 
I don't think anyone is wondering whether or not a game is going to need more than 10 GB any time soon because they wont. Running them at 4K with everything maxed out will, most definitely cause >10GB usage quite soon.
Consoles generally don't run at Ultra settings though, as Ultra settings typically means turning everything up to 11 despite the perceptible quality differences typically dropping off a good deal below that. I mean, if you insist on never ever lowering settings below ultra, I get that this is a concern, but ... who does that? Why? 10GB is not going to be an actual, real-life, relevant limitation for VRAM in the foreseeable future.
 
I am now thoroughly convinced the RTX 3080 is absolute overkill for my 1440p monitor.

At 60Hz, yes.

My 1440p G-Sync monitor goes upto 165Hz, I wouldn't say overkill for that, it depends on the game.
 
Good and clear review as usual from TPU. I particulary love the comparison 1080/1440/4K becasue its everything we need.

For the Ti consumers, wait till they release a Ti version of 3070 or 3080 and see if they can match what they said the + 40% perfomance...

Because as a 2080 Ti owner with a 1440p 165Hz monitor the RTX 3080 didnt impress me as i expected.

Plus temperatures are high, we are getting GPU's that outputs tons of heat more and more. So no improvement over there.
 
Rather the only solution I can think of is to record power during all testing in all games, to get the proper power average. Not something I'm able to do at the moment.
Hwinfo does a great job of this for NV cards. :)
 
Bloody awesome performance, great job on nVidia's part. RTX hit is still a bit insane without DLSS, but hey...

Is it true though that the $699 price is only for the limited amount of founder's cards, and partner models will cost a good deal more? Hope that's not the case.
 
Maybe, but 1080p is long in the tooth by now so I don't think it's a good example.
Isn't it being long in the tooth precisely the point? It's not like the requirements for gaming at 4k are somehow going to change differently than the requirements for 1080p have historically, after all.
 
Hwinfo does a great job of this for NV cards. :)
Not at all, you have to do physical measurements, with proper equipment, to actually get usable power results. Also running HWiNFO while you're benching for performance will skew your performance results, especially on cards that approach the CPU limit
 
Consoles generally don't run at Ultra settings though

Exactly. Current gen consoles make do with about 5 GB and yet you don't even need to go for ultra settings nor 4K to go over that on just VRAM alone on PCs running the exact same game.
 
Not at all, you have to do physical measurements, with proper equipment, to actually get usable power results
If all you're after is a relative difference, it works just fine. Aren't these the same sensors used for throttling and clocks? I'd test the difference with equipment before simply dismissing the values in Hwinfo as 'not usable'. ;)


Also running HWiNFO while you're benching for performance will skew your performance results, especially on cards that approach the CPU limit
EDIT to your edit after I replied........ : You don't run Hwinfo while you're benchmarking and gathering results.... you run it when testing for power and not FPS... you know this.

Aren't software power readings notoriously unreliable?
See above. If he's going for literal accuracy versus a relative difference, I wouldn't do it. That said, everything can be off... some things more than others.

I'll give you a hint.. I've tested this with a meter...they are usable results. ;)
 
Last edited:
Exactly. Current gen consoles make do with about 5 GB and yet you don't even need to go for ultra setting nor 4K to go over than on just VRAM alone on OC on the exact same game.
... are there any current gen 4k console games that are VRAM limited? I would think they are rather limited by their terrible CPUs and less terrible but still not that impressive GPUs ...

Again: the question isn't whether it's possible to exceed 10GB of VRAM usage in games, the question is whether 10GB is going to be a noticeable, real-world limitation that significantly impacts performance for this GPU. And I find that extremely unlikely.
 
@W1zzard

First let me take a minute and say DAMN NICE PLOTTING SKILLS for the frame time and IQR page. Your plotting and analysis skill is better than most biomedical publications in some top tier journals. What did you use? Customized ggplot2?


On topic. Amazing GPU all around. With other board partners charging over $700, I would just buy the FE and call it a day!
 
  • Like
Reactions: M2B
I can't see a single game in the TPU test suite that doesn't show significant scaling at 4k. Are you saying that these are still CPU limited? Based on what? Is there hidden data somewhere showing CPU usage for each GPU tested? Or some other review you've seen showing this that you haven't mentioned?

Just check the relative performance of 2080 Ti compare to 3080 across 3 resolution and you will see a change
1080p: 87%
1440p: 81%
4K: 75%
Might as well use DSR to do some 8K testing on this bad boy :D
People with 1080p and 1440p screen might as well use DSR if they have the 3080, otherwise you are just wasting all performance prowess of 3080.
 
Again: the question isn't whether it's possible to exceed 10GB of VRAM usage in games, the question is whether 10GB is going to be a noticeable, real-world limitation that significantly impacts performance for this GPU.

That's a strange way to think about it, if it is possible to exceed that amount then it inevitably becomes a limitation.
 
Which 4K graph has made you think so? RTX 3080 minimum fps is above average for RTX 2080 Ti. I don't remember the last time we've had such a huge leap in performance. How is it "underwhelming"?

Maybe you could share some graphs to prove your PoV?



It's exactly as powerful as advertised if not better at 4K, the resolution it was created for. Never seen so many AMD trolls in one discussion. Also, it costs $700, so not sure about greed. Also, no one forces you to buy it.

Speaking of greed:

f4114121b3eddaa6ab4f8f710044f519b39c5992bce548ec4b5da0ddb15d1fe7.png
whos trolling?? im on 1080ti u .....
 
Maybe, but 1080p is long in the tooth by now so I don't think it's a good example.
Isn't it exactly a good example for that reason? Assuming similar progress we'll be on 10K resolution in the future, yet the current cards will still be fine for 4K

the question isn't whether it's possible to exceed 10GB of VRAM usage in game
Actually .. if we assume game's are 100 GB, how much textures/levels data do you actually need at a given time?

What did you use? Customized ggplot2?
Excel + VBA :D
 
Back
Top