• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So is the 10 gb vram on rtx 3080 still enough?

Status
Not open for further replies.
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
No matter how much I despise the 10GB on the 3080, I can't find one to buy.
 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
It's possible I missed something too ... can you clarify what you're so passionately dismissive about? I'm not speaking for nguyen, but I've got an open mind...
How about the fact that Nguyen is blatantly twisting facts to suit the very flawed narrative they are attempting(and failing) to express.
 
Last edited:

95Viper

Super Moderator
Staff member
Joined
Oct 12, 2008
Messages
12,667 (2.24/day)
Stay on topic.
Stop any insults and personal attacks.
Keep it technical.

Read the guidelines if you need a refresher on how/what to post.

Be polite and Constructive, if you have nothing nice to say then don't say anything at all.
This includes trolling, continuous use of bad language (ie. cussing), flaming, baiting, retaliatory comments, system feature abuse, and insulting others.
Do not get involved in any off-topic banter or arguments. Please report them and avoid instead.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
How about the fact that Nguyen is blatantly twisting facts to suit the very flawed narrative they are attempting(and failing) to express.
Can you be more specific? What facts is he twisting? He posted up charts that supports what he is saying, no? What are you contesting, exactly? Be clear and please support YOUR assertion as well so we're able to figure this out for ourselves. :)

Apologies if I'm being particularly dense here.
 
Joined
May 8, 2018
Messages
1,495 (0.69/day)
Location
London, UK
Yeah, 10GB is enough for gaming today and I guess for at least 2 years more. If you want deep learning and other things go for the 20gb ti version or 3090.

There is a new article about it.

 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
Apologies if I'm being particularly dense here.
If you're serious, then I'll explain a bit. For a moment it seemed like you were being antagonistic.

Ok if you look at post314(one page back), you'll see a comment made about how HUB and TS, in Nguyen's view, were putting "spin" in their reviews.

Naturally I looked through the review cited and found no such problem and responded.

In post316, he made comparisons to TPU and Guru3D reviews and stated a few things that were simply inaccurate if not willfully deceptive.

In the next post I pointed out the flaws in his statements and posted the correct page in the subject review to be referring to in the context of the discussion at hand.

In Nguyen's next statement, post318, they made even more inaccurate statements and not only misquoted the cited data but seemed to be deliberately twisting facts out of context to fit their flawed, factless narrative.

As a general rule, if I feel like people are discussing a subject in earnest, I'll go out of my way to help them see the real deal or get more facts that can help them understand more about the subject being discussed. But when it seems like people are just being deceptive or worse, willfully ignorant, I lose the will to be helpful or continue the discussion. That's when you see me say, "Google it yourself" or "whatever". I've got no time for people who are going to waste it.
 
Last edited:

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.19/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Well, my warranty came back with a 3090 instead of a 3080 so i'll let you guys know if i ever run out of VRAM.
 
Joined
Nov 11, 2016
Messages
3,062 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Well, my warranty came back with a 3090 instead of a 3080 so i'll let you guys know if i ever run out of VRAM.

your 3080 was defective and they sent you a 3090 ? is this a blessing in disguise ?
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.19/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
your 3080 was defective and they sent you a 3090 ? is this a blessing in disguise ?

discounted upgrade due to no stock, its a 'mere' galax model, but it's still quiet, fast, and RGB blingy
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
If you're serious, then I'll explain a bit. For a moment it seemed like you were being antagonistic.
Apologies you feel this way. I do get frustrated at your (what feels like a) consistent lack of supporting information without being prodded. Maybe it shows even when I ask nicely (not antagonizing, just being honest with you). :)

So...the point of 314 is this......
Enabling DLSS in Cyberpunk 2077 will instantly lower VRAM allocation by 1GB, yet they claim 3070 is slower than 2080 Ti with RTX/DLSS on is because of VRAM limitation ? what kinda editorial logic is this ?
... to which the charts he posted support the assertion, correct?

The first chart shows 1440p ultra with the 3070 and 2080 Ti hitting 56 FPS. The second chart shows the 1440p ultra + DLSS and the 2080Ti is ~9% (~8 FPS) faster at the same res.

Now, look at the chart in #316.. when you go up to 4K ultra (which uses more vRAM than 1440p, right) the difference between a 2080Ti and 3070 is 3% (~1 FPS) as it should be. TPU and Gulu3D (if anyone gets that joke, LMK... :p) support this claim they are the same speed at 4K.....while using MORE vram than 1440p+DLSS. How is that possible? What are we BOTH missing here?



In the next post I pointed out the flaws in his statements and posted the correct page in the subject review to be referring to in the context of the discussion at hand.
Did you though? In #317 it feels like you missed his point which #318 attempts to bring it back on track... to which you promptly blew him off in #319. The fact that they are the same speed at 4K at 3 sites isn't the point. The point is that at 4K which uses MORE vRAM than 1440p + DLSS, they are still tied, yet when using DLSS alone and LESS vram, there is a significant gap.

The charts in #318 show when RT and DLSS are used, (where more vRAM is allocated vs not using RT and DLSS) that the 3070 closes that gap to 1%. If vRAM was a problem, how does the gap shrink when it's using more at the higher res?

In Nguyen's next statement, post318, they made even more inaccurate statements and not only misquoted the cited data but seemed to be deliberately twisting facts out of context to fit their flawed, factless narrative.
What was misquoted? What are those twisted facts, exactly? This is where the details matter, lex. :)

I guess the question, at least to me, is how is it possible that 4K UHD shows the two cards neck and neck with each other, but at a lower resolution with DLSS the gap is nearly 10% (even with more tensor cores)? After cutting through all this (thanks random insomnia, lol), I think I know the answer...but it isn't anyone twisting a narrative intentionally (that I can see). I don't think HUB/Techspot have an agenda, but I can see why he feels this way according to the conclusion he quoted. That said........ I think I found a curiosity/hole in Ngyen's point... but it isn't what I think you are trying to describe, Lex.

As a general rule, if I feel like people are discussing a subject in earnest, I'll go out of my way to help them see the real deal or get more facts that can help them understand more about the subject being discussed. But when it seems like people are just being deceptive or worse, willfully ignorant, I lose the will to be helpful or continue the discussion. That's when you see me say, "Google it yourself" or "whatever". I've got no time for people who are going to waste it.
I felt he was discussing this in earnest. He posted his opinion and supported it with charts and multiple references. You come in shooting him down and seemingly miss the point. He clarifies the point with more charts and you blow him off thinking the above. I don't believe he was being deceptive, nor willfully ignorant. That said, it doesn't mean he (we both) didn't miss something. Now, I think I see what the issue is............

What I think he missed that may shed some light on things...............is the fact that RT on Ampere is a lot faster that Turing b/c more RT bits. So the faster RT overcomes the 8GB of vRAM 'issue' and closes the gap regardless(?). Now, that seems a bit counterintuitive since with RT enabled, you use more vRAM... but it's the only thing I can think of. In the end, it doesn't seem like when using RT/DLSS that there is an issue with RAM. If so, it should have manifested itself in their results, correct? We don't see that, yet we see what the conclusion says. It doesn't seem to match.

Let's be clear, all of his charts support what he is saying.... but the reasoning behind his conclusion may be flawed is all. So, you may be right in some respect, Lex, but more so by accident/different reasons than actually/accurately pointing out what he was missing. :)

So in the end, about the vRAM situation (3070/3080... it doesn't matter)...... is it the RT that is overcoming the so-called problem?
 
Last edited:
Joined
Nov 11, 2016
Messages
3,062 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
What I think he missed that may shed some light on things...............is the fact that RT on Ampere is a lot faster that Turing b/c more RT bits. So the faster RT overcomes the 8GB of vRAM 'issue' and closes the gap regardless(?). Now, that seems a bit counterintuitive since with RT enabled, you use more vRAM... but it's the only thing I can think of. In the end, it doesn't seem like when using RT/DLSS that there is an issue with RAM. If so, it should have manifested itself in their results, correct? We don't see that, yet we see what the conclusion says. It doesn't seem to match.

Let's be clear, all of his charts support what he is saying.... but the reasoning behind his conclusion may be flawed is all. So, you may be right, Lex, but more so by accident/different reasons than accurately pointing out what he was missing. :)

So in the end, about the vRAM situation (3070/3080... it doesn't matter)...... is it the RT that is overcoming the so-called problem?

Very well thought out discussion sir, but I have to add something, is that the results for 3070 at 1440p + RT Ultra + DLSS does not indicate any performance degradation caused by lack of VRAM, when you compare 3070 to both 2080 Ti and 3080

1440p Ultra + DLSS Quality
3080 - 107.3fps
2080 Ti - 90.1fps
3070 - 82.6fps
This is the numbers HUB used as baseline performance, said so themselves.

1440p + RT reflection + DLSS quality
3080 76.3fps = 29%fps drop
2080Ti 62.9fps = 30%fps drop
3070 57.8fps = 30%fps drop

1440p + RT Ultra + DLSS Quality
3080 60.6fps= 44%fps drop from baseline
2080 Ti 47.8fps = 47%fps drop from baseline
3070 46.7fps= 44%fps drop from baseline
Note that HUB also use these calculations in their article.

3070's performance is consistent with 3080, what I found inconsistent is the baseline performance of the 2080Ti. Perhaps the particular test scene that HUB used favors Turing in term of rasterization performance, however HUB should not draw conclusion about 3070's lack of VRAM when in fact their own data do not support that idea.
If the 3070 were indeed VRAM limited in 1440p + RT Ultra + DLSS quality, so would the 3080 :laugh: .
 
Last edited:
Joined
Oct 22, 2014
Messages
13,210 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
discounted upgrade due to no stock, its a 'mere' galax model, but it's still quiet, fast, and RGB blingy
Selling for $2,500 minimum here. :eek:
Just a slight price bonus from the 3080.
 
Joined
Apr 6, 2015
Messages
246 (0.07/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
Given the current numbers, 10 GB RAM isn't exactly a limit for most games.
Plus, with AMD actually being able to catch up with Big Navi (less Ray-tracing), a more heated-up competition should be ahead;
I wouldn't expect 3080 to age as well as 1080Ti even if it had more RAM.

I would have gotten one if it was anywhere close to MSRP :laugh:
 
Joined
Jun 7, 2020
Messages
72 (0.05/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MAG X570 TOMAHAWK WIFI
Cooling Thermalright Frost Commander 140 BLACK
Memory 2x16 GB G.Skill Ripjaws V DDR4-3600 CL14 (F4-3600C14D-32GVK)
Video Card(s) XFX SPEEDSTER MERC 310 RX 7900 XTX Black Edition
Storage 1x1TB Samsung 970 EVO, 1x1TB ADATA XPG SX8200 Pro, 2x 2TB Seagate Barracuda
Display(s) LG 27GP850P-B
Case White Phanteks Eclipse P400A
Power Supply Thermaltake Toughpower GF3 1000 W Gold
Mouse Logitech G603
Keyboard ROCCAT Vulcan 121 AIMO
Software Windows 11 Pro
My 5 cents, and forgive me if I am stating the obvious:
If you are the type of person who upgrades with every new GPU generation, then 10GB VRAM is enough.
If you are the type of person who keeps their card for 3 years or longer then:
1) If you game at 1440p and plan to stay there, then 10GB should still be enough.
2) If you plan to, or game at 4K, then most likely 10GB VRAM might not be enough in the long term.
 
Joined
Apr 6, 2015
Messages
246 (0.07/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
Just to give my experience, after playing minecraft RTX on my new 3070 today.
In just one demo, the Neon something, I can see the GPU RAM use was > 8 GB (so capped there), and in the game some texture isn't fully rendered (some transparent blocks).
Though this is an edge case, but obviously in some games RAM use could go beyond 8 GB just by having more textures, so I would say it depends and if you are concerned you might want something with more RAM.

In general, I believe game releasing in the coming 2 years should have taken into account that 8-10 GB RAM is still the mainstream.
 
Joined
Feb 11, 2009
Messages
5,393 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
if its enough, its only because game devs adjust for the "lack off" Vram

If all videocards had 20gb of Vram, im sure we would see some amazing high resolution texture packs.
 
Joined
Apr 15, 2020
Messages
109 (0.07/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 970 Maxwell STRIX-GTX970-DC2OC-4GD5
Recent leaks suggest they're gonna offer a 12 GB 3060 to be able to compete with 6700 XT from AMD. So if 10 GB was really enough, they wouldn't have done so and they would release the card with 6 GB instead of 12 GB saying that 6 GB VRAM is enough VRAM for the 3060.

2070 Super exists because of the 5700 XT. Same story happened with Polaris and Pascal (R9 390 series offered 8 GB before Polaris though). It's like AMD is making them up their game otherwise we would still be getting 4C/8T i7s and that 3080 would have been like 6-8 GB (more than likely 6 GB, God!!!).

When I think of it, we really owe it all to them. So a huge thank you to AMD for pushing the industry further ahead into the right (sane) direction. Feeling so blessed!
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Horses for courses... more and more 144Hz+ are adopted by gamers, which are better suited at resolutions below 4K. With the limited supply and high cost GDDR6/X, NVidia perfected performance cards for 1080p and 1440p. If you want to performance-proof 4K in all games, you need to step up to the 3090 with its 24GB of VRAM.

Is anyone really contending that the 16GB in AMD's Radeons is the sole determinant in 4K performance? Given the limited memory configuration of a 256-bit die, AMD's only other option was 8GB, which they would have had to sell cheaper and it would have looked way worse than the RTX 3080. People, this was a marketing decision, and does not merit functional gains.
 
Joined
Nov 13, 2007
Messages
10,228 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Horses for courses... more and more 144Hz+ are adopted by gamers, which are better suited at resolutions below 4K. With the limited supply and high cost GDDR6/X, NVidia perfected performance cards for 1080p and 1440p. If you want to performance-proof 4K in all games, you need to step up to the 3090 with its 24GB of VRAM.

Is anyone really contending that the 16GB in AMD's Radeons is the sole determinant in 4K performance? Given the limited memory configuration of a 256-bit die, AMD's only other option was 8GB, which they would have had to sell cheaper and it would have looked way worse than the RTX 3080. People, this was a marketing decision, and does not merit functional gains.

And performance-proof in 4k means - 4K ULTRA... which the 3080 will struggle with today... 4K high (which looks identical to ultra) and 4k Medium, which still looks awesome in most games, at much higher FPS will be viable for a long time to come.

10GB is well within the performance bracket of the card. 3080ti and 3090 will even be replaced by the 4 series before they use all that ram anyways.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
which the 3080 will struggle with today..
No... it does not. Please take the time to read some of W1z's game reviews. So far, not ONE has used 10GB (one is very close). The 3080 doesn't struggle today, because of vRAM limitations, on any title running canned ultra settings at 4K UHD. Outside of the ONE title that has 9.8GB allocated (not used, this is an important distinction that many seem to gloss over/not understand, note...). The rest of the titles he reviewed (again, using canned Ultra settings) over the past year averaged around 7GB use. This card has plenty of time before it's vRAM becomes a problem.

That said, if you plan to mod games and add texture packs (which few of the whole actually do) and play at 4K/Ultra/60+, this may be a concern on some titles.

Can this thread be closed? I think everyone and their mom has posted an opinion already. This serves no purpose (but to come back in 3 years and laugh at some posts).
 
Joined
Nov 13, 2007
Messages
10,228 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
No... it does not. Please take the time to read some of W1z's game reviews. So far, not ONE has used 10GB (one is very close). The 3080 doesn't struggle today, because of vRAM limitations, on any title running canned ultra settings at 4K UHD. Outside of the ONE title that has 9.8GB allocated (not used, this is an important distinction that many seem to gloss over/not understand, note...). The rest of the titles he reviewed (again, using canned Ultra settings) over the past year averaged around 7GB use. This card has plenty of time before it's vRAM becomes a problem.

That said, if you plan to mod games and add texture packs (which few of the whole actually do) and play at 4K/Ultra/60+, this may be a concern on some titles.

Can this thread be closed? I think everyone and their mom has posted an opinion already. This serves no purpose (but to come back in 3 years and laugh at some posts).
That's not what I'm saying...

Basically - FPS will tank at ultra settings due to GPU horsepower before ram limitation.

Examples:
Borderlands 3 - ultra settings: GPU manages high 60's with dips into the mid/high 50's- feels awful for a shooter, at high/medium tweaked settings runs great at 80-110 FPS (and looks identical).
Outer Worlds - same story except even lower dips.
Cyberpunk (LOL - rip) - Need tweaked medium/high settings and DLSS to even play at 4k, tweaked settings run between 70-90FPS.

IMO newer games at ultra settings will beat the gpu into a slideshow way before the ram quantity becomes an issue.
 
Last edited:
Joined
Apr 15, 2020
Messages
109 (0.07/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 970 Maxwell STRIX-GTX970-DC2OC-4GD5
If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.

So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards.

Interesting.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,747 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I had the pleasure of gaming at a mates place, plugging my 3080 box via HDMI 2.1 into an LG CX 65 and playing various games, DOOM Eternal, Cyberpunk, Jedi fallen order to name a few, and holy sh*t was the 4k VRR OLED experience amazing.

I am totally sold on an OLED and the 3080 absolutely bloody excels at 4k gaming in the here and now. Really makes me want the CX 48 as my gaming monitor, I can't believe a TV does such an awes inspiring job of it. The way it handles VRR was buttery to a level I have not experienced in person. Absolutely blew me away.
 
Joined
Apr 15, 2020
Messages
109 (0.07/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 970 Maxwell STRIX-GTX970-DC2OC-4GD5
Yeah playing on big screen is something else entirely.
 
Status
Not open for further replies.
Top