• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So is the 10 gb vram on rtx 3080 still enough?

Status
Not open for further replies.
Joined
Feb 1, 2013
Messages
849 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5GHz 1.232v Raystorm Pro
Motherboard EVGA Z370 Micro
Cooling GTX360 + MCR120-XP, EK-XRES
Memory 2x16GB TridentZ 3900-C15-2T 1.45v
Video Card(s) RTX 3080 Bykski Eagle
Storage Samsung 970 EVO 500GB, 860 QVO 2TB
Display(s) CU34G2X 3440x1440 144Hz
Case FT03-T mATX
Audio Device(s) SBz
Power Supply SS-850KM3
Mouse G502
Keyboard G710+
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
No matter how much I despise the 10GB on the 3080, I can't find one to buy.
 
Joined
Jul 5, 2013
Messages
12,220 (4.38/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
It's possible I missed something too ... can you clarify what you're so passionately dismissive about? I'm not speaking for nguyen, but I've got an open mind...
How about the fact that Nguyen is blatantly twisting facts to suit the very flawed narrative they are attempting(and failing) to express.
 
Last edited:

95Viper

Moderator
Staff member
Joined
Oct 12, 2008
Messages
8,719 (1.93/day)
Stay on topic.
Stop any insults and personal attacks.
Keep it technical.

Read the guidelines if you need a refresher on how/what to post.

Be polite and Constructive, if you have nothing nice to say then don't say anything at all.
This includes trolling, continuous use of bad language (ie. cussing), flaming, baiting, retaliatory comments, system feature abuse, and insulting others.
Do not get involved in any off-topic banter or arguments. Please report them and avoid instead.
 
Joined
Dec 31, 2009
Messages
19,301 (4.74/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
How about the fact that Nguyen is blatantly twisting facts to suit the very flawed narrative they are attempting(and failing) to express.
Can you be more specific? What facts is he twisting? He posted up charts that supports what he is saying, no? What are you contesting, exactly? Be clear and please support YOUR assertion as well so we're able to figure this out for ourselves. :)

Apologies if I'm being particularly dense here.
 
Joined
May 8, 2018
Messages
897 (0.88/day)
Location
London, UK
Yeah, 10GB is enough for gaming today and I guess for at least 2 years more. If you want deep learning and other things go for the 20gb ti version or 3090.

There is a new article about it.

 
Joined
Jul 5, 2013
Messages
12,220 (4.38/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
Apologies if I'm being particularly dense here.
If you're serious, then I'll explain a bit. For a moment it seemed like you were being antagonistic.

Ok if you look at post314(one page back), you'll see a comment made about how HUB and TS, in Nguyen's view, were putting "spin" in their reviews.

Naturally I looked through the review cited and found no such problem and responded.

In post316, he made comparisons to TPU and Guru3D reviews and stated a few things that were simply inaccurate if not willfully deceptive.

In the next post I pointed out the flaws in his statements and posted the correct page in the subject review to be referring to in the context of the discussion at hand.

In Nguyen's next statement, post318, they made even more inaccurate statements and not only misquoted the cited data but seemed to be deliberately twisting facts out of context to fit their flawed, factless narrative.

As a general rule, if I feel like people are discussing a subject in earnest, I'll go out of my way to help them see the real deal or get more facts that can help them understand more about the subject being discussed. But when it seems like people are just being deceptive or worse, willfully ignorant, I lose the will to be helpful or continue the discussion. That's when you see me say, "Google it yourself" or "whatever". I've got no time for people who are going to waste it.
 
Last edited:

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
48,972 (8.18/day)
Location
Australalalalalaia.
System Name Rainbow Sparkles
Processor Ryzen R7 5800X
Motherboard Asus x570 Gaming-F
Cooling EK 240mm RGB AIO
Memory 64GB DDR4 3600 Corsair Vengeance RGB
Video Card(s) Galax RTX 3090 SG 24GB (0.8v 1.8GHz)
Storage 1TB Sasmsung 970 Pro NVME + 500GB 850 Evo
Display(s) Gigabyte G32QC + Phillips 328m6fjrmb (32" 1440p 165Hz/144Hz curved )
Case Fractal Design R6
Audio Device(s) Razer Leviathan + Corsair Void pro RGB, Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE
Software Windows 10 pro x64 (all systems)
Benchmark Scores Lots of RGB, so you know it's fast.
Well, my warranty came back with a 3090 instead of a 3080 so i'll let you guys know if i ever run out of VRAM.
 
Joined
Nov 11, 2016
Messages
1,055 (0.67/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
Well, my warranty came back with a 3090 instead of a 3080 so i'll let you guys know if i ever run out of VRAM.

your 3080 was defective and they sent you a 3090 ? is this a blessing in disguise ?
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
48,972 (8.18/day)
Location
Australalalalalaia.
System Name Rainbow Sparkles
Processor Ryzen R7 5800X
Motherboard Asus x570 Gaming-F
Cooling EK 240mm RGB AIO
Memory 64GB DDR4 3600 Corsair Vengeance RGB
Video Card(s) Galax RTX 3090 SG 24GB (0.8v 1.8GHz)
Storage 1TB Sasmsung 970 Pro NVME + 500GB 850 Evo
Display(s) Gigabyte G32QC + Phillips 328m6fjrmb (32" 1440p 165Hz/144Hz curved )
Case Fractal Design R6
Audio Device(s) Razer Leviathan + Corsair Void pro RGB, Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE
Software Windows 10 pro x64 (all systems)
Benchmark Scores Lots of RGB, so you know it's fast.
your 3080 was defective and they sent you a 3090 ? is this a blessing in disguise ?

discounted upgrade due to no stock, its a 'mere' galax model, but it's still quiet, fast, and RGB blingy
 
Joined
Dec 31, 2009
Messages
19,301 (4.74/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
If you're serious, then I'll explain a bit. For a moment it seemed like you were being antagonistic.
Apologies you feel this way. I do get frustrated at your (what feels like a) consistent lack of supporting information without being prodded. Maybe it shows even when I ask nicely (not antagonizing, just being honest with you). :)

So...the point of 314 is this......
Enabling DLSS in Cyberpunk 2077 will instantly lower VRAM allocation by 1GB, yet they claim 3070 is slower than 2080 Ti with RTX/DLSS on is because of VRAM limitation ? what kinda editorial logic is this ?
... to which the charts he posted support the assertion, correct?

The first chart shows 1440p ultra with the 3070 and 2080 Ti hitting 56 FPS. The second chart shows the 1440p ultra + DLSS and the 2080Ti is ~9% (~8 FPS) faster at the same res.

Now, look at the chart in #316.. when you go up to 4K ultra (which uses more vRAM than 1440p, right) the difference between a 2080Ti and 3070 is 3% (~1 FPS) as it should be. TPU and Gulu3D (if anyone gets that joke, LMK... :p) support this claim they are the same speed at 4K.....while using MORE vram than 1440p+DLSS. How is that possible? What are we BOTH missing here?



In the next post I pointed out the flaws in his statements and posted the correct page in the subject review to be referring to in the context of the discussion at hand.
Did you though? In #317 it feels like you missed his point which #318 attempts to bring it back on track... to which you promptly blew him off in #319. The fact that they are the same speed at 4K at 3 sites isn't the point. The point is that at 4K which uses MORE vRAM than 1440p + DLSS, they are still tied, yet when using DLSS alone and LESS vram, there is a significant gap.

The charts in #318 show when RT and DLSS are used, (where more vRAM is allocated vs not using RT and DLSS) that the 3070 closes that gap to 1%. If vRAM was a problem, how does the gap shrink when it's using more at the higher res?

In Nguyen's next statement, post318, they made even more inaccurate statements and not only misquoted the cited data but seemed to be deliberately twisting facts out of context to fit their flawed, factless narrative.
What was misquoted? What are those twisted facts, exactly? This is where the details matter, lex. :)

I guess the question, at least to me, is how is it possible that 4K UHD shows the two cards neck and neck with each other, but at a lower resolution with DLSS the gap is nearly 10% (even with more tensor cores)? After cutting through all this (thanks random insomnia, lol), I think I know the answer...but it isn't anyone twisting a narrative intentionally (that I can see). I don't think HUB/Techspot have an agenda, but I can see why he feels this way according to the conclusion he quoted. That said........ I think I found a curiosity/hole in Ngyen's point... but it isn't what I think you are trying to describe, Lex.

As a general rule, if I feel like people are discussing a subject in earnest, I'll go out of my way to help them see the real deal or get more facts that can help them understand more about the subject being discussed. But when it seems like people are just being deceptive or worse, willfully ignorant, I lose the will to be helpful or continue the discussion. That's when you see me say, "Google it yourself" or "whatever". I've got no time for people who are going to waste it.
I felt he was discussing this in earnest. He posted his opinion and supported it with charts and multiple references. You come in shooting him down and seemingly miss the point. He clarifies the point with more charts and you blow him off thinking the above. I don't believe he was being deceptive, nor willfully ignorant. That said, it doesn't mean he (we both) didn't miss something. Now, I think I see what the issue is............

What I think he missed that may shed some light on things...............is the fact that RT on Ampere is a lot faster that Turing b/c more RT bits. So the faster RT overcomes the 8GB of vRAM 'issue' and closes the gap regardless(?). Now, that seems a bit counterintuitive since with RT enabled, you use more vRAM... but it's the only thing I can think of. In the end, it doesn't seem like when using RT/DLSS that there is an issue with RAM. If so, it should have manifested itself in their results, correct? We don't see that, yet we see what the conclusion says. It doesn't seem to match.

Let's be clear, all of his charts support what he is saying.... but the reasoning behind his conclusion may be flawed is all. So, you may be right in some respect, Lex, but more so by accident/different reasons than actually/accurately pointing out what he was missing. :)

So in the end, about the vRAM situation (3070/3080... it doesn't matter)...... is it the RT that is overcoming the so-called problem?
 
Last edited:
Joined
Nov 11, 2016
Messages
1,055 (0.67/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
What I think he missed that may shed some light on things...............is the fact that RT on Ampere is a lot faster that Turing b/c more RT bits. So the faster RT overcomes the 8GB of vRAM 'issue' and closes the gap regardless(?). Now, that seems a bit counterintuitive since with RT enabled, you use more vRAM... but it's the only thing I can think of. In the end, it doesn't seem like when using RT/DLSS that there is an issue with RAM. If so, it should have manifested itself in their results, correct? We don't see that, yet we see what the conclusion says. It doesn't seem to match.

Let's be clear, all of his charts support what he is saying.... but the reasoning behind his conclusion may be flawed is all. So, you may be right, Lex, but more so by accident/different reasons than accurately pointing out what he was missing. :)

So in the end, about the vRAM situation (3070/3080... it doesn't matter)...... is it the RT that is overcoming the so-called problem?

Very well thought out discussion sir, but I have to add something, is that the results for 3070 at 1440p + RT Ultra + DLSS does not indicate any performance degradation caused by lack of VRAM, when you compare 3070 to both 2080 Ti and 3080

1440p Ultra + DLSS Quality
3080 - 107.3fps
2080 Ti - 90.1fps
3070 - 82.6fps
This is the numbers HUB used as baseline performance, said so themselves.

1440p + RT reflection + DLSS quality
3080 76.3fps = 29%fps drop
2080Ti 62.9fps = 30%fps drop
3070 57.8fps = 30%fps drop

1440p + RT Ultra + DLSS Quality
3080 60.6fps= 44%fps drop from baseline
2080 Ti 47.8fps = 47%fps drop from baseline
3070 46.7fps= 44%fps drop from baseline
Note that HUB also use these calculations in their article.

3070's performance is consistent with 3080, what I found inconsistent is the baseline performance of the 2080Ti. Perhaps the particular test scene that HUB used favors Turing in term of rasterization performance, however HUB should not draw conclusion about 3070's lack of VRAM when in fact their own data do not support that idea.
If the 3070 were indeed VRAM limited in 1440p + RT Ultra + DLSS quality, so would the 3080 :laugh: .
 
Last edited:
Joined
Jul 5, 2013
Messages
12,220 (4.38/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
What was misquoted? What are those twisted facts, exactly? This is where the details matter, lex.
That said........ I think I found a curiosity/hole in Ngyen's point... but it isn't what I think you are trying to describe, Lex.
Yup, whatever, earthdog.
 
Joined
Oct 22, 2014
Messages
11,091 (4.78/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel i5-9600KF
Motherboard NZXT N7 Z370 Black
Cooling Cooler Master 240 RGB AIO / Stock
Memory Thermaltake Toughram 16GB 4400MHz DDR4 or Gigabyte 16GB 3600MHz DDR4 or Adata 8GB 2133Mhz DDR4
Video Card(s) Asus Dual 1060 6GB
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
discounted upgrade due to no stock, its a 'mere' galax model, but it's still quiet, fast, and RGB blingy
Selling for $2,500 minimum here. :eek:
Just a slight price bonus from the 3080.
 
Joined
Apr 6, 2015
Messages
184 (0.09/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
Given the current numbers, 10 GB RAM isn't exactly a limit for most games.
Plus, with AMD actually being able to catch up with Big Navi (less Ray-tracing), a more heated-up competition should be ahead;
I wouldn't expect 3080 to age as well as 1080Ti even if it had more RAM.

I would have gotten one if it was anywhere close to MSRP :laugh:
 

Antonis_35

New Member
Joined
Jun 7, 2020
Messages
17 (0.06/day)
My 5 cents, and forgive me if I am stating the obvious:
If you are the type of person who upgrades with every new GPU generation, then 10GB VRAM is enough.
If you are the type of person who keeps their card for 3 years or longer then:
1) If you game at 1440p and plan to stay there, then 10GB should still be enough.
2) If you plan to, or game at 4K, then most likely 10GB VRAM might not be enough in the long term.
 
Joined
Apr 6, 2015
Messages
184 (0.09/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
Just to give my experience, after playing minecraft RTX on my new 3070 today.
In just one demo, the Neon something, I can see the GPU RAM use was > 8 GB (so capped there), and in the game some texture isn't fully rendered (some transparent blocks).
Though this is an edge case, but obviously in some games RAM use could go beyond 8 GB just by having more textures, so I would say it depends and if you are concerned you might want something with more RAM.

In general, I believe game releasing in the coming 2 years should have taken into account that 8-10 GB RAM is still the mainstream.
 
Joined
Feb 11, 2009
Messages
3,097 (0.70/day)
System Name Cyberline
Processor Intel Core i7 2600k
Motherboard Asus P8P67 LE Rev 3.0
Cooling Tuniq Tower 120
Memory Corsair (4x2) 8gb 1600mhz
Video Card(s) AMD RX480
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb
Display(s) Philips 32inch LPF5605H (television)
Case antec 600
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
if its enough, its only because game devs adjust for the "lack off" Vram

If all videocards had 20gb of Vram, im sure we would see some amazing high resolution texture packs.
 
Joined
Apr 15, 2020
Messages
96 (0.30/day)
Location
Planet Earth
System Name Old friend
Processor 3550 Ivy Bridge
Memory 2x8GB Ripjaws
Video Card(s) 970 Maxwell
Recent leaks suggest they're gonna offer a 12 GB 3060 to be able to compete with 6700 XT from AMD. So if 10 GB was really enough, they wouldn't have done so and they would release the card with 6 GB instead of 12 GB saying that 6 GB VRAM is enough VRAM for the 3060.

2070 Super exists because of the 5700 XT. Same story happened with Polaris and Pascal (R9 390 series offered 8 GB before Polaris though). It's like AMD is making them up their game otherwise we would still be getting 4C/8T i7s and that 3080 would have been like 6-8 GB (more than likely 6 GB, God!!!).

When I think of it, we really owe it all to them. So a huge thank you to AMD for pushing the industry further ahead into the right (sane) direction. Feeling so blessed!
 
Joined
Feb 1, 2013
Messages
849 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5GHz 1.232v Raystorm Pro
Motherboard EVGA Z370 Micro
Cooling GTX360 + MCR120-XP, EK-XRES
Memory 2x16GB TridentZ 3900-C15-2T 1.45v
Video Card(s) RTX 3080 Bykski Eagle
Storage Samsung 970 EVO 500GB, 860 QVO 2TB
Display(s) CU34G2X 3440x1440 144Hz
Case FT03-T mATX
Audio Device(s) SBz
Power Supply SS-850KM3
Mouse G502
Keyboard G710+
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Horses for courses... more and more 144Hz+ are adopted by gamers, which are better suited at resolutions below 4K. With the limited supply and high cost GDDR6/X, NVidia perfected performance cards for 1080p and 1440p. If you want to performance-proof 4K in all games, you need to step up to the 3090 with its 24GB of VRAM.

Is anyone really contending that the 16GB in AMD's Radeons is the sole determinant in 4K performance? Given the limited memory configuration of a 256-bit die, AMD's only other option was 8GB, which they would have had to sell cheaper and it would have looked way worse than the RTX 3080. People, this was a marketing decision, and does not merit functional gains.
 
Joined
Nov 13, 2007
Messages
8,322 (1.71/day)
Location
Austin Texas
System Name Chernobyl
Processor 10850K @ 5.1
Motherboard MSI 490-A PRO
Cooling 360mm AIO
Memory 32 GB 4100 Mhz DDR4 18-18-18-38-440 trfc - 2T
Video Card(s) MSI Ventus RTX 3080
Storage 3x1TB SSDs, 2TB SSD
Display(s) LG 49NANO85 - 49" 4K 120HZ NanoIPS
Case Zalman X3
Audio Device(s) Bose Solo
Power Supply Corsair SF750
Mouse Logitech GPro Wired
Keyboard tecware phantom
Software Windows 10 64 Bit
Horses for courses... more and more 144Hz+ are adopted by gamers, which are better suited at resolutions below 4K. With the limited supply and high cost GDDR6/X, NVidia perfected performance cards for 1080p and 1440p. If you want to performance-proof 4K in all games, you need to step up to the 3090 with its 24GB of VRAM.

Is anyone really contending that the 16GB in AMD's Radeons is the sole determinant in 4K performance? Given the limited memory configuration of a 256-bit die, AMD's only other option was 8GB, which they would have had to sell cheaper and it would have looked way worse than the RTX 3080. People, this was a marketing decision, and does not merit functional gains.

And performance-proof in 4k means - 4K ULTRA... which the 3080 will struggle with today... 4K high (which looks identical to ultra) and 4k Medium, which still looks awesome in most games, at much higher FPS will be viable for a long time to come.

10GB is well within the performance bracket of the card. 3080ti and 3090 will even be replaced by the 4 series before they use all that ram anyways.
 
Joined
Dec 31, 2009
Messages
19,301 (4.74/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
which the 3080 will struggle with today..
No... it does not. Please take the time to read some of W1z's game reviews. So far, not ONE has used 10GB (one is very close). The 3080 doesn't struggle today, because of vRAM limitations, on any title running canned ultra settings at 4K UHD. Outside of the ONE title that has 9.8GB allocated (not used, this is an important distinction that many seem to gloss over/not understand, note...). The rest of the titles he reviewed (again, using canned Ultra settings) over the past year averaged around 7GB use. This card has plenty of time before it's vRAM becomes a problem.

That said, if you plan to mod games and add texture packs (which few of the whole actually do) and play at 4K/Ultra/60+, this may be a concern on some titles.

Can this thread be closed? I think everyone and their mom has posted an opinion already. This serves no purpose (but to come back in 3 years and laugh at some posts).
 
Joined
Nov 13, 2007
Messages
8,322 (1.71/day)
Location
Austin Texas
System Name Chernobyl
Processor 10850K @ 5.1
Motherboard MSI 490-A PRO
Cooling 360mm AIO
Memory 32 GB 4100 Mhz DDR4 18-18-18-38-440 trfc - 2T
Video Card(s) MSI Ventus RTX 3080
Storage 3x1TB SSDs, 2TB SSD
Display(s) LG 49NANO85 - 49" 4K 120HZ NanoIPS
Case Zalman X3
Audio Device(s) Bose Solo
Power Supply Corsair SF750
Mouse Logitech GPro Wired
Keyboard tecware phantom
Software Windows 10 64 Bit
No... it does not. Please take the time to read some of W1z's game reviews. So far, not ONE has used 10GB (one is very close). The 3080 doesn't struggle today, because of vRAM limitations, on any title running canned ultra settings at 4K UHD. Outside of the ONE title that has 9.8GB allocated (not used, this is an important distinction that many seem to gloss over/not understand, note...). The rest of the titles he reviewed (again, using canned Ultra settings) over the past year averaged around 7GB use. This card has plenty of time before it's vRAM becomes a problem.

That said, if you plan to mod games and add texture packs (which few of the whole actually do) and play at 4K/Ultra/60+, this may be a concern on some titles.

Can this thread be closed? I think everyone and their mom has posted an opinion already. This serves no purpose (but to come back in 3 years and laugh at some posts).
That's not what I'm saying...

Basically - FPS will tank at ultra settings due to GPU horsepower before ram limitation.

Examples:
Borderlands 3 - ultra settings: GPU manages high 60's with dips into the mid/high 50's- feels awful for a shooter, at high/medium tweaked settings runs great at 80-110 FPS (and looks identical).
Outer Worlds - same story except even lower dips.
Cyberpunk (LOL - rip) - Need tweaked medium/high settings and DLSS to even play at 4k, tweaked settings run between 70-90FPS.

IMO newer games at ultra settings will beat the gpu into a slideshow way before the ram quantity becomes an issue.
 
Last edited:
Joined
Apr 15, 2020
Messages
96 (0.30/day)
Location
Planet Earth
System Name Old friend
Processor 3550 Ivy Bridge
Memory 2x8GB Ripjaws
Video Card(s) 970 Maxwell
If 16GB VRAM becomes common then they'll design levels that utilise 16GB of art. It's really not any extra work for them, they just need to move the slider a little further to the right when using the compress-O-tron to reduce art assets to what fits into common VRAM sizes, 2GB, 4GB, or 8GB for example.

So some devs, at least, have been automatically ready for 16GB and 32GB cards for half a decade or more. The tools to do it effortlessly are mainstream industry standards.

Interesting.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,836 (1.16/day)
System Name MightyX
Processor Ryzen 7 3700X 1usmus
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
I had the pleasure of gaming at a mates place, plugging my 3080 box via HDMI 2.1 into an LG CX 65 and playing various games, DOOM Eternal, Cyberpunk, Jedi fallen order to name a few, and holy sh*t was the 4k VRR OLED experience amazing.

I am totally sold on an OLED and the 3080 absolutely bloody excels at 4k gaming in the here and now. Really makes me want the CX 48 as my gaming monitor, I can't believe a TV does such an awes inspiring job of it. The way it handles VRR was buttery to a level I have not experienced in person. Absolutely blew me away.
 
Joined
Apr 15, 2020
Messages
96 (0.30/day)
Location
Planet Earth
System Name Old friend
Processor 3550 Ivy Bridge
Memory 2x8GB Ripjaws
Video Card(s) 970 Maxwell
Yeah playing on big screen is something else entirely.
 
Status
Not open for further replies.
Top