• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So is the 10 gb vram on rtx 3080 still enough?

Status
Not open for further replies.
Joined
Sep 17, 2014
Messages
21,214 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Yeah I find it odd that there are people who think like this:
-Turn off RT for better performance: perfectly acceptable
-Reduce Ultra detail to High for better performance: totally unacceptable

So some people are blind to RT, yet very picky about details, doesn't make any sense.

OK I'll try one more time :) But I'll start by saying YES, you are correct. It is a personal consideration - we all try to crystal ball ourselves out of this, its never going to be conclusive until its too late ;)

- RT performance is early adopter territory. Next gen may turn things on its head altogether and make current day perf obsolete straight away. You can check Turing > Ampere RT perf for proof. Remember, AMD is having a lite-version of RT in RDNA2. It can go either of both ways - the industry goes full steam on it and RDNA3 or beyond will push it far more heavily, or they really don't and focus goes back towards better raster perf while RT takes a similar place as, say, Tesselation - just another effect to use. The supposed 'RT advantage' of Nvidia can also dwindle faster than you might blink if devs start optimizing for consoles first. The additional die space Nvidia has for it, won't be used properly unless Nvidia keeps using bags of money like they have so far to get RTX implementation.

Its far too early to determine RT is 'here to stay' in the projected way as a 'major part' of the graphics pipeline. If the market doesn't eat it, it'll die. Its a very expensive effect. Look at the price surges, demand issues... They are related.

RT is also not efficient at this time. Its the same thing as enabling overly costly AA that barely shows an advantage. Yes, you cán... but why? In a large number of situations it really doesn't add much. You can still count the examples where it does, on one hand - and you'll have fingers left.

- 10GB VRAM is not resale-worthy. Its just not. Its yesterday's capacity. Past two gens already had more. The fact we're already discussing it at launch speaks volumes.. You buy this to use it for a few years and then it gets knocked down the product tiers very fast. I haven't seen anyone disagree with that, by the way, even in this topic. We ALL draw the conclusion that 10GB will impose limitations pretty soon. The idea that this somehow 'scales with the core power' has absolutely no basis in the past - in the past, we've always seen an increase or equal capacity with increasing core power. You can't ignore that disbalance. Its there and it'll show.

- 16GB VRAM is very resale-worthy, especially given the fact that there is lots of core power on tap and the balance with the core power relative to past gen is also kept intact. Well balanced GPUs last longest. Its just that simple. When they run out of oomph, they run out of all things at the same time, and that tends to take a long while. Until they do... you can resell them. A GPU without such balance doesn't resell like that - you can only resell it on 'conditional' situations, ie specific use cases. '3080's a great card for 1440p now', is probably the punchline. You'll insta-lose all potential buyers with a 4K panel or even UWs - your niche got that much smaller.

- VRAM is used everywhere. If you're short, you'll be tweaking your settings every time, not just in the games that may or may not have RT worth looking at. So going forward in time, say you'll be buying a 4K monitor 3 years from now... with a 3080 you might also feel the urge to upgrade the GPU. With a 12-16GB card, you most certainly won't have to.

As for a hundred DXR titles... yeah. In a similar vein we also have 'hundreds' of DX12 titles... that we still prefer to run in DX11 because its the same thing but better.

TL DR what it REALLY comes down to... is how keen you are to early adopt RT. Except now its not the Turing days where the competition has nothing to place against that consideration - the competition has a technically more durable product - and it even does RT too! That's a pretty steep price tag to keep going green if you ask me.
 
Last edited:
Joined
Jan 8, 2017
Messages
9,113 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Yeah I find it odd that there are people who think like this:
-Turn off RT for better performance: perfectly acceptable
-Reduce Ultra detail to High for better performance: totally unacceptable

So some people are blind to RT, yet very picky about details, doesn't make any sense.

Of course it doesn't make sense since you appear to be blind to everything expect RT.

Turning RT on can sometime halve the performance but the visual impact is minimal.
Going from Low to Ultra usually halves the performance as well or even worse but the visual impact is massive.

You wouldn't play on Low but with RT on, would you ? The priories people have are clear and it makes perfect sense.
 
Joined
Feb 3, 2017
Messages
3,542 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Going from Low to Ultra usually halves the performance as well or even worse but the visual impact is massive.
Who said anything about going from Ultra to Low? Impact to image quality from changing Texture setting from Ultra to Very High is either minimal or nonexistent.
 
Joined
Jan 8, 2017
Messages
9,113 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Who said anything about going from Ultra to Low? Impact to image quality from changing Texture setting from Ultra to Very High is either minimal or nonexistent.

I am talking about settings in general not just textures.
 
Joined
Jul 26, 2019
Messages
418 (0.24/day)
Processor R5 5600X
Motherboard Asus TUF Gaming X570-Plus
Memory 32 GB 3600 MT/s CL16
Video Card(s) Sapphire Vega 64
Storage 2x 500 GB SSD, 2x 3 TB HDD
Case Phanteks P300A
Software Manjaro Linux, W10 if I have to
TL DR what it REALLY comes down to... is how keen you are to early adopt RT. Except now its not the Turing days where the competition has nothing to place against that consideration - the competition has a technically more durable product - and it even does RT too! That's a pretty steep price tag to keep going green if you ask me.

Exactly. I am not interested in early adopting RTRT in the least. If AMD were to offer a RT disabled option I would buy that... This seems to deeply offend some people. I can't understand it lol
 
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
I know for sure that I will be enabling Ray Tracing in Cyberpunk 2077. Massive difference in image quality, unless you are blind, there is plenty of ingame footage showcasing it. AMDs 6000 series will get RT support sometime next year in this game, but performance hit using AMD is massive, with 3070 beating 6800 XT, atleast 3070 have the option to turn on DLSS 2.0 to increase fps by up to 100% and migitate the fps hit, or simply turn RT off and just enjoy a massive fps gain, the choice is yours

RT is here to stay and will only get more and more common going forward, AMD needs to improve their RT performance ALOT for 7000 series.

AMD users downplay the importance of RT, DLSS and tons of other RTX features because, you know why.. :laugh:

Those High and Ultra presets are going to include RT in a few years
 
Last edited:
Joined
Feb 3, 2017
Messages
3,542 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Exactly. I am not interested in early adopting RTRT in the least. If AMD were to offer a RT disabled option I would buy that... This seems to deeply offend some people. I can't understand it lol
The reverse is also true - some people seem to be deeply offended by others wanting to try or early adopt RTRT.
The cost of hardware RT acceleration is not high to begin with.
I am talking about settings in general not just textures.
The context of the thread is VRAM and future-proofness due to VRAM amounts. The main setting that affects this is texture quality.
 
Joined
Sep 17, 2014
Messages
21,214 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I know for sure that I will be enabling Ray Tracing in Cyberpunk 2077. Massive difference in image quality, unless you are blind, there is plenty of ingame footage showcasing it. AMDs 6000 series will get RT support sometime next year in this game, but performance hit using AMD is massive, with 3070 beating 6800 XT, atleast 3070 have the option to turn on DLSS 2.0 to increase fps by up to 100% and migitate the fps hit

Its just a late-to-the-party addon in CBP2077 and most of the lighting is intact whether or not RTX is on or off.

I'm not sure what you've seen though, got some examples? I'm open to that killer app... really I am.

Exactly. I am not interested in early adopting RTRT in the least. If AMD were to offer a RT disabled option I would buy that... This seems to deeply offend some people. I can't understand it lol

Part of the defense is the fact that people are going to feel like they have to change camps. When you've been running a certain tint for a long time it grows on you. I notice the same thing, a strange reluctance to switch because I realistically don't have hands on experience with RDNA2 yet, and I do have it with Nvidia.

Still though, I'm thoroughly unimpressed with the product line Nvidia is producing right now, especially after the years (!) of Turing which were also extremely weak. They didn't push much forward, all we really got was paying a big fat RT tax from it. They pre empted RT and they failed miserably from a consumer standpoint. What have we really got now? A 2080ti that got eclipsed in a year, and a SUPER line up that was too late to matter. And now they follow up with these measly VRAM amounts? GTFO. Not worth my cash.

I see a lot of corporate push because Nvidia had bad numbers post-mining and post-Pascal. They used RT to have 'the next best thing'. Good for shareholders, but I'm not seeing my benefit. AMD said it right at the time: until the midrange (consoles) start moving, its dead anyway.
 
Last edited:
Joined
Jul 26, 2019
Messages
418 (0.24/day)
Processor R5 5600X
Motherboard Asus TUF Gaming X570-Plus
Memory 32 GB 3600 MT/s CL16
Video Card(s) Sapphire Vega 64
Storage 2x 500 GB SSD, 2x 3 TB HDD
Case Phanteks P300A
Software Manjaro Linux, W10 if I have to
The reverse is also true - some people seem to be deeply offended by others wanting to try or early adopt RTRT.
The cost of hardware RT acceleration is not high to begin with.
The context of the thread is VRAM and future-proofness due to VRAM amounts. The main setting that affects this is texture quality.
Maybe. I haven't seen much if the reverse though.
 
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
Its just a late-to-the-party addon in CBP2077 and most of the lighting is intact whether or not RTX is on or off.

I'm not sure what you've seen though, got some examples? I'm open to that killer app... really I am.



Part of the defense is the fact that people are going to feel like they have to change camps. When you've been running a certain tint for a long time it grows on you. I notice the same thing, a strange reluctance to switch because I realistically don't have hands on experience with RDNA2 yet, and I do have it with Nvidia.

Still though, I'm thoroughly unimpressed with the product line Nvidia is producing right now, especially after the years of Turing which were also extremely weak. They didn't push much forward, all we really got was paying a big fat RT tax from it. They pre empted RT and they failed miserably from a consumer standpoint. What have we really got now? A 2080ti that got eclipsed in a year, and a SUPER line up that was too late to matter. And now they follow up with these measly VRAM amounts? GTFO. Not worth my cash.

I see a lot of corporate push because Nvidia had bad numbers post-mining and post-Pascal. They used RT to have 'the next best thing'. Good for shareholders, but I'm not seeing my benefit. AMD said it right at the time: until the midrange (consoles) start moving, its dead anyway.

Yeah thats why Nvidia have been working closely with CDPR for years now, haha!

Why so mad? Because Nvidia can afford to back a huge title like Cyberpunk? Biggest game release in years?

Weeell, AMD has Godfall, you know, the game with insanely bad review ratings :laugh: :laugh:
Most AMD users can't wait for that 12GB texture pack! LMAO! Would rather watch paint dry than play Godfall
 
Joined
Jan 8, 2017
Messages
9,113 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The context of the thread is VRAM and future-proofness due to VRAM amounts. The main setting that affects this is texture quality.

This is the comment to which I was replying :

-Reduce Ultra detail to High for better performance: totally unacceptable


The main setting that affects this is texture quality.

That's not really true, the bulk of memory used is not taken up by textures.
 
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
By the time 16GB will be required for 4K, 6800 and 6800XT will have laughable performance anyway, even 6900XT will look like a low to mid-end solution

I guess you are new to high resolution gaming if you think VRAM is going to save a weak GPU
 
Joined
Feb 3, 2017
Messages
3,542 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
I'm not sure what you've seen though, got some examples? I'm open to that killer app... really I am.
That killer app was Control when it comes to visuals.
They didn't push much forward, all we really got was paying a big fat RT tax from it. They pre empted RT and they failed miserably from a consumer standpoint. What have we really got now? A 2080ti that got eclipsed in a year, and a SUPER line up that was too late to matter.
2080Ti was released in September 2018. 3000-series came September 2020. 2 Years.
Big fat RT tax seems to be questionable at best. RT Cores take up about 3% of the die if not less.
 
Joined
Jan 8, 2017
Messages
9,113 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
That killer app was Control when it comes to visuals.

That "killer app", came out one year after 2000 series launched and it also came with an unbelievable performance hit.
 
Low quality post by Sovsefanden
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
That "killer app", came out one year after 2000 series launched and it also came with an unbelievable performance hit.

You have no experience with RTX or Control it seems LMAO ... As I expected

You own GTX, and considers going AMD 6000 series. So you are in denial.

Wait till you see Cyberpunk screenshots using RTX, and then the dull looking OFF image shortly after, thats the AMD experience
 
Joined
Sep 28, 2005
Messages
3,175 (0.47/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
Joined
Feb 3, 2017
Messages
3,542 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
That "killer app", came out one year after 2000 series launched and it also came with an unbelievable performance hit.
There really hasn't been that many major changes in graphics for a long while. The last one was lower-level APIs DX12/Vulkan and that effectively took even longer (and still affects DXR adoption). Low-level API purpose was performance boost not visuals and that is also still in progress with varying results. Back when new stuff was introduced every couple of years, major performance hits with latest and greatest was norm. Remember the performance hit from Tessellation or AA (not even accounting for the initial supersampling)? There are things that need to be figured out with every new thing and 2 years is not a long time for that. A lot of groundwork has been laid though - APIs are there, big engines have support by now etc.

The expectation that new effect will come at no performance cost or a minor one is naive. Especially with something like RTRT that is well known as technology and has very clear performance implications. Glad to see AMD can now start contributing to improving RTRT.

Edit:
The biggest short-term improvements in RTRT performance are not likely to come from increasing the performance of current RT acceleration hardware. This is very easy to scale up if needed but even Nvidia is clearly holding back from adding more RT cores. Optimizing the ray projection is also a pretty well-known territory. There is a big base hit on performance that comes from setting up the scene and data. This is getting slow but constant improvements on this from. I would speculate one a "standard" enough solution is reached, manufacturers are starting to look at hardware acceleration around that as well.
 
Last edited:
Joined
Sep 17, 2014
Messages
21,214 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Yeah thats why Nvidia have been working closely with CDPR for years now, haha!

Why so mad? Because Nvidia can afford to back a huge title like Cyberpunk? Biggest game release in years?

Weeell, AMD has Godfall, you know, the game with insanely bad review ratings :laugh: :laugh:
Most AMD users can't wait for that 12GB texture pack! LMAO! Would rather watch paint dry than play Godfall

Look, if you're going to honor your post count/day and actually come here to troll the usual AMD-Nvidia rage debate, go elsewhere. So far you did quite well, but this is taking things into the gutter. Stahp. It won't work and you won't be more happy for going there. We have experience ;)

I've asked you for examples SHOWING the marked difference in Cyberpunk 2077 with RTX on and off. I ask this specifically because you seem to have seen major differences but I never did. Its an honest question. Either answer it with honesty, or GTFO - its a pointless debate if we're not backing up anything we say with good arguments or evidence.
 
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
It's funny that people think the 16GB will make 6800 series relevant in 4-5 years tho, will be considered absolute garbage at that point

Look, if you're going to honor your post count/day and actually come here to troll the usual AMD-Nvidia rage debate, go elsewhere. So far you did quite well, but this is taking things into the gutter. Stahp. It won't work and you won't be more happy for going there. We have experience ;)

I've asked you for examples SHOWING the marked difference in Cyberpunk 2077 with RTX on and off. I ask this specifically because you seem to have seen major differences but I never did. Its an honest question. Either answer it with honesty, or GTFO

Why should I show you? Can't you find it yourself? There's plenty of videos and articles about it lmao

I know it's hard to accept as a non-RTX owner tho
 
Joined
Sep 17, 2014
Messages
21,214 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
It's funny that people think the 16GB will make 6800 series relevant in 4-5 years tho, will be considered absolute garbage at that point



Why should I show you? Can't you find it yourself? There's plenty of videos and articles about it lmao

I know it's hard to accept as a non-RTX owner tho

Thank you, ignore button is one click away you know - this is just confirming my assumption above about your agenda. Your post count is giving it away.

Why should I be able to find something I did not find yet? What alternate reality is this?
 
Joined
Jan 8, 2017
Messages
9,113 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Remember the performance hit from Tessellation or AA (not even accounting for the initial supersampling)?

And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.

Look, if you're going to honor your post count/day and actually come here to troll the usual AMD-Nvidia rage debate, go elsewhere. So far you did quite well, but this is taking things into the gutter. Stahp. It won't work and you won't be more happy for going there. We have experience ;)

I've asked you for examples SHOWING the marked difference in Cyberpunk 2077 with RTX on and off. I ask this specifically because you seem to have seen major differences but I never did. Its an honest question. Either answer it with honesty, or GTFO - its a pointless debate if we're not backing up anything we say with good arguments or evidence.

Stop responding to paid trolls (at least I hope he's being paid). Report and move on.
 
Low quality post by Sovsefanden
Joined
Nov 26, 2020
Messages
106 (0.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
Constant trolling
And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.



Stop responding to paid trolls (at least I hope he's being paid). Report and move on.

You are calling me a troll, only because you are a fanboy and can't look past your own nose, funny :laugh:

Oh well, 9 more days will I will be playing Cyberpunk maxed out in full RTX glory.
 
Joined
Sep 17, 2014
Messages
21,214 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.



Stop responding to paid trolls (at least I hope he's being paid). Report and move on.

He's not getting paid I hope, he's pretty bad at the game. :roll: Lasted less than a week!
 
Joined
Jan 8, 2017
Messages
9,113 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
He's not getting paid I hope, he's pretty bad at the game. :roll:

No I really hope he is, I cringe at the idea that someone's time can be worth so little to post that kind of garbage for free.
 
Joined
Feb 3, 2017
Messages
3,542 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
And that massive performance hit still exists a decade after, so much so that tessellation is sparingly used if ever in the detriment of higher performance alternatives like POM.
Tessellation has far far smaller performance hit than it had in the beginning and there have been improvements even relatively recently. It is used sparingly but it does see constant use where needed, it is all over the place.
 
Status
Not open for further replies.
Top