- Joined
- Sep 17, 2014
- Messages
- 21,214 (5.98/day)
- Location
- The Washing Machine
Processor | i7 8700k 4.6Ghz @ 1.24V |
---|---|
Motherboard | AsRock Fatal1ty K6 Z370 |
Cooling | beQuiet! Dark Rock Pro 3 |
Memory | 16GB Corsair Vengeance LPX 3200/C16 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Fractal Design Define R5 |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | XTRFY M42 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W10 x64 |
Yeah I find it odd that there are people who think like this:
-Turn off RT for better performance: perfectly acceptable
-Reduce Ultra detail to High for better performance: totally unacceptable
So some people are blind to RT, yet very picky about details, doesn't make any sense.
OK I'll try one more time But I'll start by saying YES, you are correct. It is a personal consideration - we all try to crystal ball ourselves out of this, its never going to be conclusive until its too late
- RT performance is early adopter territory. Next gen may turn things on its head altogether and make current day perf obsolete straight away. You can check Turing > Ampere RT perf for proof. Remember, AMD is having a lite-version of RT in RDNA2. It can go either of both ways - the industry goes full steam on it and RDNA3 or beyond will push it far more heavily, or they really don't and focus goes back towards better raster perf while RT takes a similar place as, say, Tesselation - just another effect to use. The supposed 'RT advantage' of Nvidia can also dwindle faster than you might blink if devs start optimizing for consoles first. The additional die space Nvidia has for it, won't be used properly unless Nvidia keeps using bags of money like they have so far to get RTX implementation.
Its far too early to determine RT is 'here to stay' in the projected way as a 'major part' of the graphics pipeline. If the market doesn't eat it, it'll die. Its a very expensive effect. Look at the price surges, demand issues... They are related.
RT is also not efficient at this time. Its the same thing as enabling overly costly AA that barely shows an advantage. Yes, you cán... but why? In a large number of situations it really doesn't add much. You can still count the examples where it does, on one hand - and you'll have fingers left.
- 10GB VRAM is not resale-worthy. Its just not. Its yesterday's capacity. Past two gens already had more. The fact we're already discussing it at launch speaks volumes.. You buy this to use it for a few years and then it gets knocked down the product tiers very fast. I haven't seen anyone disagree with that, by the way, even in this topic. We ALL draw the conclusion that 10GB will impose limitations pretty soon. The idea that this somehow 'scales with the core power' has absolutely no basis in the past - in the past, we've always seen an increase or equal capacity with increasing core power. You can't ignore that disbalance. Its there and it'll show.
- 16GB VRAM is very resale-worthy, especially given the fact that there is lots of core power on tap and the balance with the core power relative to past gen is also kept intact. Well balanced GPUs last longest. Its just that simple. When they run out of oomph, they run out of all things at the same time, and that tends to take a long while. Until they do... you can resell them. A GPU without such balance doesn't resell like that - you can only resell it on 'conditional' situations, ie specific use cases. '3080's a great card for 1440p now', is probably the punchline. You'll insta-lose all potential buyers with a 4K panel or even UWs - your niche got that much smaller.
- VRAM is used everywhere. If you're short, you'll be tweaking your settings every time, not just in the games that may or may not have RT worth looking at. So going forward in time, say you'll be buying a 4K monitor 3 years from now... with a 3080 you might also feel the urge to upgrade the GPU. With a 12-16GB card, you most certainly won't have to.
As for a hundred DXR titles... yeah. In a similar vein we also have 'hundreds' of DX12 titles... that we still prefer to run in DX11 because its the same thing but better.
TL DR what it REALLY comes down to... is how keen you are to early adopt RT. Except now its not the Turing days where the competition has nothing to place against that consideration - the competition has a technically more durable product - and it even does RT too! That's a pretty steep price tag to keep going green if you ask me.
Last edited: