- Joined
- Sep 17, 2014
- Messages
- 20,932 (5.97/day)
- Location
- The Washing Machine
Processor | i7 8700k 4.6Ghz @ 1.24V |
---|---|
Motherboard | AsRock Fatal1ty K6 Z370 |
Cooling | beQuiet! Dark Rock Pro 3 |
Memory | 16GB Corsair Vengeance LPX 3200/C16 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Fractal Design Define R5 |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | XTRFY M42 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W10 x64 |
Stop watching LinuxTechTips, you'll get brain cancer.
I completely agree with you. I don't need influencers to form an opinion though, unlike what you seem to be alluding to.
I've said RTX was shit *during* the GDC Keynote, before it dawned upon most people here or even Linus himself. No I don't need a cookie for that, just to illustrate my point. I've even said long before Turing launch that this gen would likely be a tiny jump in performance and I predicted when RTX was first announced that they were going to sell Turing on this feature alone. And all on this very forum too, so you can scan my post history to verify everything.
Crystal ball or common sense, you decide.