• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

Joined
Aug 20, 2007
Messages
21,421 (3.41/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
If AMD could come up with an efficient de-noise method without any dedicated hardware

Yeah, if you have any technical knowledge at all you don't have to guess about this. It simply can't happen with present hardware. Unless we are talking about things like Quake 3...
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.47/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Could you elaborate?
There's one RT core attached to each SM. An ideal raytracing ASIC would have one pool of shared memory with all raytracing compute units attached to it. They all have access to the same mesh information and light source information and they illuminate the meshes based on collisions. As I said before, the way Turing is set up, it's meant to complement exiting rendering techniques and not replace them. Correct me if I'm wrong but I don't think there's even a 100% raytraced demo running on RTX. Yeah, NVIDIA compiled a few videos of exclusively raytraced demos but AMD has done the same with Radeon Rays too--nothing exceptional that all GPUs can do given enough time.

It makes more sense to me that AMD would develop a raytracing ASIC which operates as a co-processor on the GPU not unlike the Video Coding Engine and Unified Video Decoder. It would pull meshes and illumination information from the main memory, bounce it's rays, and then copy back the updated meshes to be anti-aliased.

All that information is generally in the VRAM. A raytracing ASIC would be more cache than anything else.
 
Last edited:
Joined
Feb 15, 2019
Messages
1,653 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Yeah, if you have any technical knowledge at all you don't have to guess about this. It simply can't happen with present hardware. Unless we are talking about things like Quake 3...

The same debate happened when AMD first announced FreeSync.
And we knew the outcome.
 
Joined
Aug 20, 2007
Messages
21,421 (3.41/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
The same debate happened when AMD first announced FreeSync.
And we knew the outcome.

This is different. Adaptive sync was already in the queue for the VESA standard. It utilized technology tech people knew already existed (even blurbusters noted this in their gsync analysis).

You are asking AMD to do something their chips literally do not have the facilities to do. At best, they can achieve something akin to emulation.

It simply won't happen, because short of new hardware, it can't happen.

There's one RT core attached to each SM. An ideal raytracing ASIC would have one pool of shared memory with all raytracing compute units attached to it. They all have access to the same mesh information and light source information and they illuminate the meshes based on collisions.

That's because all the RT cores do is the magic denoising.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.47/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Then why do they call them RT cores? Denoising really has nothing to do with raytracing. It is just averaging pixels to cover up the lousy job their limited number of rays do.
 
Last edited:
Joined
Feb 15, 2019
Messages
1,653 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
This is different.

It is the same.
Back when AMD first announced FreeSync,
ppl speculated if it is possible without the expensive FPGA module embedded inside the monitor.
Turns out it is.

Nvidia themselves shown you RTRT is possible without ASIC acceleration.
Pascal architecture itself wasn't built with RTRT in mind yet can run RTRT with not bad results.
It is quite impressive to see in a ray traced game, a 1080ti could to 70% of fps compared to a 2060, with the first driver that enable Pascal to do this. (Poor optimization)

RT cores are dead weight in conventional Rasterization, but Cuda can do both.
Same goes to AMD.
ASIC for RTRT will soon be obsolete, just like the ASIC for PhysX.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
That's because all the RT cores do is the magic denoising.

Just saying

Nvidia said:
The RT Core includes two specialized units. The first unit does bounding box tests, and the second unit does ray-triangle intersection tests.

"Specialized" = Fixed function

To my knowledge Nvidia has never said outside of Optix AI that the denoiser is accelerated. It always references them as "Fast".

Denoisers are Filters and will vary as such.

CG said:
OptiX may not be ideal for animation. For animation we suggest using V-Ray’s denoiser with cross-frame denoising.
 
Last edited:
Joined
Aug 2, 2011
Messages
1,458 (0.30/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 Pro 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Alienware AW3423DWF, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502X Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
If this is the case, GCN using async compute should be able to do very well at DXR without modification.

I guess we'll find out when AMD debuts DXR support.
the GPU could properly async (like GCN can)

You do know that Turing is fully async compute capable, right? It can execute INT and FP at the same time, concurrently.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.47/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
That's in a WARP. Async is filling the gaps between WARPs. Scheduler finds idle resources then applies async compute workloads to them. What NVIDIA does is switch WARP from graphics, to compute, then back again. It isn't capable of filling in idle resources like GCN does.

Turing in particular needs instructions for FP32 and FP16 simultaneously in each WARP or large swaths of transistors end up idling:
https://www.anandtech.com/show/13973/nvidia-gtx-1660-ti-review-feat-evga-xc-gaming/2
 
Last edited:
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
That's because all the RT cores do is the magic denoising.
No, they do not. Tensor cores (!= RT cores) can do denoising.
Back when AMD first announced FreeSync,
ppl speculated if it is possible without the expensive FPGA module embedded inside the monitor.
Turns out it is.
Without FPGA, sure. Without specialized hardware - nope. Adaptive sync support is in the scaler. Monitor manufacturers use an ASIC for it and it is probably just a question of scale where they are able to order bigger batches of chips. That is also the reason why Adaptive sync support took as long as it did to become a common feature.
It is quite impressive to see in a ray traced game, a 1080ti could to 70% of fps compared to a 2060, with the first driver that enable Pascal to do this. (Poor optimization)
I seriously doubt this. Nvidia has had Optix in professional space for years. Specifically DXR part is probably as well optimized as it can be at this point.
 
Last edited:
Joined
Mar 8, 2013
Messages
20 (0.00/day)
That's in a WARP. Async is filling the gaps between WARPs. Scheduler finds idle resources then applies async compute workloads to them. What NVIDIA does is switch WARP from graphics, to compute, then back again. It isn't capable of filling in idle resources like GCN does.

Turing in particular needs instructions for FP32 and FP16 simultaneously in each WARP or large swaths of transistors end up idling:
https://www.anandtech.com/show/13973/nvidia-gtx-1660-ti-review-feat-evga-xc-gaming/2
Turing seems to be far better than Pascal when it comes to Async Compute, are you sure the deficiencies of Pascal in this area, apply to Turing overall?
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
ppl speculated if it is possible without the expensive FPGA module embedded inside the monitor.
I recall in anandtech comments a guy who claimed to work at AMD stating that GPU driving screen refreshing has been there in the notebook world for a while. I forgot the standard's name, but there was nothing to speculate about, it was literally there.
 
Joined
Sep 17, 2014
Messages
22,358 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
i'm not sure yet, i got lots of other things to test and nvidia has provided numbers anyway. it'll also be a ton of work to test all this.

i'm thinking about adding a rtx section with just one game (metro?) to future gpu reviews

I honestly wouldn't go down this rabbit hole. Metro is in no way representative of RTX capability. It just shows us what a global illumination pass does and costs. Any other game may implement more or less features and work out entirely different. It won't be that informative really. I think focused feature articles like you've done up until now are a much better way to do it.
 
Joined
Aug 20, 2007
Messages
21,421 (3.41/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,742 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I honestly wouldn't go down this rabbit hole. Metro is in no way representative of RTX capability. It just shows us what a global illumination pass does and costs. Any other game may implement more or less features and work out entirely different. It won't be that informative really. I think focused feature articles like you've done up until now are a much better way to do it.
Not gonna stop with the per-game launch articles

Which game would you choose instead of Metro? and why?
 
Joined
Aug 20, 2007
Messages
21,421 (3.41/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Joined
Sep 17, 2014
Messages
22,358 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Not gonna stop with the per-game launch articles

Which game would you choose instead of Metro? and why?

I think its too early to choose a game as a 'benchmark' for RT performance yet. We've seen two meagre attempts that entered the dev cycle late in the development of both games, I don't think its going to be representative of anything that comes next.

The main problem is that its so very abstract. Nvidia uses performance levels without a clear distinction for RTX On. Everything is fluid here; they will always balance the amount of RT around the end-user performance requirements; the BF V patch was a good example of that. What really matters is the actual quality improvement and then the performance hit required to get there. This is not like saying 'MSAA x4 versus MSAA x8' or something like that. There is no linearity, no predictability and the featureset can be used to any extent.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,742 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I think its too early to choose a game as a 'benchmark' for RT performance yet
But isn't "Metro RTX" in reviews better than nothing? Once more titles come out I'm definitely open to changing the title when rebenching
 

bug

Joined
May 22, 2015
Messages
13,729 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It is relevant.
This "RTX being sort of supported on Pascal cards" move has one purpose only : Try to convince ppl to "upgrade / side-grade" to RTX cards.
Then, what is the point to do this "upgrade / side-grade" when there are not even a handful of Ray-Tracing games out there ?

RTX based on DXR which is just one part of the DX12 API, RTX itself is not comparable to fully featured DX10 / DX11 / DX12 .
Compare it to Nvidia's last hardware-accelerated marketing gimmick a.k.a PhysX, tho.
It was the same thing again, hardware PhysX got 4 games back in 2008 and 7 in 2009, then hardware PhysX simply fade out and now open sourced.
Now we got 3 Ray Traced games in 7 months, sounds familiar ?
Again, apples and oranges. PhysX was built as a closed solution, whereas RTX is DXR (part of a pretty "standard" API) with sugar on top.
Please try to word your replies better. Or in case your only point was "Nvidia is trying to scam people out of their money" or "RTX will die because I don't like it as it is now", we've already had hundreds of posts about that.
 
Joined
Sep 22, 2012
Messages
1,010 (0.23/day)
Location
Belgrade, Serbia
System Name Intel® X99 Wellsburg
Processor Intel® Core™ i7-5820K - 4.5GHz
Motherboard ASUS Rampage V E10 (1801)
Cooling EK RGB Monoblock + EK XRES D5 Revo Glass PWM
Memory CMD16GX4M4A2666C15
Video Card(s) ASUS GTX1080Ti Poseidon
Storage Samsung 970 EVO PLUS 1TB /850 EVO 1TB / WD Black 2TB
Display(s) Samsung P2450H
Case Lian Li PC-O11 WXC
Audio Device(s) CREATIVE Sound Blaster ZxR
Power Supply EVGA 1200 P2 Platinum
Mouse Logitech G900 / SS QCK
Keyboard Deck 87 Francium Pro
Software Windows 10 Pro x64
NVIDIA literary make fools from people. Are you aware how much quality parts for music, guns, etc could be bought for price of single RTX2080Ti and she become outdated for 3 years.
For 5 years is almost garbage. It's very rear in industry to find comparison with that to cost so much.
Extreme quality audio equipment as Amplifiers, Turntables, Speakers, etc could be bought for 1K to last decades.
Compare example some studio-club turntable from Technics who reach Anniversary of 50 years, indestructible 15kg monster with replaceable every single part on him could be bought for price of RTX2080 example, Extreme quality speakers. That's poor insanity where gaming industry lead us.
Turing was Turning point for me, moment when I figure out that from now I will play games 2-3 years after launch date, pay only one platform in generation, not two with same memory types and similar core architectures because that's literary throwing money.
You can buy 85 Vinyls or 150 CDs average for price of single custom build RTX2080Ti. That's abnormal.
You can buy Benelli M4 12ga Tactical Shutgun for RTX2080Ti... When you check little peace of plastic with chip ready to become outdated for 5 years, it's funny.

I prepared 700 euro money and waited RTX2080Ti with option to add 100-150 euro more even 200 for some premium models, I hoped maybe even K|NGP|N version.
I was not even close to buy 30% stronger card than GTX1080Ti. At the end I didn't had enough for even RTX2080 and need to add significant money.
Option was to buy second hand GTX1080Ti and rest of money to buy 1TB M.2 but I will remember this GeForce architecture as many of you will remember next generations and gave up from brand new cards. You can play nice with previous high end model with 1-2 years warranty second hand below 500$.

I knew in one moment when people become aware how much they pay for Intel processors and NVIDIA graphic cards they will change approach and than even significant drop of price will not
Now if NVIDIA drop price for 150$ they would change nothing significantly, they would sell more GPU, but not much as they expected.

PC gaming is if you can afford GTX1080, GTX1080Ti, RTX2080/2080Ti, eventually RTX2070.
But investing few hundreds for lower class of GPU... better PS4.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,358 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But isn't "Metro RTX" in reviews better than nothing? Once more titles come out I'm definitely open to changing the title when rebenching

Better than nothing, yes, but its also riding the Nvidia marketing wave which I think is out of place for neutral reviewers.

We didn't test 'Hairworks games' or 'PhysX games' back when Nvidia pushed games with those technologies either. As it stands today, there is no practical difference apart from it being called 'DXR'. There was never a feature article for other proprietary tech either such as 'Turf Effects' or HBAO+, so in that sense the articles we get are already doing the new technology (and its potential) justice. In reviews you want to highlight a setup that applies for all hardware.

Its a pretty fundamental question really. If this turns out to be a dud, you will be seen as biased if you consistently push the RTX button. And if it becomes widely supported technology, the performance we see today has about zero bearing on it. The story changes when AMD or another competitor also brings a hardware solution for it.

Something to think about, but this is why I feel focused articles that really lay out the visual and performance differences in detail are actually worth something, but another RTX bar in the bar chart most certainly is not.

ASIC for RTRT will soon be obsolete, just like the ASIC for PhysX.

The only caveat there in my opinion is perf/watt. The RT perf/watt of dedicated hardware is much stronger, and if you consider the TDP of the 2080ti, there isn't really enough headroom to do everything on CUDA.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,729 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
NVIDIA literary make fools from people. Are you aware how much quality parts for music, guns, etc could be bought for price of single RTX2080Ti and she become outdated for 3 years.
For 5 years is almost garbage. It's very rear in industry to find comparison with that to cost so much.
Extreme quality audio equipment as Amplifiers, Turntables, Speakers, etc could be bought for 1K to last decades.
Compare example some studio-club turntable from Technics who reach Anniversary of 50 years, indestructible 15kg monster with replaceable every single part on him could be bought for price of RTX2080 example, Extreme quality speakers. That's poor insanity where gaming industry lead us.
Turing was Turning point for me, moment when I figure out that from now I will play games 2-3 years after launch date, pay only one platform in generation, not two with same memory types and similar core architectures because that's literary throwing money.
You can buy 85 Vinyls or 150 CDs average for price of single custom build RTX2080Ti. That's abnormal.
You can buy Benelli M4 12ga Tactical Shutgun for RTX2080Ti... When you check little peace of plastic with chip ready to become outdated for 5 years, it's funny.
Using the same logic, I assume you'd never buy a BMW because of the 7-series or i8 pricing. And you won't buy a Mercedes either, because Maybach.
 
Joined
Feb 15, 2019
Messages
1,653 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
we've already had hundreds of posts about that.

It is because people agreed that way.
Listen to the people.

Yes, "Nvidia is trying to scam people out of their money"
And yes, Nvidia RTX surely will die because how bad they ruined it, but DXR lives on.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Monitor manufacturers use an ASIC for it and it is probably just a question of scale where they are able to order bigger batches of chips. That is also the reason why Adaptive sync support took as long as it did to become a common feature.
BULLSHIT.

It was a cheap add-on feature that manufacturers producing upscaler chips included free of charge.
It took customer reluctance to pay 200+ premium per monitor, for nVidia branded version of it.
GSync branded notebooks didn't use it either, because, see above.

But once Gsync failed and FreeSync won, let's call the latter simply "Adaptive Sync" shall we, it downplays what AMD has accomplished and all.

I assume you'd never buy a BMW because of the 7-series or i8 pricing. And you won't buy a Mercedes either, because Maybach.
Bringing ultra-competitive low margin automotive into this is beyond ridiculous.
 
Joined
Feb 3, 2017
Messages
3,732 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I recall in anandtech comments a guy who claimed to work at AMD stating that GPU driving screen refreshing has been there in the notebook world for a while. I forgot the standard's name, but there was nothing to speculate about, it was literally there.
eDP, Embedded Displayport.
It was a cheap add-on feature that manufacturers producing upscaler chips included free of charge.
It took customer reluctance to pay 200+ premium per monitor, for nVidia branded version of it.
GSync branded notebooks didn't use it either, because, see above.
Yes, manufacturers producing upscaler chips did include the feature. The correct question is - when?
It took customer reluctance to pay 200+ premium per monitor sounds like free really wasn't why.
GSync module would not physically fit in a notebook. Besides, isn't using established standards exactly what is encouraged? :)

Wrong thread for this though.
 
Last edited:
Top