• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Assetto Corsa Competizione Dumps NVIDIA RTX

Joined
Sep 17, 2014
Messages
10,419 (5.46/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
Great help. I only have 1h 45min of video to watch, looking for a phrase :rockout:

Edit: And while I haven't found the statement just yet, it seems it means what I thought it means: https://www.reddit.com/r/nvidia/comments/9aopb1
Irony of this story; if you put the ground rules and assets in place for a raster scene it behaves exactly the same way. Many engines run simulations just the same as RT is a simulation. And it does handle the code thats there much more efficiently, as it does not calculate all sorts of crap it wont use (no probing, culling).

Its potato potatoe material, and it all takes work while Nvidia has provided zero proof that workflows magically require fewer man hours for similar results. Just guesstimates induced by a healthy dose of marketing for the next best thing.

Nothing just works, all those things RT does 'on its own' are useless as we lack the horsepower to push it anyway. So you end up spending an equal amount of time fixing all of that.

The only thing you need less off with RT, is talented devs and designers. Raster takes more skill to get right. Not more time. RT is just a lazy package brute forcing it for you and passing the bill to end users.

Ive seen it too often. New ways of working, new algorithms... and yet, every half serious dev squad has a backlog to keep going for years...
 
Last edited:
Joined
Mar 7, 2010
Messages
527 (0.15/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 1700 @ 4.00
Motherboard AsRock X370 Killer SLI/ac
Cooling Corsair H110i
Memory 16 GIG GSKILL Ripjaw @ 2400
Video Card(s) Gigabyte GTX 1070 G1
Storage Crucial M.2 250 Samsung 840 EVO 250-Samsung 850 Pro-WD 1 TB
Display(s) LG 27
Case NZXT
Audio Device(s) N/A
Power Supply EVGA 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Home
Ray Tracing will become a thing, when it is a thing.
It's not a thing.... Yet!
 
Joined
Jan 8, 2017
Messages
4,597 (4.32/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
The gist of all this is that developers are starting to realize that "hey this is a lot work and we have zero incentive to do it". That's all.
 
Joined
Nov 6, 2016
Messages
229 (0.20/day)
Location
NJ, USA
If and when competitors (AMD primarily) come out with RT solution, this will benefit immensely from the work developers and Nvidia are doing right now.
I could be wrong, but I feel like that'd be the first time, in a very long time (if at all) Nvidia will have done something to the benefit of the entire community and to advance the industry as a whole. While loyal fanboy will seek to conflate the following statement as taking sides, I feel as though it's been AMD that has continually made such advances to everyone's benefit (and obviously to their own, the two are not exclusionary) e.g. HBM, Mantle/DX12, open source initiatives, etc.
 
Joined
Jan 13, 2011
Messages
189 (0.06/day)
It's strange to see people actively argue against ray tracing in general.
View attachment 129368


EDIT:
The performance hit when enabling RTX is just to great to be worth it with current HW. Maybe when its down to a 5-10% hit, it may be appealing.
I wonder how much of that is the hardware vs how much of it is bad implementation. Like in the days of crysis 2 where you get jersey barriers with thousands of polygons in it's mesh and tessellated water that's placed over the entire level whether you see it or not.
 
Joined
Nov 6, 2016
Messages
229 (0.20/day)
Location
NJ, USA
@btarunr

Why the sensationalist title?

Why not use "drop" instead of "dump"? As a journalist you must remain neutral.
How a given individual takes a word's implicit suggestions is completely subjective, some like myself, didn't see any implied suggestion in the use of the word "dump". Not an accusation, but arguably, the very fact that you perceived such "bias" is a bigger indicator or your own maintained bias and preferred alliegence/loyalties than anything else, i.e. It implies you're acting as a self appointed defender and apologist for Nvidia (again, not an accusation I am making, but a probable opinion and analysis I could imagine many commentors making)
 
Joined
Jun 16, 2016
Messages
344 (0.27/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 256 GB Lexar NM520 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
All the multiplatform games will have some sort of DXR path for lighting and/or shadows IF the new consoles come out with support for raytracing. Otherwise, the situation with the Assetto Corsa devs will probably keep happening for any game that's not a blockbuster. The feature will be a selling point IF the consoles provide a wide install base and the usual suspects compete for best looking game by using the new capabilities.

I bought a 2070 Super knowing I appreciated the 1080 Ti performance at rasterization. I did not buy it for RTX. But I am hoping that in the next three years, which is probably how long I'll have this card, enough games will have options for it that I can at least try them out. If it never happens, I'll still have a good graphics card that just happens to do more than I need. This is exactly why the RTX series was so disappointing in the beginning, because the price per performance stayed flat and we got a feature no one cares about. The SUPER line does provide the needed bump in raster performance and we get something extra.
 
Joined
Sep 17, 2014
Messages
10,419 (5.46/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
I could be wrong, but I feel like that'd be the first time, in a very long time (if at all) Nvidia will have done something to the benefit of the entire community and to advance the industry as a whole. While loyal fanboy will seek to conflate the following statement as taking sides, I feel as though it's been AMD that has continually made such advances to everyone's benefit (and obviously to their own, the two are not exclusionary) e.g. HBM, Mantle/DX12, open source initiatives, etc.
I disagree on that one.

Many people have complained about GameWorks and they still do. But let's get one thing straight. Gameworks technologies have trickled down into the 'used everywhere space' and they still do. PhysX is the only notable one that really didn't - and yet, CPU Physx did, and now the SDK is open to all I believe. But think about AA tech for a minute. Nvidia has a major share in this. Additionally you can complain about Gsync but they were still first in offering it, and creating a market for it. I was never a fan of the model they used, and vastly prefer the FreeSync approach, but its still there. In some way your enemy is your friend in this business, because all progress is progress. Also, think about the development of GPU Boost. Or improved delta compression. Many of those are later copied by AMD, with their own sauce no less, but still (what's really different is how they implement it on their own arch). While AMD was suffering HBM shortages, Nvidia simply produced Maxwell GPUs with 256 bit buses doing just fine - and still doing that today - for high end GPUs. That cost advantage echoed in the price of, say a GTX 970 which took the market by storm with very good perf/dollar.

And what about all those open AMD initiatives? Only a few of them really stick, such as Vulkan in which AMD is merely one of the players. Many things AMD portrays as 'open initiatives' are industry wide movements they also have a flavor for. Mantle/DX12/Lowlevel access for example is much bigger than AMD and is not new either, even though some like to attribute all of that to the Mantle initiative (as if BF is of any major relevance in the market - see the lukewarm BFV RTX reception). I think if AMD contributed to the industry, their biggest achievements are in the CPU area instead.

In addition, the performance leader does push the industry forward. Its that simple - more performance, means developers can optimize for a higher performance mainstream. For PC gaming, I would definitely say Nvidia has injected more money into developers and overall graphics fidelity than AMD in the last few decades. There is a LOT of support being provided under the hood through dev channels. That 65-80% market share means there is a lot of margin to spend that money and even give it away. There is even a form of responsibility because if Nvidia doesn't take care of the content creators, they won't create nicer games that need higher performance GPUs - the two feed each other. I'm not really complaining about that to be honest, because dev money leads to ingame results and those favor everyone.

Another point could be made for AMD though. The console performance level dictates what most big budget games get optimized around. Better AMD console GPUs mean better PC games. Unfortunately, consoles are not the cutting edge, so while this is progress, it won't usually 'feel' as such - it actually feels like slowing us down from a PC gamer perspective.

On topic - this RTX push. There is an equally large group explaining it as just that: pushing the industry forward, or Nvidia serving mostly itself. Both are true, really.

A matter of perspective ;)
 
Last edited:
Joined
Apr 30, 2012
Messages
3,473 (1.25/day)
Imagination tried it in 2014 didn't get them very far with it and they targeted devs. They didn't throw money at developers to include features though.

Nvidia pours money into these Dev houses as a partnership ($$$) to include RTX. Nvidia could just dump more money into KUNOS for them to find "the maturity of the technology" worth their wild again.
 
Last edited:
Joined
Dec 22, 2011
Messages
2,985 (1.03/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
This isn't too surprising in the grade scheme of things, we are all still waiting for DX12 and Vulkan to rule the world after all.
 
Joined
Jun 28, 2016
Messages
2,860 (2.28/day)
TPU serves almost no texts about RTRT development or new games that start implementing them, but - inevitably - we get a text about a game studio that delays RTRT implementation.

Normally I'd say @btarunr is pro-AMD (like many on this forum). However, AMD has already said RTRT hardware is coming soon, so what's going on? Are you guys just anti-RTRT? Why?

As for ACC itself: it is made by a tiny, highly specialized studio Kunos Simulazioni. It's under 50 people that basically make a single game. Switching to RTRT for such small companies will always be a huge cost.
For similar reasons ACC still uses UT4 (today used mostly in indie games). They can't afford an in-house engine and it seems they can't afford switching to something more modern. It's not that surprising since graphics always had lower priority in their games.

For large corporations shifting to RTRT is going to be fairly easy. Yes, games have to be written differently. Yes, you have to employ different people. But it's really universal (just like RT rendering engines).
With few mainstream games released yearly, the cost of transition will be absorbed easily.
 
Joined
Feb 18, 2005
Messages
2,254 (0.42/day)
Location
United Kingdom
Seriously who gives a damnwhat the reasons are? The fact remains it looks and plays like an afterthought, and last I checked we didnt get a discount for beta testing. We are talking about very costly GPUs here that sell with the perf dollar ratio worse than it was in 2016. Wake up already.
And yet AMD GPUs, which don't include ray-tracing hardware and are significantly smaller and less complex, sell for almost the same amount of money as NVIDIA's. Remind me who's ripping off customers again, hmmm?
 
Joined
Sep 17, 2014
Messages
10,419 (5.46/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
So that makes the price increase OK? :kookoo: AMDs price is completely unrelated here. And they are also selling costly GPUs with low margin, think about HBM equipped Fury, Vega, RVII.
 
Joined
Jun 28, 2016
Messages
2,860 (2.28/day)
So that makes the price increase OK? :kookoo: AMDs price is completely unrelated here. And they are also selling costly GPUs with low margin, think about HBM equipped Fury, Vega, RVII.
So... we have 2 dGPU makers and you say both are too expensive.
How exactly does that work? :-D
 
Joined
Feb 8, 2012
Messages
2,934 (1.03/day)
Location
Zagreb, Croatia
System Name Windows 7 64-bit Core i5 3570K
Processor Intel Core i5 3570K @ 4.2 GHz, 1.26 V
Motherboard Gigabyte GA-Z77MX-D3H
Cooling Scythe Katana 4
Memory 4 x 4 GB G-Skill Sniper DDR3 @ 1600 MHz
Video Card(s) Gainward NVIDIA GeForce GTX 970 Phantom
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case CoolerMaster Silencio 550
Audio Device(s) VIA HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 7 Enterprise 64-bit SP1
As for tweaking on lighting I don't think DXR really simplifies much... with RT you propaply need to adjust any and all surfaces for light bounce properties otherwise you'll get pretty derpy lighting artifacts.
That's why most games in the last year use physically based shader that has emission texture map, which makes this a non issue for digital assets created by hand ... it can look off with photogrammetry if it is not done right
Um dude AtiTruForm is tessellation..
Exactly the point ... how many years since ATI Radeon 8000 until using tesselation in games? I'd say about 10 years ... even 15 if you look at widespread adoption.
Why that long? Fixed function tesselation is useless, it only makes sense if programmable via geometry shaders.
Similarly RTX as such in its first incarnation is evidently almost useless ... but not as useless as tesselation on DX8.1 hardware ... I'd say just as useful as tesselation on Radeon 5000 series DX11 gpus - it's there, but performance tanks if you use it.
 

bug

Joined
May 22, 2015
Messages
6,788 (4.09/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Irony of this story; if you put the ground rules and assets in place for a raster scene it behaves exactly the same way. Many engines run simulations just the same as RT is a simulation. And it does handle the code thats there much more efficiently, as it does not calculate all sorts of crap it wont use (no probing, culling).

Its potato potatoe material, and it all takes work while Nvidia has provided zero proof that workflows magically require fewer man hours for similar results. Just guesstimates induced by a healthy dose of marketing for the next best thing.

Nothing just works, all those things RT does 'on its own' are useless as we lack the horsepower to push it anyway. So you end up spending an equal amount of time fixing all of that.

The only thing you need less off with RT, is talented devs and designers. Raster takes more skill to get right. Not more time. RT is just a lazy package brute forcing it for you and passing the bill to end users.

Ive seen it too often. New ways of working, new algorithms... and yet, every half serious dev squad has a backlog to keep going for years...
And yet we haven't had rasterization in movies since the first Jurassic Park. I wonder why that is ;)
 
Joined
Oct 10, 2009
Messages
388 (0.10/day)
Location
Madrid, Spain
System Name Cubito
Processor Core i7-8700K
Motherboard Asus TUF Z390M
Cooling Noctua NH-D15
Memory 32 GB DDR4 3000mhz Corsair Vengeance LPX
Video Card(s) Gigabyte GTX 1070Ti
Storage 2 x Samsung 850 EVO 250GB + Toshiba MQ01ABD100 1 TB x 2
Display(s) Asus ROG Swift PG258Q + Asus ROG Strix XG258
Case Thermaltake Core V21
Audio Device(s) Asus Xonar Xense
Power Supply Corsair RMX750
Mouse Roccat Nyth
Keyboard Corsair K70 MK.2
Software Windows 10 Home x64
It is a chicken and egg problem.
It's not. It's a problem of trying to parcel new tech for the sake of trying to squeeze the cow while the rest does their thing more slowly because you don't want to colaborate. We already had physx and havoc as a case, or even gsync and freesync. Eventually we will get there and everyone is working towards that, but nvidia ALWAYS apply the walled garden just to tear it down years later and slowing everyone else in the process. Walled gardens don't work in PC.

Look, I'm hyped over RT like no one and Cyberpunk 2077 got me really excited, but it would be better if they nvidia were more open about this.
 

bug

Joined
May 22, 2015
Messages
6,788 (4.09/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It's not. It's a problem of trying to parcel new tech for the sake of trying to squeeze the cow while the rest does their thing more slowly because you don't want to colaborate. We already had physx and havoc as a case, or even gsync and freesync. Eventually we will get there and everyone is working towards that, but nvidia ALWAYS apply the walled garden just to tear it down years later and slowing everyone else in the process. Walled gardens don't work in PC.

Look, I'm hyped over RT like no one and Cyberpunk 2077 got me really excited, but it would be better if they nvidia were more open about this.
DXR is not walled garden. And neither is its Vulkan counerpart (which I'm not sure is finished yet).
 
Joined
Feb 11, 2009
Messages
2,289 (0.58/day)
System Name Cyberline
Processor Intel Core i7 2600k
Motherboard Asus P8P67 LE Rev 3.0
Cooling Tuniq Tower 120
Memory Corsair (4x2) 8gb 1600mhz
Video Card(s) AMD RX480
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb
Display(s) Philips 32inch LPF5605H (television)
Case antec 600
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
@btarunr

Why the sensationalist title?

Why not use "drop" instead of "dump"? As a journalist you must remain neutral.
its the same thing, you are splitting hairs just because either yourself are too emotionally involved in this or you see that 99% of the forum users are.
People are weird man.

I just dont get it, dont we all want real time ray tracing for like...two decades now?
And its finally getting worked on, with a good shot of becoming a thing....and people laugh when a company drops support?

I mean I get it, its RTX, its Nvidia, the company who pushes tech....when it exclusively benefits them (cough physics cough cough Gsync cough).
But we all want Real Time Global Illumination to become a thing.

Idk bout you guys but I get sad thinking about for example AMD's Mantle, how it was meant to introduce True Audio that would finally give audio quality a much needed kick in the rear and that never panned out.
Or how DX12 was suppose to natively support us hooking up whatever GPU's we wanted and they would work together.
Or that Lucid Hydra chip before it.
 
Last edited:
Joined
Nov 4, 2005
Messages
10,442 (2.03/day)
System Name MoFo 2
Processor AMD PhenomII 1100T @ 4.2Ghz
Motherboard Asus Crosshair IV
Cooling Swiftec 655 pump, Apogee GT,, MCR360mm Rad, 1/2 loop.
Memory 8GB DDR3-2133 @ 1900 8.9.9.24 1T
Video Card(s) HD7970 1250/1750
Storage Agility 3 SSD 6TB RAID 0 on RAID Card
Display(s) 46" 1080P Toshiba LCD
Case Rosewill R6A34-BK modded (thanks to MKmods)
Audio Device(s) ATI HDMI
Power Supply 750W PC Power & Cooling modded (thanks to MKmods)
Software A lot.
Benchmark Scores Its fast. Enough.
Another example of a technology with closed wall support in limited fashion that is hard to implement correctly. There are other forms of open source Ray tracing that give superior results with power computational overhead.

PhysX anyone?
 
Joined
Sep 17, 2014
Messages
10,419 (5.46/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
And yet we haven't had rasterization in movies since the first Jurassic Park. I wonder why that is ;)
Because movies are not an interactive audiovisual experience, but a passive one, that can be rendered ahead of time, and not in realtime?

(obtuse on purpose, or just a momentary lapse?)

So... we have 2 dGPU makers and you say both are too expensive.
How exactly does that work? :-D
Common sense, and a look at the recent past, and how a perf/dollar is supposed to curve and not suddenly peak because both companies are deciding to slow down on perf gain per gen.

How this works: sales are lower than expected, and companies work harder to sell their next version at a better price or with better features, inventory won't sell, and old stock gets sold at discount. Or we buy into the marketing nonsense/empty promises, and confirm its a price level we agree with. Commerce does indeed work like that, yes. These 2 dGPU makers have a history and we have their GPUs already. We're upgrading. And we can choose not to, its as simple as that. The only share for RTX Nvidia is really capturing by storm is through the RTX2060, for all those who didn't feel like paying around 500 bucks for a GTX 1080 3 years ago. They got one instead for 350. The rest is just not interesting at all for anyone who already has a midrange or better GPU. Navi is more of the same really, by the way. 5-10% perf gaps are too silly to even talk about. The upper midrange is simply massively overcrowded right now and its more of the same ever since Pascal was released. So yes. Shit's too expensive, disregarding two or three exceptions and the odd lucky deal.

Seriously the above two responses of both of you are a clear sign of tunnel vision. You know these answers just as well as I do, but its a reality you don't like to hear, so you choose to contest it. You can rest assured, its futile. Whichever way it flies, the reality won't change, and that is you're paying way too much for a baby step forward. If that is what you want to support or defend, be my guest, but I'm just going to exercise some more patience until products arrive that dó offer what I want at a reasonable price point. And they will appear, no worries.

Same goes for RT enabled cards and games. When its truly ready like 'Crysis' was ready - and we still use the familiar quote today, even though it was unplayable at launch ;) -, these fruitless discussions are not even in anyone's mind - everyone will be amazed when a product offers something truly groundbreaking and the momentum it gets will silence any naysayers, you can go back in history for proof of that. The current state of affairs is not that - and that is the point I've been making since RTX was announced. Its good if you can make the distinction between hit and miss, and for any new tech or innovation, timing is everything. AMD can write a book about that, and so can Nvidia.

As time passes and RTRT 'momentum' is still non existant, perhaps the timing here was completely wrong, and you'd be a complete idiot to pay premium for that.

DXR is not walled garden. And neither is its Vulkan counerpart (which I'm not sure is finished yet).
You can build walls in many ways. One of them is having a high admission fee. Another is using custom hardware ahead of everyone else. It means you put most of the effort in your own flowers and step down on those the others are growing. Nvidia smelled an opportunity for cash and satisfying shareholders after mining dipped so it went ahead of the music. They even rushed it so badly that there was barely anything to show us. And there still really isn't much.
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
28,280 (6.23/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Rtx is just like sli, physx, DEAD END

Another example of a technology with closed wall support in limited fashion that is hard to implement correctly. There are other forms of open source Ray tracing that give superior results with power computational overhead.

PhysX anyone?
RadeonRays
 
Joined
Aug 20, 2007
Messages
12,084 (2.69/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 13-13-13-33-2T
Video Card(s) EVGA GTX 1080 FTW2
Storage Mushkin Pilot-E 2TB NVMe SSD w/ EKWB M.2 Heatsink
Display(s) LG 32GK850G-B 1440p 32" AMVA Panel G-Sync 144hz Display
Case Thermaltake Core X31
Audio Device(s) Onboard TOSLINK to Schiit Modi MB to Schiit Asgard 2 Amp to AKG K7XX Ruby Red Massdrop Headphones
Power Supply EVGA SuperNova T2 850W 80Plus Titanium
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 x64 Enterprise... yes, it's legit.

bug

Joined
May 22, 2015
Messages
6,788 (4.09/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Because movies are not an interactive audiovisual experience, but a passive one, that can be rendered ahead of time, and not in realtime?

(obtuse on purpose, or just a momentary lapse?)
No, I was just pointing out that, everything else being equal, ray tracing still yields better visuals. That's the primary reason RTRT is worth pursuing ;)
 
Joined
Jan 8, 2017
Messages
4,597 (4.32/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
Open, brand neutral standards are the way forward.
DXR is brand neutral, at the end of the that's what makes RT work not RTX.
 
Top