• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3D Mark's Time Spy With Raytracing to be Launched by the End of September

Consoles would still suck if Nvidia controlled them. It's not like AMD has always had that position themselves... and yet consoles always sucked.
 
Wait, whatever happen to 3DMark Serra? Thought I heard it during Vega launch but never seen it
 
This form of ray tracing only works well on shiny objects, so a complete switch to ray tracing is impossible until we start with a new texture format to include diffusion, refraction, reflection, and diffraction which will increase the calculation overhead to staggering levels, meaning 60FPS at 1080 may run at 15FPS with real world accurate ray tracing only.


Slow your roll there pony.

" NVIDIA also announced the GameWorks SDK will add a ray tracing denoiser module. The updated GameWorks SDK, coming soon, includes ray-traced area shadows and ray-traced glossy reflections."

Glossy reflections only, better shadows which I welcome, and that is all.
Why am I a pony and what was I rolling? I didn't say anything about "real world accurate ray tracing". That was you, and only you.

That BF V RTX demo showed the wood on the gun giving ray traced reflections and that wasn't a very shiny object..
Open water in combo with horrid SSR is my biggest visual distraction in games.
 
Consoles is such a small income for AMD it would be more trouble than worth for AMD.
(I assume you meant to say Nvidia) Then why is nvidia so salty? They wanted it, undoubtedly. If that were true, then why bother with all of the previous generations? Why bother continuing the failed Tegra for Nintendo?

The truth is they want all of it and are sore losers. No one wants them, b/c they can't produce a worthwhile SoC.

The longer this continues, especially since AMD is getting Zen money, the more likely it becomes they get usurped and have nothing to fall back on.
 
(I assume you meant to say Nvidia)
Oops, youre right, good catch! :oops: yeah, the second was supposed to be Nvidia.

Terms we apply to people, like "sore losers", don't apply to corporations. It really lowers your own argument points when you try to portray them as having people personalities. Just a little tip. :)
 
Oops, youre right, good catch! :oops: yeah, the second was supposed to be Nvidia.

Terms we apply to people, like "sore losers", don't apply to corporations. It really lowers your own argument points when you try to portray them as having people personalities. Just a little tip. :)

It kind of does, depending on how much power the CEO has. I know nothing of Nvidia though. They got the dude with leather jacket, but it's not quite the cult of personality Oracle/Ellison or Jobs' Apple was, I'm sure.
 
It kind of does, depending on how much power the CEO has. I know nothing of Nvidia though. They got the dude with leather jacket, but it's not quite the cult of personality Oracle/Ellison or Jobs' Apple was, I'm sure.
No CEO operates in a vacuum. Everything is dictated by profit. Personal feelings don’t matter. I’ve held a good position in a Fortune 500, and that’s simply how things are viewed.
 
No CEO operates in a vacuum. Everything is dictated by profit. Personal feelings don’t matter. I’ve held a good position in a Fortune 500, and that’s simply how things are viewed.

That's more common, sure. But I'm just saying there are exceptions. Apple definitely spun on Steve Jobs' axis. Especially when he came back. Even more when he came back.. because at that point there was a myth that firing him the first time destroyed the company. So they embraced him full gusto the second time.
 
Why am I a pony and what was I rolling? I didn't say anything about "real world accurate ray tracing". That was you, and only you.

That BF V RTX demo showed the wood on the gun giving ray traced reflections and that wasn't a very shiny object..
Open water in combo with horrid SSR is my biggest visual distraction in games.


I guess I mean my real world experience very little is glossy, shiny, and that is ALL nvidia can perform ray tracing in real time on. Perhaps what you saw was pre-cooked footage. This was, and still is quite common with Gameworks.

"For convexes and trimeshes, it is indeed possible (and recommended) to share the cooked data between all instances."

100 boxes you can destroy with PhysX all precooked data to speed up the process. Just like every other engine, and why games have gotten bigger, a lot of prerendered items, just have to make sure the right color is installed.
 
Who knows, maybe you’ll still be able to run it just to see....Id love to see how few frames my card can pump out trying to calculate ray traces. :laugh:

This is my hope, it will also be very interesting to see how Vega fairs versus the 1xxx series too. AMD pretty much need to put out an RT driver for it as they appear to have nothing else to offer for quite come time, so even if it's slow, it will show they can do it too. If it turns out to be faster than 1xxx series, that will also be good for them , even if it just means you can get some better shadows in SOTTR (no pun intended!).

If it perfroms OK against the "RTX" cards with RT on it will of course be even more interesting.

As had been mentioned elsewhere, Vega has a lot of horsepower but it is not well utilised a lot of the time. If RT can be done on the e.g. 20 - 25% that goes unused, it may work.

Additionally, I wonder if this "kind of async" the NV presentation showed (the new pipeline) where integer ops are done in parallel means that DXR etc. are implemented where these ops are fully parallel, if that's the case (and the apparent new push for 2 cards hints at it), maybe if you have a 2nd card (e.g. a 2nd Vega) if the s/w can then utilise it better by e.g. putting the needed RT/integer etc. calcs onto the 2nd card - or optimally spreading the various calculations over the total cores to get the right balance for the best performance.

It would seem that's the way things are going.

If this does turn out to be the case, for those wanting more performance, what would a TR with 4 x V56/64s perform like and how much would it cost if 2080ti is 2x the cost of a V64? Obviously either option is expensive and the power bill/cooling could be a factor but less likely if you can afford that kind of system.

Wouldn't it be funny if cards can be given a calculation to do with very little data xfer and the results also don't need much and you could run your own super computer with 10+ cards on oe of those mining boards with 1x slots - or some board made for it with a load of 4x. (This last bit is a bit of fun BTW!).

/ramble
 
AMD's unique feature set lately tends to be things that are cheap and a little more neutral (like Chill and Freesync). I hope they just stick with that. OTOH, they need to step up raw performance.
 
definitely a specific version of Time Spy w/Ray-tracing. Releasing this at end of Sept is a good thing considering it gives Nvidia more than enough time to publish a proper driver for their RTX cards, rather than using current release or pre-alpha drivers like we saw back at Gamescom. Hopefully this will put an end to everyone's rather skeptical thought about the card's questionable performance to bed for good. This is getting even more interesting in my eyes.
 
LOL. CUDA?

Different market.. I'm just talking about games. There isn't widespread adoption of game specific stuff they put out (good though it may be... I don't know if it's customers driving it or developers.. but no one follows their lead on that. Yet at the same time, everyone buys Nvidia for performance. It's kind of weird. It's like a monopoly without any of the benefits).
 
We literally use Teslas Alternating Current and not Edison's Direct Current....

Nvidia is the Edison to the rest of the market wanting Tesla Open Source or DX implementation.

Thanks for pointing me in the right direction. Did some reading. Ferrari and Tesla both had patents on AC, with Ferrari first with his AC motor. Westinghouse, one of Edison's biggest competitors, bought or licenced the patents from both men. Edison bugged out of his own company after pushing DC, as it was starting to move to AC. He came back and acquired/merged with another AC promoter.
I like it when someone corrects you and you then have to go and brush up on history.

But in analogy. Tesla would be Ageia Physics cards?
 
Last edited:
Consoles is such a small income for AMD it would be more trouble than worth for AMD.

You bet Nvidia is salty that they don't get to charge a crap ton of money for a custom GPU. If Sony/MS went with Nvidia I am certain it wouldn't have been a small income for them but fortunately they new better.
 
Edison was more amazing than anyone currently.. I really don't like these analogies taken too far :D (I also don't like alternate histories that slam him too much btw), but that was one thing he screwed up on.

edit: Not that anyone's doing this here. It's just a sidenote… about all of the weird conspiracies surrounding Tesla and people promoting magical Tesla crap. "Woo" I think is the term on rational wiki.
 
Last edited:
Edison was more amazing than anyone currently.. I really don't like these analogies taken too far :D (I also don't like alternate histories that slam him too much btw), but that was one thing he screwed up on.

edit: Not that anyone's doing this here. It's just a sidenote… about all of the weird conspiracies surrounding Tesla and people promoting magical Tesla crap. "Woo" I think is the term on rational wiki.

I agree. Tesla has been 'mystefied' somewhat. All those pioneers of the time deserve credit.
 
I always knew AMD would hold the industry back, but then I also knew losing consoles deals meant nothing and Nvidia were telling the truth... you pay peanuts... you get monkeys.
Who doesn't even now properly support all DX12 features (the big proof is that new WoW expansion doesn't allow nVidia GPUs to use DX12 and most of the games with it enabled make their GPUs lose performance vs DX11)? Who sabotaged DX10.1 that when enabled gave performance advantage to the competitor and made the game's dev (Batman Arkham Asylum) make a patch to deactivate this setting? Who promotes G-sync and doesn't support freesync that is much better for customers in vfm? Who made the GPP and backtracked it (without any proof that it's cancelled ofc)? You know the answers. It is a totally different thing that a company doesn't make the best product all-round even for 2-3 gens (AMD) and it is another story what nVidia has done to the market constantly sabotaging tech progress that they aren't gaining money from it or losing in comparson. Linux support is another big proof of their anti-customer agenda. On the other side AMD with Mantle helped to have Vulcan that allows many games to play in Linux that makes dev's and gamers' lives much easier. AMD has made big mistakes in marketing creating hype but didn't charge more than they should usually or made so many tools working properly or only with their products like gameworks, physx, g-sync, etc. I cannot disagree with any customer that wants or needs the highest perfomance possible at the time and pays more than he should to have it (free choice exists) but only fanboys can support such a malicious and anti-customer company so vocally and this is bad for the rest of us customers also, when this company is encouraged to make even worse moves every while.
 
New version of Nvidia Mark. How exciting!!!

and made the game's dev (Batman Arkham Asylum)
It was Assassin's Creed I believe.
 
Different market..
But the market that's actually built around standardized features, because it's the customers who code. :)
I'm just talking about games. There isn't widespread adoption of game specific stuff they put out (good though it may be... I don't know if it's customers driving it or developers..
Not many gamers are thinking so much about hardware and features. They just want the game to look well and get acceptable fps. It's not like 10-20 years ago, when everyone knew which DirectX he had, because they were so many quirks and problems.

DX 12 is a tech for developers: it changes a lot about how they work and it generally makes writing games much harder and costlier. But the actual impact on how games look isn't that great. So it didn't get traction in the business.
Ray tracing is a huge improvement for customers. If it makes to mainstream titles, we'll be in a new era of gaming. Well... at least those of us, who care more about visuals than about fps. :)
 
Well... at least those of us, who care more about visuals than about fps. :)

Hold on a minute.. these are pretty much the same thing.

I mean, any game great can "look" great if you didn't care about fps. :D
 
Nvidia RTX enormous benefit is that it saves developers plenty of man-hour to make game look good, as oppose to DX12 which just make developer's life miserable. Let hope we have settings that can tune down the ray tracing so as to minimize performance loss in RTX supported games.
 
Nvidia RTX enormous benefit is that it saves developers plenty of man-hour to make game look good, as oppose to DX12 which just make developer's life miserable. Let hope we have settings that can tune down the ray tracing so as to minimize performance loss in RTX supported games.

I didn't know DX12 was that bad. What about Vulkan?
 
Back
Top