Friday, August 24th 2018

3D Mark's Time Spy With Raytracing to be Launched by the End of September

(Update: UL has come forward to clarify the way they're integrating Raytracing into their benchmarking suite. You can read the follow-up article here.)

UL (who acquired and is in the process of changing 3D Mark's image to that of its own) has revealed that the new, raytracing-supporting version of their Time Spy high performance and high quality benchmark will be arriving by the end of September.

The new version of the benchmark will be released around the launch of Microsoft's next version of its Windows 10 Operating System, codenamed Redstone 5, and thus will fall in some time after NVIDIA's RTX 20-series launch on September 20th. Here's hoping it will be available in time for comparative reviews on NVIDIA's new family of products, and that some light can be shed on the new series' framerates delivery, and not just their GigaRays/sec capabilities.
Source: TechSpot
Add your own comment

70 Comments on 3D Mark's Time Spy With Raytracing to be Launched by the End of September

#26
Apocalypsee
Wait, whatever happen to 3DMark Serra? Thought I heard it during Vega launch but never seen it
Posted on Reply
#27
Upgrayedd
SteevoThis form of ray tracing only works well on shiny objects, so a complete switch to ray tracing is impossible until we start with a new texture format to include diffusion, refraction, reflection, and diffraction which will increase the calculation overhead to staggering levels, meaning 60FPS at 1080 may run at 15FPS with real world accurate ray tracing only.


Slow your roll there pony.

" NVIDIA also announced the GameWorks SDK will add a ray tracing denoiser module. The updated GameWorks SDK, coming soon, includes ray-traced area shadows and ray-traced glossy reflections."

Glossy reflections only, better shadows which I welcome, and that is all.
Why am I a pony and what was I rolling? I didn't say anything about "real world accurate ray tracing". That was you, and only you.

That BF V RTX demo showed the wood on the gun giving ray traced reflections and that wasn't a very shiny object..
Open water in combo with horrid SSR is my biggest visual distraction in games.
Posted on Reply
#28
TheGuruStud
rtwjunkieConsoles is such a small income for AMD it would be more trouble than worth for AMD.
(I assume you meant to say Nvidia) Then why is nvidia so salty? They wanted it, undoubtedly. If that were true, then why bother with all of the previous generations? Why bother continuing the failed Tegra for Nintendo?

The truth is they want all of it and are sore losers. No one wants them, b/c they can't produce a worthwhile SoC.

The longer this continues, especially since AMD is getting Zen money, the more likely it becomes they get usurped and have nothing to fall back on.
Posted on Reply
#29
rtwjunkie
PC Gaming Enthusiast
TheGuruStud(I assume you meant to say Nvidia)
Oops, youre right, good catch! :oops: yeah, the second was supposed to be Nvidia.

Terms we apply to people, like "sore losers", don't apply to corporations. It really lowers your own argument points when you try to portray them as having people personalities. Just a little tip. :)
Posted on Reply
#30
StrayKAT
rtwjunkieOops, youre right, good catch! :oops: yeah, the second was supposed to be Nvidia.

Terms we apply to people, like "sore losers", don't apply to corporations. It really lowers your own argument points when you try to portray them as having people personalities. Just a little tip. :)
It kind of does, depending on how much power the CEO has. I know nothing of Nvidia though. They got the dude with leather jacket, but it's not quite the cult of personality Oracle/Ellison or Jobs' Apple was, I'm sure.
Posted on Reply
#31
rtwjunkie
PC Gaming Enthusiast
StrayKATIt kind of does, depending on how much power the CEO has. I know nothing of Nvidia though. They got the dude with leather jacket, but it's not quite the cult of personality Oracle/Ellison or Jobs' Apple was, I'm sure.
No CEO operates in a vacuum. Everything is dictated by profit. Personal feelings don’t matter. I’ve held a good position in a Fortune 500, and that’s simply how things are viewed.
Posted on Reply
#32
StrayKAT
rtwjunkieNo CEO operates in a vacuum. Everything is dictated by profit. Personal feelings don’t matter. I’ve held a good position in a Fortune 500, and that’s simply how things are viewed.
That's more common, sure. But I'm just saying there are exceptions. Apple definitely spun on Steve Jobs' axis. Especially when he came back. Even more when he came back.. because at that point there was a myth that firing him the first time destroyed the company. So they embraced him full gusto the second time.
Posted on Reply
#33
Steevo
UpgrayeddWhy am I a pony and what was I rolling? I didn't say anything about "real world accurate ray tracing". That was you, and only you.

That BF V RTX demo showed the wood on the gun giving ray traced reflections and that wasn't a very shiny object..
Open water in combo with horrid SSR is my biggest visual distraction in games.
I guess I mean my real world experience very little is glossy, shiny, and that is ALL nvidia can perform ray tracing in real time on. Perhaps what you saw was pre-cooked footage. This was, and still is quite common with Gameworks.

"For convexes and trimeshes, it is indeed possible (and recommended) to share the cooked data between all instances."

100 boxes you can destroy with PhysX all precooked data to speed up the process. Just like every other engine, and why games have gotten bigger, a lot of prerendered items, just have to make sure the right color is installed.
Posted on Reply
#34
nemesis.ie
rtwjunkieWho knows, maybe you’ll still be able to run it just to see....Id love to see how few frames my card can pump out trying to calculate ray traces. :laugh:
This is my hope, it will also be very interesting to see how Vega fairs versus the 1xxx series too. AMD pretty much need to put out an RT driver for it as they appear to have nothing else to offer for quite come time, so even if it's slow, it will show they can do it too. If it turns out to be faster than 1xxx series, that will also be good for them , even if it just means you can get some better shadows in SOTTR (no pun intended!).

If it perfroms OK against the "RTX" cards with RT on it will of course be even more interesting.

As had been mentioned elsewhere, Vega has a lot of horsepower but it is not well utilised a lot of the time. If RT can be done on the e.g. 20 - 25% that goes unused, it may work.

Additionally, I wonder if this "kind of async" the NV presentation showed (the new pipeline) where integer ops are done in parallel means that DXR etc. are implemented where these ops are fully parallel, if that's the case (and the apparent new push for 2 cards hints at it), maybe if you have a 2nd card (e.g. a 2nd Vega) if the s/w can then utilise it better by e.g. putting the needed RT/integer etc. calcs onto the 2nd card - or optimally spreading the various calculations over the total cores to get the right balance for the best performance.

It would seem that's the way things are going.

If this does turn out to be the case, for those wanting more performance, what would a TR with 4 x V56/64s perform like and how much would it cost if 2080ti is 2x the cost of a V64? Obviously either option is expensive and the power bill/cooling could be a factor but less likely if you can afford that kind of system.

Wouldn't it be funny if cards can be given a calculation to do with very little data xfer and the results also don't need much and you could run your own super computer with 10+ cards on oe of those mining boards with 1x slots - or some board made for it with a load of 4x. (This last bit is a bit of fun BTW!).

/ramble
Posted on Reply
#35
StrayKAT
AMD's unique feature set lately tends to be things that are cheap and a little more neutral (like Chill and Freesync). I hope they just stick with that. OTOH, they need to step up raw performance.
Posted on Reply
#36
notb
StrayKATFor being market leaders, it's strange that they don't have one thing that's become a standard. Why would this be?
LOL. CUDA?
Posted on Reply
#37
Tsukiyomi91
definitely a specific version of Time Spy w/Ray-tracing. Releasing this at end of Sept is a good thing considering it gives Nvidia more than enough time to publish a proper driver for their RTX cards, rather than using current release or pre-alpha drivers like we saw back at Gamescom. Hopefully this will put an end to everyone's rather skeptical thought about the card's questionable performance to bed for good. This is getting even more interesting in my eyes.
Posted on Reply
#38
StrayKAT
notbLOL. CUDA?
Different market.. I'm just talking about games. There isn't widespread adoption of game specific stuff they put out (good though it may be... I don't know if it's customers driving it or developers.. but no one follows their lead on that. Yet at the same time, everyone buys Nvidia for performance. It's kind of weird. It's like a monopoly without any of the benefits).
Posted on Reply
#39
the54thvoid
Intoxicated Moderator
SteevoWe literally use Teslas Alternating Current and not Edison's Direct Current....

Nvidia is the Edison to the rest of the market wanting Tesla Open Source or DX implementation.
Thanks for pointing me in the right direction. Did some reading. Ferrari and Tesla both had patents on AC, with Ferrari first with his AC motor. Westinghouse, one of Edison's biggest competitors, bought or licenced the patents from both men. Edison bugged out of his own company after pushing DC, as it was starting to move to AC. He came back and acquired/merged with another AC promoter.
I like it when someone corrects you and you then have to go and brush up on history.

But in analogy. Tesla would be Ageia Physics cards?
Posted on Reply
#40
Vya Domus
rtwjunkieConsoles is such a small income for AMD it would be more trouble than worth for AMD.
You bet Nvidia is salty that they don't get to charge a crap ton of money for a custom GPU. If Sony/MS went with Nvidia I am certain it wouldn't have been a small income for them but fortunately they new better.
Posted on Reply
#41
StrayKAT
Edison was more amazing than anyone currently.. I really don't like these analogies taken too far :D (I also don't like alternate histories that slam him too much btw), but that was one thing he screwed up on.

edit: Not that anyone's doing this here. It's just a sidenote… about all of the weird conspiracies surrounding Tesla and people promoting magical Tesla crap. "Woo" I think is the term on rational wiki.
Posted on Reply
#42
the54thvoid
Intoxicated Moderator
StrayKATEdison was more amazing than anyone currently.. I really don't like these analogies taken too far :D (I also don't like alternate histories that slam him too much btw), but that was one thing he screwed up on.

edit: Not that anyone's doing this here. It's just a sidenote… about all of the weird conspiracies surrounding Tesla and people promoting magical Tesla crap. "Woo" I think is the term on rational wiki.
I agree. Tesla has been 'mystefied' somewhat. All those pioneers of the time deserve credit.
Posted on Reply
#43
Fluffmeister
ApocalypseeWait, whatever happen to 3DMark Serra? Thought I heard it during Vega launch but never seen it
I remember that too, included AMD logos and everything!

www.geeks3d.com/forums/index.php?topic=5077.0

I guess Vega's market penetration was so appalling they chose not to bother.
Posted on Reply
#44
HD64G
FluffmeisterI always knew AMD would hold the industry back, but then I also knew losing consoles deals meant nothing and Nvidia were telling the truth... you pay peanuts... you get monkeys.
Who doesn't even now properly support all DX12 features (the big proof is that new WoW expansion doesn't allow nVidia GPUs to use DX12 and most of the games with it enabled make their GPUs lose performance vs DX11)? Who sabotaged DX10.1 that when enabled gave performance advantage to the competitor and made the game's dev (Batman Arkham Asylum) make a patch to deactivate this setting? Who promotes G-sync and doesn't support freesync that is much better for customers in vfm? Who made the GPP and backtracked it (without any proof that it's cancelled ofc)? You know the answers. It is a totally different thing that a company doesn't make the best product all-round even for 2-3 gens (AMD) and it is another story what nVidia has done to the market constantly sabotaging tech progress that they aren't gaining money from it or losing in comparson. Linux support is another big proof of their anti-customer agenda. On the other side AMD with Mantle helped to have Vulcan that allows many games to play in Linux that makes dev's and gamers' lives much easier. AMD has made big mistakes in marketing creating hype but didn't charge more than they should usually or made so many tools working properly or only with their products like gameworks, physx, g-sync, etc. I cannot disagree with any customer that wants or needs the highest perfomance possible at the time and pays more than he should to have it (free choice exists) but only fanboys can support such a malicious and anti-customer company so vocally and this is bad for the rest of us customers also, when this company is encouraged to make even worse moves every while.
Posted on Reply
#45
john_
New version of Nvidia Mark. How exciting!!!
HD64Gand made the game's dev (Batman Arkham Asylum)
It was Assassin's Creed I believe.
Posted on Reply
#46
notb
StrayKATDifferent market..
But the market that's actually built around standardized features, because it's the customers who code. :)
I'm just talking about games. There isn't widespread adoption of game specific stuff they put out (good though it may be... I don't know if it's customers driving it or developers..
Not many gamers are thinking so much about hardware and features. They just want the game to look well and get acceptable fps. It's not like 10-20 years ago, when everyone knew which DirectX he had, because they were so many quirks and problems.

DX 12 is a tech for developers: it changes a lot about how they work and it generally makes writing games much harder and costlier. But the actual impact on how games look isn't that great. So it didn't get traction in the business.
Ray tracing is a huge improvement for customers. If it makes to mainstream titles, we'll be in a new era of gaming. Well... at least those of us, who care more about visuals than about fps. :)
Posted on Reply
#47
StrayKAT
notbWell... at least those of us, who care more about visuals than about fps. :)
Hold on a minute.. these are pretty much the same thing.

I mean, any game great can "look" great if you didn't care about fps. :D
Posted on Reply
#48
nguyen
Nvidia RTX enormous benefit is that it saves developers plenty of man-hour to make game look good, as oppose to DX12 which just make developer's life miserable. Let hope we have settings that can tune down the ray tracing so as to minimize performance loss in RTX supported games.
Posted on Reply
#49
StrayKAT
nguyenNvidia RTX enormous benefit is that it saves developers plenty of man-hour to make game look good, as oppose to DX12 which just make developer's life miserable. Let hope we have settings that can tune down the ray tracing so as to minimize performance loss in RTX supported games.
I didn't know DX12 was that bad. What about Vulkan?
Posted on Reply
#50
notb
StrayKATHold on a minute.. these are pretty much the same thing.

I mean, any game great can "look" great if you didn't care about fps. :D
But most gamers don't care about anything above "good enough". This is the more casual, fun-oriented style - like on consoles.
60fps is great, but lets be honest: 30fps is still fine.
fps below 30 takes some fun away, but man... I remember years of Intel HD gaming, when I played games like Oblivion on low settings - with frames dropping to maybe 10 or 15 during flashy fights. And it was *so much fun* nevertheless.

Looking at the RTX samples, it's a huge change. And we know it's the next big thing in gaming, because - let's be honest - 4K@60fps is here and 99% of gamers don't need more.
StrayKATI didn't know DX12 was that bad. What about Vulkan?
Also bad.
It's not even about the API itself being badly written. It's the concept itself. You have a much more direct control of what's going on in the GPU. So if you spend enough time coding, it works better. But it also means way more complicated code, more differences between platforms, worse porting and so on. It's more dependent on GPU firmware as well...

Older DirectX were pretty simple - even for programmers who only needed to go 3D for occasional project or to have fun.
Posted on Reply
Add your own comment
Apr 26th, 2024 23:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts