Monday, March 26th 2012

NVIDIA Kepler Tech-Demo Called "New Dawn"

NVIDIA stunned reporters at its GeForce Kepler press-event, by smoothly-running running Epic Games' Unreal Engine 3 "Samaritan" tech-demo on a single GeForce Kepler GPU, when the demo needed up to three previous-generation GPUs. However, Samaritan isn't Kepler's official tech-demo. It is reportedly called "New Dawn", and is a retake on the "Dawn" tech-demo, which baffled the industry, nearly a decade ago. "Dawn" displayed its central character, a fairy by its name, in stunning detail (at the time).

While Dawn was incredibly detailed, its environment was pretty-much just a textured sky-box. "New Dawn" could bring Dawn back into action, focusing on environmental elements such as realistic physics simulation, improved hair animation, and greater detail. NVIDIA has a wealth of new elements to play with, such as a level of tessellation that could be impossible to render smoothly on the competitor's GPU (even if one could run it). NVIDIA could distribute this demo on its websites (NVIDIA.com, GeForce.com), soon. NVIDIA, and rival AMD, release tech-demos with each new GPU architecture, which demonstrate the capabilities of their new flagship GPUs. Pictured below is a frame from the 2003 demo.



A "sneak-peek" video of the demo follows.

Source: Expreview
Add your own comment

55 Comments on NVIDIA Kepler Tech-Demo Called "New Dawn"

#2
ViperXTR
time of the GeForce FX eh? i remember back in the day where the FX cards are getting owned by the 9000 series
Posted on Reply
#4
INSTG8R
by: ViperXTR
time of the GeForce FX eh? i remember back in the day where the FX cards are getting owned by the 9000 series
Yep Ruby is still the best :rockout:

[media=youtube]JiWNQbSDuvY[/media]
Posted on Reply
#7
NC37
by: ViperXTR
time of the GeForce FX eh? i remember back in the day where the FX cards are getting owned by the 9000 series
Heh, I still have an old FX5600 non ultra somewhere. Has come in handy a few times when troubleshooting some older AGP rigs. Mostly since my only other AGP card is a 7800GS which I flashed into my Quicksilver.

Nice to see Dawn back. I do hope this means AMD might bring back Ruby too. It was a shame after I saw them put so much effort into her, then just let her kinda drift off into memory.
Posted on Reply
#8
15th Warlock
Looks really cool, very close to the uncanny valley, but to be honest I want to download the Samaritan demo, not some fairy crap, why is Samaritan not available to the public, it's not like the necessary hardware hasn't been released yet :(

Come on Epic, let us try it at home instead of teasing us with videos :p
Posted on Reply
#9
ViperXTR
I owned a GeForce FX 5900 XT that time, it was a good card, until the SM2 games started poppin out (Far Cry, HL2)

Dawn, Dusk, Nalu, Luna, Mad Mod Mike, Adriane...good times :D
Posted on Reply
#11
the54thvoid
Legit Reviews questioned Nvidia about it's tesselation claims of superiority. This is what Nvidia said.
We should have a pretty comfortable lead in Heaven 2.5 and 3.0. I’m surprised we lost in your moderate tessellation testing at 25x16. Running 8xMSAA we’re likely memory bandwidth bound as Heaven isn’t a pure tessellation benchmark. It also stresses other parts of the GPU besides just tessellation.

If you run a tessellation test like tessmark or Microsoft’s SubD11 tessellation test (which both AMD and NVIDIA use when quoting tessellation perf) you’ll see that 4x difference in tessellation horsepower we’re referring to. - NVIDIA PR
http://www.legitreviews.com/article/1881/14/
Posted on Reply
#12
Badelhas
exactly, why isnt the "Samaritan" Demo already available to the public? And why the hell arent any games like crysis released? I mean, BF3 looks good, but its is not groundbraking like crysis was when it was released. I still think its the best looking game ever, if you apply some mods, and the game was released 5 years ago!

Cheers,
Andre
Posted on Reply
#13
grammaton_feather
Nvidia demos used to be great. Cascades, Last chance gas station, geoforms, Dawn...

More recently they have given us some real boring crap like Stone Giant, aliens v triangles, endless city.
Posted on Reply
#15
alwayssts
Let's stop. Reflect.

In a benchmark stressing purely tessellation and another benchmark testing mostly tessellation, a gpu that clocks itself dynamically according to total gpu load to fill a 225w tdp performs better in a given unbalanced scenario than one that does not use 225w or do the same dynamic clocking.

Seriously, this boost thing is awesome tech which no one can deny...but it kinda makes certain benchmarks completely bs. Remember PhysX in 3Dmark? Different but similar...your gpu also has to do graphics and is not just using all flops on physx. This is one of those times where it has to be said people play games, not benchmarks.

Let me be clear...nvidia is seemingly ahead in tessellation, but not nearly at the level those benchmarks imply. Also, I question the realistic practicality of that level of tessellation when used in congress with a realistic gaming scenario at any given resolution. I would rather my tdp (if that is the new performance metric/bottleneck) be used on higher flops or pixel/texture fillrates, wouldn't you?

edit: nvidia basically said the exact same thing to legit reviews apparently. Thanks for posting that quote 54thvoid!
Posted on Reply
#16
BigMack70
I think I remember Epic saying something about not having the Samaritan demo optimized well for different hardware configs, and that they weren't planning on doing so - so that's why they didn't release it to the public. I don't think it was ever intended to be a benchmark - more like an advertisement for their engine.

I agree, though, I'd love to download that and am very bummed we can't. I'm curious to see how my 7970 @ 1200/1700 will handle nvidia's upcoming demo (probably not very well at all, if I know nVidia).
Posted on Reply
#17
phanbuey
by: alwayssts
Let's stop. Reflect.

In a benchmark stressing purely tessellation and another benchmark testing mostly tessellation, a gpu that clocks itself dynamically according to total gpu load to fill a 225w tdp performs better in a given unbalanced scenario than one that does not use 225w or do the same dynamic clocking.

Seriously, this boost thing is awesome tech which no one can deny...but it kinda makes certain benchmarks completely bs. Remember PhysX in 3Dmark? Different but similar...your gpu also has to do graphics and is not just using all flops on physx. This is one of those times where it has to be said people play games, not benchmarks.

Let me be clear...nvidia is seemingly ahead in tessellation, but not nearly at the level those benchmarks imply. Also, I question the realistic practicality of that level of tessellation when used in congress with a realistic gaming scenario at any given resolution. I would rather my tdp (if that is the new performance metric/bottleneck) be used on higher shader-based fidelity and fps, wouldn't you?

edit: nvidia basically said the exact same thing to legit reviews apparently. Thanks for posting that quote 54thvoid!
This is a tech demo, not a benchmark... im not quite sure what you are trying to say.
Posted on Reply
#18
radrok
I have a feeling that on some reviews the CCC is set to optimized tessellation for the HD 7970 hence the higher scores....

also ddddddouble post!
Posted on Reply
#19
W1zzard
by: Badelhas
why the hell arent any games like crysis released?
because they make 100x more money programming for consoles
Posted on Reply
#20
grammaton_feather
by: Badelhas
exactly, why isnt the "Samaritan" Demo already available to the public? And why the hell arent any games like crysis released? I mean, BF3 looks good, but its is not groundbraking like crysis was when it was released. I still think its the best looking game ever, if you apply some mods, and the game was released 5 years ago!

Cheers,
Andre
Assuming you haven't played BF3 on single player? It is ground-breaking in almost every way. New levels of visual realism, most realistic character motion using ANT, written for PC first and foremost and you don't think it's ground-breaking? One GIANT LOL!!!

Something is groundbreaking when it raises the bar, when it gives us new levels of visual and gaming realism. BF3 gives us that on pretty much all levels.

Crysis also was ground-breaking in it's day. You have to mod crysis to make it look good now and I have read comments before from people saying it's still the best looking game when u mod it, they showed screenshots and actually the screenshots showed it still looked dated compared to BF3 or Crysis 2 in DX11 with HD texture pack.
Posted on Reply
#21
BigMack70
by: grammaton_feather
Assuming you haven't played BF3 on single player? It is ground-breaking in almost every way. New levels of visual realism, most realistic character motion using ANT, written for PC first and foremost and you don't think it's ground-breaking? One GIANT LOL!!!

Something is groundbreaking when it raises the bar, when it gives us new levels of visual and gaming realism. BF3 gives us that on pretty much all levels.

Crysis also was ground-breaking in it's day. You have to mod crysis to make it look good now and I have read comments before from people saying it's still the best looking game when u mod it, they showed screenshots and actually the screenshots showed it still looked dated compared to BF3 or Crysis 2 in DX11 with HD texture pack.
BF3 today is nowhere near what Crysis was in 2007. Not even close. It's better than what's out there, but not by far. It's also arguably not much better than Crysis 2. It also runs fine maxed out on today's hardware - it took years (or crazy multi-GPU setups) to run Crysis maxed out.

To compare BF3 in 2011/12 to Crysis in 2007 is complete ignorance. The answer is very simple - we don't have better looking games because all the $$$ is in console development, so even for a PC-first game, the graphics are limited because they have to make sure the game will run on 7 year old consoles.
Posted on Reply
#22
kenkickr
Soon to be followed by a nude patch:laugh::shadedshu if anyone remembers the Dawn/Dusk and Nalu perv patchs
Posted on Reply
#23
mrw1986
by: BigMack70
BF3 today is nowhere near what Crysis was in 2007. Not even close. It's better than what's out there, but not by far. It's also arguably not much better than Crysis 2. It also runs fine maxed out on today's hardware - it took years (or crazy multi-GPU setups) to run Crysis maxed out.

To compare BF3 in 2011/12 to Crysis in 2007 is complete ignorance. The answer is very simple - we don't have better looking games because all the $$$ is in console development, so even for a PC-first game, the graphics are limited because they have to make sure the game will run on 7 year old consoles.
Just because it took insane hardware setups to run Crysis doesn't mean it had better graphics. It simply means that the code is horribly optimized.
Posted on Reply
#24
Steevo
Is Nvidia really saying the benchmark that they were sp much better at just a year and some ago is now not considered a valid test, and it is due to their failing at it?

It is the shit like this that really bothers users, but here hava a fairy, it is all better.

How long till someone injects the vendor ID for a 680 onto their 7970 and runs the test. It happened witb Batman, and lo it ran just as good on red as it did on green.
Posted on Reply
#25
BigMack70
by: mrw1986
Just because it took insane hardware setups to run Crysis doesn't mean it had better graphics. It simply means that the code is horribly optimized.
Obviously Crysis is poorly optimized... I think that is obvious to all (especially anyone who has played the last level and experienced the joys of a memory leak that kills your framerate into the single digits regardless of your hardware...)

However, my point stands that Crysis compared to other 2007 games shows Crysis in a FAR more impressive light than BF3 compared to 2012 games. There are several games that are on the same tier graphically as BF3, even if BF3 might be a bit better (and guess what - the original Crysis is on that list).

Nothing was anywhere close to Crysis when it was released; not by a long shot. The reason we haven't had anything like that happen since is because of the console cycle - 99% of games are either developed with the consoles in mind first, or even though they are designed for PC they have to be able to be ported to console, limiting what can be done graphically on the PC.
Posted on Reply
Add your own comment