• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Kepler Tech-Demo Called "New Dawn"

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA stunned reporters at its GeForce Kepler press-event, by smoothly-running running Epic Games' Unreal Engine 3 "Samaritan" tech-demo on a single GeForce Kepler GPU, when the demo needed up to three previous-generation GPUs. However, Samaritan isn't Kepler's official tech-demo. It is reportedly called "New Dawn", and is a retake on the "Dawn" tech-demo, which baffled the industry, nearly a decade ago. "Dawn" displayed its central character, a fairy by its name, in stunning detail (at the time).

While Dawn was incredibly detailed, its environment was pretty-much just a textured sky-box. "New Dawn" could bring Dawn back into action, focusing on environmental elements such as realistic physics simulation, improved hair animation, and greater detail. NVIDIA has a wealth of new elements to play with, such as a level of tessellation that could be impossible to render smoothly on the competitor's GPU (even if one could run it). NVIDIA could distribute this demo on its websites (NVIDIA.com, GeForce.com), soon. NVIDIA, and rival AMD, release tech-demos with each new GPU architecture, which demonstrate the capabilities of their new flagship GPUs. Pictured below is a frame from the 2003 demo.



A "sneak-peek" video of the demo follows.


View at TechPowerUp Main Site
 
Last edited by a moderator:
time of the GeForce FX eh? i remember back in the day where the FX cards are getting owned by the 9000 series
 
time of the GeForce FX eh? i remember back in the day where the FX cards are getting owned by the 9000 series


Yep Ruby is still the best :rockout:

 
Last edited by a moderator:
No love for Wanda? :D
 
time of the GeForce FX eh? i remember back in the day where the FX cards are getting owned by the 9000 series

Heh, I still have an old FX5600 non ultra somewhere. Has come in handy a few times when troubleshooting some older AGP rigs. Mostly since my only other AGP card is a 7800GS which I flashed into my Quicksilver.

Nice to see Dawn back. I do hope this means AMD might bring back Ruby too. It was a shame after I saw them put so much effort into her, then just let her kinda drift off into memory.
 
Looks really cool, very close to the uncanny valley, but to be honest I want to download the Samaritan demo, not some fairy crap, why is Samaritan not available to the public, it's not like the necessary hardware hasn't been released yet :(

Come on Epic, let us try it at home instead of teasing us with videos :p
 
I owned a GeForce FX 5900 XT that time, it was a good card, until the SM2 games started poppin out (Far Cry, HL2)

Dawn, Dusk, Nalu, Luna, Mad Mod Mike, Adriane...good times :D
 
Legit Reviews questioned Nvidia about it's tesselation claims of superiority. This is what Nvidia said.

We should have a pretty comfortable lead in Heaven 2.5 and 3.0. I’m surprised we lost in your moderate tessellation testing at 25x16. Running 8xMSAA we’re likely memory bandwidth bound as Heaven isn’t a pure tessellation benchmark. It also stresses other parts of the GPU besides just tessellation.

If you run a tessellation test like tessmark or Microsoft’s SubD11 tessellation test (which both AMD and NVIDIA use when quoting tessellation perf) you’ll see that 4x difference in tessellation horsepower we’re referring to. - NVIDIA PR

http://www.legitreviews.com/article/1881/14/
 
exactly, why isnt the "Samaritan" Demo already available to the public? And why the hell arent any games like crysis released? I mean, BF3 looks good, but its is not groundbraking like crysis was when it was released. I still think its the best looking game ever, if you apply some mods, and the game was released 5 years ago!

Cheers,
Andre
 
Nvidia demos used to be great. Cascades, Last chance gas station, geoforms, Dawn...

More recently they have given us some real boring crap like Stone Giant, aliens v triangles, endless city.
 
Let's stop. Reflect.

In a benchmark stressing purely tessellation and another benchmark testing mostly tessellation, a gpu that clocks itself dynamically according to total gpu load to fill a 225w tdp performs better in a given unbalanced scenario than one that does not use 225w or do the same dynamic clocking.

Seriously, this boost thing is awesome tech which no one can deny...but it kinda makes certain benchmarks completely bs. Remember PhysX in 3Dmark? Different but similar...your gpu also has to do graphics and is not just using all flops on physx. This is one of those times where it has to be said people play games, not benchmarks.

Let me be clear...nvidia is seemingly ahead in tessellation, but not nearly at the level those benchmarks imply. Also, I question the realistic practicality of that level of tessellation when used in congress with a realistic gaming scenario at any given resolution. I would rather my tdp (if that is the new performance metric/bottleneck) be used on higher flops or pixel/texture fillrates, wouldn't you?

edit: nvidia basically said the exact same thing to legit reviews apparently. Thanks for posting that quote 54thvoid!
 
Last edited:
I think I remember Epic saying something about not having the Samaritan demo optimized well for different hardware configs, and that they weren't planning on doing so - so that's why they didn't release it to the public. I don't think it was ever intended to be a benchmark - more like an advertisement for their engine.

I agree, though, I'd love to download that and am very bummed we can't. I'm curious to see how my 7970 @ 1200/1700 will handle nvidia's upcoming demo (probably not very well at all, if I know nVidia).
 
Let's stop. Reflect.

In a benchmark stressing purely tessellation and another benchmark testing mostly tessellation, a gpu that clocks itself dynamically according to total gpu load to fill a 225w tdp performs better in a given unbalanced scenario than one that does not use 225w or do the same dynamic clocking.

Seriously, this boost thing is awesome tech which no one can deny...but it kinda makes certain benchmarks completely bs. Remember PhysX in 3Dmark? Different but similar...your gpu also has to do graphics and is not just using all flops on physx. This is one of those times where it has to be said people play games, not benchmarks.

Let me be clear...nvidia is seemingly ahead in tessellation, but not nearly at the level those benchmarks imply. Also, I question the realistic practicality of that level of tessellation when used in congress with a realistic gaming scenario at any given resolution. I would rather my tdp (if that is the new performance metric/bottleneck) be used on higher shader-based fidelity and fps, wouldn't you?

edit: nvidia basically said the exact same thing to legit reviews apparently. Thanks for posting that quote 54thvoid!

This is a tech demo, not a benchmark... im not quite sure what you are trying to say.
 
I have a feeling that on some reviews the CCC is set to optimized tessellation for the HD 7970 hence the higher scores....

also ddddddouble post!
 
exactly, why isnt the "Samaritan" Demo already available to the public? And why the hell arent any games like crysis released? I mean, BF3 looks good, but its is not groundbraking like crysis was when it was released. I still think its the best looking game ever, if you apply some mods, and the game was released 5 years ago!

Cheers,
Andre

Assuming you haven't played BF3 on single player? It is ground-breaking in almost every way. New levels of visual realism, most realistic character motion using ANT, written for PC first and foremost and you don't think it's ground-breaking? One GIANT LOL!!!

Something is groundbreaking when it raises the bar, when it gives us new levels of visual and gaming realism. BF3 gives us that on pretty much all levels.

Crysis also was ground-breaking in it's day. You have to mod crysis to make it look good now and I have read comments before from people saying it's still the best looking game when u mod it, they showed screenshots and actually the screenshots showed it still looked dated compared to BF3 or Crysis 2 in DX11 with HD texture pack.
 
Last edited:
Assuming you haven't played BF3 on single player? It is ground-breaking in almost every way. New levels of visual realism, most realistic character motion using ANT, written for PC first and foremost and you don't think it's ground-breaking? One GIANT LOL!!!

Something is groundbreaking when it raises the bar, when it gives us new levels of visual and gaming realism. BF3 gives us that on pretty much all levels.

Crysis also was ground-breaking in it's day. You have to mod crysis to make it look good now and I have read comments before from people saying it's still the best looking game when u mod it, they showed screenshots and actually the screenshots showed it still looked dated compared to BF3 or Crysis 2 in DX11 with HD texture pack.

BF3 today is nowhere near what Crysis was in 2007. Not even close. It's better than what's out there, but not by far. It's also arguably not much better than Crysis 2. It also runs fine maxed out on today's hardware - it took years (or crazy multi-GPU setups) to run Crysis maxed out.

To compare BF3 in 2011/12 to Crysis in 2007 is complete ignorance. The answer is very simple - we don't have better looking games because all the $$$ is in console development, so even for a PC-first game, the graphics are limited because they have to make sure the game will run on 7 year old consoles.
 
Soon to be followed by a nude patch:laugh::shadedshu if anyone remembers the Dawn/Dusk and Nalu perv patchs
 
BF3 today is nowhere near what Crysis was in 2007. Not even close. It's better than what's out there, but not by far. It's also arguably not much better than Crysis 2. It also runs fine maxed out on today's hardware - it took years (or crazy multi-GPU setups) to run Crysis maxed out.

To compare BF3 in 2011/12 to Crysis in 2007 is complete ignorance. The answer is very simple - we don't have better looking games because all the $$$ is in console development, so even for a PC-first game, the graphics are limited because they have to make sure the game will run on 7 year old consoles.

Just because it took insane hardware setups to run Crysis doesn't mean it had better graphics. It simply means that the code is horribly optimized.
 
Is Nvidia really saying the benchmark that they were sp much better at just a year and some ago is now not considered a valid test, and it is due to their failing at it?

It is the shit like this that really bothers users, but here hava a fairy, it is all better.

How long till someone injects the vendor ID for a 680 onto their 7970 and runs the test. It happened witb Batman, and lo it ran just as good on red as it did on green.
 
Back
Top