Friday, March 19th 2010

NVIDIA Claims Upper Hand in Tessellation Performance

A set of company slides leaked to the press reveals that NVIDIA is claiming the upper hand in tessellation performance. With this achievement, NVIDIA is looking to encourage leaps in geometric detail, probably in future games that make use of tessellation. NVIDIA's confidence comes from the way its GF100 GPU is designed (further explained here). Each GF100 GPU physically has 16 Polymorph Engines, one per streaming multiprocessor (SM) which helps in distributed, parallel geometry processing. Each Polymorph Engine has its own tessellation unit. With 15 SMs enabled on the GeForce GTX 480 and 14 on the GeForce GTX 470, there are that many independent tessellation units.

NVIDIA demonstrated its claims in the presentation using the Unigine Heaven, where the GeForce GTX 480 was pitted against a Radeon HD 5870. In many scenes where tessellation is lower, the GPUs performed neck-and-neck, with the GTX 480 performing better more often. But in scenes with heavy tessellation (particularly the "dragon" scene, where a highly detailed model of a dragon needs to be rendered with densely tessellated meshes, the GTX 480 clocks nearly a 100% performance increment over the HD 5870. NVIDIA has been confident about the tessellation performance back since January, when it detailed the GF100 architecture. The GeForce GTX 400 series graphics cards will be unveiled on the 26th of March.

Images Courtesy: Techno-Labs
Add your own comment

145 Comments on NVIDIA Claims Upper Hand in Tessellation Performance

#51
Zubasa
CyberCTLOL! :laugh: I'm not talking out of my arse. I know I can't use DX10 or DX11 in XP. Until the next gen of consoles, DX10 or DX11 will take off very slowly because obviously the X360 & PS3 run on DX9.

Tell me what I'm majorly missing from DX10 & DX11 from DX9. My buddy has Windows 7 and an ATI 5850 card. Far Cry 2 looks "barely" better in DX10. Halo 2 looks like a high rez X360 version in DX10. It's another game that could have been done in DX9 (and others have made it work on DX9 on the net) but it was a ploy by MS to push Vista, just like Crysis "Very High" mode which is easily done on DX9 with a few changes to the files. We both have BC2 also, and I do see a little bit of a difference in DX11. But not enough to make me excited. Actually, I'll bet Dice could have made it look just as good under DX9 but they didn't try perhaps because of MS. Heck, FSAA isn't possible in BC2 in DX9, and that's just BS in my book because it runs better on my PC than my friend's without FSAA on either system. FSAA in DX9 is possible in the rest of the games I own.

I have no interest in stalker CoP, Metro 2033 , or the new AVP game. Maybe DIRT2 but I'll wait until later this year / next year when I upgrade and I'm sure it will be packaged with the DX11 card I buy. Look at people's opinions around the web on DX11 games thus far. People aren't that enthused about how much "better" games look compared to the DX9 version, nor how much slower they run.

All my games that are DX9 I run at max settings with at least 2XFSAA (except Crysis) and they run at a very fluid frame rate without slowdown. I don't have any games that are DX8 or below installed on my system. Only DX9.

Until we see the Xbox 720 & PS4 we are stuck at a stalemate for mainstream technology progression.
So you buy a DX10 graphics card to run games in DX9, you might as well get a console and be done with it :laugh:
You want to know why BF:BC2 does not support FSAA in DX9? Because they simply port that from the console which couldn't afford FSAA. :slap:

DX11 actually is taking on much better than DX10, for the very lease there are games that support DX11 within the first 6 months the hardware is released.
DX10 fell flat on its face mainly because how people hates Win Vista, and the fact that nVidia released pussy mid-range 8600GTs that can't even run DX9 games maxed out.

The point about FarCry2 not looking much better in DX10?
That games is base on the Dunia engine which is a modified CryEngine, and it is a DX9 engine with added DX10 support.
The Halo 2 Engine is mainly designed for the Xbox that makes it an even worst example.
By the way Halo 2 is released in 2004, that is way before Vista even exists. These are DX9 native games.

Most important thing is this is a thread about the new DX11 GPU from nVidia, not some Console vs PC thread.
Mussle's earlier post was to point out that nvidia's 3DVision has too many limitations, and that includes FPS/refresh rate issues in DX10/11 those have nothing to do with Vsync
Posted on Reply
#52
tkpenalty
CyberCTHaha!! Whatever. Enjoy your 6 monitor setup. That will look funnier on a desk than any type of buddy holly glasses on your face. To each their own. You don't know what you're missing out on. I guess ignorance is bliss??
So what does 3D actually help with? Just eyecandy, and nothing more, its not like you need to judge distance (and you can't really). For it to even work well you'd have to keep your head still.

However a 6 monitor setup can be wrapped around your field of vision (or 3), and lets you see more of whats going on in games. Especially more useful in racing games, where in a normal 16:9 ratio, you'd rely on a button to check who's beside you, whereas in a multiple monitor setup you could just duck your eye to the side quickly without being distracted by the button.

The difference is that multiple monitors actually facilitate gaming, whilst 3D doesn't do anything but create an illusion.

You may find your 23 inch screen big enough but others who want more than just eye candy would object.
Posted on Reply
#53
CyberCT
ZubasaSo you buy a DX10 graphics card to run games in DX9, you might as well get a console and be done with it :laugh:
You want to know why BF:BC2 does not support FSAA in DX9? Because they simply port that from the console which couldn't afford FSAA. :slap:

DX11 actually is taking on much better than DX10, for the very lease there are games that support DX11 within the first 6 months the hardware is released.
DX10 fell flat on its face mainly because how people hates Win Vista, and the fact that nVidia released pussy mid-range 8600GTs that can't even run DX9 games maxed out.

The point about FarCry2 not looking much better in DX10?
That games is base on the Dunia engine which is a modified CryEngine, and it is a DX9 engine with added DX10 support.
The Halo 2 Engine is mainly designed for the Xbox that makes it an even worst example.
By the way Halo 2 is released in 2004, that is way before Vista even exists. These are DX9 native games.

Most important thing is this is a thread about the new DX11 GPU from nVidia, not some Console vs PC thread.
Mussle's earlier post was to point out that nvidia's 3DVision has too many limitations, and that includes FPS/refresh rate issues in DX10/11 those have nothing to do with Vsync
Did you read that I have XP? I only have DX9 on my PC, which is perfectly fine. Even though I have an X360 Id much rather have my games running at 60fps at 1080p than 30fps at 720p. To each their own.
tkpenaltySo what does 3D actually help with? Just eyecandy, and nothing more, its not like you need to judge distance (and you can't really). For it to even work well you'd have to keep your head still.

However a 6 monitor setup can be wrapped around your field of vision (or 3), and lets you see more of whats going on in games. Especially more useful in racing games, where in a normal 16:9 ratio, you'd rely on a button to check who's beside you, whereas in a multiple monitor setup you could just duck your eye to the side quickly without being distracted by the button.

The difference is that multiple monitors actually facilitate gaming, whilst 3D doesn't do anything but create an illusion.

You may find your 23 inch screen big enough but others who want more than just eye candy would object.
Well, like you also said, to each their own. I see absolutely no reason for a 3 (let alone 6) monitor setup for gaming. Maybe for day trading stocks, but nothing else. I couldn't care less if I either turn my eyes left or right to see a car on my side, or to quickly press a buton to see the same thing on my single monitor. And it would be more than a grand to get a 6 monitor setup (worthless for anything other than PC eye candy) with a very thin bezel, how much power would they consume alltogether, and would it run at 60fps on the same setup as a single monitor? I find it strage you say you have a hard time doing 3D in DX10/11 because Nvidia has native support for stereo 3D in Vista or Windows 7 only (DX9/DX10/11 only not XP), and the reviews say it works great. I have to use D3D drivers or IZ3D drivers for stereoscopic 3D, both of which work fine for me. You can call 3D a gimmick, but I call 6 monitors a gimmick, especially since there are thick bezels in between monitors. If the bezels were .01" thick, then maybe I could agree with you. I've seen youtube videos (and even at Bestbuy as a console setup) of a 6 monitor setup and it looks retarded. Like looking through a paned window:

www.youtube.com/watch?v=X6jYycRmWz4

3D, however, looks amazing as everything jumps out at you. I don't play like that daily. Just once in a while to get a treat or to show it off. An ultra highrez monitor shrunk to 23" that supports 2560 x 1600 would be awesome cause there would be no need for FSAA. Until then, a 56" DLP would probably take up more real estate as a 6 monitor setup, consume less power, and be cheaper. It could also be used as a regular TV, monitor, and have no thick bezels inbetween. It could also do 3D if I please.

But whatever, to each their own.
Posted on Reply
#54
tkpenalty
CyberCTYou can call 3D a gimmick, but I call 6 monitors a gimmick, especially since there are thick bezels in between monitors. If the bezels were .01" thick, then maybe I could agree with you. I've seen youtube videos (and even at Bestbuy as a console setup) of a 6 monitor setup and it looks retarded. Like looking through a paned window:

www.youtube.com/watch?v=X6jYycRmWz4

3D, however, looks amazing as everything jumps out at you.

But whatever, to each their own.
Bezels aren't really a problem because in the end you see like literally 4000% more? Its like saying that windows on the side of a car are useless because theres the roof support structures on either side...

imo its useless in FPS games though.

Anyway back on OT.
Posted on Reply
#55
Unregistered
nice Title..
I hope it's proven correct when these cards are released and at a competitive price..

ok gonna go ride my unicorn across the river of chocolate now...
#56
the_wolf88
I don't believe in a benchmark from Nvidia !!

We need to see reviews from other sources not from Nvidia !!

Of course Nvidia will show it is winning in their benchmarks even if they are losing to ATI..

Anyway 6 days to go and we'll see who is the KING OF HELL !!
Posted on Reply
#57
rizla1
simlariveryup, not only the performance is behind, but the feature-sert as well. No multimonitor gaming (you can game on max 3 monitor with a SLI setup, fail) and 3D gaming uses proprietary expensive glasses (not that anyone cares about 3D gaming) and still no word for 7.1 sound on the hdmi output. probably non-existent.
Who in the world uses 6 monitors for gaming other than the few super rich? :wtf: and for desktop 2 is enough is it not?
Posted on Reply
#58
Zubasa
CyberCTDid you read that I have XP? I only have DX9 on my PC, which is perfectly fine. Even though I have an X360 Id much rather have my games running at 60fps at 1080p than 30fps at 720p. To each their own.
Yes I did read you have XP, just because you are running on XP does not mean the issue can be ignored.

Newer games are developed with DX10 and 11 in mind, just because you don't care about quality doesn't mean quality does not matter.
There is a increase in quality from DX9 to DX10 and also to DX11 (with tessellation which is the topic), but significant or not is your opinion.
The point of whole nVidia's "3D Vision" is also to in a way increase quality "to the eye", if you don't care about that why bother posting?

Eyefinity (the "stupid multi-monitior gaming" as you put it) on the other hand is a pretty much fail safe feature.
It does not require a high refresh-rate monitor, ultra high-end hardware or expensive shutter glasses.

I think I have gone off-topic far enough, I will leave it here.
Posted on Reply
#59
temp02
Just one question, did they use different versions of a software to compare different hardware like the benchmark result "released" the other day (were nVidia used a custom yet to be released to public version of Heaven Benchmark, v1.1) or is this an "OK" result where they did both benchmarks (on both cards, on the same day) using the same version of the software?
Posted on Reply
#60
Zubasa
temp02Just one question, did they use different versions of a software to compare different hardware like the benchmark result "released" the other day (were nVidia used a custom yet to be released to public version of Heaven Benchmark, v1.1) or is this an "OK" result where they did both benchmarks (on both cards, on the same day) using the same version of the software?
From the way it looks it is still the same "yesteryear" graph.
Posted on Reply
#61
Mussels
Freshwater Moderator
honestly, i shouldnt have said the talking out your arse part. typing out your arse would have made more sense - apologies on being rude, regardless.
Posted on Reply
#62
Black Panther
If this card runs slightly better than the 5870 but results to be much more expensive (as has always been the norm for Nvidia), then one would be better off buying a 5970 I guess.
Posted on Reply
#63
simlariver
rizla1Who in the world uses 6 monitors for gaming other than the few super rich? :wtf: and for desktop 2 is enough is it not?
it's not only for gaming, having multiple monitors for visualization, remote management, cad design, movie editing, security, etc... Is definitely a plus.
Posted on Reply
#64
Mussels
Freshwater Moderator
simlariverit's only for gaming, having multiple monitors for visualization, remote management, cad design, movie editing, security, etc... Is definitely a plus.
my brother works as a security guard where they use some ridiculously expensive hardware to get multi monitor for the banks of security cameras - a single eyefinity 5/6 card would solve all their issues.
Posted on Reply
#65
locoty
3D, i enjoy it only in movie, not games

multi monitor?Eyefinity? if it's not good, why nvidia copy it? why they bother to release Nvidia Surround

Someday, Samsung, LG, Dell or other LCD maker will release a monitor which its bezel can be taken off for the purpose of multi monitor

FERMI will be out next week, and i think ATI prepare 5890 to counter it, that's why ATI prohibited its partner to release some high-clocking 5870, because it is served for 5890
Posted on Reply
#66
Mussels
Freshwater Moderator
locotySomeday, Samsung, LG, Dell or other LCD maker will release a monitor which its bezel can be taken off for the purpose of multi monitor
i'm expecting that within a year or two actually, or at the very least a very thin bezel on the sides. Until eyefinity, there was no real market to bother doing it for.
Posted on Reply
#67
runnin17
SasquiWhy compare one selective synthetic benchmark against a 5870, then turn around and provide a whole performance chart relative to a GTX 285?

Confused marketing.
Welcome to the wonderful world of nVidia marketing. They know that their card won't offer any true performance gains for the price and power draw so they are relying on the nVidia fanboys to do all the buying and recommending of the card to others for them.

nVidia = fail.

I do hope the card is not a complete bust though, b/c I want some price wars.

PRICE WARS = Crossfire or ever Tri-fire 5870's for me :rockout::rockout::rockout::rockout:
Posted on Reply
#68
Fourstaff
the_wolf88Anyway 6 days to go and we'll see who is the KING OF HELL !!
Its obviously Kratos.
Posted on Reply
#69
newtekie1
Semi-Retired Folder
[I.R.A]_FBiBut if the aim is X amount of performance and not a certain amount of money spent or a certain part why should it matter?
I've never bought a part based on the aim of a certainly amount of performance, and I don't think many others do either. I always buy the best part the budget can afford.

Comparing overclocked results to stock results in discussions like this always makes the person look silly.
Posted on Reply
#70
Unregistered
newtekie1I've never bought a part based on the aim of a certainly amount of performance, and I don't think many others do either. I always buy the best part the budget can afford.

Comparing overclocked results to stock results in discussions like this always makes the person look silly.
and on that note every Benchmark I see using ANY OC's gets instantly dismissed by me
which just so happens to be about 90% of benchmarks on the web..That is unless it's showing the difference between stock and a OC
Posted on Edit | Reply
#71
Imsochobo
CyberCTDid you read that I have XP? I only have DX9 on my PC, which is perfectly fine. Even though I have an X360 Id much rather have my games running at 60fps at 1080p than 30fps at 720p. To each their own.



Well, like you also said, to each their own. I see absolutely no reason for a 3 (let alone 6) monitor setup for gaming. Maybe for day trading stocks, but nothing else. I couldn't care less if I either turn my eyes left or right to see a car on my side, or to quickly press a buton to see the same thing on my single monitor. And it would be more than a grand to get a 6 monitor setup (worthless for anything other than PC eye candy) with a very thin bezel, how much power would they consume alltogether, and would it run at 60fps on the same setup as a single monitor? I find it strage you say you have a hard time doing 3D in DX10/11 because Nvidia has native support for stereo 3D in Vista or Windows 7 only (DX9/DX10/11 only not XP), and the reviews say it works great. I have to use D3D drivers or IZ3D drivers for stereoscopic 3D, both of which work fine for me. You can call 3D a gimmick, but I call 6 monitors a gimmick, especially since there are thick bezels in between monitors. If the bezels were .01" thick, then maybe I could agree with you. I've seen youtube videos (and even at Bestbuy as a console setup) of a 6 monitor setup and it looks retarded. Like looking through a paned window:

www.youtube.com/watch?v=X6jYycRmWz4

3D, however, looks amazing as everything jumps out at you. I don't play like that daily. Just once in a while to get a treat or to show it off. An ultra highrez monitor shrunk to 23" that supports 2560 x 1600 would be awesome cause there would be no need for FSAA. Until then, a 56" DLP would probably take up more real estate as a 6 monitor setup, consume less power, and be cheaper. It could also be used as a regular TV, monitor, and have no thick bezels inbetween. It could also do 3D if I please.

But whatever, to each their own.
Update to win7 and see that you get DX11 75fps 1080 P instead of DX9 60 fps 1080P

Its true ;)
Posted on Reply
#72
phanbuey
ImsochoboUpdate to win7 and see that you get DX11 75fps 1080 P instead of DX9 60 fps 1080P

Its true ;)
no its not. because he gets 60 fps whether he gets 60fps or 150000fps, he picked that number because he knows what he's talking about. Alot of consoles render 30 because they cant render the whole 60.
Posted on Reply
#74
Mussels
Freshwater Moderator
far cry 2 shows up an awful lot there... its not even DX11 (and its an nvidia sponsored game, regardless)

the performance gains just arent very high, considering how much more powe this card uses at load.
Posted on Reply
Add your own comment
Apr 25th, 2024 18:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts