Friday, March 19th 2010

NVIDIA Claims Upper Hand in Tessellation Performance

A set of company slides leaked to the press reveals that NVIDIA is claiming the upper hand in tessellation performance. With this achievement, NVIDIA is looking to encourage leaps in geometric detail, probably in future games that make use of tessellation. NVIDIA's confidence comes from the way its GF100 GPU is designed (further explained here). Each GF100 GPU physically has 16 Polymorph Engines, one per streaming multiprocessor (SM) which helps in distributed, parallel geometry processing. Each Polymorph Engine has its own tessellation unit. With 15 SMs enabled on the GeForce GTX 480 and 14 on the GeForce GTX 470, there are that many independent tessellation units.

NVIDIA demonstrated its claims in the presentation using the Unigine Heaven, where the GeForce GTX 480 was pitted against a Radeon HD 5870. In many scenes where tessellation is lower, the GPUs performed neck-and-neck, with the GTX 480 performing better more often. But in scenes with heavy tessellation (particularly the "dragon" scene, where a highly detailed model of a dragon needs to be rendered with densely tessellated meshes, the GTX 480 clocks nearly a 100% performance increment over the HD 5870. NVIDIA has been confident about the tessellation performance back since January, when it detailed the GF100 architecture. The GeForce GTX 400 series graphics cards will be unveiled on the 26th of March.

Images Courtesy: Techno-Labs
Add your own comment

145 Comments on NVIDIA Claims Upper Hand in Tessellation Performance

#1
SetsunaFZero
rizla1 said:
wait for the gtx 460 . that will be a gud card. for middleclass
yes, like the gtx260 series but only if the price is good :)
Posted on Reply
#2
Mussels
Moderprator
yay, nvidia is faster in a single feature no one cares about.

If they were faster in games, they'd be advertising that - and they arent.

Fermi looks like its still going to be a massive fail.
Posted on Reply
#3
_33
Now if there could be other games than Dirt 2 that actually use Tesselation... ;)
Posted on Reply
#4
[I.R.A]_FBi
fermi is 1.5 times better at making em sweat, it is a market leader!
Posted on Reply
#5
simlariver
Mussels said:
yay, nvidia is faster in a single feature no one cares about.

If they were faster in games, they'd be advertising that - and they arent.

Fermi looks like its still going to be a massive fail.
yup, not only the performance is behind, but the feature-sert as well. No multimonitor gaming (you can game on max 3 monitor with a SLI setup, fail) and 3D gaming uses proprietary expensive glasses (not that anyone cares about 3D gaming) and still no word for 7.1 sound on the hdmi output. probably non-existent.
Posted on Reply
#6
CyberCT
simlariver said:
yup, not only the performance is behind, but the feature-sert as well. No multimonitor gaming (you can game on max 3 monitor with a SLI setup, fail) and 3D gaming uses proprietary expensive glasses (not that anyone cares about 3D gaming) and still no word for 7.1 sound on the hdmi output. probably non-existent.
Wrong. Multimonitor gaming is stupid. Who in their right mind is going to pay for 6 monitors to play a game, surrounded by a monitor bezel between each monitor which interrupts the total screen? That defeats the whole purpose.

3D gaming is awesome and until you experience it first hand, you have no idea what your talking about or smoking. The glasses are not that expensive and I have yet to hear one of my friends (both male and female) say 3D gaming is not amazing on my 56" DLP.

I agree that Nvidia dropped the ball this round. But your other statements make absolutely no sense and you're a fanboy. Owned. :slap:
Posted on Reply
#7
Mussels
Moderprator
i've used 3D glasses for gaming recently, and back when they first came out (Geforce 3 era)

They sucked then, and they suck now.

Multi monitor gaming is not as bad as you think it is, LCD's have fairly thin bezels, and thats the point of running 3 monitors - you dont have a bezel in front of you, you just have extra peripheral vision if you turn your head.

Am i interested in 3 monitors for FPS games? no, not at all. But at the same time i know just how flawed 3D gaming is and how problematic it is. Its doomed to failure all over again.

Heres a few facts: it requires 120Hz screens, and your in game FPS to be 120 (or in some cases, just 60FPS doubled) - thats a CONSTANT 60FPS, not a max, not an average. you drop your FPS to 30, and you're getting a nauseating slideshow.

it requires powered 3D glasses to use - that means batteries or corded. they're expensive, and a pain to replace should they break.

DirectX 10 has no way to force refresh rates, i assume DX11 is the same. That means in most games you play these days, you cant force 120Hz - so 3D wont work (this ofc may get fixed over time... but its taken several years now, so i dont see it happening any time soon)
Posted on Reply
#8
CyberCT
Mussels said:
i've used 3D glasses for gaming recently, and back when they first came out (Geforce 3 era)

They sucked then, and they suck now.

Multi monitor gaming is not as bad as you think it is, LCD's have fairly thin bezels, and thats the point of running 3 monitors - you dont have a bezel in front of you, you just have extra peripheral vision if you turn your head.

Am i interested in 3 monitors for FPS games? no, not at all. But at the same time i know just how flawed 3D gaming is and how problematic it is. Its doomed to failure all over again.

Heres a few facts: it requires 120Hz screens, and your in game FPS to be 120 (or in some cases, just 60FPS doubled) - thats a CONSTANT 60FPS, not a max, not an average. you drop your FPS to 30, and you're getting a nauseating slideshow.

it requires powered 3D glasses to use - that means batteries or corded. they're expensive, and a pain to replace should they break.

DirectX 10 has no way to force refresh rates, i assume DX11 is the same. That means in most games you play these days, you cant force 120Hz - so 3D wont work (this ofc may get fixed over time... but its taken several years now, so i dont see it happening any time soon)
I don't follow what your're saying. I use the NVIDIA control panel to force vsync and it works. I don't understand your problem with 3D gaming. It works perfectly for me in all games except Crysis. Well, actually, Crysis works fine in 3D if I set everything to low. But it still looks incredible on low. You could get a large DLP from Mitsubishi that supports 3D much cheaper than you would get for 6 LCD monitors. Not to mention the power consumption benefit too. The 3D glasses were $100 for two BTW. Not that expensive at all consideriing the final result. I have the Samsung HL56A650. Awesome TV for the price almost 2 years ago.

Again. Perhaps you should try it now. I have a GTX 285 FTW from EVGA and my E8500 OC'd to 3.8 GHz. Works great with 3D gaming like I mentioned above. Sadly, I think I'll sit out this round until the next gen of cards that are worth the price. My GFX card is awesome and runs great with everything except Crysis.
Posted on Reply
#9
newtekie1
Semi-Retired Folder
Steevo said:
Even at stock I never went below 20FPS during the Dragon scene, At my current overclock i stay about 25FPS at the dragon scene, and that is before the new drivers (10.3a). Plus the benchmark Nvidia was using was version 1.1 according to many, that included up to a 30% performance increase during heavy tessellation by aggressive culling.


So their effective 100% boost can be cut down 30% by the use of the new Heaven benchmark, plus the new drivers ATI is releasing 10-20%, and the ability to overclock most 5870 by at least 20%.


So this is Nvidias fancy way of saying by using selective math we can bend numbers our way, all day. Just like I could say.

Available DX11 Video card manufacturers.

ATI



So ATI has the fastest DX11 video card on the market in any and all configurations.
I love it every time someone uses the "but the slower part can be overclocked to match the faster, ignore the fact that the faster part can be overclocked also, that isn't important..."

I'm waiting for some real reviews before I make any judgement. Personally, if they both perform equally in DX10 games, I don't care as they will both destroy DX10 games. I'm more interested in DX11 performance, mainly tessellation, so if the GTX280/270 is better at that, then that IMO is the imporant thing.
Posted on Reply
#10
Super XP
[I.R.A]_FBi said:
fermi is 1.5 times better at making em sweat, it is a market leader!
If NVIDIA's Marketing Team wanted, they can easily sell a Bowl of Soup, name it GTX 450, put a nice sticker on it, with a heatsink & fan, post some benchmarks and most likely you would have people go nuts for the darn thing :laugh:
CyberCT said:
Wrong. Multimonitor gaming is stupid. Who in their right mind is going to pay for 6 monitors to play a game, surrounded by a monitor bezel between each monitor which interrupts the total screen? That defeats the whole purpose.

3D gaming is awesome and until you experience it first hand, you have no idea what your talking about or smoking. The glasses are not that expensive and I have yet to hear one of my friends (both male and female) say 3D gaming is not amazing on my 56" DLP.

I agree that Nvidia dropped the ball this round. But your other statements make absolutely no sense and you're a fanboy. Owned. :slap:
Multi-Monitor for gaming is awesome if you think about it. Though I don't think its for everybody, its an option that some will buy into providing you got the funds. You don't have to buy 3 monitors right away, you can buy one then wait a bit save up and buy another. They also have some nice thin bezels out there.
I am not a fan of 3D glasses, I think its stupid that you have to where glasses to play games IMO. Some may like it and others may not. What would be interesting is if they can do it without wearing glasses.

Just like 3D HDTV's which companies are trying to sell to us, it stinks and I hope it fails unless they stop ripping us off by forcing us to pay $300+ for one pair of 3D glasses. Most if not all HDTV's out today can already do 3D "WITHOUT" 3D glasses. That is what they should be pushing for, not this nonesense 3D glasses and so called 3D HDTV's.
Posted on Reply
#11
[I.R.A]_FBi
newtekie1 said:
I love it every time someone uses the "but the slower part can be overclocked to match the faster, ignore the fact that the faster part can be overclocked also, that isn't important..."
But if the aim is X amount of performance and not a certain amount of money spent or a certain part why should it matter?
Posted on Reply
#12
afw
This is like taking forever ... i want to see some reviews ... :pimp:
Posted on Reply
#13
eidairaman1
The Exiled Airman
if they were so confident in the performance why wasn't the board released in Late Jan Middle February. TBH Nvidia has made so many promises they can't keep it's just like the Current President of the United States (One Big Ass Mistake America)!!! And no you can't blame the previous stuff as this one has done nothing but Lied to our faces and also has tried saying the others are Idiots when the others have way better plans and ideas than Current losers do.

3D Glasses Tech is such a gimmick, it was just a money stealing technique when you can make your own 3D glasses tech for about 25 Cents?

Mussels said:
i've used 3D glasses for gaming recently, and back when they first came out (Geforce 3 era)

They sucked then, and they suck now.

Multi monitor gaming is not as bad as you think it is, LCD's have fairly thin bezels, and thats the point of running 3 monitors - you dont have a bezel in front of you, you just have extra peripheral vision if you turn your head.

Am i interested in 3 monitors for FPS games? no, not at all. But at the same time i know just how flawed 3D gaming is and how problematic it is. Its doomed to failure all over again.

Heres a few facts: it requires 120Hz screens, and your in game FPS to be 120 (or in some cases, just 60FPS doubled) - thats a CONSTANT 60FPS, not a max, not an average. you drop your FPS to 30, and you're getting a nauseating slideshow.

it requires powered 3D glasses to use - that means batteries or corded. they're expensive, and a pain to replace should they break.

DirectX 10 has no way to force refresh rates, i assume DX11 is the same. That means in most games you play these days, you cant force 120Hz - so 3D wont work (this ofc may get fixed over time... but its taken several years now, so i dont see it happening any time soon)
you can include no Vista and 7 Support for NF2 and 3 motherboards from them either (greedy Bastards)

MarcusTaz said:
Did not Nvidia just release a driver that caused video cards to overheat and fry? ;)

I will stick with ATI as far as the eye can see. Nvidia will not get my business, not after their 7900GTX debacle vista driver and memory 2D sync issues and never fessing up to it. They took my money then and will not take it again. Plus everyone knows they cheat with benchmarks. :D

I am very happy with my HD5850 thank you that is green and runs cool...:)
Posted on Reply
#14
Mussels
Moderprator
CyberCT said:
I don't follow what your're saying. I use the NVIDIA control panel to force vsync and it works. I don't understand your problem with 3D gaming. It works perfectly for me in all games except Crysis. Well, actually, Crysis works fine in 3D if I set everything to low. But it still looks incredible on low. You could get a large DLP from Mitsubishi that supports 3D much cheaper than you would get for 6 LCD monitors. Not to mention the power consumption benefit too. The 3D glasses were $100 for two BTW. Not that expensive at all consideriing the final result. I have the Samsung HL56A650. Awesome TV for the price almost 2 years ago.

Again. Perhaps you should try it now. I have a GTX 285 FTW from EVGA and my E8500 OC'd to 3.8 GHz. Works great with 3D gaming like I mentioned above. Sadly, I think I'll sit out this round until the next gen of cards that are worth the price. My GFX card is awesome and runs great with everything except Crysis.
forcing Vsync has nothing to do with games not supporting the higher refresh rate - when i first used it, if your game couldnt be forced to 120Hz, 3D didnt work. Perhaps thats different now, and it only needs to run at 60 and the drivers do the rest.
Posted on Reply
#15
CyberCT
Super XP said:
Multi-Monitor for gaming is awesome if you think about it. Though I don't think its for everybody, its an option that some will buy into providing you got the funds. You don't have to buy 3 monitors right away, you can buy one then wait a bit save up and buy another. They also have some nice thin bezels out there.
I am not a fan of 3D glasses, I think its stupid that you have to where glasses to play games IMO. Some may like it and others may not. What would be interesting is if they can do it without wearing glasses.

Just like 3D HDTV's which companies are trying to sell to us, it stinks and I hope it fails unless they stop ripping us off by forcing us to pay $300+ for one pair of 3D glasses. Most if not all HDTV's out today can already do 3D "WITHOUT" 3D glasses. That is what they should be pushing for, not this nonesense 3D glasses and so called 3D HDTV's.
I never understood what was so cool about multimonitor gaming. My monitor is 23" and I can't imagine putting more monitors on my desk (they won't fit anyway) to a "wow" factor. I feel this is a countermeasure by ATI against Nvidia's native support for 3D gaming. Heck, I can play nearly all my games at 1080p at 60fps on my 56" HDTV and they look stunning. The TV isn't sharp enough to show jaggies beyond 2x FSAA so the games look flawless. And that's about the same if not more real estate than 6 monitors next to each other (also you have to add the hardware to mount all of them).

If you ever tried 3D gaming you would be absolutely amazed. Games like Bioshock (wow), Left 4 Dead, Mirrors Edge, Crysis (low), etc ... the list goes on. They all run at 120 fps so they run very smooth in 3D. Like I said, eveyone guy and girl I've had over my place just to whitness it was absolutely stunned, like me. You absolutely must try it.

I can't understand why you're saying you hope it fails. I hope it catches ground if anything.

There is a possibility of having 3D without glasses. I forget which company showcased it at some trade show (maybe Samsung) but it was preliminary and still a ways off from mainstream production. Without glasses would be much better, I agree with you there.

I have no idea where you're getting $300 for glasses. Mine cost me $50 for one pair for the tridef setup. Shop around and you'd be amazed what you could find.
Posted on Reply
#16
Mussels
Moderprator
supreme commander worked well for multi monitor, one monitor was the game, the other a giant version of the minimap (with full zoom functionality - same as first screen, but no HUD)

multimonitor never took off in games because until now, very very few (matrox only really) video cards allowed THREE monitor at once. HD5K is the first gaming grade cards to offer that, so you dont have a gap right smack bang where your crosshair would be in an FPS game.


so you're saying you can run all those games at 120FPS constant, at what resolution and settings, on what hardware?
Posted on Reply
#17
eidairaman1
The Exiled Airman
Sorry 50 USD for a Gimmick that doesn't look good wearing them out in the world or provide any functionality other than the Monitor is lamer than buying the Worlds Best Pair of Oakley's or Ray Ban.


CyberCT said:
I never understood what was so cool about multimonitor gaming. My monitor is 23" and I can't imagine putting more monitors on my desk (they won't fit anyway) to a "wow" factor. I feel this is a countermeasure by ATI against Nvidia's native support for 3D gaming. Heck, I can play nearly all my games at 1080p at 60fps on my 56" HDTV and they look stunning. The TV isn't sharp enough to show jaggies beyond 2x FSAA so the games look flawless. And that's about the same if not more real estate than 6 monitors next to each other (also you have to add the hardware to mount all of them).

If you ever tried 3D gaming you would be absolutely amazed. Games like Bioshock (wow), Left 4 Dead, Mirrors Edge, Crysis (low), etc ... the list goes on. They all run at 120 fps so they run very smooth in 3D. Like I said, eveyone guy and girl I've had over my place just to whitness it was absolutely stunned, like me. You absolutely must try it.

I have no idea where you're getting $300 for glasses. Mine cost me $50 for one pair for the tridef setup. Shop around and you'd be amazed what you could find.
Posted on Reply
#18
CyberCT
Mussels said:
supreme commander worked well for multi monitor, one monitor was the game, the other a giant version of the minimap (with full zoom functionality - same as first screen, but no HUD)

multimonitor never took off in games because until now, very very few (matrox only really) video cards allowed THREE monitor at once. HD5K is the first gaming grade cards to offer that, so you dont have a gap right smack bang where your crosshair would be in an FPS game.


so you're saying you can run all those games at 120FPS constant, at what resolution and settings, on what hardware?
I have an E8500 OC'd to 3.8 GHz and an EVGA GTX285 FTW edition running on XP at 1080p on my HDTV for 3D gaming. Like I said, I'll probably pass on this round of cards because I'm a bit disappointed with their performance (only taking NVIDIA's preliminary review numbers though). My card runs everything fine. I can OC my CPU and GFX card more too, just in case. They work great!
Posted on Reply
#19
eidairaman1
The Exiled Airman
CyberCT said:
I have an E8500 OC'd to 3.8 GHz and an EVGA GTX285 FTW edition running on XP at 1080p on my HDTV for 3D gaming. Like I said, I'll probably pass on this round of cards because I'm a bit disappointed with their performance (only taking NVIDIA's preliminary review numbers though). My card runs everything fine. I can OC my CPU and GFX card more too, just in case. They work great!
Heres the other problem, games are starting to transition to Vista and 7 only, basically Meaning DX 10- DX 11 support only, no DX 9 Support any longer.
Posted on Reply
#20
CyberCT
eidairaman1 said:
Sorry 50 USD for a Gimmick that doesn't look good wearing them out in the world or provide any functionality other than the Monitor is lamer than buying the Worlds Best Pair of Oakley's or Ray Ban.
Haha!! Whatever. Enjoy your 6 monitor setup. That will look funnier on a desk than any type of buddy holly glasses on your face. To each their own. You don't know what you're missing out on. I guess ignorance is bliss??
Posted on Reply
#21
CyberCT
eidairaman1 said:
Heres the other problem, games are starting to transition to Vista and 7 only, basically Meaning DX 10- DX 11 support only, no DX 9 Support any longer.
Well I have a HUGE library of games that supports DX9 and runs at 120 FPS to matchthe 120 Hz on my HDTV, so no worries here.
Posted on Reply
#22
eidairaman1
The Exiled Airman
CyberCT said:
Well I have a HUGE library of games that supports DX9 and runs at 120 FPS to matchthe 120 Hz on my HDTV, so no worries here.
ensure you are using multiple quote button at bottom of each post you want in a single post and not posting 1 right after the other. Vista and 7 only included DX 9 support so you can play your games from around 2002/2003 to 2009.
Posted on Reply
#23
Mussels
Moderprator
so in other words cyberCT, you're talking out your arse.

I specifically mentioned DX10 and 11 for both performance and refresh rate reasons and you told me that yours works fine - but if you're in XP, you cant use DX10 or 11.

Please dont try and tell us everything is awesome and great, when you arent running at even medium graphics (DX10) in modern games.

(stalker CoP, Metro 2033 and bad company 2 all offer DX9, 10 and 11 modes - and DX11 is the best looking and most demanding in all of them)
Posted on Reply
#24
CyberCT
Mussels said:
so in other words cyberCT, you're talking out your arse.

I specifically mentioned DX10 and 11 for both performance and refresh rate reasons and you told me that yours works fine - but if you're in XP, you cant use DX10 or 11.

Please dont try and tell us everything is awesome and great, when you arent running at even medium graphics (DX10) in modern games.

(stalker CoP, Metro 2033 and bad company 2 all offer DX9, 10 and 11 modes - and DX11 is the best looking and most demanding in all of them)
LOL! :laugh: I'm not talking out of my arse. I know I can't use DX10 or DX11 in XP. Until the next gen of consoles, DX10 or DX11 will take off very slowly because obviously the X360 & PS3 run on DX9.

Tell me what I'm majorly missing from DX10 & DX11 from DX9. My buddy has Windows 7 and an ATI 5850 card. Far Cry 2 looks "barely" better in DX10. Halo 2 looks like a high rez X360 version in DX10. It's another game that could have been done in DX9 (and others have made it work on DX9 on the net) but it was a ploy by MS to push Vista, just like Crysis "Very High" mode which is easily done on DX9 with a few changes to the files. We both have BC2 also, and I do see a little bit of a difference in DX11. But not enough to make me excited. Actually, I'll bet Dice could have made it look just as good under DX9 but they didn't try perhaps because of MS. Heck, FSAA isn't possible in BC2 in DX9, and that's just BS in my book because it runs better on my PC than my friend's without FSAA on either system. FSAA in DX9 is possible in the rest of the games I own.

I have no interest in stalker CoP, Metro 2033 , or the new AVP game. Maybe DIRT2 but I'll wait until later this year / next year when I upgrade and I'm sure it will be packaged with the DX11 card I buy. Look at people's opinions around the web on DX11 games thus far. People aren't that enthused about how much "better" games look compared to the DX9 version, nor how much slower they run.

All my games that are DX9 I run at max settings with at least 2XFSAA (except Crysis) and they run at a very fluid frame rate without slowdown. I don't have any games that are DX8 or below installed on my system. Only DX9.

Until we see the Xbox 720 & PS4 we are stuck at a stalemate for mainstream technology progression.
Posted on Reply
#25
AsRock
TPU addict
If true they beat a 6 month old card lol. I'm sure W1z will prove what the truth is and what is not :).
Posted on Reply
Add your own comment