• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Pointless tesselation.

Status
Not open for further replies.
Actually the article isn't really a FAIL.

Here is what is says:


If you don't want to ready all that:

Tesselation is good when you can see the difference, not in a flat object or invisilbe water that runs beneath the ground than you can see.

Answers to this are lack of time, incompetence or lack money.


Another explanation is nvidia is behind all this, including the update to dx11

In this case tesselation slow down Nvidias 17-21% and ATIs 31-38%

It can mean that ATI's cards suck at Tessellation too.
 
Actually the article isn't really a FAIL.

Here is what is says:


If you don't want to ready all that:

Tesselation is good when you can see the difference, not in a flat object or invisilbe water that runs beneath the ground than you can see.

Answers to this are lack of time, incompetence or lack money.

Another explanation is nvidia is behind all this, including the update to dx11

In this case tesselation slow down Nvidias 17-21% and ATIs 31-38%

Nvidia is known to have better tesselation. Almost all other benchs prove this. Also Nvidia worked with Crytek to optimize it better on THIER hardware. They didnt cripple ATI. They just make it work better on Nvidia. Fuck we have this conversation EVERY DAMN TIME Nvidia touches a game.

Anyway the game sucks so who cares.

It can mean that ATI's cards suck at Tessellation too.

The 6xxx series doesn't. Not by a long shot.
 
And water is NOT being rendered, just like nothing that is behind anything else is being rendered either.

actually it gives the GPU a lot of work:

From the same basic vantage point, we can whirl around to take a look at the terra firma of Manhattan. In this frame, there's no water at all, only some federally mandated crates (this is an FPS game), a park, trees, and buildings. Yet when we analyze this frame in the debugger, we see a relatively large GPU usage spike for a certain draw call, just as we saw for the coastline scene above.

The tessellated water mesh remains in the scene, apparently ebbing and flowing beneath the land throughout, even though it's not visible. The GPU is doing the work of creating the mesh, despite the fact that the water will be completely occluded by other objects in the final, rendered frame. That's true here, and we've found that it's also the case in other outdoor areas of the game with a coastline nearby.

Obviously, that's quite a bit needless of GPU geometry processing load. We'd have expected the game engine to include a simple optimization that would set a boundary for the water at or near the coastline, so the GPU isn't doing this tessellation work unnecessarily.
 
It can mean that ATI's cards suck at Tessellation too.

ATI have less tesselation power that is correct, but, why have tesselation when you don't need it?

Also Nvidia worked with Crytek to optimize it better on THIER hardware. They didnt cripple ATI. They just make it work better on Nvidia. Fuck we have this conversation EVERY DAMN TIME Nvidia touches a game.

is it a long way to think that one company payed the other one to screw a 3rd one?

nvidia contacts crytek and says "I will give you money to make the update to DX11, but, you have to arrange something to cripple ATI in this game"...
 
actually it gives the GPU a lot of work:

I understand the article perfectly thank you very much. I also happen to know how to interpret it (unlike most commenting apparently), so I know why the autor is completely wrong. The key in that statement is "relatively". Relative to what? The author then goes and shows the barrier, the wooden planks and some bricks. All of them are far more detailed than water as can be seen, and they represent a much much bigger load than the water. There's literally dozens of those barriers, and wood planks and bricks, and countless similar objects on many scenes, and dozens of trees with thousands of leaves (at least 2 polys each) and a lot of other geometry consuming scenery. Like water, probably 80% + of those objects along with all of it's polygons are occuded (behind another object) and hence theyare not rendered. Also rendering is a lot more than loading up the triangles. In fact loading up the triangles is only a very very small portion of rendering. So again the water is relatively big compared to what? Compared to the millions of other polys onscreen? Certainly not. Compared to the millions and millions of pixel shader code running for every single frame. Again NO.

Anyone who has used the editor knows that when you create a new scene water s there by default. And creating an island surrounded by jaw dropping ocean, is as easy as using a heigh map to represent the island. Crytek chose to have a very small compromise on performance by "leaving the water there", in exchange to have a very easy SDK for creating the beautiful islands of Crysis. Period.
 
this has nothing to do with nvidia or ATI.

this is crytek being lazy, and not putting in a proper implementation of tessellation. they promised it, and didn't have it. they only had so much time and probably no budget.

@bene
i don't know about the mesh and how much of a load it creates. but i do know it makes absolutely no sense to have such a dense mesh on a slab of cement. it seems they just converted it without optimizing it.
 
To bad i uninstalled the game before the DX11 patch. I finished it and i didn't enjoy it enough to have it keep up disk space.

Was waiting for the DX11 patch but now i don't care.
 
is it a long way to think that one company payed the other one to screw a 3rd one?

nvidia contacts crytek and says "I will give you money to make the update to DX11, but, you have to arrange something to cripple ATI in this game"...

Just look up Batman:AA on this very forum and you will see how dead that horse is.

beating-a-dead-horse.jpg
 
Anyone who has used the editor knows that when you create a new scene water s there by default. And creating an island surrounded by jaw dropping ocean, is as easy as using a heigh map to represent the island. Crytek chose to have a very small compromise on performance by "leaving the water there", in exchange to have a very easy SDK for creating the beautiful islands of Crysis. Period.
Lazy, but I can understand.

@bene
i don't know about the mesh and how much of a load it creates. but i do know it makes absolutely no sense to have such a dense mesh on a slab of cement. it seems they just converted it without optimizing it.
Thats what I was trying to point out. no point in having a gazilion polygons there when it is no needed.
 
i don't know about the mesh and how much of a load it creates. but i do know it makes absolutely no sense to have such a dense mesh on a slab of cement. it seems they just converted it without optimizing it.

It does make sense if you want to use it for displacement mapping and they are using it for displacement mapping. So whenever there's a "crack" or an "scratch" in the cement you need so much density or it will look like crap, and IMHO more is really needed, in the future, I'm not completely satisfied with the end result, because it's still "blocky".

Is it completely optimized? No, they could have used variable densities, depending on where it's most needed, but then again that, besides the obvious need of man-hours while completely time constrained, it would also put a lot more strain on domain and hull shaders (configurable part of tesselation pipeline) as well as widening the rendering pipeline even further (also taking up more CPU time) and increasing the input lag, between other considerations. The added strain on hull and domain shaders alone, which are calculated on SPs, in an engine which is already heavy on shaders as is, may actually degrade performance more than "brute forcing" the triangle density to that required to satisfactorily render the cracks and scratches. That's something we don't know.

Thats what I was trying to point out. no point in having a gazilion polygons there when it is no needed.

And who says when it's needed and when not? That's the question I want answered, because IMO that much geometry was needed for what it's being rendered and I would not use any less triangles than they are using. And yes there's areas which are more flat, they are never flat, but since they are almost flat they could be made flat without a "great loss" like they've been doing for years, but now we are discussing about Blu-Ray movie versus 700 MB Divx movie.
 
Last edited:
Also the statement "They simply underestimated the emphasis that NVIDIA would place on tessellation" is simply wrong. First of all it's not Nvidia using it, it's developer using it*. I'm not Nvidia and I was expecting tesselation to be used in the way it's being used in Crysis 2, because that's how tesselation has been used in "offline" renderers for the past 20 years. Expecting anything different is stupid, so I don't think that's what happened. I mean it's not like tesselation is new and has not been used before... if AMD failed to understand what tesselation would be used for, what it's desirable to use it for, then it's their own fault. As a gamer all I know is that I can get a 130 euros card right now and play with those nice effects albeit on low res or pay $300 and play it maxed out. And it's not alien tech selling for $5000 we are talking about, anyone can buy those things, for the same perf/price as the competition, if they feel tesselation is important and they see a benefit and don't care about brand.

* I don't remember anyone complaining or calling developers out when FEAR and Oblivion were launched and they ran much faster on the Ati X1900, because they extensively used pixel shader effects to levels and quality mostly never seen before and Ati had the good initiative of putting 4 pixel shaders per pipeline (4:1 ratio) on their design as oposed to the traditional 1:1 designs used until then. 5 years later the ratio is a lot bigger and for a good reason, and games use pixel shaders far more extensively and also for a good reason. It's not different with tesselation, except that nowadays it seems that everyone wants to believe that everything thst helps Nvidia is because of some dodgy thing and not design.

Firstly... The statement is not wrong. NVIDIA did put a massive emphasis on the power of their Fermi architecture when it comes to tessellation.

Secondly... yes, the tesselation engine has been in the ATI / AMD hardware since the 2900. I know that. They updated it to full DX11 compatability for the 5000 series, as I am sure you are aware. This is a similar argument that was brought up back when NVIDIA had the FX series, which where "fully compatible" with DX9, but sucked AT CERTAIN ASPECTS OF IT. NVIDIA fixed the issues they had with the release of the GeForce 6 series, and AMD have done a similar thing in that they have tried to solve their tessellation shortcomings with the 6xxx series, and as I mentioned I am sure the 7xxx series will be even better.

Let us remember that NVIDIA had major shortcomings when it came to geometry rendering with the 8 and 200 series geforce cards, which they addressed with the Fermi architecture... I am not complaining about NVIDIA here at all. Their cards are good. So are AMD's. At the end of the day what the article is trying to get across is that there are issues with the engine that should have been addressed.

I also have a theory that as they were testing on Radeon cards, there is perhaps a bug with the AMD drivers causing the "underground ocean".

Everyone complains about manufacturers when their hardware doesn't live up to their expectations, and sometimes SOME people go overboard and accuse the competing manufacturer of foul play. I have not once said that NVIDIA is guilty of foul play, so please don't make it look that way when you quote me :) You clearly know at least some of what you are talking about, and I do agree with some of your statements. I wouldn't mind being afforded the same courtesy.
 
Last edited:
Firstly... The statement is not wrong. NVIDIA did put a massive emphasis on the power of their Fermi architecture when it comes to tessellation.

I meant that it's wrong on that this should never be about besting the competitor, it should be about what's best for us, isn't it? I mean that's what they say anyway (I know what the reality is :laugh:) and that's what the article claims too. And apparently less tesselation, even when it's posible to do a lot of it right here, right now, is the best for us. Less detail is suddenly best for us... oh funny.

EDIT: Because everybody knows that what we need is for benchmarks to run exactly the same on both brands, that's our goal, we do not want the best experience our money can buy, we want equality even if that means lowering detail so that the battlefield is even out. Who wants to force the adoption of techologies by AMD/Nvidia, to force them to catch up, when we can just tone down anything that is higher than the norm. That's exacty what we want, yay technology communism for all!

The bottom line is that AMD should have not tried to match Nvidia, they should have made what was best for us, and given DX11, that means packing as much tesselation power as silicon permits and they failed on that. They were not tinking about offering the best experience they were thinking about besting Nvidia, that's a mistake.

Secondly... yes, the tesselation engine has been in the ATI / AMD hardware since the 2900. I know that. They updated it to full DX11 compatability for the 5000 series, as I am sure you are aware. This is a similar argument that was brought up back when NVIDIA had the FX series, which where "fully compatible" with DX9, but sucked AT CERTAIN ASPECTS OF IT. NVIDIA fixed the issues they had with the release of the GeForce 6 series, and AMD have done a similar thing in that they have tried to solve their tessellation shortcomings with the 6xxx series, and as I mentioned I am sure the 7xxx series will be even better.

Let us remember that NVIDIA had major shortcomings when it came to geometry rendering with the 8 and 200 series geforce cards, which they addressed with the Fermi architecture... I am not complaining about NVIDIA here at all. Their cards are good. So are AMD's. At the end of the day what the article is trying to get across is that there are issues with the engine that should have been addressed.

I also have a theory that as they were testing on Radeon cards, there is perhaps a bug with the AMD drivers causing the "underground ocean".

Everyone complains about manufacturers when their hardware doesn't live up to their expectations, and sometimes SOME people go overboard and accuse the competing manufacturer of foul play. I have not once said that NVIDIA is guilty of foul play, so please don't make it look that way when you quote me :) You clearly know at least some of what you are talking about, and I do agree with some of your statements. I wouldn't mind being afforded the same courtesy.

Oh I'm sure they will "fix" (not that it has to be fixed really but increase the number) tesselation performance on next cards, but that's also admitting it needs to be "fixed", it's admission of defeat already. They fell short, way too short and Nvidia demostrated how it could be done right now right here and on 40nm process (tesselation is not what makes Fermi big, it's not the main reason). And that's simply the reality. Like I said before and like you said both have been ahead at some point with some technologies and there's always been a game developer using that tech to stand above the rest, and most often than not that has always put the balance in favor of either Ati or Nvidia. So I can't nderstand the bitching.
 
Last edited:
Crytek provided a FREE UPGRADE and everyone is crying and butt hurt their rig can't run it. Thats all this is about. If the writer of this article could run this game maxed out and get a massive e-peen boner we wouldn't even know about the underground water. But he went all emo......over a free upgrade.
 
Crytek provided a FREE UPGRADE and everyone is crying and butt hurt their rig can't run it. Thats all this is about. If the writer of this article could run this game maxed out and get a massive e-peen boner we wouldn't even know about the underground water. But he went all emo......over a free upgrade.

Wow. Just wow.

Not a free upgrade. A DELAYED part of the game, as advertised, that doesn't work very well.

So you would rather live in ignorance of the fact that your GPU is doing pointless processing?

Wow.
 
Wow. Just wow.

Not a free upgrade. A DELAYED part of the game, as advertised, that doesn't work very well.

So you would rather live in ignorance of the fact that your GPU is doing pointless processing?

Wow.

They didn't have to provide $%!t. They could have scraped it at anytime and it would have made no difference in the game play. Also do you know EVERY little aspect of how ALL your games render? Do you even care as long as it runs smooth? I don't. It could have a f$#king hamster smoking crack to power a steam engine behind a 50 foot dildo as long as it runs smooth.

You wanted to play Crysis 2? They sold you Crysis 2. They gave you a FREE upgrade with it. You have no grounds to complain.
 
They didn't have to provide $%!t. They could have scraped it at anytime and it would have made no difference in the game play. Also do you know EVERY little aspect of how ALL your games render? Do you even care as long as it runs smooth? I don't. It could have a f$#king hamster smoking crack to power a steam engine behind a 50 foot dildo as long as it runs smooth.

You wanted to play Crysis 2? They sold you Crysis 2. They gave you a FREE upgrade with it. You have no grounds to complain.

I think you're going a bit far, and being a little dramatic here man :P

I like the hamster/steam engine/dildo idea (very creative), but he did pay for the game (maybe), so he does deserve to criticize it or complain about something they said would have been included originally. There's no need to stomp all over people expressing their opinions and tell them to stop complaining. That's what the nazi's did...
 
I think you're going a bit far, and being a little dramatic here man :P

I like the hamster/steam engine/dildo idea (very creative), but he did pay for the game (maybe), so he does deserve to criticize it or complain about something they said would have been included originally. There's no need to stomp all over people expressing their opinions and tell them to stop complaining. That's what the nazi's did...

When he bought the game it was "as is" which is stated in the EULA. Anything more they give you without a cost is free. A gift. Be grateful they gave you anything. Instead all I hear is "I wanted a red bicycle. Not a blue one" like a bunch of 5 year olds on Xmas day.

I mean some are even blaming the super villain Nvidia for all their woes. Its stupidity and immaturity at its finest.
 
Complain Complain Complain... :D
 
mail man don't you ever get tired of telling people to stop complaining and shut up? lol...
 
mail man don't you ever get tired of telling people to stop complaining and shut up? lol...

I get tired of constantly telling people "I told ya so". That and its tough being right most the time. People get mad and call you a troll and such. The life of a winner isn't easy.
 
no, mailman likes to simplify things into something they are not. he is wrong on so many levels it has to be on purpose, and that's why he is known as a "troll"
 
no, mailman likes to simplify things into something they are not. he is wrong on so many levels it has to be on purpose, and that's why he is known as a "troll"

Show me how I am wrong.
 
Status
Not open for further replies.
Back
Top