• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specs Unveiled, Part 2

That is too much for me to pay! Why would you spend that much money?
 
I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks. (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.
 
The problem with this opinion is that back then the performance increase WAS NEEDED. As it stands now, the added performance increase is NOT NEEDED at such a high premium. A lot of high end card owners aren't over-clocking like they did in days past. That's a pure indicator that we have plateau'd above average frame rates in most games. Sure, e-peen dictates higher FPS is better then what you get now. However, having a card that can play games at above average frame rates without even overclocking (in most cases) would be hard pressed to buy a card that cost $600+.


exactly;no matter the gpu brand in the last year the midrange cards 150-250$ proved to be enough for all titles to be played on a decent fps;when via or s3, or intel will have a gpu capable to achieve a decent fps people will buy it because the price/performance ratio, so the best buy card will have the best $/fps ratio,tendency which already rule the market.

i don't really understand why people must be fanboys of Nvidia or Ati ;the point is to buy a card who suits you better and use it for a few years without upgrade; i don't like when mud is thrown from both sides now just to prove that "i have a card from the 1st and best gpu manufacturer in the world"(this can be Nv or Ati) wtf cares about?
 
I don't know how accurate this information is, especially since we haven't seen any other reports of it from any reputable sources. I think I'll wait to believe specs until the cards are actually out, but to me the shader speeds on these cards seem a little low to me. I know there is more of them, but to drop the speeds that much seems insane to me.
 
I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks. (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.

Of course, because it's much easier to make ALMOST ALL developers to make your card run the games faster than to make 2-3 benchmarks run faster. Funny how corrupted all game developers are, but benchmark developers are so incorruptible...
 
No boubt nVidia will end up releasing a GT 280 which ends up being the next gen 8800GT.

So most probably just a wait for the price vs performance minded people after the GTX 280 is released.
 
I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks. (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.
Or that benchmarks are optimized for ATi.
*shrug*

The reason is in the architecture. For a game to run well on ATi R6 gen GPUs some heavy optimizations are required as the architecture of R6 GPUs is so different from earlier, and then ofcourse there's the obvious flaws in R600/RV670 like the absolutely horrible texture filtering capabilities and the innate inefficiency of the superscalar shaders. RV770 will partially fix texture filtering shortcomings but unfortunately the TMUs are only doubled - thus RV770's texturing muscle will still clearly trail that of even G92.
 
Six hundred dollars!!! :eek: awww..

Specs look good, but man.. NVidia is hunting for fat wallets again :twitch:

they are on crack at that price get real
 
Of course, because it's much easier to make ALMOST ALL developers to make your card run the games faster than to make 2-3 benchmarks run faster. Funny how corrupted all game developers are, but benchmark developers are so incorruptible...

This is not about corrupt ppl, it's about the fact that NVIDIA and ATi cards are very different.
When you make a new game for instance you have to optimize the code so that it takes advantage of the hardware the GPU has. This process can take time, you dont generaly have time to cover all aspects and explore all the resources of two cards that work in different ways. So designers have to choose. Nvidia gives them all the help they need in understanding and using the nooks and crannies of their cards. Then games run better on NVIDIA cards.

I'm trying to stay neutral here, the only reason that I like ATi cards right now is because the competition is overcharging.
 
When the Ultra came out and the price was a lifetime of slavery some people would still try to buy 3 of them.
 
This is not about corrupt ppl, it's about the fact that NVIDIA and ATi cards are very different.
When you make a new game for instance you have to optimize the code so that it takes advantage of the hardware the GPU has. This process can take time, you dont generaly have time to cover all aspects and explore all the resources of two cards that work in different ways. So designers have to choose. Nvidia gives them all the help they need in understanding and using the nooks and crannies of their cards. Then games run better on NVIDIA cards.

I'm trying to stay neutral here, the only reason that I like ATi cards right now is because the competition is overcharging.

Have you ever read any game developer's blog? They do specific code not only for each brand, but for almost each card architecture. Then they may optimize better the code specific for Nvidia hardware under TWIMTBP, because Nvidia gives them extense support. In that respect the code for Nvidia hardware may be better optimized, but each card has its own code.

Long before TWIMTBP, many first tier game developers said the way 3DMark did things WAS NOT how they were going to do things in the future. AFAIK this never changed, so it's the same with 3DMark 06 too. Saying that Ati architecture is any better based on these kind of benchmarks, involves that those benchmarks are doing things right, which they aren't.

This is something that has always bugged me. People say that game developers are making code "Nvidia's way", but they never stop and think that MAYBE Nvidia is doing their hardware in "game developers way", while Ati may not. TWIMTBP it's a two way relationship. With the time Ati has become better (comparatively) in benchmarks and worse in games. Everybody blames TWIMTBP for this, without taking into account each company's design decisions. For a simple example, Ati's Superscalar, VLIW and SIMD shader processors are A LOT better suited for the HOMOGENEOUS CODE involved in an static benchmark, than for the ever changing code involved in a game. Also in the case of R600 and one of it's biggest flaws, its TMUs, benchmarks are a lot more favorable than games, since you can "guess" which texture comes next and you don't have to care about the textures that have already been used. In games you don't know where the camera will head next, so you don't know which textures you can discard. In reality none of the architectures can "guess" the next texture, but G80/92 with it's bigger texture power can react better to texture changes. On benchmarks R600/670 can mitigate this effect with streaming.
 
Well, with 1 billion transistors does it even matter how many ROPs, shaders, etc it has? LOL *sarcasm*

It has way more switching power which means ownage (as long as they don't screw it up like the FX series). Clocks are irrevelant (overall).
 
latest upcoming nvidia offereings

All I want to know is can it play Crysis on very high at 1920x1200
 
wow $600 is a bit steep, wonder has that'll translate into pounds over here. I reckon £400 :(

Here's an interesting comparison chart:
12117698854f592cb84b.jpg


If the 4870 is nearlly as fast as the GTX280 it'll be a much better buy I think.

Even if the nVidia options turn out to be better (by say ... 10% to 15%), if this power consumption is real, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but NOT for that price!
 
Even if the nVidia options turn out to be better (by say ... 10% to 15%), if this power consumption is real, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but NOT for that price!

Well, if it's real, then I can say that ATI is missing transistors b/c they're the ones always sucking down the juice :p
 
Even if the nVidia options turn out to be better (by say ... 10% to 15%), if this power consumption is real, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but NOT for that price!

Thing is that from the looks of the specs, RV770 will have 2x the power of RV670 and GT200 will have 2x that of G92. This means a GTX 280 40-50% faster than the HD4870 and GTX260 could end up being 25% faster.
It's early to say anything, but most probably each card will have its market segment and ALL of them will have a similar price/performance, as it's been the case almost always. Also GTX cards won't lag a lot behind in performance-per-watt, always based on these specs which I don't know if are trusty. But then again we are always talking about these specs so, simple math:

157 watts +50% = 157 +79 = 236 W isn't funny?
 
Last edited:
Thing is that from the looks of the specs, RV770 will have 2x the power of RV670 and GT200 will have 2x that of G92. This means a GTX 280 40-50% faster than the HD4870 and GTX260 could end up being 25% faster.
It's early to say anything, but most probably each card will have its market segment and ALL of them will have a similar price/performance, as it's been the case almost always. Also GTX cards won't lag a lot behind in performance-per-watt, always based on these specs which I don't know if are trusty. But then again we are always talking about these specs so, simple math:

160 watts +50% = 160 +80 = 240

If you're correct, i would buy a nVidia card, but only when the price dropped to a more realistic value.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!
 
If you're correct, i would buy a nVidia card, but only when the price dropped to a more realistic value.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!

I will never buy any card above 300$ neither, but the VALUE of the cards is undeniable.

Also I've been looking around and other sources say $400 and $500, for GTX260 and 280 respectively and Nvidia may still have an ace up its sleeve called GTX260 448 MB which would pwn as did GTS 320 in the past, so who knows...
 
Last edited:
If you're correct, i would buy a nVidia card, but only when the price dropped to a more realistic value.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!

That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.
 
That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.

Dunno how you figure that, but it doesn't matter!

To me, there are only 2 important aspects for any card: power usage and price (in that order).
 
That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.

That logic has its flaws:

1- HD3870 X2 is not twice as fast as HD3870, we could guess HD4 x2 won't either.

2- HD3870 X2 price is more than twice than DH3870, will this be different? That puts the HD4870 X2 well above $600. Probably it will be cheaper, but it won't launch until August, we don't know how prices are going to be then...
 
Have you ever read any game developer's blog? They do specific code not only for each brand, but for almost each card architecture. Then they may optimize better the code specific for Nvidia hardware under TWIMTBP, because Nvidia gives them extense support. In that respect the code for Nvidia hardware may be better optimized, but each card has its own code.

Long before TWIMTBP, many first tier game developers said the way 3DMark did things WAS NOT how they were going to do things in the future. AFAIK this never changed, so it's the same with 3DMark 06 too. Saying that Ati architecture is any better based on these kind of benchmarks, involves that those benchmarks are doing things right, which they aren't.

This is something that has always bugged me. People say that game developers are making code "Nvidia's way", but they never stop and think that MAYBE Nvidia is doing their hardware in "game developers way", while Ati may not. TWIMTBP it's a two way relationship. With the time Ati has become better (comparatively) in benchmarks and worse in games. Everybody blames TWIMTBP for this, without taking into account each company's design decisions. For a simple example, Ati's Superscalar, VLIW and SIMD shader processors are A LOT better suited for the HOMOGENEOUS CODE involved in an static benchmark, than for the ever changing code involved in a game. Also in the case of R600 and one of it's biggest flaws, its TMUs, benchmarks are a lot more favorable than games, since you can "guess" which texture comes next and you don't have to care about the textures that have already been used. In games you don't know where the camera will head next, so you don't know which textures you can discard. In reality none of the architectures can "guess" the next texture, but G80/92 with it's bigger texture power can react better to texture changes. On benchmarks R600/670 can mitigate this effect with streaming.



I agree to an extent.


Although, I think there is a major difference in GPU architecture that has been holding ATI back across the board, the few games where ATI has worked closely with developers shows a fairly level playing field, or better ATI performance upon release.

Sadly, the only two games I can think of that I know for certain that ATI worked closely with game developers is Call of Juarez - where we see ATI cards tending to outperform nVidia's; and FEAR - where we see ATI cards continuing to keep pace with nVidias.


I'm sure a certain amount of collaboration does tend to help out nVidia's overall, but yes, GPU architecture does come into play as well; and ATI's GPUs just haven't been suited for the more complex games we've seen over the last 2 years or so.

But, all-in-all, the new ATI R700 series is a brand new design, not a re-hash of an older GPU like the R600 was to R500 - It might be wishful thinking, but . . . I think we might be in for a surprise with the new ATI GPUs. Probably just wishful thinking on my part, though :ohwell:





Anyhow, someone correct me if I'm wrong, but I thought I remember hearing that nVidia's new G200 is another re-hash of G92/G80? :confused:
 
lol you know what, for that price you could buy 2 of ATI's single offerings and beat out the NV counterpart ^^

or just go x2 and win that way, and the x2 will no doubt be far cheaper than nv's gx2 alternative, so again, price wise and performance its win win for ATI atm.

Think about it, NV may have one killer card, but if you can almost buy 2 of ATI's own killer cards (dont matter if they are less powerful than NV's offering) for around or just over the price of one of NV's cards, do the math, ATI / customer would win every time?
 
Save your money for the DX11 cards:)
 
Back
Top