Monday, December 14th 2009

NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown

NVIDIA's latest DirectX 11 compliant GPU architecture, codenamed "Fermi," is getting its first consumer graphics (desktop) implementation in the form of the GeForce GTX 300 series. The nomenclature turned from being obvious to clear, with a set of company slides being leaked to the media, carrying the GeForce GTX 300 series names for the two products expected to come out first: GeForce GTX 380, and GeForce GTX 360. The three slides in public domain as of now cover three specific game benchmarks, where the two graphics cards are pitted against AMD's Radeon HD 5870 and Radeon HD 5970, being part of the company's internal tests.

Tests include Resident Evil 5 (HQ settings, 1920x1200, 8x AA, DX10), STALKER Clear Sky (Extreme quality, No AA, 1920 x 1200, DX10), and Far Cry 2 (Ultra High Quality, 1920x1200, 8x AA, DX10). Other GPUs include GeForce GTX 295 and GTX 285 for reference, just so you know how NVIDIA is pitting the two against the Radeon HD 5000 GPUs, given that the figures are already out. With all the three tests, GTX 380 emerged on top, with GTX 360 performing close to the HD 5970. A point to note, however, is that the tests were run at 1920 x 1200, and tests have shown that the higher-end HD 5000 series GPUs, particularly the HD 5970, is made for resolutions higher than 1920 x 1200. AA was also disabled in STALKER Clear Sky. NVIDIA's GeForce GTX 300 will be out in Q1 2010.

Update (12/15): NVIDIA's Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".
Source: Guru3D
Add your own comment

189 Comments on NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown

#126
Binge
Overclocking Surrealism
MK4512If it costs anything like a GTX 295, you might need to take out a mortgage :roll:
I don't get it :wtf: Do you live in a cardboard box on a section of a golf course or something?
Posted on Reply
#127
MK4512
BingeI don't get it :wtf: Do you live in a cardboard box on a section of a golf course or something?
It's called "humble living"! And I'll thank you not to call it a box! It is the highest quality paper pulp shipping container this side of the dump!
Posted on Reply
#130
punani
But... can it run crysis?

:D
Posted on Reply
#131
RoutedScripter
I won't start bait here , but the looks of the state in which nvidia is now , it's faked fermi pesentation , there is very little time in which they can develop a GPU that's 10% faster than 5970. Not to mention that the screnshot is showing supposably a single core GTX380 winning over a dual core and also a whole 10% faster , that's pretty much too good to be true.

I would have to agree , that the truth of those posted "benchmark" screenshot is very slim.

Looks like nvidia spent too much money on adverts and bribing ....
Posted on Reply
#132
Tatty_Two
Gone Fishing
RuskiSnajperI won't start bait here , but the looks of the state in which nvidia is now , it's faked fermi pesentation , there is very little time in which they can develop a GPU that's 10% faster than 5970. Not to mention that the screnshot is showing supposably a single core GTX380 winning over a dual core and also a whole 10% faster , that's pretty much too good to be true.

I would have to agree , that the truth of those posted "benchmark" screenshot is very slim.
I agree that the truth in this thread is very slim, but truly, expect GT300 to be faster than it's competition, I have been observing and contibuting to these damn wars since..... well longer than I care to remeber and more often than not the green side tends to come out on top howerver at a price..... a price that some are willing to pay, whichever way you care to look at it though, it's gotta be good for the consumer, if the green side werent competative at least, we would not see price decreases from anyone.
Posted on Reply
#133
Lionheart
SNIFF SNIFF!!! I smell bullshit:laugh::laugh::laugh:
Posted on Reply
#134
imperialreign
Tatty_OneI agree that the truth in this thread is very slim, but truly, expect GT300 to be faster than it's competition, I have been observing and contibuting to these damn wars since..... well longer than I care to remeber and more often than not the green side tends to come out on top howerver at a price..... a price that some are willing to pay, whichever way you care to look at it though, it's gotta be good for the consumer, if the green side werent competative at least, we would not see price decreases from anyone.
I agree . . . except that back around the time of the 1900/7900 series, ATI were leading the performance market, and a few series prior. ATi really fell off with the 2000 series, though . . .

Typically happens, though, one leads for a few series, then the other overtakes them for a few series . . .


Now, if GTX380 is a dual-GPU solution (like nVidia have been claiming it will be), then I could see it topping out over the 5970 . . . but, then again, if it scales that poorly from a GTX360, that doesn't bode well for the 300 series as a whole . . . at least compared to the overall gains of 2 GPU setups from the 5000 series.

I guess we'll just have to see.
Posted on Reply
#135
phanbuey
I never thought nv claimed the gtx 380 to be dual gpu :confused:
Posted on Reply
#136
imperialreign
phanbueyI never thought nv claimed the gtx 380 to be dual gpu :confused:
Well, they haven't yet (as far as I know) . . .

but, they've been throwing rumors out of releasing a dual-GPU board at the same time the single-GPU board is first released.

If that is the case, then I'd have to fathom the 380 as being a dual-GPU board.

But . . . this is simply (un)founded speculation . . . until it's on the shelves, we can't know for sure what nVidia is up to. Either way, they're lagging seriously behind with their new series.
Posted on Reply
#137
RoutedScripter
That's true and it's usually the green who's a little faster always , but that doesn't mean anything , people need to realize those extra few fps aren't anything , they are either better support in games for nvidia , optimizations ,benchmarks ... I mean I hardly care about fps.


I mean I don't even rely on artificial benchmarks like futuremark's products, ... I run on my machine my games and my settings , and I probably won't see bad things on either side but the fact why nvidia has that top fps it's cause they just compete with fps , look for example ATI cards have it's own sound card, they are maybe better option when it comes to TV and multimedia cause of the AVIVO (i never used it my self actually) , and most importantly there's been stuff over years I can say, the red side didn't cause crashes or fails too much and had better drivers , well that were how my friends talked who had many gpus from both sides as soon as from 1998. (but I agree the catalyst drivers 9.1 until 9.11 , all those in the middle including 9.1 were really crappy.)

Not to mention screen quality has been praised in the red camp , I do agree on this one cause I can see it my self, the ATI's shadows are really standing out and you can clearly see the difference , this is something I gladly switch over for a few fps.

Now me realizing the source is 3Dguru , I can safely say , that the truth of that pics went from little to zero.
imperialreignWell, they haven't yet (as far as I know) . . .

but, they've been throwing rumors out of releasing a dual-GPU board at the same time the single-GPU board is first released.

If that is the case, then I'd have to fathom the 380 as being a dual-GPU board.

But . . . this is simply (un)founded speculation . . . until it's on the shelves, we can't know for sure what nVidia is up to. Either way, they're lagging seriously behind with their new series.
Indeed , but if those pics have any little truth , the gtx380 has to be a dual core. That wont tie up with the 260 cause then the 260 would have to be dual core too to fit.

On the other hand , what if this new gpus really have a truly new design , some new super optimizing code for example cuda , that's the key to so fast games , ... again saying that speed is what's i don't see as the main decider anymore , cause both camps have GPUs of that cost that can run almost every game fast enough. Crysis being an exception (Crysis is a presentation of the engine rather more of a game , but the game actually has some of the talent spirit that I like in that game)
Posted on Reply
#138
imperialreign
RuskiSnajperThat's true and it's usually the green who's a little faster always , but that doesn't mean anything , people need to realize those extra few fps aren't anything , they are either better support in games for nvidia , optimizations ,benchmarks ... I mean I hardly care about fps.
Well, IMHO, the last few series have been running so close together that I really don't think the differences anymore are boiling down to hardware . . . but rather developer optimization . . . and we all know that the majority of games are better optimized for nVidia's hardware, much thanks to their TWIMTBP program. I'd love for ATI to step it up a notch and start pushing their ATI Game! program a bit more, but alas . . . fundage is rather tight . . .

Even still . . . when your card ousts the competition by an average of 5 FPS, you can claim the performance title . . . and with that comes all the raging hard-ons for owning a card that's "performance king," no matter what the cost. big reason why nvidia have been able to push such insane pricing for their hardware the last 3-5 years. Personally, I can afford it, but I won't buy nVidia products (for numerous reasons) . . . the average user can't, but if they want "the best of the best of the best," they're willing to fork out the dough.

My biggest wish, for the gaming/harware market as a whole, would be that ATI is finally back to a financially sound position, and can start pushing their ATI Game! program a lot more (which they've rather neglected the last few years). There needs to be competition in the gaming market against TWIMTBP, and ATI just can't affor to do so, ATM.
RuskiSnajperIndeed , but if those pics have any little truth , the gtx380 has to be a dual core. That wont tie up with the 260 cause then the 260 would have to be dual core too to fit.

On the other hand , what if this new gpus really have a truly new design , some new super optimizing code for example cuda , that's the key to so fast games , ... again saying that speed is what's i don't see as the main decider anymore , cause both camps have GPUs of that cost that can run almost every game fast enough. Crysis being an exception (Crysis is a presentation of the engine rather more of a game , but the game actually has some of the talent spirit that I like in that game)
I can't really fathom nVidia doing anything "truly new," they've been milking the same designs for the last few years . . . enough so that both ATI and Intel have made numerous comments on their architecture.

I've been starting to wonder if we're at a point where nVidia are simply "tapped out," and can't take that architecture any further . . . forcing them to go back to R&D . . . if that's the case, then who knows what the results would be? They could be faster, or slower . . . and it would take much longer to get the product to market than was originally thought (although, Fermi is starting to fit this bill quite nicely).

ATI can tell you first hand, though, re-designing a new GPU from scratch, or making major changes to existing architecture, will lead you into a lot of pitfalls.
Posted on Reply
#139
Makaveli
lol the best part of this thread are those slides. Can you guys keep it up I wanna see a powervr chip in there, also I would love to see the speed of a Trident x4 STI turbo "aka AMDNV Killer"
Posted on Reply
#140
eidairaman1
The Exiled Airman
DrPepperPerformance matters to us more than idle power consumption because our gpu's aren't usually idle.
Thats if you dont have a job, or you work from home

Myself go a-b and back daily, which leaves the machine about 12-14 hours idle daily.
Posted on Reply
#141
TheMailMan78
Big Member
eidairaman1Thats if you dont have a job, or you work from home

Myself go a-b and back daily, which leaves the machine about 12-14 hours idle daily.
S3 sleep or just turn the damn thing off. Fixed.
Posted on Reply
#142
eidairaman1
The Exiled Airman
even still, suppose your not doing any graphic demanding task while away, such as defrag, torrents etc.
Posted on Reply
#143
TheMailMan78
Big Member
eidairaman1even still, suppose your not doing any graphic demanding task while away, such as defrag, torrents etc.
Not enough consumption to make any real difference in your power bill. Its all gimmicks man. If you have 50 computers running on one bill THEN you worry about such things. A dirty AC filter will cost you more money.
Posted on Reply
#144
Mussels
Freshwater Moderator
stasdmAnd you brain would not differ between 50 and 60 FPS (only effect of luminiscent lamps makes some difference).
on my old CRT i could easily see a difference from 60Hz to 120Hz, same with FPS. i was disapointed when i went LCD, due to that (but other things made up for it)
TheMailMan78Maybe yours cant but mine can.
same
Nailezssomeone dig up that thread we had going a couple months ago about how many fps the eye and brain can see
somewhere in there was proof that the human eye/brain combo(or whatever u want to call it) could register 200fps+, and used some USAF pilot testing as evidence
i posted that so many times i hate posting it. The human eye can see one odd frame out of 200+, it all depends on how good your brain is (aka, how well trained you are - look at how fast CSS kiddies are vs a 40 year old noob)


as to the comment someone made about '60 FPS is the best an LCD can do, who cares'

did it occur to you that people buy these, and play it on games *Drumroll* in the future? when games are SLOWER? 10 more FPS now = 5 more FPS then, and it could be a deal breaker
Posted on Reply
#145
ShadowFold
KainXScoulda fooled me ^^
CUDA fooled me :laugh:...
Posted on Reply
#146
SummerDays
I guess Nvidia was trying to show that their GTX 295 was faster than a 5870 while being slightly more expensive.
Posted on Reply
#147
AddSub
Impressive stuff. I can't wait to tri-SLI those monster GPU's.

"The way it's meant to be played" ....indeed! :D
Posted on Reply
#148
PP Mguire
RuskiSnajperThat's true and it's usually the green who's a little faster always , but that doesn't mean anything , people need to realize those extra few fps aren't anything , they are either better support in games for nvidia , optimizations ,benchmarks ... I mean I hardly care about fps.


I mean I don't even rely on artificial benchmarks like futuremark's products, ... I run on my machine my games and my settings , and I probably won't see bad things on either side but the fact why nvidia has that top fps it's cause they just compete with fps , look for example ATI cards have it's own sound card, they are maybe better option when it comes to TV and multimedia cause of the AVIVO (i never used it my self actually) , and most importantly there's been stuff over years I can say, the red side didn't cause crashes or fails too much and had better drivers , well that were how my friends talked who had many gpus from both sides as soon as from 1998. (but I agree the catalyst drivers 9.1 until 9.11 , all those in the middle including 9.1 were really crappy.)

Not to mention screen quality has been praised in the red camp , I do agree on this one cause I can see it my self, the ATI's shadows are really standing out and you can clearly see the difference , this is something I gladly switch over for a few fps.

Now me realizing the source is 3Dguru , I can safely say , that the truth of that pics went from little to zero.
Then why not just get a 9500GT or 5670 or something really cheap and try running games on high and see if you care about fps?
imperialreignWell, IMHO, the last few series have been running so close together that I really don't think the differences anymore are boiling down to hardware . . . but rather developer optimization . . . and we all know that the majority of games are better optimized for nVidia's hardware, much thanks to their TWIMTBP program. I'd love for ATI to step it up a notch and start pushing their ATI Game! program a bit more, but alas . . . fundage is rather tight . . .

Even still . . . when your card ousts the competition by an average of 5 FPS, you can claim the performance title . . . and with that comes all the raging hard-ons for owning a card that's "performance king," no matter what the cost. big reason why nvidia have been able to push such insane pricing for their hardware the last 3-5 years. Personally, I can afford it, but I won't buy nVidia products (for numerous reasons) . . . the average user can't, but if they want "the best of the best of the best," they're willing to fork out the dough.

My biggest wish, for the gaming/harware market as a whole, would be that ATI is finally back to a financially sound position, and can start pushing their ATI Game! program a lot more (which they've rather neglected the last few years). There needs to be competition in the gaming market against TWIMTBP, and ATI just can't affor to do so, ATM.




I can't really fathom nVidia doing anything "truly new," they've been milking the same designs for the last few years . . . enough so that both ATI and Intel have made numerous comments on their architecture.

I've been starting to wonder if we're at a point where nVidia are simply "tapped out," and can't take that architecture any further . . . forcing them to go back to R&D . . . if that's the case, then who knows what the results would be? They could be faster, or slower . . . and it would take much longer to get the product to market than was originally thought (although, Fermi is starting to fit this bill quite nicely).

ATI can tell you first hand, though, re-designing a new GPU from scratch, or making major changes to existing architecture, will lead you into a lot of pitfalls.
Fermi is new, and its not a milked 285. The specs alone say that.
Posted on Reply
#149
Imsochobo
TheMailMan78S3 sleep or just turn the damn thing off. Fixed.
download 24/7.

Idle powerconsumtion is important. load i cudnt care less about.
Posted on Reply
#150
Mussels
Freshwater Moderator
Imsochobodownload 24/7.

Idle powerconsumtion is important. load i cudnt care less about.
then perhaps you should build a dedicated low power download system like i have
Posted on Reply
Add your own comment
Apr 25th, 2024 23:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts