Monday, December 14th 2009

NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown

NVIDIA's latest DirectX 11 compliant GPU architecture, codenamed "Fermi," is getting its first consumer graphics (desktop) implementation in the form of the GeForce GTX 300 series. The nomenclature turned from being obvious to clear, with a set of company slides being leaked to the media, carrying the GeForce GTX 300 series names for the two products expected to come out first: GeForce GTX 380, and GeForce GTX 360. The three slides in public domain as of now cover three specific game benchmarks, where the two graphics cards are pitted against AMD's Radeon HD 5870 and Radeon HD 5970, being part of the company's internal tests.

Tests include Resident Evil 5 (HQ settings, 1920x1200, 8x AA, DX10), STALKER Clear Sky (Extreme quality, No AA, 1920 x 1200, DX10), and Far Cry 2 (Ultra High Quality, 1920x1200, 8x AA, DX10). Other GPUs include GeForce GTX 295 and GTX 285 for reference, just so you know how NVIDIA is pitting the two against the Radeon HD 5000 GPUs, given that the figures are already out. With all the three tests, GTX 380 emerged on top, with GTX 360 performing close to the HD 5970. A point to note, however, is that the tests were run at 1920 x 1200, and tests have shown that the higher-end HD 5000 series GPUs, particularly the HD 5970, is made for resolutions higher than 1920 x 1200. AA was also disabled in STALKER Clear Sky. NVIDIA's GeForce GTX 300 will be out in Q1 2010.

Update (12/15): NVIDIA’s Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".

Source: Guru3D
Add your own comment

189 Comments on NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown

#1
Mussels
Moderprator
arent those three games nvidia "the way its meant to be played" titles?
Posted on Reply
#2
crow1001
by: t77snapshot
hey show some respect to our news Editors ok.:shadedshu
Get a clue, they are fake, editor needs to acknowledge this in the first post or lock it here, the original thread where these fake results came from has been locked until there is proof to support them, I suggest this goes the same way, but being as they ARE fake that will never happen.
Posted on Reply
#3
Amok
by: laszlo
i zoom it at max ....nothing... take the posted picture not the uploaded:)
Yeah, the jpg version had bad compression, but as you can see, i did another version, would you believe what i did if some guy posted it on the web or in the news section of some big site?
Posted on Reply
#4
shevanel
just another ploy to keep people talking and that is all good.

meanwhile, I'm about to play some Dirt 2 w/ DX11.. anyone down?
Posted on Reply
#5
WarEagleAU
Bird of Prey
I really do not expect Fermi to be better than cypress to the points made in the graphs. I Think Fermi will be a great nvidia product, but AMD is in the win with their ATI cards now and will continue to be when they drop the prices the day Fermi launches.
Posted on Reply
#6
laszlo
by: Amok
Yeah, the jpg version had bad compression, but as you can see, i did another version, would you believe what i did if some guy posted it on the web or in the news section of some big site?
see what i mean: http://i49.tinypic.com/23mk11d.jpg
Posted on Reply
#7
Imsochobo
by: Flyordie
You guys do realize ATI isn't just gonna stand around and watch this unfold...

There will be an HD5980 or equivalent. ;-p

Probably a 512bit GDDR5 HD5890... As we all know the HD58xx series is very bandwidth starved... giving it that extra bandwidth should pull it ahead of almost everything Nvidia can throw out...
They aint, i gained less on a memory clock from 1000 mhz vs 1400 mhz.
a core clock from 850 to 950 = way more fps gain, meaning, they aint memory bottlenecked as MANY belive.

G300 will have a 384 bit memory bus running GDDR5 memory modules.
That will have an estimated memory bandwidth of 185-200 gb/sec.

Its funny how 128 bit memory bus diffrence on the 4870 could mean just mere 5-8% eh? they aint very memory bottlenecked, stop arguing about the damn bus, ati know better than you, i belive they've done EXTENSIVE testing to find out what gives most bang for the buck.
Posted on Reply
#8
shevanel
yeah 512mb of 384bit memory probably.

I'll probably buy one when they release. I'd like to be able to kick paper in Batman AA while simultaneously having DX11 features for my other games.
Posted on Reply
#9
Amok
Look at what i found... neat huh? :laugh:

Posted on Reply
#10
KainXS
never trust any internal or pre release reviews, most of the time they will be fake, and this one could have been done by a little kid from the looks I could make a review saying the HD8870 will be faster than the GTX600, would anyone like to see that, of course not, because it would be fake, just like this.

I can't even believe the mods posted this crap, lol you don't even know what the source is
Posted on Reply
#11
Mussels
Moderprator
by: KainXS
never trust any internal or pre release reviews, most of the time they will be fake, and this one could have been done by a little kid from the looks I could make a review saying the HD8870 will be faster than the GTX600, would anyone like to see that, of course not, because it would be fake, just like this.

I can't even believe the mods posted this crap,
newsposters and mods arent the same people


besides, if you'd actually clicked the link for source...


Posted on Reply
#14
Mussels
Moderprator
by: crow1001
Well you not think it right that techpowerup should display the same disclaimer in its post?
up to the discretion of the news poster - they DO provide links to the source, under the assumption that anyone interested in the source of the post would click it for further information.
Posted on Reply
#15
TooFast
by the time this thing comes out, the 5890 and 5990 will be ready.
Posted on Reply
#16
EastCoasthandle
Was this ever confirmed by nvidia? From what was told this is fake. It started in Tom's Hardware Thread. As you can see it looks like an official TM benchmark result but look at the user name (Successful_Troll). Then it appeared the next day at OCUK forum to look like official nvidia slide. In this thread, the poster here links back to TM forum grinning about the thread created there.

This is why I ask if this has been officially confirmed by nvidia via press release on their homepage (for example). Because as it stands now it doesn't look real based on how this came about.
Posted on Reply
#17
Mussels
Moderprator
by: EastCoasthandle
Was this ever confirmed by nvidia? From what was told this is fake. It started in Tom's Hardware Thread. As you can see it looks like an official TM benchmark result but look at the user name (Successful_Troll). Then it was photoshoped in the nd was photoshopped again at the OCUK forum to look like official nvidia slide. In this thread, the poster here links back to TM forum grinning about the thread created there.

This is why I ask if this has been officially confirmed by nvidia via press release on their homepage (for example). Because as it stands now it doesn't look real based on how this came about.
it looks to be completely fake.
Posted on Reply
#18
crow1001
I think you will agree when I say techpowerup is a very popular site, so to have something real dodgy pasted on the front page for all to view and with no disclaimer unless you go to the source, this is correct? I think not.
Posted on Reply
#19
Amok
Don't you understand, both posters on THG and overclockers.co.uk are actually Jen Hsuan in disguise... trying to ruin AMDs Christmas sale... :laugh:
Posted on Reply
#20
KainXS
A monkey came back from the future and posted this on those forums, thats the source lol
Posted on Reply
#21
Mussels
Moderprator
by: crow1001
I think you will agree when I say techpowerup is a very popular site, so to have something real dodgy pasted on the front page for all to view and with no disclaimer unless you go to the source, this is correct? I think not.
indeed.

The case has been made, so hopefully BTA will read these points and edit his post.
Posted on Reply
#23
Mistral
Amok, the graph you posted is out of date. Here's the latest, straight from nV's official internal testing. You can see why it hasn't been made public before.



As you can notice, it includes data from the upcoming mainstream offering of the third major superpower in the graphics market.
Posted on Reply
#25
Amok
Ahhh, my sources kept me on the dark on this one... But still, Larrabee is the king to be...

One GPU to rule them all, One GPU to find them,
One GPU to bring them all and in the darkness bind them all
Posted on Reply
Add your own comment