Wednesday, September 30th 2009

NVIDIA GT300 ''Fermi'' Detailed

NVIDIA's upcoming flagship graphics processor is going by a lot of codenames. While some call it the GF100, others GT300 (based on the present nomenclature), what is certain that the NVIDIA has given the architecture an internal name of "Fermi", after the Italian physicist Enrico Fermi, the inventor of the nuclear reactor. It doesn't come as a surprise, that the codename of the board itself is going to be called "reactor", according to some sources.

Based on information gathered so far about GT300/Fermi, here's what's packed into it:
  • Transistor count of over 3 billion
  • Built on the 40 nm TSMC process
  • 512 shader processors (which NVIDIA may refer to as "CUDA cores")
  • 32 cores per core cluster
  • 384-bit GDDR5 memory interface
  • 1 MB L1 cache memory, 768 KB L2 unified cache memory
  • Up to 6 GB of total memory, 1.5 GB can be expected for the consumer graphics variant
  • Half Speed IEEE 754 Double Precision floating point
  • Native support for execution of C (CUDA), C++, Fortran, support for DirectCompute 11, DirectX 11, OpenGL 3.1, and OpenCL


Update: Here's an image added from the ongoing public webcast of the GPU Technology Conference, of a graphics card based on the Fermi architecture.

Source: Bright Side of News
Add your own comment

205 Comments on NVIDIA GT300 ''Fermi'' Detailed

#2
Bjorn_Of_Iceland
El Fiendo said:
And I bet my basement would sound like a bunch of harpies getting gang banged by a roving group of banshees with 6 GT300s added to my setups.
You may also want to charge people for a sauna bath in your basement as well.
Posted on Reply
#3
DaedalusHelios
Steevo said:
100 draw distance and all high settings? You are delusional, mistaken, or full of shit.

My 1Gb video card can't run it, it isn't the processor power required, it is the vmem, plain and simple.
You are using a 4850 right?

A lightly OC'ed gtx 280 beats it even if the 4850 has a 1ghz core OC.
Posted on Reply
#5
El Fiendo
Well that's lackluster. I mean after the weird new design concept of the 5870, I half expected to see something with a built in rainbow gun that can actually fire rainbows or something as retaliation. This one looks like they said 'We love the G80 so much, we're doing it again!' Thankfully I don't buy my GFX cards based on looks.
Posted on Reply
#6
Fitseries3
Eleet Hardware Junkie
watch the vid and you can get a better look.

its actually a bit different than you would think.

i like it myself.

these cards will have some serious balls to them from what i've heard in the video so far.

the card is similar to gtx2XX card but slightly smaller
Posted on Reply
#8
El Fiendo
^^^

CUDA cores = new term for shader cores for anyone who didn't catch that right away.
Posted on Reply
#9
newtekie1
Semi-Retired Folder
El Fiendo said:
^^^

CUDA cores = shader cores for anyone who didn't catch that right away.
*Waits for people to bitch about nVidia renaming them...and equating it to them renaming video cards...:laugh:
Posted on Reply
#10
Fitseries3
Eleet Hardware Junkie
if anyone argues that these gpus wont be all balls they will be teabagged once the numbers show up.
Posted on Reply
#11
btarunr
Editor & Senior Moderator
newtekie1 said:
*Waits for people to bitch about nVidia renaming them...and equating it to them renaming video cards...:laugh:
They can call them "little girls with crayons" if they want to. It becomes all the more humiliating when 512 of them beat 1600 "super dooper pooper troopers".
Posted on Reply
#12
El Fiendo
Bta, you're now in charge of naming all hardware tech. If you don't like the name that comes up in the news, replace it. I would love to read stories along the lines of what you posted.

Also, that HP netbook doing the HD streaming was pretty sweet (for people watching the webcast).
Posted on Reply
#13
Fitseries3
Eleet Hardware Junkie
haha.... ludwig fuchs

great name.

that ferrari is pretty nice
Posted on Reply
#14
El Fiendo
Wow. Ludwig brought out some awesome tech. See those lighting effects?
Posted on Reply
#16
Benetanegia
Fitseries3 said:
watch the vid and you can get a better look.

its actually a bit different than you would think.

i like it myself.

these cards will have some serious balls to them from what i've heard in the video so far.

the card is similar to gtx2XX card but slightly smaller
I has some serious balls definately, and some brains too. :rockout:

And I love the look and the fact that it's shorter than GTX2xx cards. Especially the later.

El Fiendo said:
^^^

CUDA cores = new term for shader cores for anyone who didn't catch that right away.
They had started to call them only cores in the last months. Anyway CUDA is (has always be) the name of the architecture itself. Like x86.

What I want to know is if they have shown or will show performance numbers. I now they are not going to be real (like HD5870 being 90% faster than GTX295 lol), but if they say 200% faster you they have something. :laugh:
Posted on Reply
#17
SteelSix
hat said:
L1 and L2 cache for graphics cards? I've never seen that before...
Indeed. Can't wait to see what it can do. Damn I wish this thing was only 30 days out. :banghead:
Posted on Reply
#18
PVTCaboose1337
Graphical Hacker
This sounds like it will destroy the ATI 5xxx series. L2 cache for a graphics card = awesome.
Posted on Reply
#19
PP Mguire
Steevo said:
100 draw distance and all high settings? You are delusional, mistaken, or full of shit.

My 1Gb video card can't run it, it isn't the processor power required, it is the vmem, plain and simple.
GTX280 > 4850. Nuff said. My single GTX280 overclocked ran circles around my crossfire 4870 setup.

Btw 280 = 1gb. :rolleyes:

Also there is a patch out there that will allow you to run a higher resolution without having mass amounts of vmem. Whether it lags or not.
Posted on Reply
#20
Easy Rhino
Linux Advocate
looks like i was right about nvidia's new line of cards. now i just have to wait for them to release a midrange series...
Posted on Reply
#21
Benetanegia
Easy Rhino said:
looks like i was right about nvidia's new line of cards. now i just have to wait for them to release a midrange series...
I've not been looking the webcast since the beginning, but according to Fudzilla that are there and posting at the same time, Jensen has said that they will do a top to bottom release, including a dual GPU card.

http://www.fudzilla.com/content/view/15758/1/

PD: I've been seing the last part and the augmented reality in the Tegra has really really impressed me.
Posted on Reply
#22
imperialreign
This new DX11 war is looming to be rather interesting . . .

It defi appears that nVidia might have an ace up their sleeve against the HD5000 series . . . but, all things considered, ATI have yet to throw out any potential date for the release of the 70x2 - leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .

I'm getting the feeling we're going to see a repeat of the HD4000/GT200 release shenanigans - either way, I guess we'll have to see how it goes.
Posted on Reply
#23
Benetanegia
imperialreign said:
leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .
Read my post above yours. Nvidia will release Fermi from top to bottom which includes the dual GPU card.
Posted on Reply
#24
lism
imperialreign said:
This new DX11 war is looming to be rather interesting . . .

It defi appears that nVidia might have an ace up their sleeve against the HD5000 series . . . but, all things considered, ATI have yet to throw out any potential date for the release of the 70x2 - leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .

I'm getting the feeling we're going to see a repeat of the HD4000/GT200 release shenanigans - either way, I guess we'll have to see how it goes.
The numbers look good, but its still a paperlaunch, and if it takes really much of an effort to bake these wafers without errors, its going to be a hell of a period with this new war between Ati and Nvidia.

I'd prefer Ati, but i have my own reasons for that. Also a shame that even a i7 cant last up a HD5870 in Crossfire :confused: I think the ball is at eitherway Intel or AMD to produce a much stronger crunching CPU.
Posted on Reply
#25
imperialreign
Benetanegia said:
Read my post above yours. Nvidia will release Fermi from top to bottom which includes the dual GPU card.
all and good . . . except I don't really consider Fud to be a fully reliable source . . . as well, it seems rather odd for nVidia (or ATI for that matter) to release their cards in that order.

Also, there's been no confirmation, nor even rumor, from nVidia regarding a dual-GPU setup . . . actually, most rumors have been a little cautious in that they don't really expect nVidia to have a dual-GPU offering for this series . . . again, though, it's all speculation - nVidia have really yet to offer up much detail straight from their mouth.

Besides, it'd be extremelly shtoopid of nVidia to release a dual-GPU card before releasing anything to stack up against the 5870, especially knowing that ATI still have their dual-GPU monstrosities waiting in the wings . . . it's the same tactic that ATI is currently using, by not releasing the x2 ATM.
Posted on Reply
Add your own comment