Wednesday, September 30th 2009

NVIDIA GT300 ''Fermi'' Detailed

NVIDIA's upcoming flagship graphics processor is going by a lot of codenames. While some call it the GF100, others GT300 (based on the present nomenclature), what is certain that the NVIDIA has given the architecture an internal name of "Fermi", after the Italian physicist Enrico Fermi, the inventor of the nuclear reactor. It doesn't come as a surprise, that the codename of the board itself is going to be called "reactor", according to some sources.

Based on information gathered so far about GT300/Fermi, here's what's packed into it:
  • Transistor count of over 3 billion
  • Built on the 40 nm TSMC process
  • 512 shader processors (which NVIDIA may refer to as "CUDA cores")
  • 32 cores per core cluster
  • 384-bit GDDR5 memory interface
  • 1 MB L1 cache memory, 768 KB L2 unified cache memory
  • Up to 6 GB of total memory, 1.5 GB can be expected for the consumer graphics variant
  • Half Speed IEEE 754 Double Precision floating point
  • Native support for execution of C (CUDA), C++, Fortran, support for DirectCompute 11, DirectX 11, OpenGL 3.1, and OpenCL
Update: Here's an image added from the ongoing public webcast of the GPU Technology Conference, of a graphics card based on the Fermi architecture.
Source: Bright Side of News
Add your own comment

205 Comments on NVIDIA GT300 ''Fermi'' Detailed

#151
Easy Rhino
Linux Advocate
if i see a dx11 nvidia mid range card by nov i will poop myself
Posted on Reply
#152
Benetanegia
imperialreignall and good . . . except I don't really consider Fud to be a fully reliable source . . . as well, it seems rather odd for nVidia (or ATI for that matter) to release their cards in that order.

Also, there's been no confirmation, nor even rumor, from nVidia regarding a dual-GPU setup . . . actually, most rumors have been a little cautious in that they don't really expect nVidia to have a dual-GPU offering for this series . . . again, though, it's all speculation - nVidia have really yet to offer up much detail straight from their mouth.

Besides, it'd be extremelly shtoopid of nVidia to release a dual-GPU card before releasing anything to stack up against the 5870, especially knowing that ATI still have their dual-GPU monstrosities waiting in the wings . . . it's the same tactic that ATI is currently using, by not releasing the x2 ATM.
Erm I'll wait until someone that has been seing the webcast can confirm that Jensen has said that in GTC, but they (FUD) have suposedly post that from the GTC... I wouldn't call that a rumor. This is not a "We've been told by sources close...", this is "Jensen Huang has confirmed..." pretty different.
Posted on Reply
#153
wolf
Performance Enthusiast
I love all the ATi heavy posts in this thread :)

you had your hype guys, now its our turn, let the green machine steam roll once more!

We should get some specifics here pretty soon.
Posted on Reply
#154
Kursah
imperialreignI'm getting the feeling we're going to see a repeat of the HD4000/GT200 release shenanigans - either way, I guess we'll have to see how it goes.
I kinda hope we do, give us some more competition and drive prices down drastsically. Give us cheaper products that aren't much slower and can be OC'd to more than make up for it, give us decent stock cooling and easy adaptation for aftermarket cooling solutions. I really hope the DX11 Gen1 "wars" set up to be ultra competetive, I kinda hope that NV doesn't pull too far ahead performance-wise and I don't think ATI would let them so to say...but in the same instant, if NV has too good of an ace up their sleeves in theory, ATI might have to get the 2nd gen rolled out sooner to compete...though ATI will have the affordability on their side I'm sure. Things have definately gotten more interesting, hard saying which way I'll go when the products are all out there and my 260 isn't cutting it anymore..which isn't yet thankfully!

:toast:
Posted on Reply
#155
imperialreign
BenetanegiaErm I'll wait until someone that has been seing the webcast can confirm that Jensen has said that in GTC, but they (FUD) have suposedly post that from the GTC... I wouldn't call that a rumor. This is not a "We've been told by sources close...", this is "Jensen Huang has confirmed..." pretty different.
Still, word from "an insider" is not the same as an "official" statement . . . unless the statement is "official," it's really just a rumor.

I'm not trying to say your statement is wrong, simply that we've seen those kinds of "news topics" posted at sites such as Fud, Inquirer, MaxPC, Tom's, Nordic, and countless other sites - some more reliable than others. It usually all boils down to the premise of "believe it when we see it" kinda thing, as the tech industry changes so often that nothing is really finalized until it's in the hands of the consumer . . . just look at how often the preliminary specs of the HD5000 series (and HD4000 / GT200, for that matter) changed before they were finally released.
KursahI kinda hope we do, give us some more competition and drive prices down drastsically. Give us cheaper products that aren't much slower and can be OC'd to more than make up for it, give us decent stock cooling and easy adaptation for aftermarket cooling solutions. I really hope the DX11 Gen1 "wars" set up to be ultra competetive, I kinda hope that NV doesn't pull too far ahead performance-wise and I don't think ATI would let them so to say...but in the same instant, if NV has too good of an ace up their sleeves in theory, ATI might have to get the 2nd gen rolled out sooner to compete...though ATI will have the affordability on their side I'm sure. Things have definately gotten more interesting, hard saying which way I'll go when the products are all out there and my 260 isn't cutting it anymore..which isn't yet thankfully!

:toast:
Defi looking forward to it too - we haven't seen this heavy or heated of competition since the good 'ol days! :rockout:
Posted on Reply
#156
Benetanegia
imperialreignStill, word from "an insider" is not the same as an "official" statement . . . unless the statement is "official," it's really just a rumor.

I'm not trying to say your statement is wrong, simply that we've seen those kinds of "news topics" posted at sites such as Fud, Inquirer, MaxPC, Tom's, Nordic, and countless other sites - some more reliable than others. It usually all boils down to the premise of "believe it when we see it" kinda thing, as the tech industry changes so often that nothing is really finalized until it's in the hands of the consumer . . . just look at how often the preliminary specs of the HD5000 series (and HD4000 / GT200, for that matter) changed before they were finally released.





Defi looking forward to it too - we haven't seen this heavy or heated of competition since the good 'ol days! :rockout:
Well definately you don't get it man.

www.nvidia.com/object/gpu_technology_conference.html

Jensen Huang CEO of Nvidia has said that in a conference. If you can find any more "official" source let me know man.

"an insider"????
"official"???? :roll::roll::roll::roll:

Sorry but I have to laugh. I have to laugh sooo hard...
Posted on Reply
#157
imperialreign
BenetanegiaWell definately you don't get it man.

www.nvidia.com/object/gpu_technology_conference.html

Jensen Huang CEO of Nvidia has said that in a conference. If you can find any more "official" source let me know man.

"an insider"????
"official"???? :roll::roll::roll::roll:

Sorry but I have to laugh. I have to laugh sooo hard...
I love how presumptuous some people get . . . seriously, I do.

Simply because a CEO of a company makes a "statement" does not make that statement official - nor will that be the "be-all, end-all" strategy that the company will follow . . . such has happened numerous times over the last 3 decades.

Besides - the link you posted to originally stated:
While he didn't talk about it during the keynote presentation, this release strategy also includes a high end dual-GPU configuration that should ship around the same time as the high end single-GPU model.
So, then, if the CEO didn't mention it at GTC - where did that info about such a release strategy come from? Or do you have some "inside scoop" you'd like to share, eh?
Posted on Reply
#158
Benetanegia
imperialreignI love how presumptuous some people get . . . seriously, I do.

Simply because a CEO of a company makes a "statement" does not make that statement official - nor will that be the "be-all, end-all" strategy that the company will follow . . . such has happened numerous times over the last 3 decades.

Besides - the link you posted to originally stated:

So, then, if the CEO didn't mention it at GTC - where did that info about such a release strategy come from? Or do you have some "inside scoop" you'd like to share, eh?
You can wrap that around as much as you want, what that sentence means is that he didn't say anything about the dual-GPU on the keynote. He did say about the top to bottom release.
Aditionally sometimes reporters get the chance to talk to them behind the scene as to say, and he told them about the dual card then. Putting false words in a company's CEO's mouth is not the best way of doing bussiness when you are a news site and there's still one more day of conference left so you want to have him happy.
Posted on Reply
#159
KainXS
It will be different this time around, Nvidia did not refer to the Geforce sector, they only referred to the the Telsa's, IE, cards for businesses.
Posted on Reply
#160
imperialreign
BenetanegiaYou can wrap that around as much as you want, what that sentence means is that he didn't say anything about the dual-GPU on the keynote. He did say about the top to bottom release.
Oh . . . so, then, although Fud stated that Jensen "didn't talk about [a top-to-bottom release strategy] during the keynote presentation," and it's not even mentioned in the keynote highlights at nVidia's blog site . . . you can affirm that the company's CEO has stated the release strategy for the GT300?

So, then, you mean to say that the statements that Fud made were wrong, because you know, for sure, of otherwise? That's quite different from your original assesment that Fud is a reliable source . . . even though Fud didn't claim that Jensen even mentioned the GT300 release strategy.

I'm not wrapping anything around any-which-way . . . simply citing the text and context as it's written.
Aditionally sometimes reporters get the chance to talk to them behind the scene as to say, and he told them about the dual card then. Putting false words in a company's CEO's mouth is not the best way of doing bussiness when you are a news site and there's still one more day of conference left.
Such actions have never stopped sites before - besides, both these tech sites and the companies thrive on the hype. It's no different than browsing through the tabloids at the supermarket . . . sometimes they're right, sometimes they're wrong . . . other times they're so far out in left field that there's no way they'd be right.

Such is how it goes with just about every tech report site out there. Simply because a reported might have an opportunity to speak with a company representative "behind the scenes" does not make that an official statement . . . nor does it mean that said reported hasn't added their own "embellishment."
Posted on Reply
#161
Benetanegia
imperialreignOh . . . so, then, although Fud stated that Jensen "didn't talk about [a dual card release at the same time as the rest as the lineup] during the keynote presentation," and it's not even mentioned in the keynote highlights at nVidia's blog site . . . you can affirm that the company's CEO has stated the release strategy for the GT300?

So, then, you mean to say that the statements that Fud made were wrong, because you know, for sure, of otherwise? That's quite different from your original assesment that Fud is a reliable source . . . even though Fud didn't claim that Jensen even mentioned the GT300 release strategy.

I'm not wrapping anything around any-which-way . . . simply citing the text and context as it's written.



Such actions have never stopped sites before - besides, both these tech sites and the companies thrive on the hype. It's no different than browsing through the tabloids at the supermarket . . . sometimes they're right, sometimes they're wrong . . . other times they're so far out in left field that there's no way they'd be right.

Such is how it goes with just about every tech report site out there. Simply because a reported might have an opportunity to speak with a company representative "behind the scenes" does not make that an official statement . . . nor does it mean that said reported hasn't added their own "embellishment."
Fixed. He did said about the top to bottom release. What I linked is the second (out 6) consecutive post made in FUD at the hours that the keynote has taken place, the first one says:
3 billion transistors


Nvidia's CEO, Jensen Huang just posted the first official picture of Fermi rendering a Bugatti car and the slide confirms that the new GPU has 3 billion transistors. The car looks realistic and it is probably rendered at real time, but at press time we've only seen a picture of it.

The focus is definitely on GPU computing power of this chip. The Bugatti was done with raytracing and 2.5 million rays and it looks seriously realistic to the point that it's astonishing.

Since we're attending the keynote, we will update with more details in real time, so stay tuned.
www.fudzilla.com/content/view/15757/1/

Since you read the keynote highlights you did read about the car right?

The second post is the one I posted earlier.

The third: www.fudzilla.com/content/view/15759/1/
The fourth: www.fudzilla.com/content/view/15760/1/
I was looking the GTC webcast already so I can confirm he did say that.
The fifth: www.fudzilla.com/content/view/15761/1/
I too was seing that.
The sixth is a photo of Huang Holding the card: www.fudzilla.com/content/view/15762/1/
I don't know when that has happened, I wasn't watching.

So does the fact that I can confirm 2 of them make a difference for you?
Posted on Reply
#162
@RaXxaa@
I wonder how many $xxxx you gotta pay for it, seems to be breaking a $1000 feet long hole in your pocket
Posted on Reply
#163
PP Mguire
Who the hell cares how they will release or what they will release. The only thing that matters is hard benchmarks AFTER its released. The rest is fuddy duddy.

From past releases its safe to say they WILL have a dual GPU card because they have since the 7950GX2. Also, people argued over heat that there wouldnt be a dual gpu GT200 but in fact there was. Arguing over what they will or will not do seems childish when it wouldnt matter either way. You will buy what card you want or can afford and call it a day no matter if they release all at once or release a dual gpu card.
Posted on Reply
#164
wolf
Performance Enthusiast
Anyone is Australia know what aussie time it will be when specs are completely officially revealed?
Posted on Reply
#165
Nkd
soldier242if thats all tru, then the singlecore GTX 380? will even beat the crap out of the 5870 X2 and still won't be tired after doing so ... damn
hmm, I would say that is bit of a stretch, so what is the difference, the games I doubt are going to be using cuda, so if they do than yea, the main purpose of this graphics card is general purpose computing, I will keep my judgement until the card is actually tested, I am sure it will beat my hd 5870, looking at the specs but who knows it might not do as well in games as it would in folding.
Posted on Reply
#166
Hayder_Master
very nice this is beat 5870 but when it release and how much cost
Posted on Reply
#167
Binge
Overclocking Surrealism
hayder.mastervery nice this is beat 5870 but when it release and how much cost
all in due time?
Posted on Reply
#168
DaedalusHelios
Yeah I don't really trust fudzilla enough to believe it 100%. I still think its a bit up in the air what release strategy they will have. Its not that I don't believe you Benetanegia, its that I don't believe fudzilla is truthful all the time.
Posted on Reply
#169
Zubasa
PP MguireWho the hell cares how they will release or what they will release. The only thing that matters is hard benchmarks AFTER its released. The rest is fuddy duddy.

From past releases its safe to say they WILL have a dual GPU card because they have since the 7950GX2. Also, people argued over heat that there wouldnt be a dual gpu GT200 but in fact there was. Arguing over what they will or will not do seems childish when it wouldnt matter either way. You will buy what card you want or can afford and call it a day no matter if they release all at once or release a dual gpu card.
There never is and never will be a dual GT200 card. :nutkick:
You need to get your facts right, before you argue.
The GTX 295 is a dual GT200b card, the 55nm die shink allow this to happen.
If they try to get dual GT200 on a single cooler, you get another Asus Mars furnace.
I am sure it would have made you a cup of coffee. :roll:
Posted on Reply
#171
Animalpak









Uhmm tesla is for programming or what ? Not sure but i think can look like this
Posted on Reply
#172
mR Yellow
Looks sexy...even tho i'm not a huge nVidia fan.
Posted on Reply
#173
El Fiendo
Wow. Fits you were right about the looks on this. Very nice.
Posted on Reply
#174
amschip
I still wonder will that 3 billion transistors really make a difference taking into account all the added cpu kind functionality? gt200 was much bigger than rv770 and yet that difference didn't really scaled well.
Just my two cents...:)
Posted on Reply
#175
mR Yellow
amschipI still wonder will that 3 billion transistors really make a difference taking into account all the added cpu kind functionality? gt200 was much bigger than rv770 and yet that difference didn't really scaled well.
Just my two cents...:)
Time will tell.
Posted on Reply
Add your own comment
May 15th, 2024 23:39 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts