Sunday, May 17th 2009

NVIDIA GT300 Already Taped Out

NVIDIA's upcoming next-generation graphics processor, codenamed GT300 is on course for launch later this year. Its development seems to have crossed an important milestone, with news emerging that the company has already taped out some of the first engineering samples of the GPU, under the A1 batch. The development of the GPU is significant since it is the first high-end GPU to be designed on the 40 nm silicon process. Both NVIDIA and AMD however, are facing issues with the 40 nm manufacturing node of TSMC, the principal foundry-partner for the two. Due to this reason, the chip might be built by another foundry partner (yet to be known) the two are reaching out to. UMC could be a possibility, as it has recently announced its 40 nm node that is ready for "real, high-performance" designs.

The GT300 comes in three basic forms, which perhaps are differentiated by batch quality processing: G300 (that make it to consumer graphics, GeForce series), GT300 (that make it to high-performance computing products, Tesla series), and G200GL (that make it to professional/enterprise graphics, Quadro series). From what we know so far, the core features 512 shader processors, a revamped data processing model in the form of MIMD, and will feature a 512-bit wide GDDR5 memory interface to churn out around 256 GB/s of memory bandwidth. The GPU is compliant with DirectX 11, which makes its entry with Microsoft Windows 7 later this year, and can be found in release candidate versions of the OS already.
Source: Bright Side of News
Add your own comment

96 Comments on NVIDIA GT300 Already Taped Out

#51
KainXS
lol, i messed up I gots my info from the wrong source I was looking for :confused: oh well, it looks like it will be 1200sp's and 32 rops, my mistake, thats what happens when you look at an old source and try to defend it:shadedshu

I was looking for this one
www.hexus.net/content/item.php?item=18240

but I have been hearing that ATI redeveloped their dual gpu pcb's for better bandwidth utilization


so . . . nvidia keeps their crown, . . . nice i guess good for price wars
Posted on Reply
#52
Blacksniper87
DarrenWould you pay 21% more for a car that runs 5% faster? NO

So why would anyone with a brain do it for a GPU or a CPU?
I'm not saying you should buy it, if you read the post properly you would realise i am saying that nvidia is at an unfair disadvantage due to the fact that there is an overclocked 4890 in their and no overclocked nvidia cards. Also you would notice that your analogy is severly flawed, because lots of people do buy the car that is 5% faster and in some cases costs more then double the slower one. The high end suff is what makes most computer companies the profit or the money back for R&D so stop bitching and it would probably be a good idea for the fanbois above me not to post fanboi crap on an nvidia news story because all it says is that you are worried your precious AMD/ATI will not win the next round as they have this one.

DISCLOSURE: I am not a fanboi i will buy whatever the best card is in my price range so please i really don't need the flaming.

EDIT: Just in case on goes the fireproof suit ;)
Posted on Reply
#53
a_ump
KainXSlol, i messed up I gots my info from the wrong source I was looking for :confused: oh well, it looks like it will be 1200sp's and 32 rops, my mistake, thats what happens when you look at an old source and try to defend it:shadedshu

I was looking for this one
www.hexus.net/content/item.php?item=18240

but I have been hearing that ATI redeveloped their dual gpu pcb's for better bandwidth utilization


so . . . nvidia keeps their crown, . . . nice i guess good for price wars
yep, i'm assuming that better bandwidth utilization is the sideport that is disabled on HD 4870x2 will be enabled on the HD 5870x2. direct communication between the cores instead of having to go through the onboard xfire chip.
Posted on Reply
#54
Darren
Blacksniper87I'm not saying you should buy it, if you read the post properly you would realise i am saying that nvidia is at an unfair disadvantage due to the fact that there is an overclocked 4890 in their and no overclocked nvidia cards. Also you would notice that your analogy is severly flawed, because lots of people do buy the car that is 5% faster and in some cases costs more then double the slower one. The high end suff is what makes most computer companies the profit or the money back for R&D so stop bitching and it would probably be a good idea for the fanbois above me not to post fanboi crap on an nvidia news story because all it says is that you are worried your precious AMD/ATI will not win the next round as they have this one.

DISCLOSURE: I am not a fanboi i will buy whatever the best card is in my price range so please i really don't need the flaming.

EDIT: Just in case on goes the fireproof suit ;)
This is where you are going wrong, its just an overclocked product, there was modifications to the physical layout of the 4890.

Nvidia is not at an unfair advantage at all, with that logic you can say that ATI is at an unfair advantage in the midrange due to the Nvidia's GTS250 series being an overclocked 9800 GTX+. So its ok for Nvidia sell a GTS 250 which is an overclocked 9800 GTX+ but its not ok for ATI to overclock their 4870 and call it a 4890?

One rule for Nvidia and a different rule for ATI huh?


In fact in both cases the 4890 and GTS 250 are more than just mere overclocks and hence its very fair! - the GTS 250 uses 55nm opposed to 65mn core and the 4890 has an increased ring of 3 million transistors more than the 4870 and hence its not JUST and overclock and therefore no disadvantage.
Posted on Reply
#55
DrPepper
The Doctor is in the house
KainXSlol, i messed up I gots my info from the wrong source I was looking for :confused: oh well, it looks like it will be 1200sp's and 32 rops, my mistake, thats what happens when you look at an old source and try to defend it:shadedshu

I was looking for this one
www.hexus.net/content/item.php?item=18240

but I have been hearing that ATI redeveloped their dual gpu pcb's for better bandwidth utilization


so . . . nvidia keeps their crown, . . . nice i guess good for price wars
Not a big deal anyway I expect sources to change all the time. I think 1200sp is more than enough. It adds performance without the need to completely rework the design of the GPU and it keeps the costs lower.
Posted on Reply
#56
ownage
Come on guys. GT300 is taped-out when a reliable source says so. Therefore there is only a little change its taped-out.
Posted on Reply
#57
Blacksniper87
DarrenThis is where you are going wrong, its just an overclocked product, there was modifications to the physical layout of the 4890.

Nvidia is not at an unfair advantage at all, with that logic you can say that ATI is at an unfair advantage in the midrange due to the Nvidia's GTS250 series being an overclocked 9800 GTX+. So its ok for Nvidia sell a GTS 250 which is an overclocked 9800 GTX+ but its not ok for ATI to overclock their 4870 and call it a 4890?

One rule for Nvidia and a different rule for ATI huh?


In fact in both cases the 4890 and GTS 250 are more than just mere overclocks and hence its very fair! - the GTS 250 uses 55nm opposed to 65mn core and the 4890 has an increased ring of 3 million transistors more than the 4870 and hence its not JUST and overclock and therefore no disadvantage.
I am saying if you can read, is that their is an overclocked version in the graph. So bloody read the graph properly and realise you are exhibiting a brillant exampe of fanboism. As i said before your analogy was crap.

NOTE: I did not at any stage say that the 4890 was eith an overclock ofthe 4870 a rebadge or any other statement i simply observed the graphs and responded.
Posted on Reply
#60
Valdez
I hope it's clear, that ati doesn't want to make a big gpu to compete with nvidia. This is where ati's and nvidia's strategies differs, nvidia wants one, but a very powerful gpu, and ati wants to go multicore. So they make a small, but very efficient gpu, and release 1,2 or 4 gpu cards. 1 rv870 can't compete with gt300, but it doesn't have to. That's the x2's task.
And we didn't even spoke about the x4 variant (if the gossips of MCM are true) :D.
Posted on Reply
#61
newtekie1
Semi-Retired Folder
@Darren:

1.) The overclocked HD4890 can beat the stock GTX285 in a few games, but overall the GTX285 still the better performer. The original comment, and this thread, has little to do with price or performance per dollar. If you haven't realized that price does not increase at the same rate as performance, and that higher performing cards come at a price premium, you really have not business in a discussion about performance or price.

2.) You seem to really like to worry about performance per dollar. And while you were trying to sound all high and mighty, and insult the other members, you seemed to have completely failed to notice that the GTX275 offers a better performance per dollar than the stock HD4890, and equals the performance per dollar of the overclocked card in the graphs you posted... Oddly enough, it also straight up outperformed the stock hard, and matched the overclocked...
a_umpyea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back
I always find this argument odd...

So your argument is that nVidia needed a card with 2 GPUs to take the performance crown from ATi, and this is some how a negative for nVidia...

But ATi needed a card with 2 GPUs to take the performance crown nVidia's single GPU. How is that better? Why is it bad for nVidia to need two GPUs to beat ATI's two GPUs, but not bad for ATi to need 2 GPUs to beat nVidia's 1?
Posted on Reply
#62
iStink
Steevoi6.techpowerup.com/reviews/Powercolor/HD_4890_PCS/images/perfdollar_1680.gif


i6.techpowerup.com/reviews/Powercolor/HD_4890_PCS/images/perfdollar_1920.gif

I know words are hard, so here are pictures. The ones above are performance per dollar.

Do you understand this?


Below are the performance at STOCK clocks for the models.

i2.techpowerup.com/reviews/Powercolor/HD_4890_PCS/images/perfrel_1680.gif

i3.techpowerup.com/reviews/Powercolor/HD_4890_PCS/images/perfrel_1920.gif


So let me see, spend more money for less performance............ GTX285 FTW!!!!!! ;)
I know not being a raging douche bag is hard, so let me avoid turning this into an ati vs nvidia flame war and remind you that the poster I was replying to never mentioned price, and was focused on performance only.

Actually, my initial response reminded him to always mention price when trying to compare the two cards, but I felt that if I did, he'd feel I was somehow making fun of him since only a moron would forget such a thing.

Have a nice day, dick.
Posted on Reply
#63
enaher
newtekie1I always find this argument odd...

So your argument is that nVidia needed a card with 2 GPUs to take the performance crown from ATi, and this is some how a negative for nVidia...

But ATi needed a card with 2 GPUs to take the performance crown nVidia's single GPU. How is that better? Why is it bad for nVidia to need two GPUs to beat ATI's two GPUs, but not bad for ATi to need 2 GPUs to beat nVidia's 1?
Its bad beacuse of the monolitic strategy used by nvidia big expensive, hot and power hungry, they needed a shrink to make the GTX295, i think its bad for their bags, and thats why we are getting a single pcb GTX295 to increase their profit, lets remember they also got cheap on the cooler for the 55nm GTX 260 and actually runs a bit hotter, they got rid of the backplate of the card, they even made a simpler version of the pcb that in theory should OC a little worse...
Posted on Reply
#64
Steevo
Blacksniper87I am saying if you can read, is that their is an overclocked version in the graph. So bloody read the graph properly and realise you are exhibiting a brillant exampe of fanboism. As i said before your analogy was crap.

NOTE: I did not at any stage say that the 4890 was eith an overclock ofthe 4870 a rebadge or any other statement i simply observed the graphs and responded.
How is a 1Ghz or 950Mhz product from a AIB partner any different than a AIB partner for Nvidia releasing a overclocked product?


For the record 1Ghz is within spec for these chips, and not a "overclock" part.


What I care about is performance at the end of the day. How many cried when GTA4 came out and they couldn't play it at high settings, and yet it just required more memory, so in all actuality my lowly 4850 beat a 786MB card of the green camp. I overclocked my card, spent less and played more. same for a 4890, and hell even the 4770 in my parents build. $99 that kicks the shit out of a GTS250.


The difference here is we are all talking performance per dollar, performance maximum availability, and drivers.


In almost every segment of the market ATI has Nvidia beat on price and or performance. For a actual mid range card the 4770/4850 rapes Nvidia, 4890 rapes everything but 295 but for the money of a 295 you could X-fire two 4890's and still rape it, hell a 4870 X2 is 14% more efficient per dollar.
Posted on Reply
#65
newtekie1
Semi-Retired Folder
enaherIts bad beacuse of the monolitic strategy used by nvidia big expensive, hot and power hungry, they needed a shrink to make the GTX295, i think its bad for their bags, and thats why we are getting a single pcb GTX295 to increase their profit, lets remember they also got cheap on the cooler for the 55nm GTX 260 and actually runs a bit hotter, they got rid of the backplate of the card, they even made a simpler version of the pcb that in theory should OC a little worse...
The requirement of a die shrink has nothing to do with it. The argument is that it is somehow acceptable for ATi to use 2 GPUs to best 1 GPU, but somehow not as acceptable for nVidia to use 2 GPUs to best 2 GPUs. I just find that very odd.

And as for the 55nm GT200 cards, they got rid of the backplate because the memory was all moved to the front of the card, so the backplate was not needed to cool the RAM anymore.
Posted on Reply
#66
Steevo
iStinkI know not being a raging douche bag is hard, so let me avoid turning this into an ati vs nvidia flame war and remind you that the poster I was replying to never mentioned price, and was focused on performance only.

Actually, my initial response reminded him to always mention price when trying to compare the two cards, but I felt that if I did, he'd feel I was somehow making fun of him since only a moron would forget such a thing.

Have a nice day, dick.
I love you sweetie.


I wasn't aiming this at you, but at the haters, I have Nvidia cards in use at work. I use what gets me from (figurative) point A to B the cheapest and within reason. If Nvidia ever releases a product when I am ready to buy that does better then the red counterpart, I will move to that. I will admit the biggest fuckup I had was my X1800XT, $599 that I sold a year or so after for $70.

I don't hate Nvidia, or ATI, I hate the fanbois who believe their company of choice can do no wrong. I was also in the wrong when I bought a Prescott thinking something was being made better. I was wrong and moved to AMD. I only have one AMD at work, most are C2D's or other Intels.


Anyway, enough of the name calling.
Posted on Reply
#67
DrPepper
The Doctor is in the house
SteevoHow is a 1Ghz or 950Mhz product from a AIB partner any different than a AIB partner for Nvidia releasing a overclocked product?


For the record 1Ghz is within spec for these chips, and not a "overclock" part.


What I care about is performance at the end of the day. How many cried when GTA4 came out and they couldn't play it at high settings, and yet it just required more memory, so in all actuality my lowly 4850 beat a 786MB card of the green camp. I overclocked my card, spent less and played more. same for a 4890, and hell even the 4770 in my parents build. $99 that kicks the shit out of a GTS250.


The difference here is we are all talking performance per dollar, performance maximum availability, and drivers.


In almost every segment of the market ATI has Nvidia beat on price and or performance. For a actual mid range card the 4770/4850 rapes Nvidia, 4890 rapes everything but 295 but for the money of a 295 you could X-fire two 4890's and still rape it, hell a 4870 X2 is 14% more efficient per dollar.
Erm in the £130 mark the gtx260 is a clear winner. ATI's equivelant is a 4850 which can't beat a 260 in anything.
Posted on Reply
#68
enaher
newtekie1The requirement of a die shrink has nothing to do with it. The argument is that it is somehow acceptable for ATi to use 2 GPUs to best 1 GPU, but somehow not as acceptable for nVidia to use 2 GPUs to best 2 GPUs. I just find that very odd.

And as for the 55nm GT200 cards, they got rid of the backplate because the memory was all moved to the front of the card, so the backplate was not needed to cool the RAM anymore.
ok
simply put... quality, Usually quality wise Nvidia is better most of the time, cooler, warranty (EVGA, BFG), price/performance wise MOST of the time goes to Ati, remember im not talking quality of the gpu, these is reference on the cooler, and partners warranty and upgrade plans, But this is a big BUT, nvidia usually develops a BIGBADASS core very expensive to manufacture, then to recover the crown(taken by ati dual gpu) they make a two core solution thats very expensive, just to recover market share, and drop the ball on other things an example is my friend he had a EVGA GTX260 SC 192 and EVGA replaced it with a 55nm Wich ran hotter OC a little less and even had some cheap looking capacitors the 192 version was full SS Capacitors, im not saying Nvidia cant do a 2 core gpu and Ati can, IM SAYING they usualy dont plan to, and when they do build one, they start to loose cash, remember the couple of $ they lost recently, i like Ati & Nvidia i want to keep the fight as close as they can both with profits and quality.

EDIT: Of course they could do a 65nm version of the 295, it had nothing to do with thermals and power draw, and of course they remove the backplate cause they move the memory, and gee me thinking here they move the memory to get rid of the backplate and bring cost down.
Posted on Reply
#69
a_ump
newtekie1The requirement of a die shrink has nothing to do with it. The argument is that it is somehow acceptable for ATi to use 2 GPUs to best 1 GPU, but somehow not as acceptable for nVidia to use 2 GPUs to best 2 GPUs. I just find that very odd.

And as for the 55nm GT200 cards, they got rid of the backplate because the memory was all moved to the front of the card, so the backplate was not needed to cool the RAM anymore.
u miss understood me, i simply stated the 1GPU part of nvidia as whenever someone mentions the HD 4870x2 besting nvidia they chime in that nvidia has the best single gpu product, so i figured i'd save myself from reading the responses by stating it myself lol, though apparently it spawned a new argument.
Posted on Reply
#70
enaher
a_umpu miss understood me, i simply stated the 1GPU part of nvidia as whenever someone mentions the HD 4870x2 besting nvidia they chime in that nvidia has the best single gpu product, so i figured i'd save myself from reading the responses by stating it myself lol, though apparently it spawned a new argument.
ehehe:) it seems everyone misunderstood, i took as to why its negative for nvidias sake to make a double gpu, its ok to make dual gpus for anyone to get the performance crown, its usually better for ati buisness model though:toast:
Posted on Reply
#71
iStink
SteevoI love you sweetie.


I wasn't aiming this at you, but at the haters, I have Nvidia cards in use at work. I use what gets me from (figurative) point A to B the cheapest and within reason. If Nvidia ever releases a product when I am ready to buy that does better then the red counterpart, I will move to that. I will admit the biggest fuckup I had was my X1800XT, $599 that I sold a year or so after for $70.

I don't hate Nvidia, or ATI, I hate the fanbois who believe their company of choice can do no wrong. I was also in the wrong when I bought a Prescott thinking something was being made better. I was wrong and moved to AMD. I only have one AMD at work, most are C2D's or other Intels.


Anyway, enough of the name calling.
Sorry, the whole "words are hard" comment just irritated me lol. Sorry for the name calling.

As far as bang for the buck goes, I'm the same way when it comes to what I buy. I've loved the hell out of my 8800gt, but I think it's time to upgrade soon. I've been seriously eyeballing these 4890s as a practical purchase. I honestly have no preference between nvidia or ati, so long as the product makes me happy. I don't understand arguments that are fueled by illogical brand loyalty.

Oh and don't feel bad about spending too much on a video card. I bought an x800xl from compusa when I probably could have gotten it for much, MUCH cheaper.
Posted on Reply
#72
a_ump
iStinkSorry, the whole "words are hard" comment just irritated me lol. Sorry for the name calling.

As far as bang for the buck goes, I'm the same way when it comes to what I buy. I've loved the hell out of my 8800gt, but I think it's time to upgrade soon. I've been seriously eyeballing these 4890s as a practical purchase. I honestly have no preference between nvidia or ati, so long as the product makes me happy. I don't understand arguments that are fueled by illogical brand loyalty.

Oh and don't feel bad about spending too much on a video card. I bought an x800xl from compusa when I probably could have gotten it for much, MUCH cheaper.
exactly, that's the way i shop when it comes to computer hardware and it's the logical way imo. Though whenever someone is building a rig and ask for help if there's an Intel rig and then an AMD rig of same performance for round the same price, i'll point them towards AMD jsut cause i was AMD to do better to improve competition and keep themselves alive lol
Posted on Reply
#73
iStink
a_umpexactly, that's the way i shop when it comes to computer hardware and it's the logical way imo. Though whenever someone is building a rig and ask for help if there's an Intel rig and then an AMD rig of same performance for round the same price, i'll point them towards AMD jsut cause i was AMD to do better to improve competition and keep themselves alive lol
It's the same damn thing in a Mac vs PC discussion. I mean I popped in just now to check apple insider, and there was an article talking about Microsoft's new laptop ads. There are 256 comments posted in response to this article in just two days. You know what they are discussing? "APPLE COSTS TOO MUCH", "PC'S ARE CRAP", "MICROSOFT IS EVIL", "APPLE ARE A BUNCH OF THIEVES", "PC'S BREAK ALL THE TIME", "MAC USERS ARE ELITESTS"

I literally just posted a simple question "Are you happy with your computer? If you are, then what does it matter what anyone else tells you?"

I see the same thing here (maybe not as heavy) about intel vs amd or nvidia vs ati. People get so anal about mindless crap that has no effect on anything.
Posted on Reply
#74
Kursah
I hope the tight competition keeps up, I think it's good both ati and nv took a different route to the same solution. I wish 3dfx would return and that Intel does well too, the more competition the better off we are with getting better prices, also the better chances of everyone getting a solid performer no matter what they choose. I could care less about HD5xxx this or GT300 that till they're actually out and rendering games (and benches) for the masses. I'll read some reviews, see what forum members get for OC's, see how driver support, temps, failures go and wait and focus more on what the upper mid-range has to bring. I might not even replace my 260 if all my games still play smooth, I've been so happy with this card the last 10 months I just hope the price/performance competition keeps up in the future and maybe more contenders step in the ring.

:toast:
Posted on Reply
#75
icon1
I always use NVDIA cards but nevertheless I still want ATI to come up with something strong with their next gen graphic cards. Healthy competition on both side is always nice too keep
the price of the next gen card more affordable, if ATI doesn't come up with anything strong to compete w/ this beast then the price of Nvidia's GT300 will be sky high..

with the next gen cards just around the corner this is getting more exciting Lol.
Posted on Reply
Add your own comment
Apr 18th, 2024 09:09 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts