Thursday, July 10th 2008

NVIDIA Preparing GT300 Graphics Processor?

Hardspell reports that NVIDIA may have cancelled working on the G200b (the 55nm version of GT200). Details on this new DirectX 10.1 graphics processor trickled in, it has 384 shader units, uses a 45nm fabrication process and incorporates 1 GB of GDDR5 memory at 4.00 GHz (effective) while the core could be clocked at 800 MHz with a 2.00 GHz shader domain. NVIDIA hopes to take on the R700 and its successor which unreliable sources claim to be based on the Super-RV770.
Source: Hardspell
Add your own comment

68 Comments on NVIDIA Preparing GT300 Graphics Processor?

#26
flashstar
Right now ATI definitely has the most efficient GPU. Nvidia has to make a GPU that's twice as big to compete lol.

Nvidia really has to redesign their graphics chips. ATI did so with R600 and now it's paying off.
Posted on Reply
#27
candle_86
you call 800SP's efficient, its mere shader power is why they are winning complex and simple shaders total 320 effective shaders for the R770 while the GT200 is 240shaders, its brute force is why they won
Posted on Reply
#28
DarkMatter
flashstarRight now ATI definitely has the most efficient GPU. Nvidia has to make a GPU that's twice as big to compete lol.

Nvidia really has to redesign their graphics chips. ATI did so with R600 and now it's paying off.
In order to say which one is more efficient, you have to first define what efficient is and at what level you are speaking. For instance:

1- GTX280 doe not compete with RV770, it's well above. Also in no way it's double as big. 1.4 vs. 1 billion transistor. 40% difference.

2- The card that directly competes, GTX 260 has 25% less resources then GTX 280. Would have been a "native" chip it'd have 1.1 billion transistors. RV770 has 1 billion. We could reverse your sentence and say Ati required 1 billion transistor chip to compete with Nvidia's 700 million chip... << 40% difference.

3- Nvidia cards have more ROPs, it contributes to size and as of now not to performance but it shows off where it's needed, very few cases, granted. No released game uses the power because before GT200, there was no card capable of that much (G80 does not count because it was to early: 7900 and X1900 and lower cards had to be taken into account). Games are optimized for 16 ROPs. If games start using that power or not is out of the question, but it does depend on the definition of efficiency you use.
Posted on Reply
#29
vojc
candle_86you call 800SP's efficient, its mere shader power is why they are winning complex and simple shaders total 320 effective shaders for the R770 while the GT200 is 240shaders, its brute force is why they won
speed of the shaders is an issue here, nvidia 1500mhz+, ATI only 750MHz shader clock
second problem for nvidia is 45nm tech fab, who will make 45nm chps? INTEL? i think not, nvidia is transfering chips from 65nm to 55nm, while ATI is already on 55 for a time, next step is 40nm at TMSC
Posted on Reply
#30
flashstar
@ Darkmatter, are you saying that the 260 is the equivalent of 700 million transistors? That doesn't make much sense.

I'm looking at size of chip vs. performance. I don't care about chip potential. Remember that the FX 5200- 5900 were complicated but unbalanced. Even though they had much potential power, the series never saw that potential.

ATI's chip is far smaller than Nvidia's and it produces performance competitive to the GTX 260. There is no way that you can say that the 260 is like a 280 with half the transistors.
Posted on Reply
#31
DarkMatter
flashstar@ Darkmatter, are you saying that the 260 is the equivalent of 700 million transistors? That doesn't make much sense.

I'm looking at size of chip vs. performance. I don't care about chip potential. Remember that the FX 5200- 5900 were complicated but unbalanced. Even though they had much potential power, the series never saw that potential.

ATI's chip is far smaller than Nvidia's and it produces performance competitive to the GTX 260. There is no way that you can say that the 260 is like a 280 with half the transistors.
No.

G92 = 700 million transistors.

RV770 = 1 billion trnsistors.

GT200 = 1.4 billion transistors.

Now HD4850 competes with higher-end G92 as GTX 260 competes with higher-end RV770. High-end G92 can't "touch" high-end RV770 just as high-end RV770 can't "touch" high-end GT200. If your sentence is valid, mine is too.

EDIT: the othe thing I said is that if GTX260 was the high-end Nvidia, the native chip, the physical chip, it would only require 1.1 billion transistors as those are the number of them in use in GTX260.
Posted on Reply
#32
vojc
DarkMatterNo.

G92 = 700 million transistors.

RV770 = 1 billion trnsistors.

GT200 = 1.4 billion transistors.

Now HD4850 competes with higher-end G92 as GTX 260 competes with higher-end RV770. High-end G92 can't "touch" high-end RV770 just as high-end RV770 can't "touch" high-end GT200. If your sentence is valid, mine is too.

EDIT: the othe thing I said is that if GTX260 was the high-end Nvidia, the native chip, the physical chip, it would only require 1.1 billion transistors as those are the number of them in use in GTX260.
260GTX IS G200 so it has 1.4bilion transistor in general......than again some of them are disabled so the final No. of working transistors is ~1.2 mrd
Posted on Reply
#33
DarkMatter
vojc260GTX IS G200 so it has 1.4bilion transistor in general......than again some of them are disabled so the final No. of working transistors is ~1.2 mrd
Personal note: learn how to express your self better. :o

That's what I said. I was pointing out it is pointless to compare GTX 260 to HD4870 (same performance) and say Nvidia needs 1.4 billion transistors to compete with Ati's 1 bilion transistor chip. All 1.4 b transistors are not used. When used (GTX280) the result is a up to 30% faster card, on some cases.
Posted on Reply
#34
vojc
TRUE :)
and sorry for my bad english :P
Posted on Reply
#35
eidairaman1
The Exiled Airman
ok for now on, no more picking fights with fans of Nvidia, same goes for ATI, because technically we are all right and not wrong ok, if i recall we should help those who are having problems with their products, not try to push a bias on one another as it doesnt work.
Posted on Reply
#36
zithe
WeerYes, precisely. nVidia has no reason to launch anything else to compete with the R700. If they lower the price of the GTX 260, or launch a Dual-GPU card, they will beat it, and still maintain a huge lead in the high to ultimate end markets with GTX 280 SLi and Triple SLi.

This seems fabricated. 384 SP's? Impossible.
The GTX series cost more to make than the 280 was at starting point (Or around there).
Each card nvidia sells is a huge gouge out of their pocket. I doubt nvidia could afford to drop prices much more. The amount they're losing seems to be more than the price of a PS3!

Imagine the price of a GTX 200 gx2 card if released right now. Sales would be horrible, even worse than the GTX 280 right now unless they sold it for not much more than a 4870x2 (Doubt they could afford that, REALLY doubt it)
vojcTRUE :)
and sorry for my bad english :P
Your english isn't bad. :)
Posted on Reply
#37
Darkrealms
eidairaman1ok for now on, no more picking fights with fans of Nvidia, same goes for ATI, because technically we are all right and not wrong ok, if i recall we should help those who are having problems with their products, not try to push a bias on one another as it doesnt work.
Awww . . . . but . . but . . ; p
I have a hammer I can help users with troubled products : )
Posted on Reply
#38
cdawall
where the hell are my stars
lol you know what the GTX series reminds me of?

nvidia FX :nutkick: oh noes ATi has passed them again and there cards cant even keep up with midrange ATi cards haha
Posted on Reply
#39
btarunr
Editor & Senior Moderator
GTX series is just the cream of the crop range, it's not meant to have a classification in it. GTX 200 = elite, GeForce 9 series = mainstream.
Posted on Reply
#40
overclocker!
i wonder know how much they gonna charge? $2000 :laugh::laugh:

ATI rocks!!! never Nvidia again!!!!!!!!!
Posted on Reply
#41
WarEagleAU
Bird of Prey
This just seems eerily strange to me to jump down to 45nm and go with GDDR5 so quickly...
Posted on Reply
#42
candle_86
cdawalllol you know what the GTX series reminds me of?

nvidia FX :nutkick: oh noes ATi has passed them again and there cards cant even keep up with midrange ATi cards haha
explain that, right now the 9800GTX keeps up with HD4850, and the GTX260 ties the HD4870 and GTX280 pwns them both. Now all Nvidia needs to do to steal some thunder is release the 9800GX2+ @ 9800GTX speeds but dual GPU and it will once more steal thunder and the GTX280 or 260GX2 could be done for 600 for the 260 and 700 for the 280 GX2 class cards and it wouldnt be that hard actully. I quite expect an anwser from Nvidia to crush ATI, they are not one to sit back. Remember what they say one powerful core is better than 2 lesser cores, why because you can always take 2 powerful cores and put them together
Posted on Reply
#43
cdawall
where the hell are my stars
candle_86explain that, right now the 9800GTX keeps up with HD4850, and the GTX260 ties the HD4870 and GTX280 pwns them both. Now all Nvidia needs to do to steal some thunder is release the 9800GX2+ @ 9800GTX speeds but dual GPU and it will once more steal thunder and the GTX280 or 260GX2 could be done for 600 for the 260 and 700 for the 280 GX2 class cards and it wouldnt be that hard actully. I quite expect an anwser from Nvidia to crush ATI, they are not one to sit back. Remember what they say one powerful core is better than 2 lesser cores, why because you can always take 2 powerful cores and put them together
i was referring to the FX series with the midrange beating top of the line but who knows what if drivers do the same for the 48X0 series?
Posted on Reply
#44
Megasty
Hopefully, NV learned something with this go-around & won't charge $650 for their next mega gpu. Charging so much for something that will end up $150+ less a month later is just stupid. I usually wouldn't care so much about the price but the value certainly isn't there whatsoever.
Posted on Reply
#45
candle_86
MegastyHopefully, NV learned something with this go-around & won't charge $650 for their next mega gpu. Charging so much for something that will end up $150+ less a month later is just stupid. I usually wouldn't care so much about the price but the value certainly isn't there whatsoever.
the only reason its not is AMD decided to undercut the market in a desprate bid to gain marketshare. How much money does AMD make off the 48xx cards honestly?

Id guess there chip costs about 130-150 to tap out, PCB costs about 40-50, ram costs about 60-70. So at worst case where looking @ 220 to produce the whole card, most likly about 180 to produce it for the 4850 and prolly around 250 because of GDDR5 for the 4870 so do they really make alot of money with there cards, this was a bid to gain marketshare not money
Posted on Reply
#46
Megasty
candle_86the only reason its not is AMD decided to undercut the market in a desprate bid to gain marketshare. How much money does AMD make off the 48xx cards honestly?

Id guess there chip costs about 130-150 to tap out, PCB costs about 40-50, ram costs about 60-70. So at worst case where looking @ 220 to produce the whole card, most likly about 180 to produce it for the 4850 and prolly around 250 because of GDDR5 for the 4870 so do they really make alot of money with there cards, this was a bid to gain marketshare not money
It seemed to have worked didn't it. NV refuses to give their customers a break when it comes to pricing while ATi hardly makes anything on their cards. NV charges at the least, twice what it costs to make & market their cards. It finally caught up to them & they better learn from it. On the other hand, they may have actually thought the value for those cards were set in stone. Too bad a little competition crushes that mentality to dust.
Posted on Reply
#47
MoeDaKilla
Who cares whether or not AMD or NVIDIA makes money. It's all good news for the consumer. Who doesn't like price cuts and the arrival of new technologies. Don't fight guys, let's rejoice at the fact that everyone can buy an awesome graphics card regardless how much (or how little) they spend :toast:
Posted on Reply
#48
imperialreign
WarEagleAUThis just seems eerily strange to me to jump down to 45nm and go with GDDR5 so quickly...
Posted on Reply
#49
warhammer
MoeDaKillaWho cares whether or not AMD or NVIDIA makes money. It's all good news for the consumer. Who doesn't like price cuts and the arrival of new technologies. Don't fight guys, let's rejoice at the fact that everyone can buy an awesome graphics card regardless how much (or how little) they spend :toast:
F1 to that..

Supply and demand the more popular the item is prices sometimes go up and up..
Posted on Reply
#50
NinkobEi
I'm sure ATI planned out their prices before hand. I mean, DDR3 in vid cards isnt exactly expensive. I'm willing to bet the cards arent all that expensive to make due to their fabrication size (nm)
Posted on Reply
Add your own comment
Apr 25th, 2024 16:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts