Monday, January 5th 2009

More GT212 Information Forthcoming

NVIDIA's G200b, its current flagship GPU will be suceeded later this year with the GT212, and as Hardware Infos discovers, NVIDIA seems to have given some interesting specifications to the GT212. To begin with, the GPU holds more than 1.8 billion transistors. It is built on TSMC's 40nm manufacturing node. The shader domain gets a boost with 384 shader units (a 60% increase over G200(b)). The GPU holds 96 texture memory units and a 256-bit wide GDDR5 memory bus with a clock speed of 1250 MHz (5000 MT/s).

The transition to GDDR5 seemed inevitable, with there being a genuine incentive of cutting down the number of memory chips (due to the efficient memory bus), with NVIDIA having completely avoided GDDR4. With the die-size being expected to be around 300 sq. mm, these GPUs will be cheaper to manufacture. The GT212 is slated for Q2 2009.

Source: Hardware-Infos
Add your own comment

27 Comments on More GT212 Information Forthcoming

#1
Tatty_One
Looks nice, a good reason to skip G200b.......makes you wonder really why NVidia would release this info so early, surely it will hit their 200b revision sales?
Posted on Reply
#2
btarunr
Editor & Senior Moderator
by: Tatty_One
Looks nice, a good reason to skip G200b.......makes you wonder really why NVidia would release this info so early, surely it will hit their 200b revision sales?
G200b wasn't part of their long-term plans, NV was forced to bring out the G200b. Q2 2008 was when G200 came out, Q2 2009 is when GT212 is supposed to come out. G200b is an intermediate which was required to cut-down mfg costs and help up the clock speeds (or overclockability) to compete with the RV770 on both price and performance fronts.
Posted on Reply
#3
AltecV1
that chip looks yumie:D
Posted on Reply
#4
FreedomEclipse
Crazy Dogmatic Bullsh!t!
DDR5 with 256Mb mem interface??? or is that how much ram the card has onboard???? id like to see benches of the 212 compared to the other cards.
Posted on Reply
#5
AltecV1
where do you see 256Mb???????????????????????????????????????????????????
Posted on Reply
#6
Tatty_One
by: FreedomEclipse
DDR5 with 256Mb mem interface??? or is that how much ram the card has onboard???? id like to see benches of the 212 compared to the other cards.
Yup, a 256 bus, seems that GDDR5 requires less bandwidth hence the RV770 series also having just the 256.
Posted on Reply
#7
Tatty_One
by: AltecV1
where do you see 256Mb???????????????????????????????????????????????????
Here.......................
Posted on Reply
#8
zithe
by: Tatty_One
Here.......................
That says bit, not mb. XD
Posted on Reply
#9
Tatty_One
by: zithe
That says bit, not mb. XD
Yeah I know....freedom Eclipses terminology :p
Posted on Reply
#10
zithe
by: Tatty_One
Yeah I know....freedom Eclipses terminology :p
A 512 megabit interface = uber bandwidth. @.@

1TB of XDR 24,000! XD

Sorry. Side-tracking.
Posted on Reply
#11
kid41212003
by: btarunr

The transition to GDDR5 seemed inevitable, with there being a genuine incentive of cutting down the number of memory chips (due to the efficient memory bus), with NVIDIA having completely avoided GDDR4. With the die-size being expected to be around 300 sq. mm, these GPUs will be cheaper to manufacture. The GT212 is slated for Q2 2009.
I think I will skip the G200b, and wait for these :eek:.
Posted on Reply
#12
btarunr
Editor & Senior Moderator
by: kid41212003
I think I will skip the G200b, and wait for these :eek:.
Again, it's not known if NVIDIA will pass over this benefit to the consumer. :)
Posted on Reply
#13
iamverysmart
by: kid41212003
I think I will skip the G200b, and wait for these :eek:.
The G200b will last for....................one quarter, 3 months unless delayed.
Posted on Reply
#14
mdm-adph
I betcha that'll be extremely late Q2 2009, too. Maybe with that small die-size Nvidia will finally come out with a dual-chip card on one board. :D
Posted on Reply
#15
szulmizan
why not using 512bit memory bus and GDDR5.. The memory bandwidth will be more than 240GB/s..
Posted on Reply
#16
AltecV1
because the pcb will be expencive and it will be overkill!
Posted on Reply
#17
FreedomEclipse
Crazy Dogmatic Bullsh!t!
by: AltecV1
where do you see 256Mb???????????????????????????????????????????????????
by: Tatty_One
Here.......................
by: zithe
That says bit, not mb. XD
by: Tatty_One
Yeah I know....freedom Eclipses terminology :p
by: zithe
A 512 megabit interface = uber bandwidth. @.@

1TB of XDR 24,000! XD

Sorry. Side-tracking.
my bad - I did mean 'bit' - got into the bad habit of looking in another direction as i type (LOL) my touch typin skills aint up to scratch just yet.
Posted on Reply
#18
Zehnsucht
by: btarunr
Again, it's not known if NVIDIA will pass over this benefit to the consumer. :)
I agree, it's highly unlikely. Depending on ATIs offerings at the time, I expect the pricing to be on par or more expensive than equivalent ATI. BUT if ATI does not have anything to compete with, I wouldn't be surprised to see GTX280 prices (when it was released).
Posted on Reply
#19
newtekie1
Semi-Retired Folder
by: kid41212003
I think I will skip the G200b, and wait for these :eek:.
Essentially, nVidia isn't expecting G200b to be a huge hit. They are just using it to cut costs on the GTX 200 cards. There will still be plenty of people that will be looking to upgrade before G212 comes. Also, many people are like me, they don't sit around and wait to buy simply because the next best thing is right around the corner, because the next best thing is always right around the corner. G212 is coming out Q2 2009 and G300 is coming Q4 2009, so when G212 is on its way people are going to be saying "I think I will skip the G212, and wait for G300".

by: szulmizan
why not using 512bit memory bus and GDDR5.. The memory bandwidth will be more than 240GB/s..
A 512-bit memory bus is very expensive to include. Which is really the reason behind GDDR5. GDDR5 allows the same bandwidth as a 512-bit bus on a 256-bit bus.

by: btarunr
Again, it's not known if NVIDIA will pass over this benefit to the consumer. :)
Well, they will likely try to make back some of the money they have lost over the past few months. With having to sell the G200 cards at essentially 0 profit, they have had to rely on saved money to keep the company moving.

My guess is that they will release G212, and in the first few months ATi will have nothing competitive, so nVidia will charge an arm and a leg for it to make up as much money as possible. Then ATi will release something that is competitive but priced lower to try and take market share from nVidia, and nVidia will lower the prices to compete...and the cycle goes on...
Posted on Reply
#20
Zehnsucht
Do we have any indication what ATIs response is? I mean, the 4850/4870 came in june, and 4870x2 in august.
Posted on Reply
#21
AltecV1
yea we do! it is called "lil´dragon":D
Posted on Reply
#22
$ReaPeR$
with the specs of this chip i might turn Nvidia again! it looks very promising, lets hope they wont charge a fortune for it.
Posted on Reply
#23
kid41212003
by: newtekie1
Essentially, nVidia isn't expecting G200b to be a huge hit. They are just using it to cut costs on the GTX 200 cards. There will still be plenty of people that will be looking to upgrade before G212 comes. Also, many people are like me, they don't sit around and wait to buy simply because the next best thing is right around the corner, because the next best thing is always right around the corner. G212 is coming out Q2 2009 and G300 is coming Q4 2009, so when G212 is on its way people are going to be saying "I think I will skip the G212, and wait for G300".
Yeah, but like you see, I'm still have my dual 8800GT, and I'm quite happy with it, I don't do "small" upgrade, and looking at the performance gap between my dual 8800GT and the GTX200 series, I saw no reason to upgrade, yet.

I'm always try to wait as long as I could and jump a big step to the next one. ;)
Posted on Reply
#24
newtekie1
Semi-Retired Folder
Up until about a month ago I had dual 9800GTX's and did a very minor upgrade to a GTX260. But that upgrade was more about getting away from a dual-card setup, as I get tired of dealing with SLi and all the drawbacks.

I tend to do one major upgrade ever few years or so, then do a few small ones very quickly after the major upgrade, then I am content. In the past year I went from Dual-7900GTs to a 8800GTS 512MB to a 9800GTX to Dual-9800GTX's and now this GTX260. I plan to use the step-up on the GTX260 to go to a GTX285 and then I think I will stick with that for awhile.

The upgrade from the 7900GTs to the 8800GTS only cost me about $100 after selling the 7900GTs, the upgrade to the 9800GTX cost me nothing as it was a step-up from my 8800GTS. The second 9800GTX cost me $150. And after selling the 9800GTX's the upgrade to the GTX260 costs about $15.
Posted on Reply
#25
Tatty_One
by: kid41212003
Yeah, but like you see, I'm still have my dual 8800GT, and I'm quite happy with it, I don't do "small" upgrade, and looking at the performance gap between my dual 8800GT and the GTX200 series, I saw no reason to upgrade, yet.

I'm always try to wait as long as I could and jump a big step to the next one. ;)
I went from an 8800GTS 512MB to a GTX260 ansd beleive me.....it was a BIG step, to me, anything 25% or better is BIG in this game, however I appreciate that BIG is very subjective based on our own perceptions.
Posted on Reply
Add your own comment