Thursday, April 23rd 2009

GT300 to Pack 512 Shader Processors

A real monster seems to be taking shape at NVIDIA. The company's next big graphics processor looks like a leap ahead of anything current-generation, the way G80 was when it released. It is already learned that the GPU will use a new MIMD (multiple instructions multiple data) mechanism for its highly parallel computing, which will be physically handled by not 384, but 512 shader processors. The count is a 112.5% increase over that of the existing GT200, which has 240.

NVIDIA has reportedly upped the SP count per cluster to 32, against 24 for the current architecture, and a cluster count of 16 (16 x 32 = 512). Also in place, will be 8 texture memory units (TMUs) per cluster, so 128 in all. What exactly makes the GT300 a leap is not only the fact that there is a serious increase in parallelism, but also an elementary change in the way a shader processor handles data and instructions, in theory, a more efficient way of doing it with MIMD. The new GPU will be DirectX 11 compliant, and be built on the 40 nm manufacturing process. We are yet to learn more about its memory subsystem. The GPU is expected to be released in Q4 2009.Source: Hardware-Infos
Add your own comment

86 Comments on GT300 to Pack 512 Shader Processors

#1
gumpty
Gosh that looks tasty.

Here's hoping that ATI will continue their good run and will have some hardware to compete with it. I don't think anyone (outside of nvidia) wants a repeat of the 18 months of G80 dominance.
Posted on Reply
#2
Assassin48
that soon already, they just brought out the 295 now this good thing i didnt go for the 295

ill just stick with my 4870x2 XOC till that comes out then we will see what ati has
Posted on Reply
#3
DaC
Will it run Crysis ? :laugh:
Posted on Reply
#4
suraswami
by: DaC
Will it run Crysis ? :laugh:
my X800GTO on 8x PCIE played Crysis fine.:slap:
Posted on Reply
#5
Weer
God damn you, nVidia! Q4 is too late.. why didn't you make it a summer release like last year?
Posted on Reply
#6
Mussels
Moderprator
by: Weer
God damn you, nVidia! Q4 is too late.. why didn't you make it a summer release like last year?
Q4 will be a summer release.

(hint: other hemisphere)
Posted on Reply
#7
Weer
by: suraswami
my X800GTO on 8x PCIE played Crysis fine.:slap:
That simply borders on the definition of "runs."
For some people 1024x768 @ Medium with 30 FPS = "runs."
For me, it's 1920x1200 @ High with 40+ FPS.
Posted on Reply
#8
Weer
by: Mussels
Q4 will be a summer release.

(hint: other hemisphere)
Other what? Are they using the fiscal year? Explain.
Posted on Reply
#9
Mussels
Moderprator
by: Weer
That simply borders on the definition of "runs."
For some people 1024x768 @ Medium with 30 FPS = "runs."
For me, it's 1920x1200 @ High with 40+ FPS.
for me its 1920x1080 with 8x aa at 100 FPS :D

stupid crysis, i struggle for 60 FPS on very high with no AA.
Posted on Reply
#10
btarunr
Editor & Senior Moderator
by: Weer
Other what? Are they using the fiscal year? Explain.
Isn't it spring~summer in the southern hemisphere when its autumn~winter in the north?
Posted on Reply
#11
Weer
by: Mussels
for me its 1920x1080 with 8x aa at 100 FPS :D

stupid crysis, i struggle for 60 FPS on very high with no AA.
Actually, now it's 2560x1600 @ Very High with 4xAA, for me. But, the only thing that will run that is the new card released in December (Q4).
Posted on Reply
#12
Weer
by: btarunr
Isn't it summer in the southern hemisphere when its winter in the north?
I don't live in Australia! It's the time between now and the release (end of 2009), not the temperature outside that I care about. Last year, GT200 was released in Q2.
Posted on Reply
#13
Tatty_One
Super Moderator
I think I will be sticking with my 4870x2 and GTX275 until these babies come out, I just have to have one! I am sure though that ATi will pull something decent out of the bag as well though........ wonders how much these might cost.
Posted on Reply
#14
renozi
oh my poor 295 :( I'm hoping it cost so much that I won't get it until 2010!
Posted on Reply
#15
btarunr
Editor & Senior Moderator
by: Weer
I don't live in Australia! It's the time between now and the release (end of 2009), not the temperature outside that I care about. Last year, GT200 was released in Q2.
The Australian thinks that's winter.
Posted on Reply
#16
Animalpak
Hope the temps are fine of this new nvidia beast.
Posted on Reply
#17
ZoneDymo
by: Weer
That simply borders on the definition of "runs."
For some people 1024x768 @ Medium with 30 FPS = "runs."
For me, it's 1920x1200 @ High with 40+ FPS.
You got it wrong

Running a game
Running a game at playable levels (for me)
Maxing a game

are 3 different things.
Posted on Reply
#18
Disruptor4
Australia:
Summer - Autumn - Winter - Spring

switches to

America:
Winter - Spring - Summer - Autumn

Settled?


If this is to be true, what a beast... I just hope that ATi has something competitive to keep prices down!
Posted on Reply
#19
Bjorn_Of_Iceland
Hurray... time for some 60fps in Crysis on my native 16x10 res on full blast AA.. hmm.. wait.. Ive finished that game.

Devs are going multiplatform anyways, and ittl' be held back by console technology.. not unless another would be dev team would gamble in making another PC exclusive title that would profit 1/3 vs a multiplatform title, and would probly just be torrented.. We wont be seeing any title that would utilize this much power. not until xbox 720 or PS4 arrives at least.
Posted on Reply
#20
Imsochobo
Well, who cares about temps on videocards, how many of you complained about 4850 temps ? 90 C idle and stuff like that, well, they got a very low RMA rate, so, they are good.

Complain about TPD instead! not the stock cooling solution if its quiet and effecient.

This card looks mighty, dunno what theyve done, obviously this is indeed 25 NM smaller than GT2xx though.

I wonder if ati got something up their sleeve aswell =)
Posted on Reply
#21
lemonadesoda
I DREAD to think about the power requirements for that guzzling monster. It will probably need a rocket propelled cooling system, and jump leads to a running car engine for power.

If they had made significant progress on power consumption, that would surely have been touted as one of the GT300 features.
Posted on Reply
#22
InnocentCriminal
Resident Grammar Amender
Ooooooh parallelism. Interested to see how nVIDIA are going to do this.

XD
Posted on Reply
#23
Hayder_Master
"SP count per cluster to 32" and the "cluster count of 16 (16 x 32 = 512)" , isn't this same ATI technology
Posted on Reply
#24
HellasVagabond
One thing is for sure, even if ATI releases their DX11 solution 2-3 months before the GT300 i do not think people will choose it over this monster.
Posted on Reply
#25
OzzmanFloyd120
:) hopefully I'll have the cash this Christmas season.
Posted on Reply
Add your own comment