• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GT300 to Pack 512 Shader Processors

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,683 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
A real monster seems to be taking shape at NVIDIA. The company's next big graphics processor looks like a leap ahead of anything current-generation, the way G80 was when it released. It is already learned that the GPU will use a new MIMD (multiple instructions multiple data) mechanism for its highly parallel computing, which will be physically handled by not 384, but 512 shader processors. The count is a 112.5% increase over that of the existing GT200, which has 240.

NVIDIA has reportedly upped the SP count per cluster to 32, against 24 for the current architecture, and a cluster count of 16 (16 x 32 = 512). Also in place, will be 8 texture memory units (TMUs) per cluster, so 128 in all. What exactly makes the GT300 a leap is not only the fact that there is a serious increase in parallelism, but also an elementary change in the way a shader processor handles data and instructions, in theory, a more efficient way of doing it with MIMD. The new GPU will be DirectX 11 compliant, and be built on the 40 nm manufacturing process. We are yet to learn more about its memory subsystem. The GPU is expected to be released in Q4 2009.

View at TechPowerUp Main Site
 
Last edited:
Gosh that looks tasty.

Here's hoping that ATI will continue their good run and will have some hardware to compete with it. I don't think anyone (outside of nvidia) wants a repeat of the 18 months of G80 dominance.
 
that soon already, they just brought out the 295 now this good thing i didnt go for the 295

ill just stick with my 4870x2 XOC till that comes out then we will see what ati has
 
Will it run Crysis ? :laugh:
 
God damn you, nVidia! Q4 is too late.. why didn't you make it a summer release like last year?
 
God damn you, nVidia! Q4 is too late.. why didn't you make it a summer release like last year?

Q4 will be a summer release.

(hint: other hemisphere)
 
my X800GTO on 8x PCIE played Crysis fine.:slap:

That simply borders on the definition of "runs."
For some people 1024x768 @ Medium with 30 FPS = "runs."
For me, it's 1920x1200 @ High with 40+ FPS.
 
That simply borders on the definition of "runs."
For some people 1024x768 @ Medium with 30 FPS = "runs."
For me, it's 1920x1200 @ High with 40+ FPS.

for me its 1920x1080 with 8x aa at 100 FPS :D

stupid crysis, i struggle for 60 FPS on very high with no AA.
 
Other what? Are they using the fiscal year? Explain.

Isn't it spring~summer in the southern hemisphere when its autumn~winter in the north?
 
for me its 1920x1080 with 8x aa at 100 FPS :D

stupid crysis, i struggle for 60 FPS on very high with no AA.

Actually, now it's 2560x1600 @ Very High with 4xAA, for me. But, the only thing that will run that is the new card released in December (Q4).
 
Isn't it summer in the southern hemisphere when its winter in the north?

I don't live in Australia! It's the time between now and the release (end of 2009), not the temperature outside that I care about. Last year, GT200 was released in Q2.
 
I think I will be sticking with my 4870x2 and GTX275 until these babies come out, I just have to have one! I am sure though that ATi will pull something decent out of the bag as well though........ wonders how much these might cost.
 
oh my poor 295 :( I'm hoping it cost so much that I won't get it until 2010!
 
I don't live in Australia! It's the time between now and the release (end of 2009), not the temperature outside that I care about. Last year, GT200 was released in Q2.

The Australian thinks that's winter.
 
Hope the temps are fine of this new nvidia beast.
 
That simply borders on the definition of "runs."
For some people 1024x768 @ Medium with 30 FPS = "runs."
For me, it's 1920x1200 @ High with 40+ FPS.

You got it wrong

Running a game
Running a game at playable levels (for me)
Maxing a game

are 3 different things.
 
Australia:
Summer - Autumn - Winter - Spring

switches to

America:
Winter - Spring - Summer - Autumn

Settled?


If this is to be true, what a beast... I just hope that ATi has something competitive to keep prices down!
 
Hurray... time for some 60fps in Crysis on my native 16x10 res on full blast AA.. hmm.. wait.. Ive finished that game.

Devs are going multiplatform anyways, and ittl' be held back by console technology.. not unless another would be dev team would gamble in making another PC exclusive title that would profit 1/3 vs a multiplatform title, and would probly just be torrented.. We wont be seeing any title that would utilize this much power. not until xbox 720 or PS4 arrives at least.
 
Well, who cares about temps on videocards, how many of you complained about 4850 temps ? 90 C idle and stuff like that, well, they got a very low RMA rate, so, they are good.

Complain about TPD instead! not the stock cooling solution if its quiet and effecient.

This card looks mighty, dunno what theyve done, obviously this is indeed 25 NM smaller than GT2xx though.

I wonder if ati got something up their sleeve aswell =)
 
I DREAD to think about the power requirements for that guzzling monster. It will probably need a rocket propelled cooling system, and jump leads to a running car engine for power.

If they had made significant progress on power consumption, that would surely have been touted as one of the GT300 features.
 
Ooooooh parallelism. Interested to see how nVIDIA are going to do this.

XD
 
"SP count per cluster to 32" and the "cluster count of 16 (16 x 32 = 512)" , isn't this same ATI technology
 
One thing is for sure, even if ATI releases their DX11 solution 2-3 months before the GT300 i do not think people will choose it over this monster.
 
Back
Top