• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GT300 Already Taped Out

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's upcoming next-generation graphics processor, codenamed GT300 is on course for launch later this year. Its development seems to have crossed an important milestone, with news emerging that the company has already taped out some of the first engineering samples of the GPU, under the A1 batch. The development of the GPU is significant since it is the first high-end GPU to be designed on the 40 nm silicon process. Both NVIDIA and AMD however, are facing issues with the 40 nm manufacturing node of TSMC, the principal foundry-partner for the two. Due to this reason, the chip might be built by another foundry partner (yet to be known) the two are reaching out to. UMC could be a possibility, as it has recently announced its 40 nm node that is ready for "real, high-performance" designs.

The GT300 comes in three basic forms, which perhaps are differentiated by batch quality processing: G300 (that make it to consumer graphics, GeForce series), GT300 (that make it to high-performance computing products, Tesla series), and G200GL (that make it to professional/enterprise graphics, Quadro series). From what we know so far, the core features 512 shader processors, a revamped data processing model in the form of MIMD, and will feature a 512-bit wide GDDR5 memory interface to churn out around 256 GB/s of memory bandwidth. The GPU is compliant with DirectX 11, which makes its entry with Microsoft Windows 7 later this year, and can be found in release candidate versions of the OS already.

View at TechPowerUp Main Site
 
One disadvantage of being a news poster and keeping up with the news is that I want stuff months before it even comes out :banghead:

I'm guessing these will be out around christmas along with i5 and Windows 7.
 
I wonder what ATI's comeback will be like. Also I wonder if nvidia will bring out a dual gpu card.
 
ATI has RV870, although nothing spectacular from its earliest specs.

One disadvantage of being a news poster and keeping up with the news is that I want stuff months before it even comes out :banghead:

Another is that sometimes we're like the "Page 3" reporter that hangs out in the city's elite social circle, but only to report on who spilled his drink, or hung out with whom. :)
 
It looks like ATI might take nvidia's performance crown based on those specs
 
Whatever high end model this becomes, I want one, I want it made by Zotac and in my PC :D
 
oh poor ATI:ohwell:!! i dont think that the RV870 can beat that monster!
 
32 rops
around 2100 shaders
GDDR5

thats what the rumors say but based on ati's previous releases its more than likely real

but if I had to guess I would say at least 2400 shaders
 
32 rops
around 2100 shaders
GDDR5

thats what the rumors say but based on ati's previous releases its more than likely real

but if I had to guess I would say at least 2400 shaders

that is problay the 5870x2 not the 5870!:(
 
Based on current architecture comparisons, ATI need about 3.5-4 times as many shaders to match nvidias performance. So around 2,000 SP on the 5 series would seem plausible assuming a similar architecture is used.
 
stop posting news again, just release the product!
my future rig is waiting for this beast!
eVGA X58 759 + Dominator GT 2000Mhz + this beast = invincible!!! (at least 1-2years) lol
 
Last edited:
that is problay the 5870x2 not the 5870!:(

I was gonna say the same, but I really do hope they rework the 5 series from the ground up. Haven't they had the same architecture since the 2k series? For ex. 3k is just a die shrink among other things, and 4k is another speed increase, no major changes with the exception of the 4890 being ridiculously high-binned at these 1GHz speeds.

But, nvidia seems to have another 8800-like series of video cards out for the next round...if thats the case, I'm not missing out this time, I may switch.
 
Despite of being an ati user all along, i have to say that 256 gbs plus 512 st proc will give ati A HAAAAAAAAAARD time!!!
 
3K wasn't just a die shrink. An OC'd HD 3850 or standard HD 3870 can destroy a HD 2900XT (the flagship of 3K).

512-bit GDDR5 is sexy.
 
possibly my next card with my i7 system that im getting soon :D
 
But, nvidia seems to have another 8800-like series of video cards out for the next round...if thats the case, I'm not missing out this time, I may switch.


I concur. I am tired of looking at the the number two spot the vast majority of the time. If ATI can't beat the green team this time, and by a serious margin, then I am going to stuff NV/Intel into my spider case.
 
It is getting increasingly difficult to understand how performance will scale only counting the "number of shaders". As architecture moves from SISD to SIMD to MIMD the ability to predict how this will impact CAD, or DX9, 10 or 11 rendering is very hard, esp. as you layer shader effects like FSAA etc.

It may be that there is a bigger win in resolution, ie. can 2560x1600 or a bigger win in shader effects. ie 16xFSAA etc. Only benchmarking will tell.

I wonder with 3 versions of the GPU whether this will impact CUDA abilities. If it does, ie different CUDA capabilities on each, then this will spell disaster for standardising CUDA (and Physx on CUDA) enhancements.

Looking forward to more news...
 
Let's hope ATI/AMD will be up to the task or this cards will be 500$+ (if not more - not talking about the quadro/tesla)
 
nV has got to find some way to claw back all the money they lost in the last 18 months due to failed laptop GPUs... and problems with their insurance providers.

To get that money clawed back, expect the G300 to be WHOOPASS, but also expect a very high premium card, at least as pricey as a GTX 295
 
oh, this card will give ATI some headache..
 
As long as the midrange is around the hundred quid mark I don't mind, high end will always be horrifically highly priced.
 
i guess nvidia decided to step it up this time and kill ATI, they were surprised by HD 4XXX series so the way i see it they're doing all they can to beat ATI back down instead of ATI being on NV's ass. I wonder though, is it possible that there be too many shaders? that all of them can't be utilized kinda like a quad core is barely utilized in games?
 
No way for ati to beat nvidia in performance forget this !!

Nvidia has THE DRIVERS...

ATi has nothing compared to nvidia in terms of drivers optimization.
 
No way for ati to beat nvidia in performance forget this !!

Nvidia has THE DRIVERS...

ATi has nothing compared to nvidia in terms of drivers optimization.

?dude where have u been? ATI for the past year has been all over nvidia's ass, they had the performance crown for a while with the HD 4870x2, and yea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back. And yea nvidia does tend to have better drivers at launch than ATI but they also have a lot more access to games(TWIMTBP).

Drivers are important but with this release i thk we'll see the past repeat itself. Not like 2900XT vs 8800GTX, but more like 9800GTX vs HD 3870. The GTX 285 is one hell of a card with 240 SPU's 512-bit GDDR3. GT300 is going to have more than twice as many SPU's, and twice the bandwidth by moving to GDDR5. I just can't see the performance from the chip not destroying ATI's RV870 which i estimate by the specs, will only perform 25-35% faster than HD 4870.
 
Back
Top