• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GT300 to Boast Around 256 GB/s Memory Bandwidth

Especially since this might be on 55nm or even 40nm which would make yields even lower but this happened with the GTX280 and 260.

LOL, 8800GTX/Ultra were 80nm, G92s(8800GT/9800GTX)were 65nm, then 9800GTX+/GTS250s and the later GTX260/275/280/285 and 295 were all 55nm, they shrank to make yields bigger, thus when the GT300 cards come out @ 40nm the yields will be even bigger and produce less heat.
 
sounds fishy
well if you remember R600 days when they put a 512bit ringbus controller and it flopped :(
GT300 will cost a arm, leg and a head for us mortals, i bet NVIDIA will go the same rute as their current GT200, and ATI will realy beat them with their sleek, fast and CHEAP cards
 
sounds fishy
well if you remember R600 days when they put a 512bit ringbus controller and it flopped :(
GT300 will cost a arm, leg and a head for us mortals, i bet NVIDIA will go the same rute as their current GT200, and ATI will realy beat them with their sleek, fast and CHEAP cards

I'm just waiting for the first ATI 40nm quad-cpu card to come out, so I can buy one and put it on my shelf next to my museum-quality Voodoo 5 6000.

And then the cycle will be complete. Muhuhahaha
 
LOL, 8800GTX/Ultra were 80nm, G92s(8800GT/9800GTX)were 65nm, then 9800GTX+/GTS250s and the later GTX260/275/280/285 and 295 were all 55nm, they shrank to make yields bigger, thus when the GT300 cards come out @ 40nm the yields will be even bigger and produce less heat.

8800GTX was 90nm
GTX280 was 65nm

Indeed when a new process comes out yields will be bigger and when a new fabrication process comes out it usually has low yields until it is perfected. LOL
 
I'm just waiting for the first ATI 40nm quad-cpu card to come out, so I can buy one and put it on my shelf next to my museum-quality Voodoo 5 6000.

And then the cycle will be complete. Muhuhahaha

Did you know that some devs have Fresh Drivers for the Voodoo 2 and Higher i Believe as of 2008/2009.
 
lol it cracks me, everytime ATI does decent nvidia just thinks,"NEED BIGGER CORE, MORE PROCESSORS GRRR" lol. though i really dout those specifications, doubling the SPU's from one gen to the next, i mean its possible but i would thk it'd take longer than a year to fit twice as many SPU's in the same die size even with a smaller fab process, not including the other mumbo jumbo they'll be adding. And can't say 512-bit will be a flop, they already proved it's worth with the GTX 280/85. I figured they'd have gone something like 384 SPU's or so, and then do a die shrink and have a smaller die size this time around, but if these rumors are true i guess nvidia wants to stomp AMD instead of having competition. i don't see the speculated 1200spu's on RV870 coming close to matching the performance of a GT300's 256 SPU's. if those speculations turn out true.
 
I dont belive that guy , 90% of what he types is made up IMO.
Now I dont think we will see the GT300 any time soon, but even if it is mid 2010 it will be a sweet part no dought.
 
I dont belive that guy , 90% of what he types is made up IMO.
Now I dont think we will see the GT300 any time soon, but even if it is mid 2010 it will be a sweet part no dought.

Must be a spell check drout... ;)
 
Well I hope the GT300 lives up to the hype, I've been very very very content with my GTX260 since July 2008, and it just seems to get better with age. It folds like a champ, runs all my games maxxed out, runs and decent temps and I will probably grab a similar version of the next gen card. It depends on what I feel is the best bang for the buck at the time, I was initially torn between an HD4870 and GTX260 last summer, I still feel I made the right decision to this day for many reasons that suited my needs and preferences. I do hope they keep affordability in mind though, the GTX series started at a hefty price tag that wasn't worth the performance, I jumped on at a good time with a good deal + MIR for a GTX260 at about $230 shipped back then, when the average was 300+ shipped.

The GT300 GPU could be very promising, and I hope it does succeed, and I also hope ATI brings up some seriously good competition this next round too, gotta have it! :D
 
Been waiting for this card for some time, bring it on!!! :rockout:
 
Wooooooo, another bloodthirsty powerhouse. When will the computer industry move to lower power, similar performance components?
 
Will someone tell me if that says "PWN" on the top of Wiz's little blue box? The fate of the universe depends on the answer!
 
Will someone tell me if that says "PWN" on the top of Wiz's little blue box? The fate of the universe depends on the answer!

it says PWN!
 
sounds fishy
well if you remember R600 days when they put a 512bit ringbus controller and it flopped :(
GT300 will cost a arm, leg and a head for us mortals, i bet NVIDIA will go the same rute as their current GT200, and ATI will realy beat them with their sleek, fast and CHEAP cards

NVIDIA already has a decent 448-bit / 512-bit GDDR3 architecture that performs well. Without competition, yes, it will cost an arm and a leg. GTX 280 started at $650 and crash-landed at $300 in less than an year, while 8800 GTX remained above the $500 mark for over an year.
 
Thats awesome. I wish I had a computer component that said "PWN".

it's photochopped. the box is semi transparent, i wanted to add some dramatization
 
it is hard to believe all this specification in one card , im wait for GPU-Z read
 
it is hard to believe all this specification in one card , im wait for GPU-Z read

at this point in time gpuz can't ready anything off gt300, so any screenshots are fake until you see in the gpuz changelogs that some gt300 support was added
 
Back then before AMD bought ATI. ATI vas doing this massive processor that should be able to all sorts of thing, they had there own plans of a 'CUDA land' etc. To day they are more and more task optimized, which gives them the competetive edge. They support the standards, like DirectXxx OpenCl Havok etc. and no 'CUDA land'. The professional cards like FirePro, -Gl, -Stream whatever, are for the professionals, and optimized for that, and they pay a premium for that. When the first rumers came about GT300, another castle twice the die size of RV870, they are asking for trouble. IMHO CUDA is BS in the mainstream, good as e-penis and nothing more. But I have seen videos of what it can do, and I'm impressed, but it can become NVIDEA's Titanic. I don't think that ATI is going to lay dovn or raise theire hands in submission, now they have got them by the balls. There is a rumer, that the last card this year, is going to be made at GloFo in 32nm, as an MCM, and it's going to raise the bar, but it's probably just whisfull thinking.:laugh:
 
Back then before AMD bought ATI. ATI vas doing this massive processor that should be able to all sorts of thing, they had there own plans of a 'CUDA land' etc. To day they are more and more task optimized, which gives them the competetive edge. They support the standards, like DirectXxx OpenCl Havok etc. and no 'CUDA land'. The professional cards like FirePro, -Gl, -Stream whatever, are for the professionals, and optimized for that, and they pay a premium for that. When the first rumers came about GT300, another castle twice the die size of RV870, they are asking for trouble. IMHO CUDA is BS in the mainstream, good as e-penis and nothing more. But I have seen videos of what it can do, and I'm impressed, but it can become NVIDEA's Titanic. I don't think that ATI is going to lay dovn or raise theire hands in submission, now they have got them by the balls. There is a rumer, that the last card this year, is going to be made at GloFo in 32nm, as an MCM, and it's going to raise the bar, but it's probably just whisfull thinking.:laugh:

eh i'd love to know where your information comes from and what video you've seen of the GT300
 
Back
Top