• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GT300 ''Fermi'' Detailed

L1 and L2 cache for graphics cards? I've never seen that before...
 
And I bet my basement would sound like a bunch of harpies getting gang banged by a roving group of banshees with 6 GT300s added to my setups.
You may also want to charge people for a sauna bath in your basement as well.
 
100 draw distance and all high settings? You are delusional, mistaken, or full of shit.

My 1Gb video card can't run it, it isn't the processor power required, it is the vmem, plain and simple.

You are using a 4850 right?

A lightly OC'ed gtx 280 beats it even if the 4850 has a 1ghz core OC.
 
Capture491.jpg
 
Well that's lackluster. I mean after the weird new design concept of the 5870, I half expected to see something with a built in rainbow gun that can actually fire rainbows or something as retaliation. This one looks like they said 'We love the G80 so much, we're doing it again!' Thankfully I don't buy my GFX cards based on looks.
 
watch the vid and you can get a better look.

its actually a bit different than you would think.

i like it myself.

these cards will have some serious balls to them from what i've heard in the video so far.

the card is similar to gtx2XX card but slightly smaller
 
^^^

CUDA cores = new term for shader cores for anyone who didn't catch that right away.
 
^^^

CUDA cores = shader cores for anyone who didn't catch that right away.

*Waits for people to bitch about nVidia renaming them...and equating it to them renaming video cards...:laugh:
 
if anyone argues that these gpus wont be all balls they will be teabagged once the numbers show up.
 
*Waits for people to bitch about nVidia renaming them...and equating it to them renaming video cards...:laugh:

They can call them "little girls with crayons" if they want to. It becomes all the more humiliating when 512 of them beat 1600 "super dooper pooper troopers".
 
Bta, you're now in charge of naming all hardware tech. If you don't like the name that comes up in the news, replace it. I would love to read stories along the lines of what you posted.

Also, that HP netbook doing the HD streaming was pretty sweet (for people watching the webcast).
 
haha.... ludwig fuchs

great name.

that ferrari is pretty nice
 
Wow. Ludwig brought out some awesome tech. See those lighting effects?
 
watch the vid and you can get a better look.

its actually a bit different than you would think.

i like it myself.

these cards will have some serious balls to them from what i've heard in the video so far.

the card is similar to gtx2XX card but slightly smaller

I has some serious balls definately, and some brains too. :rockout:

And I love the look and the fact that it's shorter than GTX2xx cards. Especially the later.

^^^

CUDA cores = new term for shader cores for anyone who didn't catch that right away.

They had started to call them only cores in the last months. Anyway CUDA is (has always be) the name of the architecture itself. Like x86.

What I want to know is if they have shown or will show performance numbers. I now they are not going to be real (like HD5870 being 90% faster than GTX295 lol), but if they say 200% faster you they have something. :laugh:
 
This sounds like it will destroy the ATI 5xxx series. L2 cache for a graphics card = awesome.
 
100 draw distance and all high settings? You are delusional, mistaken, or full of shit.

My 1Gb video card can't run it, it isn't the processor power required, it is the vmem, plain and simple.

GTX280 > 4850. Nuff said. My single GTX280 overclocked ran circles around my crossfire 4870 setup.

Btw 280 = 1gb. :rolleyes:

Also there is a patch out there that will allow you to run a higher resolution without having mass amounts of vmem. Whether it lags or not.
 
looks like i was right about nvidia's new line of cards. now i just have to wait for them to release a midrange series...
 
looks like i was right about nvidia's new line of cards. now i just have to wait for them to release a midrange series...

I've not been looking the webcast since the beginning, but according to Fudzilla that are there and posting at the same time, Jensen has said that they will do a top to bottom release, including a dual GPU card.

http://www.fudzilla.com/content/view/15758/1/

PD: I've been seing the last part and the augmented reality in the Tegra has really really impressed me.
 
This new DX11 war is looming to be rather interesting . . .

It defi appears that nVidia might have an ace up their sleeve against the HD5000 series . . . but, all things considered, ATI have yet to throw out any potential date for the release of the 70x2 - leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .

I'm getting the feeling we're going to see a repeat of the HD4000/GT200 release shenanigans - either way, I guess we'll have to see how it goes.
 
leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .

Read my post above yours. Nvidia will release Fermi from top to bottom which includes the dual GPU card.
 
This new DX11 war is looming to be rather interesting . . .

It defi appears that nVidia might have an ace up their sleeve against the HD5000 series . . . but, all things considered, ATI have yet to throw out any potential date for the release of the 70x2 - leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .

I'm getting the feeling we're going to see a repeat of the HD4000/GT200 release shenanigans - either way, I guess we'll have to see how it goes.

The numbers look good, but its still a paperlaunch, and if it takes really much of an effort to bake these wafers without errors, its going to be a hell of a period with this new war between Ati and Nvidia.

I'd prefer Ati, but i have my own reasons for that. Also a shame that even a i7 cant last up a HD5870 in Crossfire :confused: I think the ball is at eitherway Intel or AMD to produce a much stronger crunching CPU.
 
Back
Top