Discussion in 'News' started by btarunr, Sep 30, 2009.
L1 and L2 cache for graphics cards? I've never seen that before...
You may also want to charge people for a sauna bath in your basement as well.
You are using a 4850 right?
A lightly OC'ed gtx 280 beats it even if the 4850 has a 1ghz core OC.
Well that's lackluster. I mean after the weird new design concept of the 5870, I half expected to see something with a built in rainbow gun that can actually fire rainbows or something as retaliation. This one looks like they said 'We love the G80 so much, we're doing it again!' Thankfully I don't buy my GFX cards based on looks.
watch the vid and you can get a better look.
its actually a bit different than you would think.
i like it myself.
these cards will have some serious balls to them from what i've heard in the video so far.
the card is similar to gtx2XX card but slightly smaller
CUDA cores = new term for shader cores for anyone who didn't catch that right away.
*Waits for people to bitch about nVidia renaming them...and equating it to them renaming video cards...
if anyone argues that these gpus wont be all balls they will be teabagged once the numbers show up.
They can call them "little girls with crayons" if they want to. It becomes all the more humiliating when 512 of them beat 1600 "super dooper pooper troopers".
Bta, you're now in charge of naming all hardware tech. If you don't like the name that comes up in the news, replace it. I would love to read stories along the lines of what you posted.
Also, that HP netbook doing the HD streaming was pretty sweet (for people watching the webcast).
haha.... ludwig fuchs
that ferrari is pretty nice
Wow. Ludwig brought out some awesome tech. See those lighting effects?
No words about TDP?
I has some serious balls definately, and some brains too.
And I love the look and the fact that it's shorter than GTX2xx cards. Especially the later.
They had started to call them only cores in the last months. Anyway CUDA is (has always be) the name of the architecture itself. Like x86.
What I want to know is if they have shown or will show performance numbers. I now they are not going to be real (like HD5870 being 90% faster than GTX295 lol), but if they say 200% faster you they have something.
Indeed. Can't wait to see what it can do. Damn I wish this thing was only 30 days out.
This sounds like it will destroy the ATI 5xxx series. L2 cache for a graphics card = awesome.
GTX280 > 4850. Nuff said. My single GTX280 overclocked ran circles around my crossfire 4870 setup.
Btw 280 = 1gb.
Also there is a patch out there that will allow you to run a higher resolution without having mass amounts of vmem. Whether it lags or not.
looks like i was right about nvidia's new line of cards. now i just have to wait for them to release a midrange series...
I've not been looking the webcast since the beginning, but according to Fudzilla that are there and posting at the same time, Jensen has said that they will do a top to bottom release, including a dual GPU card.
PD: I've been seing the last part and the augmented reality in the Tegra has really really impressed me.
This new DX11 war is looming to be rather interesting . . .
It defi appears that nVidia might have an ace up their sleeve against the HD5000 series . . . but, all things considered, ATI have yet to throw out any potential date for the release of the 70x2 - leading me to believe they're holding it in the reins until the 300 is out, then slap nVidia with their dual-GPU setup, further driving nVidia's price down . . .
I'm getting the feeling we're going to see a repeat of the HD4000/GT200 release shenanigans - either way, I guess we'll have to see how it goes.
Read my post above yours. Nvidia will release Fermi from top to bottom which includes the dual GPU card.
The numbers look good, but its still a paperlaunch, and if it takes really much of an effort to bake these wafers without errors, its going to be a hell of a period with this new war between Ati and Nvidia.
I'd prefer Ati, but i have my own reasons for that. Also a shame that even a i7 cant last up a HD5870 in Crossfire I think the ball is at eitherway Intel or AMD to produce a much stronger crunching CPU.
Separate names with a comma.