Discussion in 'News' started by btarunr, Nov 19, 2010.
Wow, that seems great. This will be war of the worlds. 5990 vs. GTX 595
I would really love to see a dual-GF110 GPU but given the power requirements and thermal print unfortunately it won't be possible. What is more realistic; and is most probably the case here is a GPU with dual fully-enabled GF104 chips (aka GF114). As usual; time will tell
Whats the amorphous gray blob on the back there?
first image After cleaning
Oh that's proof of what it is. Fuzzy pictures ftl.
I love all the fanaticism coming out the woodwork again. People saying the 6990 is dead and Nvidia have won. Jeez.
Won't people learn.
Given the GTX580 is a vapor chamber cooled part that still can reach 90+ degrees and without power throttling guzzle 300+ watts (links below) do you really think they can get two GF110 cores on one card? Just like the GTX 295 wasn't 2x285's (it was 2 x 275's) and just like the 5970 is 2x5870's downclocked to 5850 speed, this would seem very likely not be two fully operational 580 cores.
Likewise, the 6970 rumours suggest a high power guzzler as it is a huge die (why it should come close to the 580). Given the leaks that point to a cayman XT Antilles card, the 6990 will probably be a monster also.
Both cards will surely be made in very limited numbers as they'll both be ultra high end parts.
Despite how good they might be, i wouldn't buy one simply because it's dual gpu and having had GTX 295 and 2x5850's, i want a single card now. A GTX 580 Super Overclock from Gigabyte would suffice
GTX 580 power draw:
GTX 580 temp:
Actually they look completely different. Power supply solution, mounting holes, power connector (power draw) etc... it is a completely different device.
I don't want to sound like an a$$, but if a GTX580 consume 10% more than a 5970, imagine how much this beast will eat for breakfast!!!
I hope it consumes 5000W/s. Power to the people. lol
It doesn't look different enough given that the asus pic was a one off proof of concept prototype with different cores. I wouldn't rule out it being a new mars card as it may simply be a refinement and evolution of that concept.
The power problem would be solved, theoretical, if they would reduce the manufacturing process down to 22nm or 18nm, or even lower. With the actual technology is not physical possible to increase the performance of a GPU without increasing its power consumption. The smaller the trasistor, the less power they eat, the hottest will get...I know, it's a vicious circle...
actually ... power from teh ppl
it makes me remember the processor development, it started from single core, then the push the clock, when the max clock reached they switched into multi core until now, it looks gonna be like that they started with single, then dual, then put multi core on one package
You mean 6990 vs GTX595...
Then we could match 9800GX with GTX295... the power circuit uses different Ics with with less phase count. Also nvidia's board has only DVI port connections... It isn't an evolution... it is just a cheaper design... using different components. It could be, that mars card has more layers too...
The layout rules are same for every card concerning wire length for memory bus, power lines etc... so that the components are placed more or less in the same place.
The power consumption cannot be more than we can attach to it - 2*150W + 75W from PCIe.
Thank you, I thought everyone here (barring some) were insolent and insensitive fools
Meh, meh and meh. This ain't about carbon neutral foolery, it's about an awesome graphics card. We pay our power bill, and we WILL use it the way we see fit, and what a hell of a way to use a load of it. What would be awesome is a dual-gpu with them on the same die, like CPU's (pardon me if they exist already).
So in theory this would be about 5fps faster then a GTX580 sli setup right?
I'll believe this when it's released, though i'm def getting the feeling its going to have a weaker core config than the GTX 580, perhaps 2 570 chips, which would be kewl bc it'd show us what the 570's hardware specs. I feel that the 570 is gonna be a hella great deal, kinda like the 5850 was.
As for Nvidia taking back the single and dual gpu crown. I highly doubt it. Im not gonna be surprised if AMD purposelly helped the leaked rumors saying their 6970 has 1500 some shaders. sorta given em the benefit of the doubt or at least that they learned from Nvidia's mistake of dilly dallying.
Watch it release with 1920 shaders or something, 900mhz core clock. that'd be tight, n of course ROP's n whatnot would get n increase. its been said that the 6970/50 chip are a lil off coure from AMD's usual small but extremely efficient per mm2, So yea with the bumps in specs they spoke of, this chip would be smaller than 580
Finally a nVidia card worth being in my rig!
I hope they call this monster if ever produced GTX 580 X2/GTX 570 X2/GTX 560 X2, so it's very clear which kind of GPU core they're using to power it to the average gamer with deep pockets.
NOT A CHANCE!!!!! :shadedshu
maybe 10% slower, my guess....
And If you ask me there will be two 580GPUs with 460 (560??) freqs.
X2 is (was...) ATI trademark bro!!!:shadedshu
To me these are the same card, but with different power section layouts.
The sleeping lion woke up
But its not true with GPU's.
AMD / Ati has chosen to develop really tight GPU's with lesser power consumption, but lower developing-costs. Nvidia makes GPU's which are larger, tougher to make and having problems with TSMC making good wafles of chips lol.
I'd prefer a chip with less power consumption and being efficient as possible instead of a chip that generates more then 400Watts of heat in games.
Anyway, this is more of an answer towards AMD's upcoming 6990.... Nividia knows its going to get their ass kicked.
Separate names with a comma.