Friday, December 23rd 2011

AMD Dual-GPU Radeon HD 7990 to Launch in Q1 2012, Packs 6 GB Memory

Even 12 months ago, an Intel Nehalem-powered gaming PC with 6 GB of system memory was considered high-end. Now there's already talk of a graphics card taking shape, that has that much memory. On Thursday this week, AMD launched its Radeon HD 7970 graphics card, which features its newest 28 nm "Tahiti" GPU, and 3 GB of GDDR5 memory across a 384-bit wide memory interface. All along, it had plans of making a dual-GPU graphics card that made use of two of these GPUs to give you a Crossfire-on-a-stick solution. AMD codenamed this product "New Zealand". We are now getting to learn that codename "New Zealand" will carry the intuitive-sounding market name Radeon HD 7990, and that it is headed for a Q1 2012 launch.

This means that Radeon HD 7990 should arrive before April 2012. Tests show that Tahiti has superior energy-efficiency compared to previous-generation "Cayman" GPU, even as it has increased performance. From a technical standpoint, a graphics card featuring two of these Tahiti GPUs, running with specifications matching those of the single-GPU HD 7970, looks workable. Hence, there is talk of 6 GB of total graphics memory (3 GB per GPU system).

One can also expect the fruition of AMD's new ZeroCore technology. This technology powers down the GPU to zero draw when the monitor is blanked (idling), but in CrossFire setups, this technology completely powers down other GPUs than the primary one to zero state, when the system is not running graphics-heavy applications. This means that the idle, desktop, and Blu-ray playback power-draw of the HD 7990 will be nearly equal to that of the HD 7970, which is already impressive.


Source: Softpedia
Add your own comment

74 Comments on AMD Dual-GPU Radeon HD 7990 to Launch in Q1 2012, Packs 6 GB Memory

#1
dieterd
eh, 1000$ mark cracked like nothing and I dont think that Nvidia will be the one who will compete with pricing. but i see none of you really care about prices, so Merry X-mass - I hope that we all will get only cash under X-mass tree, because the 28-nm generation started wit lota appetite for it!
Posted on Reply
#2
the54thvoid
by: TheMailMan78
That thing will cost more then a super charger on my 4.6 Mustang. Ill pass.
by: Trackr

Besides, they will have to make the price competitive to stay above the HD 6990, so the extra RAM likely won't mean a higher price.
Yeah, price. If the 7970 is going to be £450, the dual card will be prohibitively expensive. I simply cannot justify buying a card that costs £700 plus (which surely it will be?) even though I've bought a £480 cpu.

by: claylomax
Metro 2033, Stalker CoP, Crysis 2, Cryostasis, Battlefield 3, etc ... they max out 1.5GB playing at 1900 x 1200 with 4xAA on my system.
Is that a Vista thing? I run BF3 at max AA at that res and it doesn't go over that.
Posted on Reply
#3
TheMailMan78
Banstick Dummy
by: the54thvoid
Yeah, price. If the 7970 is going to be £450, the dual card will be prohibitively expensive. I simply cannot justify buying a card that costs £700 plus (which surely it will be?) even though I've bought a £480 cpu.



Is that a Vista thing? I run BF3 at max AA at that res and it doesn't go over that.
It doesnt go over that because thats all you have.
Posted on Reply
#4
the54thvoid
by: TheMailMan78
It doesnt go over that because thats all you have.
I use afterburner and see the mem usage on my g19 in game. It doesn't come as close to that. Most I've seen in game (Metro 2003 i think) was 1400 or something. I'm aware my gfx card is 1.5GB.

In fact , I'll go check now :)
Posted on Reply
#5
TheMailMan78
Banstick Dummy
by: the54thvoid
I use afterburner and see the mem usage on my g19 in game. It doesn't come as close to that. Most I've seen in game (Metro 2003 i think) was 1400 or something. I'm aware my gfx card is 1.5GB.

In fact , I'll go check now :)
How many displays?
Posted on Reply
#6
Crap Daddy
BF3 and I think all these games that need lots of VRAM use all that's available so on my card BF3 all maxed out shows 1260 something MB used so if I had more it would use more.
Posted on Reply
#7
dorsetknob
Its an incestuous relationship between hardware manufacturer's and the software houses
manufacturer's bring equipment to the market
software houses write software that pushes the hardware to the limit
so manufacturer's revise and improve their hardware
we all upgrade to the new and latest spec hardware and so the software houses produce more demanding software to take advantage ect on on and on the cycle goes on
Posted on Reply
#8
the54thvoid
Just one. Just checked. I'm getting 1463MB of usage with ultra, settings, 4xMSAA and 16 AF. Motion Blur and Ambient Occlusion off. And that was in a firefight.

Would it climb to 1536MB if fully used or is there some left as buffer, i.e. is tha my card being maxed?
Posted on Reply
#9
Mistral
I can just imagine some people trying to play with that card on Windows XP :banghead:

Though Photoshop will surely love the obscene amount of memory...
Posted on Reply
#10
Dj-ElectriC
I don't like the whole dual-gpu memory amount marketing...
Becuase its simply half true. They can market it just becuase the chips are there
but people might think they can use up to 6GB of VRAM
Posted on Reply
#11
TheMailMan78
Banstick Dummy
by: the54thvoid
Just one. Just checked. I'm getting 1463MB of usage with ultra, settings, 4xMSAA and 16 AF. Motion Blur and Ambient Occlusion off. And that was in a firefight.

Would it climb to 1536MB if fully used or is there some left as buffer, i.e. is tha my card being maxed?
Run three monitors with that 1.5 GB of memory. ;) Its hardly pushing one now as you just saw.
by: Mistral
I can just imagine some people trying to play with that card on Windows XP :banghead:

Though Photoshop will surely love the obscene amount of memory...
It will make no difference in Photoshop. Photoshop uses Open GL. Different ballgame all together. Most IGP's is all Photoshop needs for its draw.
Posted on Reply
#12
the54thvoid
by: TheMailMan78
Run three monitors with that 1.5 GB of memory. ;) Its hardly pushing one now as you just saw.
Nvidia, three monitors? Maybe they'll sort that out by Kepler? :laugh:

(I know some AIB's have done it...)
Posted on Reply
#13
NdMk2o1o
Am I right in saying the recent benches with 7970 were between 20-30% faster than GTX 580?
I would expect the performance to increase by at least 10% across the board with mature drivers meaning 30-40% faster on all counts than a GTX 580.
Slap 2 of these chips together without lowering core clocks and shaders etc a'la GTX 590 and this dual GPU card should be a monster and stomp the 590 by a LOT
Posted on Reply
#14
radrok
Should also run very very cool compared to the HD 6990, hell sometimes the top GPU runs at 92C -.-
Posted on Reply
#15
Super XP
by: ensabrenoir
90% of out rigs arent fully utilized/overkill.... This will fit right in. we all need to push for better software though
Software is greatly holding back hardware. Something needs to change, something in terms of a nice kick in the aris for those software developers to step it up a notch.

This card is nuts :laugh:
Posted on Reply
#16
radrok
That's not entirely true, it's because the resolution standard is stuck at 1080p, once your go over you have the need for such graphics cards.
Posted on Reply
#17
Dent1
by: dorsetknob
Its an incestuous relationship between hardware manufacturer's and the software houses
manufacturer's bring equipment to the market
software houses write software that pushes the hardware to the limit
so manufacturer's revise and improve their hardware
we all upgrade to the new and latest spec hardware and so the software houses produce more demanding software to take advantage ect on on and on the cycle goes on
I disagree, it's the opposte, it's because software isnt optimised for the hardware, so the hardware is unefficient and hence runs the software with pure brute force.

If what you said was correct, why can a sub $100 CPU and sub $150 GPU run any game today at max settings?
Posted on Reply
#18
pantherx12
by: Velvet Wafer
Finally, 8192x shadows in Skyrim are possible without major lag! :D
Ha ha not whilst the shadows are run by the cpu! :laugh:
Posted on Reply
#19
NdMk2o1o
by: Dent1

If what you said was correct, why can a sub $100 CPU and sub $150 GPU run any game today at max settings?
BF:3, Metro 2033, Skyrim can all be run on max with sub $100 CPU and sub $150 GPU? I think not, there are a few more though not many so we are at a transition period whereby the game devs need to start making more games like the aforementioned to take advantage of the current hardware, though your statement is untrue, the mid range can't max the top games, least not the few I mentioned and others. There will be another Farcry/Crysis/Metro 2033 out soon enough and so it continues.
Posted on Reply
#20
Super XP
by: radrok
That's not entirely true, it's because the resolution standard is stuck at 1080p, once your go over you have the need for such graphics cards.
Absolutely not, I got an 8-Core CPU running at 4.40GHz and a HD 6970 and I still cannot run games like Skyrim and Metro 2033 at MAX PQ.
Like somebody already said, the software needs to catch up to the hardware. Anybody claiming otherwise is gaming too much in their sleep.
Posted on Reply
#21
Dent1
by: NdMk2o1o
BF:3, Metro 2033, Skyrim can all be run on max with sub $100 CPU and sub $150 GPU? I think not, there are a few more though not many so we are at a transition period whereby the game devs need to start making more games like the aforementioned to take advantage of the current hardware, though your statement is untrue, the mid range can't max the top games, least not the few I mentioned and others. There will be another Farcry/Crysis/Metro 2033 out soon enough and so it continues.
Well yes. Bought my CPU over 2 years ago for £75. Bought my GPU for £79.99, actually bought two for CF. Runs all those games max settings no problem. Friend has same setup with larger monitor still runs maxed out.
Posted on Reply
#22
pantherx12
by: Super XP
Absolutely not, I got an 8-Core CPU running at 4.40GHz and a HD 6970 and I still cannot run games like Skyrim and Metro 2033 at MAX PQ.
Like somebody already said, the software needs to catch up to the hardware. Anybody claiming otherwise is gaming too much in their sleep.
Should get the new SKSE performance plugin dude, will be able to max out skyrim easily :toast:

( I can max it out and then some and I'm at 4.3 and using a 6870)
Posted on Reply
#23
mechtech
by: Esse
There's my little country :)
You're telling me, the province of Ontario is larger than that land mass.

And I am more interested to see the 7850!!!
Posted on Reply
#24
Super XP
by: pantherx12
Should get the new SKSE performance plugin dude, will be able to max out skyrim easily :toast:

( I can max it out and then some and I'm at 4.3 and using a 6870)
Where we get this bro :confused:
Posted on Reply
#25
radrok
by: Super XP
Absolutely not, I got an 8-Core CPU running at 4.40GHz and a HD 6970 and I still cannot run games like Skyrim and Metro 2033 at MAX PQ.
Like somebody already said, the software needs to catch up to the hardware. Anybody claiming otherwise is gaming too much in their sleep.
Until consoles get a generation refresh the only way to increase performance is to buy overkill hardware, while I agree with you on optimization the need for faster hardware is still there, as I said previously.

3DMark 2011, although not being a game, is a fine example of how much a full DirectX11 game could be heavy on GPUs, if all developers would implement all DX11 libraries we'd need more than dual Tahiti to cope with the need for hardware.
Metro2033 itself is a GPU hog, sure some games are CPU bound and that's developing flaws.

And a Bulldozer is not an 8-core CPU, because it's like saying that mine is a 12-core CPU... doesn't work that way.
Posted on Reply
Add your own comment