Wednesday, April 17th 2013

AMD Radeon HD 7990 Launch Date Revealed

Market launch of AMD's Radeon HD 7990 "Malta" dual-GPU graphics card is less than a week away, according to an OCaholic report. Sources told the publication that AMD plans to launch its flagship graphics card on the 24th of April, 2013. According to it, reviews of the card should already be underway. AMD Radeon HD 7990 is the company's flagship graphics card, featuring a pair of 28 nm "Tahiti" GPUs. According to specifications derived from older reports, it packs a total of 4096 stream processors, and 6 GB of GDDR5 memory across two 384-bit wide memory interfaces. What sets this card apart from the HD 7990 "New Zealand" launched last year by AMD's partners is the power-optimizations AMD put into it, leaving the card to draw power from "just" two 8-pin PCIe power connectors, and make do with a dual-slot cooling solution.

Source: OCaholic.ch
Add your own comment

96 Comments on AMD Radeon HD 7990 Launch Date Revealed

#1
blibba
by: EarthDog
Second card for crossfire does NOTHING
It certainly isn't true that a second card does nothing. I can understand people's strong reaction when they see things like this, though.







They're not universal issues, but they are alarming for such expensive and popular hardware. (Please don't take me as suggesting that SLI is issue-free, btw - the 690 doesn't exactly flatter itself in these same graphs.)
Posted on Reply
#2
Eagleye
by: adulaamin
yeah I'm brainwashing readers to believe AMD drivers are less than spectacular when it comes to crossfire. I guess it's not working since it's only pissing some people off. :p
Should we also stop recommending ALL single nvidia cards bcoz AMD is Superior in single cards over Nvidia?

Edit: The Xfire and Sli is game dependent anyway. Some games are better on Xfire and some on Sli
Posted on Reply
#3
blibba
by: Eagleye
Should we also stop recommending ALL single nvidia cards bcoz AMD is Superior in single cards over Nvidia?
Well, if you pop over into Nvidia news threads you'll find that there are plenty of people already doing that, and it's similarly poorly justified. Such is the internet.
Posted on Reply
#4
EarthDog
I hear ya blibba... and it is a REAL issue. Just not everyone is affected. Some games dont show those severe frame time problems. Yes, they are AAA titles and the problem should be fixed, no doubt. The kicker is this... can you SEE the problem even though its there? It depends on the title really.
Posted on Reply
#5
blibba
by: EarthDog
The kicker is this... can you SEE the problem even though its there? It depends on the title really.
And on the person!

Personally, as someone who plays very few AAA titles, these issues in the presumably best-tested games really put me off multi-GPU altogether.
Posted on Reply
#6
EarthDog
And I dont blame you for being concerned. What I take exception to was the tone/wording of the thread here and several articles on it just blowing it out of proportion.
This is my message to you guys, frametime measurements have been taken out of proportion. What you see in these charts sometimes looks alarming with massive spike, whilst your average end-user will never ever even see it on screen.
http://www.guru3d.com/articles_pages/geforce_gtx_650_ti_boost_review,8.html



Oh well. :)
Posted on Reply
#7
the54thvoid
I really hope AMD nail this card and have release drivers to match it. That would spur Nvidia to more action - they have in the past reacted to AMD driver leaps with improvements of their own.

I'd be hard pressed though to go back to dual gpu for gaming. Even with latency issues being addressed some titles don't work in crossfire and that can be done at developer level. Maybe with AMD's control over gpu's in next gen consoles though this will be the start of a software revolution for them. They have the hardware and if the power optimisations pay off then they'll have done an awesome job.

But just to add, i stopped using 7970 crossfire because of the things being 'debated' above. As pretty as this card might be, it needs to match Nvidia in smoothness - not beat it in fps.
Posted on Reply
#8
Mathragh
Perhaps AMD's focus with crossfire this time around hasn't been on improving the frame times for each game individually, but rather on the mechanic. They could've implemented something Nvidia is also using; namely frame buffering. This would would probably be a lot more easy to implement, hence the fairly quick release, and work for all games/apps currently suffering from spiky frametimes.

Just a thought!
Posted on Reply
#9
adulaamin
by: Eagleye
AMD is Superior in single cards
I'd bet the few people who own a Titan would say otherwise. :p

by: Eagleye
Should we also stop recommending ALL single nvidia cards?
We all have different ideas on what we think IS better. Price/performance, performance/watt, pure performance, games bundled. Recommend what you honestly think is better. :) If I have a friend who has deep pockets and doesn't care about price, I'd recommend he get a Titan and be done with it. If I had the money I would too.
Posted on Reply
#10
cadaveca
My name is Dave
by: Mathragh
They could've implemented something Nvidia is also using; namely frame buffering. This would would probably be a lot more easy to implement, hence the fairly quick release, and work for all games/apps currently suffering from spiky frametimes.
Maybe patents prevent this. That would make a ton of sense, actually. If NVidia patented the tech they used(and it is very likely they did), AMD would have to develop a whole new method of doing their frame-time balancing, and that would easily, in my books, explain why it has taken so long for them to produce proper Crossfire support.
Posted on Reply
#11
Xzibit
by: cadaveca
Maybe patents prevent this. That would make a ton of sense, actually. If NVidia patented the tech they used(and it is very likely they did), AMD would have to develop a whole new method of doing their frame-time balancing, and that would easily, in my books, explain why it has taken so long for them to produce proper Crossfire support.
That be odd since the data coming out of the Game Engine never lines properly between the two vendors then you introduce mischief from both hardware and software implementations to rectify an issue in the pipeline that might not even be your problem to begin with.

Both should stick to synchronozing additional cards more efficiently but imposing solutions to mask inherent problem from the PC gaming pipeline id say no. If anything the FCAT test says that method is proving wrong in scalability with Higher resolutions.
Posted on Reply
#12
cadaveca
My name is Dave
by: Xzibit
That be odd since the data coming out of the Game Engine never lines properly between the two vendors then you introduce mischief from both hardware and software implementations to rectify an issue in the pipeline that might not even be your problem to begin with.
Blah, blah, blah. You haven't read much of the info that has come out recently on the subject, have you? There is no such situation. Game devs do not make drivers. It's all the hardware's fault. Really. DirectX is "supposed" to offer an open rendering environment that makes this not an issue. That's why they got rid of "Cap Bits". Oh, what's that you say? CUDA? Nah, it doesn't limit that in any way, at all. Why did you ask?



Either way, there must have been a specific reason that AMD has these issues, and NVidia does not. I don't accept the answer they have given of "we are too stupid to look at that". I've been bitching to them directly about their issues for years. They made a choice to ignore it, and all issues now are 1000% the fault of hardware vendors. There is no denying that. The past 10 years have proven it.

But, I guess AMD now seems to think that the "7990" is a viable enough option to want to release this card. I'm very interested myself, but I remember the 5970 launch, and 5870 Crossfire not getting proper Eyefinity Crossfire drivers like the 5970 had, for months. Not surprisingly, back then AMD have cursor corruption issues, and they do now too. AMD has already said July for that, and that's about the same time frame with the 5870 Crossfire Issues.



I really care about products like this, because I'm an Eyefinity user, and if this card will fix the issues I have, I'll sell my current cards and get one.
Posted on Reply
#13
Casecutter
by: adulaamin
I hope they have working crossfire drivers by then. If not, there will be a lot of complaints. :)
by: jigar2speed
spinning every single AMD product's news into AMD driver suck news, honestly, i am getting sick of this, trust me a lot of regulars are as well.
I think adulaamin provided a pertinent statement in the case of this dual chip product because it essentially needs Crossfire working well to be outstanding. If you’ve follow recent reviews C/F on GCN architecture hasn’t been a super strong suite for AMD lately, and Latency has been brought recently. I’m still not sure such frap measurements early in the pipeline are a true indicator of the actual smoothness you eyeball on the screen. Hopefully we’ll see very soon.
Posted on Reply
#14
Xzibit
I guess what I was saying wasnt simple enough.

I was saying that if Nvidia introduced a hardware solution like you were impling. The FCAT test at PCPer have proven that scaling above 1080p is horrid. So I wouldnt call that a solution.

AMD could provide a similar solution but then you'd have both AMD and Nvidia multi-GPU configurations being equally as aweful beyond 1080p.
I'm pretty sure thats a niche of a niche that buy the higher end GPUs and go Multi-GPU just to stick to 1080p

AMD would be wise and look at how badly Nvidia is scaling in that departement when the implications are they are doing it "Properly" so to speak.
They should take note aswell as Nvidia themselves and improve rather then to match a less then preferable performance outcome for those using Multi-GPU solutions or thinking about going multi-gpu in the future.
Posted on Reply
#15
Mathragh
by: Casecutter
I think adulaamin provided a pertinent statement in the case of this dual chip product because it essentially needs Crossfire working well to be outstanding. If you’ve follow recent reviews C/F on GCN architecture hasn’t been a super strong suite for AMD lately, and Latency has been brought recently. I’m still not sure such frap measurements early in the pipeline are a true indicator of the actual smoothness you eyeball on the screen. Hopefully we’ll see very soon.
The discrepancy between fraps measurements and the frame delivery to the screen itself was indeed there, shown by both anandtech, and techreport. They also showed that nvidia used some "secret sauce"(read frame buffering) that caused frames that were delivered unevenly by the game engine to be smoothed out by the driver/hardware layer in order to deliver smooth frames on screen. This isn't THE solution to the problem, as this will cause other things, like input lag to increase, because frames will sit in the buffer for a longer time, and it will probably not decrease all of the juddery movements on screen, but atleast the frame delivery will be a bit smoother.

Some kind of buffering would be in my opinion the most easy kind of "cheap fix" they could do that would improve things, especially for crossfire. However, as cadaveca also said, they might have to find a solution that is not exactly the same as Nvidias' , as a result of patent issues that would otherwise arise(i dont have a lot of knowledge about patents myself, so i wont argue about that)

In any case, I do believe that they atleast made some progress with drivers for this card, because of a lot of reasons, some stated in a previous post of mine .
Posted on Reply
#16
cadaveca
My name is Dave
by: Xzibit
I guess what I was saying wasnt simple enough.

I was saying that if Nvidia introduced a hardware solution like you were impling. The FCAT test at PCPer have proven that scaling above 1080p is horrid. So I wouldnt call that a solution.

AMD could provide a similar solution but then you'd have both AMD and Nvidia multi-GPU configurations being equally as aweful beyond 1080p.
I'm pretty sure thats a niche of a niche that buy the higher end GPUs and go Multi-GPU just to stick to 1080p

AMD would be wise and look at how badly Nvidia is scaling in that departement when the implications are they are doing it "Properly" so to speak.
They should take note aswell as Nvidia themselves and improve rather then to match a less then preferable performance outcome for those using Multi-GPU solutions or thinking about going multi-gpu in the future.
Blah blah blah...

A quick Patent search will bring up any number of patents that support what I've posted. That's all. The rest is just conjecture. ;) I really don't care about what NVidia is doing, nor is it on topic.


I just want properly working Crossfire and Eyefinity. Taking pictures right now of my cards to list so I can buy a 7990. Need a 7950?:roll:
Posted on Reply
#18
arbiter
I wouldn't say its AMD cheating more as eveyone relying on frap's fps numbers with out looking at the video output. Frap's pulled fps numbers before it gets to gpu so if a frame is slow to render or gets dropped for what ever reason both cards get it counted for it. Nvidia has hardware in their cards prevents it. Which card is faster? its pretty much a dead heat when its 1 card vs 1 card, not in SLI/CF or dual gpu card. Its pretty much up in the air.
Posted on Reply
#19
EarthDog
Here is the thing. If you test with FCAT, a lot of the frame times and spikes you see, you wont notice as a user. Its when the spikes hit above 50ms or so that a user can notice. FRAPS can catch this...
Posted on Reply
#21
EarthDog
by: Slizzo
Soft launch?
Is the NDA up yet................? Awful hard to judge availability when the NDA isnt up... :toast:
Posted on Reply
#22
arbiter
by: EarthDog
Here is the thing. If you test with FCAT, a lot of the frame times and spikes you see, you wont notice as a user. Its when the spikes hit above 50ms or so that a user can notice. FRAPS can catch this...
you will notice it as low as 30ms, 33ms frame is only 30fps time's you might not notice it if its say once every 10+ sec but if it happens say every other second you will notice a slight bit of a stud-er.
Posted on Reply
#23
EarthDog
That isnt what other articles say...

Articles seem to vary.. LOL!
Posted on Reply
#24
march10k
Meh...a minor, but important reason that I shelled out for a 7970 when it was brand new is future-proofing. I'm skipping this generation...and the next three! :toast:
Posted on Reply
#25
theoneandonlymrk
by: march10k
Meh...a minor, but important reason that I shelled out for a 7970 when it was brand new is future-proofing. I'm skipping this generation...and the next three! :toast:
Three might be optimistic dude, I am supposed to be waiting on the next generation but these damn game promos and also the 2-3gb memory upgrade(over my 1gb gfx's) have me twitching.

Ahh I hear the urge to upgrade screaming at me now.........no.. can't. .
Posted on Reply
Add your own comment