Thursday, January 19th 2012

Sapphire HD 7950 3 GB Pictured

Pictures of the Sapphire HD 7950 3 GB graphics card have been leaked on the Guru3D forum by user asder00. No further details or specifications were given other than "Product name: Sapphire HD7950 3G GDDR5 PCI-E HDMI/DVI-I/DUAL MINI DP OC VERSION".

As can be seen by the cooler, this is a non-reference design - at least for the cooler. The cooler is coloured black, has a very modern, sharp, angular and somewhat futuristic look to it and sports two large Sapphire-branded fans. Of course, it has heatpipes and five are visible. The design of this cooler suggests that it should work quietly and cool efficiently. Connectivity is aided and abetted by one DVI port, one full size HDMI port and two mini-DisplayPorts, like on its bigger brother the HD 7970. Moving on to the box, one can see that it has a 384-bit bus and 3 GB GDDR5, again like the HD 7970. The box also shows a logo with "OverClock" written boldly in white on it. The inevitable attractive lady box art in this instance has her wearing a grey top, a partial face mask and a helmet from what looks like the wars of yesteryear, plus she is carrying a rifle over her shoulder.
Add your own comment

29 Comments on Sapphire HD 7950 3 GB Pictured

#1
qubit
Overclocked quantum bit
Thanks to radrok for the tip. :toast:
Posted on Reply
#2
Dj-ElectriC
OC capabilities are strong with this one!!! :rockout:
Posted on Reply
#3
ViperXTR
inb4 powercolor 7950 lol
Posted on Reply
#4
Wrigleyvillain
PTFO or GTFO
Yum. Loving the high vram on this series.
More infos:

Output:1 x Dual-Link DVI
1 x HDMI (with 3D)
2 x Mini-DisplayPort
DisplayPort 1.2

GPU: 900 MHz Core Clock
28 nm Chip
1792 x Stream Processors

Memory: 3072 MB Size
384 -bit GDDR5
5000 MHz Effective

Dimension: 275(L)x115(W)x42(H) mm Size.

Software: Driver CD
SAPPHIRE TriXX Utility

Accessory: CrossFire™ Bridge Interconnect Cable
DVI to VGA Adapter
Mini-DP to DP Cable
6 PIN to 4 PIN Power Cable x 2
HDMI to SL-DVI Adapter
HDMI 1.4a high speed 1.8 meter cable(Full Retail SKU only)
Posted on Reply
#5
shiny_red_cobra
If this one really has a 384-bit memory bus, then I see no reason at all to get a 7970. This is just as good imo.
Posted on Reply
#6
FierceRed
I love Sapphire.

Remember the Dawn versus Ruby tech demo virtual girl battles nVidia and ATI competed in during the early 2000s? Sapphire's box art is the only visible legacy I can count on regularly seeing these days and my inner nostalgia addicted pubescent teenager-child is grateful to them.

Surely her bust size and horribly ineffective camouflage scheme will improve my gaming enjoyment. Obviously!

T-minus 13 days until the reviews and Kepler announcement speculation...
Posted on Reply
#9
zomg
2 x 6-pin power ?
Posted on Reply
#10
puma99dk|
anyone know the price of her?

i hope that Nvidia maybe will push the price down alittle x:
Posted on Reply
#11
Completely Bonkers
Now let's say you have 32bit x86 windows. 3GB is going to be a problem due to shadowed memory allocated address space for the frame buffers etc. Do the drivers let you set a max memory usage to, say, 1GB or 2GB? Or are there other workarounds for using such a great new gaming GPU in a (common) 32bit system? A 3GB GPU is at risk of completely consuming all system memory address space on 32bit leaving only a few MB let for the game!
Posted on Reply
#12
Dj-ElectriC
Is there any reason for someone with an HD7950 to use 32bit os at all?
Posted on Reply
#13
dir_d
by: Completely Bonkers
Now let's say you have 32bit x86 windows. 3GB is going to be a problem due to shadowed memory allocated address space for the frame buffers etc. Do the drivers let you set a max memory usage to, say, 1GB or 2GB? Or are there other workarounds for using such a great new gaming GPU in a (common) 32bit system? A 3GB GPU is at risk of completely consuming all system memory address space on 32bit leaving only a few MB let for the game!
Shouldnt be a problem except for maybe XP but why would you be on XP with this card anyways.
Posted on Reply
#14
Wrigleyvillain
PTFO or GTFO
Exactly. That's ridiculous. Why don't you pair it with a Prescott P4 while you're at it?
Posted on Reply
#16
Completely Bonkers
by: overclocking101
lol old hardware fail
lol, "lol" fail.

It's a software (x86 OS) not a hardware problem!

So, basically, you guys don't know but can recommend an OS upgrade. Wow, thanks! Genius! Employees at geeksquad by any chance?
Posted on Reply
#17
Casecutter
by: FierceRed
I love Sapphire.

Remember the Dawn versus Ruby tech demo virtual girl battles nVidia and ATI competed in during the early 2000s? Sapphire's box art is the only visible legacy I can count on regularly seeing these days and my inner nostalgia addicted pubescent teenager-child is grateful to them.

Surely her bust size and horribly ineffective camouflage scheme will improve my gaming enjoyment. Obviously!

T-minus 13 days until the reviews and Kepler announcement speculation...
Ruby - The first female cyber celebrity and just keep getting better/hotter with age! Who in the real world can do that, may be… Christie Brinkley ;)
Posted on Reply
#18
dir_d
by: Completely Bonkers
lol, "lol" fail.

It's a software (x86 OS) not a hardware problem!

So, basically, you guys don't know but can recommend an OS upgrade. Wow, thanks! Genius! Employees at geeksquad by any chance?
You wont have this problem with windows 7 or vista 32 bit. Those issues hae been addressed. The problem you stated is only for xp and below.
Posted on Reply
#20
FierceRed
by: dir_d
You wont have this problem with windows 7 or vista 32 bit. Those issues hae been addressed. The problem you stated is only for xp and below.
My understanding with this is that they have been addressed in DirectX version 10+ only, am I not correct? (I read the sticky, yes)

Perhaps I'm crossing the streams here between game software demands on an OS software backbone, but if someone plays a DirectX 9 game on Win7/Vista 32bit, wouldn't the rendering of the DirectX 9 path necessitate the GPU memory mirroring problem that the 7900 series is being feared to perhaps experience?

Or is the GPU memory mirroring not occurring at all, even under DirectX 9 games, due to the Win7/Vista 32bit OS backbone being based on a DirectX 10+ foundation?
Posted on Reply
#21
mediasorcerer
by: Wrigleyvillain
Exactly. That's ridiculous. Why don't you pair it with a Prescott P4 while you're at it?
thats a very witty comment^:laugh:
damn if i didnt have another big expense atm .ah well, there not going anywhere,
Posted on Reply
#22
Horrux
by: Completely Bonkers
Now let's say you have 32bit x86 windows. 3GB is going to be a problem due to shadowed memory allocated address space for the frame buffers etc. Do the drivers let you set a max memory usage to, say, 1GB or 2GB? Or are there other workarounds for using such a great new gaming GPU in a (common) 32bit system? A 3GB GPU is at risk of completely consuming all system memory address space on 32bit leaving only a few MB let for the game!
You don't load 3Gb of textures on a 3Gb card either. You need the room for a huge frame buffer and all the intermediate computations that lead to the final, displayed image. So in any case, you will toy around with the settings in your game until you get what works. And in any case, you will benefit from the monstrous power of this GPU, if with perhaps 'strange' (i.e., non-maxed out) settings due to system RAM limitations.

Until you upgrade your OS.
Posted on Reply
#23
Prima.Vera
by: Completely Bonkers
Now let's say you have 32bit x86 windows. 3GB is going to be a problem due to shadowed memory allocated address space for the frame buffers etc. Do the drivers let you set a max memory usage to, say, 1GB or 2GB? Or are there other workarounds for using such a great new gaming GPU in a (common) 32bit system? A 3GB GPU is at risk of completely consuming all system memory address space on 32bit leaving only a few MB let for the game!
Who is using Win x32 nowadays??? I mean seriously?? I am talking about gamers not office users or such....:twitch:
Posted on Reply
#24
FierceRed
by: Prima.Vera
Who is using Win x32 nowadays??? I mean seriously?? I am talking about gamers not office users or such....:twitch:
You would be surprised... or perhaps "surprised" isn't the right word :laugh:... at the level of ignorance about an x64 OS still exists out there. Believe it or not, many of the gaming masses believe that going to a 64bit OS will break their ability to run 32bit programs, or at the very least introduce some issues that are non-existent on a 32bit OS.

Rarely do they understand that the continuing evolution of technology (which seems at an ever quicker pace these days) is actually creating issues in the opposite direction; that staying on a 32bit OS where things like address space limitations and GPU frame buffer mirroring still exist are going to create much more jarring problems in the near future.

But then, the gaming masses aren't buying $700 3GB GPUs, so I guess it balances out. Doesn't change the fact that soon nVidia and AMD are going to have to put "64bit OS compatible only" stickers on their retail boxes in a few years.
Posted on Reply
#25
MxPhenom 216
Corsair Fanboy
I want this card now!:D :D :D :D :D :D
Posted on Reply
Add your own comment