Wednesday, September 10th 2014

Galaxy GeForce GTX 970 Pictured, Specs Confirmed, Early Benchmarks Surface

Here are some of the first pictures of an AIC partner branded NVIDIA GeForce GTX 970 graphics card, the Galaxy GTX 970 GC. Spotted across Chinese PC enthusiast forums and social networks, the latest set of leaks cover not just pictures of what the GTX 970 looks like, but also what's under its hood. To begin with, Galaxy's card appears to be built for the high-end market segment. A meaty twin-fan aluminium fin-stack heatsink, coupled by a spacey backplate cover a signature Galaxy blue PCB, holding NVIDIA's new GTX 970 GPU, and 4 GB of GDDR5 memory. The card appears to feature a high-grade VRM that draws power from a combination of 8-pin and 6-pin PCIe power connectors.
There's also a selection of pictures of a purported reference-design GeForce GTX 970 graphics card. It may look drab, but that's because NVIDIA will not ship reference-design cards. The GTX 970 will likely be an AIC-exclusive, meaning that you'll only find custom-design cards based on the chip. We wonder if that's the same with the GTX 980. Straightaway, you'll notice that the GTX 970 reference PCB bears an uncanny resemblance to the one NVIDIA used for the GTX 670, GTX 660 Ti, and GTX 760. That's probably because the GK104 and new GM204 are pin-identical. Such a thing isn't new. The "Pitcairn" silicon (Radeon HD 7870, HD 7850) and its predecessor, "Barts" (HD 6870 and HD 6850) are similarly pin-identical, differing with the die. The similarity in PCB design, if nothing, shows that the GTX 970 will be as energy-efficient as the GTX 670.
Moving on to the actual-specs, and some users with access to GeForce GTX 970 managed to pull these specs off a TechPowerUp GPU-Z screenshot. Some parts of the screenshot look blurry, probably due to a failed attempt at blurring out the BIOS string. GPU-Z has preliminary support for GM204 since version 0.7.9. This is what it could make out:
  • GPU identified as "1C32"
  • 1,664 CUDA cores
  • 138 TMUs
  • 32 ROPs
  • 256-bit wide GDDR5 memory interface
  • 4 GB standard memory amount
  • 1051 MHz core, 1178 MHz GPU Boost, and 7012 MHz (GDDR5-effective) memory clocks
  • 224 GB/s memory bandwidth
The sample was put through a quick run of 3DMark 11 Extreme Preset. It scored X3963 points, which if you factor in the dual-core Core i3-4130 used in the bench, puts the GTX 970 somewhere between the GTX 780 and GTX 780 Ti, in terms of performance.
Source: VideoCardz
Add your own comment

69 Comments on Galaxy GeForce GTX 970 Pictured, Specs Confirmed, Early Benchmarks Surface

#27
HumanSmoke
Dj-ElectriCKepler and Maxwell are not the same, not nearly the same in performance\watt ratio.
If a 1664 part can beat a GTX 780, than yes, a 1920 part could match something that is 22% faster than a GTX 780.
Well I wouldn't get your expectations too high. The 3963 3DMark 11 Extreme score supposedly put the card (which carries a 5% overclock) between the 780 and 780Ti when using a Core i3-4130 ? Yet the same Core i3-4130 and a bog-standard 780 at reference clocks scores 4228. So it seems like the only "proof" (the score) doesn't actually back up WCCF's assertion of the cards ability. Am I surprised? No. Am I surprised that people just accept a claim without delving past the hyperbole? Somewhat.
james888I was hoping for more compute performance. The 750ti has incredible compute performance per watt. In F@H my overclocked 750ti gets about 70k points per day, similar to that of a 7870 or 660ti. Gaming wise, it is no where near them, but in compute it is really close.
Are you sure maxwell won't have good compute performance?
Ah, I was referring to the double precision rather than general compute performance - my bad- I should have been more specific. GM 107's FP64 rate is 1:32, which is a decrease over low-end Kepler's 1:24. I pretty much expect GM 204 to follow that trend in comparison to GK 104.
Double precision is an over rated metric in general (although of seemingly variable importance to some people), but if GM 204 is also intended for Quadro cards -as seems likely, it may be a bullet point for future SKUs.
Posted on Reply
#28
Diverge
When is Nvidia going to get rid of the DVI ports and make the vent go from end to end?

There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.
Posted on Reply
#29
Nordic
DivergeWhen is Nvidia going to get rid of the DVI ports and make the vent go from end to end?

There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.
But I don't want to purchase a new cable to go with a new card, hypothetically that is. Really though, is the vent going from end to end going to help cooling that much compared to the practicality of a dvi port? Just get a card where an card maker doesn't put a dvi port. Remember, there is no reference design.
Posted on Reply
#30
Delta6326
I've been away for some time now, but did I miss GTX 8** or something?
Posted on Reply
#31
HumanSmoke
DivergeThere's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.
Except that HDMI 1.x doesn't natively support 21:9 aspect ratios - so all those people who buy 21:9 monitorsare basically screwed for HDMI anyway - it's DVI or DP, and DP isn't found on all monitors (although most if not all ultrawides should have it).
Delta6326I've been away for some time now, but did I miss GTX 8** or something?
Mobile.
Posted on Reply
#32
silapakorn
Delta6326I've been away for some time now, but did I miss GTX 8** or something?
It's just a naming scam. Kinda like when some buildings skip the 13th floor entirely or name it "12a".
Posted on Reply
#34
15th Warlock
I hope I don't get too much flak for this, as it's my own personal opinion of the tech landscape so far the last 12 months:

First the whole Mantle delay fiasco, then the Devil's Canyon disappointment, Haswell-E's sort of meh release afterwards including some ridiculous markup on DDR4 modules, and now this...

I guess the days of 50~75% performance gains between enthusiast CPU and GPU generation jumps are far behind us, and now we have to just have to accept a paltry 10~15% jump or the same performance than last generation but at a discounted price...

Mobile and notebooks is where the current technology revolution is taking place, Core-M, Tegra K1, AMD's APUs...

I mean, you can still invest thousands of dollars on a monster build with a watercooled i7 5960X, 32GBs of DDR4, an ROG RVE X99 board and SLI GTX980s, but chances are, gaming performance wise, you'll probably just be 15~20% ahead of a 2~3 yr old rig...

Oh well, i look forward to the new breed of x86 tablets with enough performance to run most current games and battery life to last you a whole day, but I sure miss those quantum leaps in performance of days long gone by...
Posted on Reply
#35
GhostRyder
Delta6326I've been away for some time now, but did I miss GTX 8** or something?
They named the mobile GPU's for the last Kepler series (And a few Maxwell variants) as the 8XX series. The desktop market decided to skip for a little boost in anticipation that they are going way beyond the previous generation of cards.
DivergeWhen is Nvidia going to get rid of the DVI ports and make the vent go from end to end?

There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.
DVI is still a basic standard when it comes to monitors for gamers hence why they keep them. Most people I run into still actually use them at LAN parties while the rest of course now use HDMI but its still something in use. If I were choosing I would love to just have Display Ports now and use adaptors for everything but that's a personal opinion.
Posted on Reply
#36
wolf
Performance Enthusiast
looks quite promising from where I stand, with only preliminary specs and one benchmark.

Interested in power consumption figures. keep in mind that driver optimization will always boost performance beyond what we are seeing now, and looking at ~1600 shaders performing between 2300 and 2900 of the older type bodes very well for maxwell architecture.

Question is, is the fully unlocked GM104 1792 or 2048? and what about GM100?

Oh and being pin compatible with GK104 should do well for pricing, after the inital few months of sales when prices settle, naturally.

Interesting times ahead.
Posted on Reply
#37
The Von Matrices
At the moment is there any faster GDDR5 than 7GHz or will the GTX 980 have the same memory bandwidth as the GTX 970? I remember a few scenarios where the GTX 680 performed nearly the same as the GTX 670 due to their identical memory bandwidths, and this new generation may have the same issue.
Posted on Reply
#38
GhostRyder
The Von MatricesAt the moment is there any faster GDDR5 than 7GHz or will the GTX 980 have the same memory bandwidth as the GTX 970? I remember a few scenarios where the GTX 670 performed no worse than the GTX 680 due to their identical memory bandwidths, and this new generation may have the same issue.
Most likely, the memory will be the same based on thethe early "Leaks". Makes it a good thing because now the lower and higher cards will be ready for all high resolution needs even with the 256 bit bus.

I mean if we go off that leak of course...Plus those were overclocks so it may just end up being exactly the same or we could find out things were hidden from us.
Posted on Reply
#39
v12dock
Block Caption of Rainey Street
Anyone else notice ATItool in the background?
Posted on Reply
#40
HumanSmoke
The Von MatricesAt the moment is there any faster GDDR5 than 7GHz or will the GTX 980 have the same memory bandwidth as the GTX 970?
Likely the same. 7Gbps is the fastest GDDR5 at the moment
Elpida (Micron)
SK Hynix
Samsung
v12dockAnyone else notice ATItool in the background?
Software photobomb!
Posted on Reply
#41
Lionheart
Seems decent... I would love to pick up a GTX 980 if it surpasses the GTX 780Ti which it should!!! & also if the price is right :toast:
Posted on Reply
#42
Axaion
DivergeWhen is Nvidia going to get rid of the DVI ports and make the vent go from end to end?

There's not much reason for bulky DVI ports anymore, when we have hdmi that easily covert to DVI with a passive adapter.
>wanting HDMI
>any year
>not displayport

Jesus christ just get a mac
Posted on Reply
#43
johnspack
Here For Good!
Still waiting for my 770......
Posted on Reply
#44
BiggieShady
Well, GTX 770 was less of an improvement over GTX 670, than GTX 970 is relative to GTX 770. Skipping the 800 series made it even more obvious :D
Posted on Reply
#45
buildzoid
HumanSmokeExcept that HDMI 1.x doesn't natively support 21:9 aspect ratios - so all those people who buy 21:9 monitorsare basically screwed for HDMI anyway - it's DVI or DP, and DP isn't found on all monitors (although most if not all ultrawides should have it).

Mobile.
I use HDMI for to drive my 21:9 ASUS monitor at 83hz
Posted on Reply
#46
HumanSmoke
buildzoidI use HDMI for to drive my 21:9 ASUS monitor at 83hz
Well, that's interesting - so I guess the issues people have with 21:9 aspect ratio monitors and TV's are firmware related - scalar issues?
Seems a little odd that the HDMI 2.0 spec highlights native 21:9 aspect ratio and the HDMI 1.4 spec doesn't mention it.
Support for the wide angle theatrical 21:9 video aspect ratio
And maybe the Wiki page needs an update.
Nice to know that the issue isn't within the specification, although HDMI.org really need to make the specification and compatibility clearer.
Posted on Reply
#47
z1tu
HumanSmokeWell I wouldn't get your expectations too high. The 3963 3DMark 11 Extreme score supposedly put the card (which carries a 5% overclock) between the 780 and 780Ti when using a Core i3-4130 ? Yet the same Core i3-4130 and a bog-standard 780 at reference clocks scores 4228. So it seems like the only "proof" (the score) doesn't actually back up WCCF's assertion of the cards ability. Am I surprised? No. Am I surprised that people just accept a claim without delving past the hyperbole? Somewhat.


Ah, I was referring to the double precision rather than general compute performance - my bad- I should have been more specific. GM 107's FP64 rate is 1:32, which is a decrease over low-end Kepler's 1:24. I pretty much expect GM 204 to follow that trend in comparison to GK 104.
Double precision is an over rated metric in general (although of seemingly variable importance to some people), but if GM 204 is also intended for Quadro cards -as seems likely, it may be a bullet point for future SKUs.
This is what I was thinking, I mean, maybe I'm not very good at interpreting video card specs but core and memory clocks aside, the 780 has more memory bandwidth, more cuda cores, TMU's and ROP's. I was under the impression that at least the memory bandwidth and cuda cores/shading units whatever should count towards more performance and not just raw mhz. :confused:
Posted on Reply
#48
hardcore_gamer
It all depends on the price. If I can get a couple of 980Ti for <$900, and these perform well at 4K, I'm in. It's either 4K or GTFO.
Posted on Reply
#49
Roel
thebluebumblebeeThe sad truth is that there are no games pushing the hardware anymore. Do remember how long it took for hardware to catch up to Crysis?
Games are actually pushing hardware pretty hard and GPUs can't catch up, that's if you're using a newer monitor. 1080P 60Hz is really old by now, it's like saying that Crysis didn't push hardware when you were gaming at 720P or even 540P. 1440P 60Hz and 1080P 120+Hz monitors have been around for quite a while already and we're actually making the transition to 4K 60Hz and 1440P 144Hz. Try to use the full potential of such monitors in games like Battlefield 4 and you will see how hard it pushes your hardware. And then we're not even talking about surround gaming, in that case even the fastest 4-way SLI builds can't keep up.
Posted on Reply
#50
CoolZone
Will this card feature HDMI 2.0?
Posted on Reply
Add your own comment
May 9th, 2024 10:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts