Discussion in 'Graphics Cards' started by Boneface, Apr 6, 2012.
They must be doing something to get it to work like that
Not sure. Its perplexing but consistent...
The 7970 has been one of the worst cards I've ever owned from AMD(ATI). My ASUS DCUII has horrible VRAM temps and also had poor gpu temps before a lap and application of MX-4, the DP ports don't seem to like hdmi adaptors because anything I connect has a lovely flicker fit (could be driver issues).
I also had a Gigabyte Windforce 7970 (blue non-stock pcb), loudest card on the face of the earth, if you have ambient temps around/over 30°c. It sadly also had the loudest coil whine as well, easily audible while playing Skyrim/Saints/Anything 3D from the other side of my house.
Drivers have been a bit of a joke with each new version slowly fixing peoples issues and destroying fps in less popular games, also creating incompatibilities with OC tools (fixable with a .dll swap). Lastly I noticed a horrible texture flicker every 10-15 minutes on each of these cards, hopefully also a driver issue that can be fixed (I'll see if I can get some screens of that, because it's jarring but very brief).
So your mileage may vary, but going by the mountains of feedback I've been reading researching for my 680 purchase the 680 is the clear winner for ease of use. So my vote goes to the 680, I would have one right now if Pccasegear got their EVGA stock in a day sooner (darn Easter long weekend). I just hope you can find one at a good price, because a 680 costing more than a 7970 is daylight robbery.
Well, since we're offering our own anecdotal evidence, I purchased the Sapphire 7970 "Dual-X" OC card for the following reasons:
NVIDIA's "boost clock" is your method for overclock, but you cannot directly control when the card does or doesn't "boost". Thus, you can configure the boost to extend all the way up to, say, 1500Mhz, but if the card doesn't feel like going there (the internal boost mechanism doesn't like heat, or the power supply voltage, or whatever) then it simply will not do that speed. Thus, you are NOT in control of your overclock like you are on prior parts (AMD and NV.)
Boost clock cannot be disabled. Meaning IF you're overclocking, you're not guaranteed to stay at a specific speed. Maybe that's cool from a stability standpoint, but it's concerning to me from an enthusiast overclocking standpoint. It means the strong potential for inconsistent performance when you aren't expecting it...
2GB framebuffer < 3GB framebuffer. With the kind of horsepower available on a 680 or 7970, I want to throw a fistfull of AA at pretty much every game. But at 2560x1440, I'm going to run out of framebuffer far sooner on the 680 than I am with the 7970 while playing the multitudes of games that have "high res" textures from the modding community. Skyrim at 4xMSAA + Adaptive Multisampling + high res textures + shadowmaps set at 8192 results in ~2.4Gb of framebuffer usage. I'd be paging a 680 to death at these settings; the 7970 shrugs and keeps moving.
Given equal clocks, they perform equally. A 680 is clocked around 10% faster than a 7970 at default speeds, which is why it generally beats the 7970 by a similar percentage. As you overclock, the 7970 actually gets more benefit per-mhz than the 680 does. They also appear to have very similar max overclocks all things considered, but see also: my prior concerns regarding the overclocking on a 680.
Now, there are a list of CONS with the AMD board too. Drivers are an obvious one; I've been sticking to the 12.3 Betas (the REAL betas, not the 12.2's that were mistakenly labeled as 12.3 BETAs) and have not had issues. Nevertheless, I have indeed heard complaints about flashing textures in some games with newer drivers. Also, the 7000 series of video cards is a whole new architecture for AMD (GCN) which means game performance can be a bit varied for the time being; things should settle a bit with a few more driver revs.
Power consumption as a con? I dunno, people have loved NV cards for years and they've always been MONSTER power consumers. Now we're within 20W or so at peak load with the nod going to NV, and suddenly it's oh-so-important that we MUST have the lowest power consumption? Meh, maybe, but for me? An extra 20W stacked onto my overclocked 3930k, overclocked 32Gb of ram, and a 6-disk SSD RAID array is pretty much rounding error for my rig that pulls more than 500W from the wall. And I'm just gonna overclock my video card anyway, so is the slight increase in extra power really a large concern to me? Not really.
As for fan or coil noise? There aren't AMD's issues, these are issues for the manufacturer of your card. My Sapphire OC card has zero coil noise, and the fans are barely audible at their highest speed (which they never actually get that fast unless you force it via Afterburner or something.) I've also had no problems with TRIXX, AMD's PowerTune, or MSI Afterburner for my overclocking needs. I'm unaware of any DLL needs to make overclocking 'work', but perhaps that's also a card manufacturer issue (some cards may use 'non-reference' clock generators or something.)
All in all, I'm incredibly happy with my purchase. To be quite fair, I'd probably be similarly happy with a 680, albeit for slightly differing reasons. I do agree that NV has the upper hand on driver quality and initial compatibility with the newest games. I also agree that NV has better compute options in-game on older models, but keep in mind that Kepler is NOT a compute card like Fermi was. You shouldn't get into the 680 expecting awesome CUDA performance similar to it's standard gaming performance.
In fact, had the 680 not had this quirky Boost Clock business, I'd probably be rocking one right now. I did, in fact, wait until the 680 had been fully unveiled before making my purchase decision -- and boost clock was the one thing that finally swung my vote to AMD. Maybe after Boost Clock has another full generation to really get nailed down and fully understood by the masses will I be more interested in working with it. But I'm skeptical of any new power limiting technology on the first go-round; the same went for ATI when they introduced PowerTune, the same went for Intel when they introduced Turbo, and the same went for AMD when they first introduced TurboCore. All three had (at minimum) moderate issues that were resolved on the second and subsequent) generations.
There's some excellent threads over at Overclock.net about boost clocks on the 680, it's pretty much 100% figured out already, simple as hell too for an average person to just get in there and pull some sliders. You're pretty much guaranteed your boosted clocks if you can keep the card below 70°c.
The overclocking utility problem does seem to be related to certain cards and their manufacturers, so from my knowledge as long as you don't go with an ASUS DCUII you won't have that problem
Coil whine is like the lottery, and going by feedback it doesn't seem to be in 7900's favor, in fact if you check out Linus's mini review of your card his emits a large coil whine.
It is a shame the 680 didn't get released with 4gb of ram, that's what's going to hold it back, thankfully I'm one of those horrid people who actually like the standard 1920x1080 resolution.
Overall both cards are fantastic, I was just voicing my opinion on the 7970 which sadly was not a good experience. I've always went Sapphire with my ATI cards, its a shame it took so long for yours to come to Aus, my old 5870 Vapor-X was just the best .
On the plus side if the 7970's drivers and problems all get ironed out I'll be able to bust out the DCUII again.
I had 2200mb of vram used in shogun 2 total war in 1080p one time. so 1080p 2gb cap still holds you back to some degree. in reality not really. but still.
The thing holding back the 680 is the gk104 core! Stick the real kepler in there and it would be a whole different story
What's wrong with 1080p? Look's on my 46"
The game engine will scale texture quality automatically. I get over 2048mb of vram usage in BF3 with my 7970. This worked well with 6950 1gb CrossFire cards for me. Image quality took somewhat of a hit, but performance was very good with two GPU's. Really though I doubt IQ is noticable at all with a 2gb card vs. a 2gb+ card.
I own a HD7970 and it's one hell of a card and a overclocker as well , if the GTX 680 didn't cost $60 extra I would have chosen that card but I would just go for the HD7970 unless you could be bothered waiting for newegg to restock the GTX 680's in
Well i decided to go with an EVGA gtx680, picking it up today or tomrrow
thanks all for the help
hey lionheart, why not just OC in CCC to 1125/1575? 99% of all 7970's can reach that OC with no voltage increase or temp increase. i currently have mine at 1150/1650 and 1.93 volts. never gets above 55 celsius with my arctic heatsink
Ive got to see this in side by sides... Again, Its a logical explanation, but, need some proof this is happening.
Heres the new toy!
Sweet choice, made the same decision myself
Its a beast of a card in bf3
Where did you get that BF3 needs more than 2GB vram on ultra settings? I played BF3 on ultra settings on a 1080p screen with crossfire 6970 which gave me about 140 fps.
OS: Windows 7 64-Bit
Processor: Quad-Core CPU
Memory: 4 GB
Hard Drive: 20 GB
Graphics Card: DirectX 11 compatible with 1024 MB RAM (NVIDIA GEFORCE GTX 560 or ATI RADEON 6950)
Sound Card: DirectX compatible
Keyboard and Mouse
DVD ROM Drive
HardOCP found VRAM issues...
Maybe we have people here who aren't tracking VRAM usage while playing online?
Well i knew being an nvidia card that i wouldnt be able to play games in 3d on this monitor as its more for ati, but i thought i would at least be able to watch movies in 3d using powerdvd 12, but i cant so as off right now i have a 700$ 120hz 3d monitor that i can only use for 120hz lol. Tomorrow i will picking up a 7970...http://www.gigabyte.us/products/product-page.aspx?pid=4102#ov trading a guy i know for his 2 gtx580s and 50$, my comp store will take the 580s and 50$ and i get the 7970 lol
For me with ultra settings i was getting about 1500 for mem usage
Gah, I read it on a website somewhere.. Maybe it was [H]ardforum? Crap, I'll try to find it.
The numbers I mentioned in another thread are ALL from multiplayer. I only played SP to get to the part we use for benchmarking the game.
That said, Im wondering how the hell they came up with 5GB...How do they know where the usage goes after your vram runs out?
@ Erocker.. I started another thread here to investigate the issue.
Well after playing with the OC a bit, the card runs pretty good for bf3...Heres the beast and OC so far
So let me make sure if i got this right ... for now you get a better experience playing bf3 on the 680 then on a 7970 right ?
Please do get more details as i am planing to get the same monitor and also having the same problem 680 or 7970.
Go for the GTX 680 i've had both cards and the GTX 680 is better
Separate names with a comma.