Wednesday, July 1st 2015

EVGA Unveils the GeForce GTX 980 Ti Classified ACX 2.0+

EVGA unveiled the GeForce GTX 980 Ti Classified ACX 2.0+ graphics card (model: 06G-P4-4998). Positioned a notch below the GTX 980 Ti Classified Kingpin Edition, and one above the GTX 980 Ti Hybrid, the card will be EVGA's fastest GTX 980 Ti, until the Classified Kingpin Edition starts selling. It offers factory overclocked speeds of 1190 MHz core, 1291 MHz GPU Boost, and an untouched 7.00 GHz memory; compared to reference clocks of 1000 MHz core and 1076 MHz GPU Boost.

The card features EVGA's biggest and tallest ACX 2.0+ air-cooling solution, which is 2 slots thick, but a good inch and a half taller than the reference card. It features a large dual fin-stack heatsink that's ventilated by a pair of large fans. The PCB features a 14+3 phase VRM that draws power from a pair of 8-pin PCIe power connectors. The card comes with a pre-installed back-plate. Display outputs include three DisplayPort 1.2 and one each of HDMI 2.0 and dual-link DVI connectors. Other features include dual-BIOS and EVGA EV-Bot module support. Available now, the EVGA GeForce GTX 980 Ti Classified ACX 2.0+ is priced at US $700.
Add your own comment

32 Comments on EVGA Unveils the GeForce GTX 980 Ti Classified ACX 2.0+

#1
haswrong
wonder if it can surpass 1600MHz on air considering the very basic acx model clocks to 1500 without trouble.
Posted on Reply
#2
btarunr
Editor & Senior Moderator
Let's all just ignore the elephant in the room, which is just $50 premium over reference, for a ton of cool stuff.
Posted on Reply
#3
GhostRyder
btarunrLet's all just ignore the elephant in the room, which is just $50 premium over reference, for a ton of cool stuff.
That is an excellent deal! I would love to see how high this thing can clock because that would be a game winning combination for me. $50 bucks more than reference would be worth it for this card if it can hit 1600+mhz!!!
Posted on Reply
#4
the54thvoid
Intoxicated Moderator
The Galaxy (Galax, KFA, whatever) HOF is probably better. I'm sure it has a larger power phase. Unfortunately the HOF doesn't ship with dual bios as standard like the Classified.

Also - water block are really required here. And usually custom bios for super speeds unless the TDP limiter is relaxed...
And just for shits 'n' giggles, the LN2 HOF has a toggle switch to disable all Nvidia protections. OcUK said they will stock it but it has NO guarantee!! lol.
Posted on Reply
#5
GhostRyder
the54thvoidThe Galaxy (Galax, KFA, whatever) HOF is probably better. I'm sure it has a larger power phase. Unfortunately the HOF doesn't ship with dual bios as standard like the Classified.

Also - water block are really required here. And usually custom bios for super speeds unless the TDP limiter is relaxed...
And just for shits 'n' giggles, the LN2 HOF has a toggle switch to disable all Nvidia protections. OcUK said they will stock it but it has NO guarantee!! lol.
As long as it has better power phases then it should be fine with the NVidia limits removed. That is (IMHO) why they unfortunately hold back the overclocking potential of their cards with limiters like that. If this has the limiter removed on at least 1 of the bios settings (Or I can put a bios on their like that which I am sure there is) I would not mind buying some of these cards with some waterblocks. I would push each one well beyond the 1600mhz area just for fun!!!!
Posted on Reply
#6
emissary42
GhostRyderAs long as it has better power phases then it should be fine with the NVidia limits removed.
16 phases of 60A rated IR3555 for the GPU should be more than enough (HOF LN2).
Posted on Reply
#7
ZoneDymo
sweet looking card, but again with all cards of the current gen, 700 dollar for something basically outdated out of the gate.....yeah ill wait with upgrading.
Posted on Reply
#8
Jack1n
*Heavy Breathing*
Posted on Reply
#9
buildzoid
emissary4216 phases of 60A rated IR3555 for the GPU should be more than enough (HOF LN2).
The K|NGP|N is rocking the typical EVGA 14 + 3 Phase VRM. The 14 phase Vcore part is using IR 6894s. 70A at 125C° and 120A at 93C°. This card probably has the same VRM as the K|NGP|N so power output wise this and the HOF LN2 are basically the same however the HOF should have less Vripple.
Posted on Reply
#10
haswrong
the54thvoidThe Galaxy (Galax, KFA, whatever) HOF is probably better. I'm sure it has a larger power phase. Unfortunately the HOF doesn't ship with dual bios as standard like the Classified.

Also - water block are really required here. And usually custom bios for super speeds unless the TDP limiter is relaxed...
And just for shits 'n' giggles, the LN2 HOF has a toggle switch to disable all Nvidia protections. OcUK said they will stock it but it has NO guarantee!! lol.
sounds like a KILL SWITCH :nutkick: lol :peace:.
Posted on Reply
#11
Fx
I have to say that the Fury X looks way sexier than this. I think the clock is ticking on these behemoth architectures.
Posted on Reply
#12
ShockG
This is a great card. As good as the HOF LN2 or KPE even for air and water cooling.
With all GM2XX GPUs, the limit is cooling. It is not a linear relationship where adding and power phases, voltage etc help scaling. These will do nothing at all.

No magic BIOS will increase overclocks and two cards with same week GPU, perhaps even same wafer will perform and scale near identically. The limit for all TI's is around 1500MHz. That has everything to do with the GPU layout/design and the target node. For instance running VID1.26V to pass 1555MHz where 1.19V passes 1526MHz speaks directly to this. Want to clean out the signal, then you need lower temps. Not 50, 40 or 30'C but around 10 to 15'C max load. Then you will see the GPU go to 1600MHz to 1620MHz game stable.

Remember that from air cooling to LN2 is only 600MHz at most (at least at present) which is a vary narrow window. That last 200MHz needs you to go down from -80'C to -130'C for instance and hold it there even under load. That is not a linear relationship and the cap for these GPUs is around 2100MHz.

So be it a 6, 8 or 14 phase PWM. It makes no difference at all for water and air cooling.
All we can do is hope for a high bin GPU if you care about that sort of thing. Perhaps even a card with Samsung memory so you can comfortably go over the 2GHz mark.
Posted on Reply
#13
the54thvoid
Intoxicated Moderator
FxI have to say that the Fury X looks way sexier than this. I think the clock is ticking on these behemoth architectures.
By all means, feel free to purchase the sexier looking card - that's the consumers prerogative. Problem is, it won't perform anywhere near as well...

Fiji is also a behemoth. The chip is huge. The PCB is smaller because of the HBM but don't confuse the HBM technology with a small die.
Posted on Reply
#15
Petey Plane
ZoneDymosweet looking card, but again with all cards of the current gen, 700 dollar for something basically outdated out of the gate.....yeah ill wait with upgrading.
to be fair, Pascal is still at least 6-8 months out, but you're right about the ridiculous, overpriced "special edition" cards that start coming out at the end of an architecture's lifespan. Although the upgrade bug is biting me hard (even though my 670 still gets 50fps in BF4), i'm not upgrading until at least Pascal, and 1440 G-Sync monitors cost less that $400
Posted on Reply
#16
Fx
the54thvoidProblem is, it won't perform anywhere near as well...
I stopped reading after this. No time for smoke.
Posted on Reply
#17
rtwjunkie
PC Gaming Enthusiast
FxI stopped reading after this. No time for smoke.
Did you perhaps miss the reviews? Fury X came nowhere near beating the 980Ti, effectively losing in over half the games. Sure, both are overkill, but to not acknowledge the 980Ti as the winner of the two is denial, especially after all the promises AMD made. This model takes that performance beyond the reviews.
Posted on Reply
#18
RejZoR
Who cares if it beats it for 2fps or not. It performs the same and is an interesting piece of technology. It may not be reasonable for me, but for someone who easily shells out 800€, I see no problem with it. It's a good card.
Posted on Reply
#19
rtwjunkie
PC Gaming Enthusiast
RejZoRWho cares if it beats it for 2fps or not. It performs the same and is an interesting piece of technology. It may not be reasonable for me, but for someone who easily shells out 800€, I see no problem with it. It's a good card.
Idc personally. I was just responding to the dude who had his hand up saying he didnt want to hear anything about its performance vs Fury X. Personally both cards are great...and overkill.
Posted on Reply
#20
Fx
rtwjunkieDid you perhaps miss the reviews? Fury X came nowhere near beating the 980Ti, effectively losing in over half the games. Sure, both are overkill, but to not acknowledge the 980Ti as the winner of the two is denial, especially after all the promises AMD made. This model takes that performance beyond the reviews.
The problem that many fanboys have when comparing two things is how they reach their conclusion. They are only able to see black/white, all/nothing or winner/loser. If you compared the two cards objectively, you would see that he Fury trades many blows with the 980Ti. Many of the losses are marginal and the same can be said about its wins.

Another fact of note is that the Fury X probably has more potential to improve with driver updates.
Posted on Reply
#21
the54thvoid
Intoxicated Moderator
FxI stopped reading after this. No time for smoke.
If Reading is too hard, try some pictures.

Both overclocked, no added voltage. TPU review


Or one where the Fury X beats the stock 980ti (Guru 3D). Notice the out of the box performance of the custom card. As far as the reviews go - AMD won't allow custom Fury X cards.



Sure, the Fury X wins at some titles but the reviews (or is it mass conspiracy) all point to 980ti being the better card because it's allowed to be 'pimped' by the AIB's. If Fury X wins/draws/loses (really depends on reviews) it's close with a stock 980ti. But add the 20% a lot of cards get (1400+ boost) and it's clear what the better gaming choice is.

But like I said, feel free to buy Fury X.

As for the driver argument - sure, it'll get better but we know Nvidia can up the game too. Please don't label me a fan boy - too old for that and I'll buy what suits my needs. I'm unhappy with Fury X's limited hardware potential, i.e - no custom versions.
Posted on Reply
#22
rtwjunkie
PC Gaming Enthusiast
FxThe problem that many fanboys have when comparing two things is how they reach their conclusion. They are only able to see black/white, all/nothing or winner/loser. If you compared the two cards objectively, you would see that he Fury trades many blows with the 980Ti. Many of the losses are marginal and the same can be said about its wins.

Another fact of note is that the Fury X probably has more potential to improve with driver updates.
Please look for my other posts on the cards. I have even used the term "trading blows". However, it's also true that the Fury X did not come anywhere close to blowing everything else out of the water as their PR team basically promised.

As to being a fanboy, i actually planned on buying a Fury X for my upgrade. Like @the54thvoid I too am way too old for that immature BS. Then when I saw who the winner was (it doesn't matter if the 980Ti is only marginally the winner, it's still the winner), I was going to get the 980Ti. I have now decided to sit by and wait the next generation from each camp.

This allows me to neutrally observe which card is better. It might offend some people, but that's the fact. However, does it matter? Nope, not at all, because they are both good cards and both more than most people need.
Posted on Reply
#23
Xzibit
the54thvoid
That chart makes the 960 4gb a much better buy then the 970
Posted on Reply
#24
the54thvoid
Intoxicated Moderator
XzibitThat chart makes the 960 4gb a much better buy then the 970
It's actually a very good signpost to say why that memory issue gets relevant at 4k. I also wanted to use a 4k chart as that is where Fury X picks it's skirt up and runs better.
Posted on Reply
#25
Fx
the54thvoidIf Reading is too hard, try some pictures.

Both overclocked, no added voltage. TPU review


Or one where the Fury X beats the stock 980ti (Guru 3D). Notice the out of the box performance of the custom card. As far as the reviews go - AMD won't allow custom Fury X cards.



Sure, the Fury X wins at some titles but the reviews (or is it mass conspiracy) all point to 980ti being the better card because it's allowed to be 'pimped' by the AIB's. If Fury X wins/draws/loses (really depends on reviews) it's close with a stock 980ti. But add the 20% a lot of cards get (1400+ boost) and it's clear what the better gaming choice is.

But like I said, feel free to buy Fury X.

As for the driver argument - sure, it'll get better but we know Nvidia can up the game too. Please don't label me a fan boy - too old for that and I'll buy what suits my needs. I'm unhappy with Fury X's limited hardware potential, i.e - no custom versions.
You changing your tune to "being the better card" is more acceptable for argument's sake than compared to "it won't perform anywhere near as well". If you hadn't used such exaggeration, I wouldn't have labelled you a fanboy.

One point that you might not be considering is that many people don't overclock their cards. I for one am through with overclocking and run all stock rigs these days so your 20% boost doesn't apply to me or many others for that matter.
Posted on Reply
Add your own comment
Apr 27th, 2024 01:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts