Wednesday, October 3rd 2012

NVIDIA Forces EVGA to Pull EVBot Support from GTX 680 Classified

According to an Overclockers.com report, NVIDIA forced EVGA to remove voltage control, more specifically, support for its EVBot accessory, on its GeForce GTX 680 Classified graphics card. EVBot, apart from realtime monitoring, gives users the ability to fine-tune voltages, a feature NVIDIA doesn't want users access to. This design change was communicated by EVGA's Jacob Freeman, in response to a forum question a users who found his new GTX 680 Classified card to lack the EVBot header.

"Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution," said Freeman. Hinting that NVIDIA is behind the design change, he said "Unfortunately we are not permitted to include this feature any longer," later adding "It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device." To make matters worse, Freeman said that EVGA has no immediate plans to cut prices of the GTX 680 Classified.
Source: Overclockers.com
Add your own comment

99 Comments on NVIDIA Forces EVGA to Pull EVBot Support from GTX 680 Classified

#26
the54thvoid
Intoxicated Moderator
The Kepler design is pretty well tied to it's strict clock/voltage manipulation to achieve the clock boosts it gets. As the dynamic clocks are hardware related it would be very hard to safely allow voltage adjustments.

I think Nvidia have the card as fast as it can go on the set up it is on. More volts may well fry the thing and they don't want that bad PR (and I don't blame them).

Also, as Ben says, the 690 decimated any partners chance of making a viable dual 680 card.

It's not like Apple though. This Kepler voltage situation is being controlled to stop damage being done to hardware that's already pushing the boat out.

I say Nvidia are looking after us this time (well not me, I'm with red).
Posted on Reply
#27
20mmrain
Alright This sucks..... Time to switch back to AMD/ATI next round...... there goes another $2000 Dollar purchase from me you won't get Nvidia. (Especially if this rule carriers over to the next gen)

Also for those who say it is not as dangerous anymore..... Well maybe you are not trying to overclock as high as you can to make it dangerous. Just because cards are able to reach higher clocks before it is considered dangerous..... doesn't mean that it is not dangerous anymore.
As for the fail safe features that protect a product from a bad overclock..... I think that this is a good feature..... and like I said there comes a point in time anyway when you push a product so High that even fail safe features won't protect you any longer.
Posted on Reply
#29
hardcore_gamer
This is not "protecting". This is dumbing down. Overclockers were here for a long time. Why didn't they introduce this "feature" when people started to damage 6600GTs by overclocking in the past? I see this as a move to take away tweakability from the hardware.
Posted on Reply
#31
Bjorn_Of_Iceland
remixedcatthe hell nvidia? why? makes me worry about other things....
because nvidia loves money.
Posted on Reply
#32
hardcore_gamer
RecusLooks like fanboys will never learn.
Yes, fanboys will never learn. They always try to justify a company even if they take away some important feature from the users.

Enthusiasts and most people who value their money overclock their GPU to get the maximum performance out of it. It is very important in high end cards like 680.

Intel took away the overclocking feature and people still pay the premium for K series. But things are different here in GPU arena where both players are very close in performance.
Posted on Reply
#33
BigMack70
The GTX 6xx series cards have always been a giant "F U" to consumers...

They couldn't deliver on GK110 and instead sold a $300 chip at a $500 price point, they complicated overclocking with their "GPU Boost" nonsense, and they screwed enthusiasts by not allowing voltage tuning on any of their cards.

I feel bad for uninformed folks who bought something like the Classified or the MSI Lightning 680.

I generally like Nvidia's cards, but the GTX 6xx series is a load of crap and a huge step backwards from the GTX 5xx series IMO.
Posted on Reply
#34
Solaris17
Super Dainty Moderator
knowing EVGA though in future releases they will probably do something like MSI and provide voltage read pads and pads you can attach trim pots too.

sneaky sneaky.
Posted on Reply
#35
TheLostSwede
News Editor
BenetanegiaDo you have a proof link or something? It seems to me that the only reason they didn't release it is that a GK104 based MARS is simply redundant as GTX 690 is already based on the full chip and clocked like the GTX680. Afaik Asus told TPU that the card was never meant to release:



www.techpowerup.com/171202/ASUS-ROG-MARS-III-Dual-GTX-680-PCB-Pictured.html
Not as such, but I spoke to a friend of mine at Asus at MOA of all things who told me this. I'm pretty sure he knows what he's talking about, but of course the official excuse would be something that doesn't make Nvidia look bad...
Posted on Reply
#36
3870x2
There might be a reason other than money and control that NV is doing this.
Posted on Reply
#37
[H]@RD5TUFF
This really pisses me off, but it's not like AMD is a real alternative with the shittiest drivers in history.
Posted on Reply
#38
Easy Rhino
Linux Advocate
TheLostSwedeIn related news, Asus wasn't allowed to make their dual GPU MARS card either, as Nvidia wouldn't let them...
So apparently the chip makers are now controlling what their "partners" are allowed to do, or not to do with the chips they buy from them.
this has always been the case.
Posted on Reply
#39
hv43082
Wow the pro AMD people are all over this thread like flies over poop!
Posted on Reply
#40
dj-electric
Becuase of Nvidia my Lightning 680s can't overvolt pass 1.175v

This is Bullshit
Posted on Reply
#41
alwayssts
You want unbiased? I'll flame them all. Ready?

Go.
BenetanegiaWhat's the point, use the tools and you'll achieve the same results as the rest of the world (with slight differences based on luck) and if it all comes down to that.
You sir, hit a raw nerve and said a lot of important things that people really need to understand. From a guy that grew up with the ABIT boards that started the initial craze, it really does make me sad. The reality is, as you know, and many people don't seem to understand is that overclocking for all intents and purposes is dead and dying...killed by wanting to spur upgrades. Every major player is guilty.

Remember when AMD and nvidia over the course of a couple generations started limiting TDP per product for segmentation and sold it as a feature? Remember then when they started doing set voltages and turbo modes (which in the case of 7900 for example actually makes it a WORSE over-all product they sell for more money)? Remember when Intel essentially made a product that doesn't really work worth a damn over the typical best perf/clock ratio of ~1.175 volts (Ivy Bridge)?

As you infer by stating 'everyone gets the same results', the diluted form of overclocking we have today is more of a product feature to allow for lower and looser binning, skimp on the BOM on lower-end skus, and is built into an inflated price on upper-end ones...exactly the opposite of the initial market and purpose. I was fine with the advent of software voltage control (that took advantage of clock/voltage potential of a process with realistic cooling) versus soldering to change the properties of a resistor...less release of the magic smoke that way. What pisses me off are things like strict conformance to the pci-e (etc) power specs, or even more bullshit, limiting tdps below them. Not doing the later, for instance, saved GF104/110 from being a colossal failure and made the products quite appealing. If you want proof, look at the stock tdp and power consumption of those products, how many pci-e connectors they had, how well they clocked...and power consumption after overclocking. Now that kepler doesn't suck...gone.

Another example,

AMD knows if you buy a 7870, you are likely going to run it at ~1200+mhz...the max the chip will do limited by process tech, not tdp.
They also know if you'd like to buy a 7850 and get 7870 performance, so they institute tdp restrictions (130w) and bios locks (1050mhz) to limit the average clock to under the performance of the formers stock ability (7850 @ 1075mhz ~ 7870 @ 1ghz). That way the bottom-line price difference is preserved while the upper sku still appears worthy at stock.

Even worse is when you can smell WHY they do these things on the grand scale. In this case it's obvious 8000 will have 150w/225w max tdp parts (7850 is 130w, 7950 200w...10% less used logic and ~20% less tdp than 70 skus...).

Why would anyone buy a 150w part that outperforms a stock 7870 by 10% if 7850 already could a year earlier for the same price? They wouldn't. Now they can sell it as a huge efficiency improvement (more shaders than 7870 at lower initial clock and overclocking potential granting a net gain in terms but not actual power consumption beyond scant used transistors for CU difference), which again...is bullshit. While obviously GK104 is well-designed, no doubt it's tdp, just like 7950 and 8950, is built so GTX770 will look justifiably better at 225w, again more units at a lower sweet clock/voltage spot.

In short...it's all a big fucking ruse. Any average Joe that thinks they are an overclocker anymore is either lying to themselves or extremely delusional.
Posted on Reply
#42
cadaveca
My name is Dave
alwaysstsIn short...it's all a big fucking ruse. Any average Joe that thinks they are an overclocker anymore is either lying to themselves or extremely delusional.
Really, I think the complaining about it is just as stupid though, since "overclockers" wanted OCing to go mainstream, pushed marketing reps, and then it happened. And now that it's happened, everyone wants things back the way they were, since companies have taken steps to ensure that them offering these features doesn't bankrupt them.


So, ya got what ya wanted, but didn't consider the consequences....



Now, I am not directing this at anyone specifically, this is just my general feeling. But then, when I try to tell people how to OC, to get a bit extra, they were doing something COMPLETELY different.


OC is not dead. It's just more cleverly hidden. If you don't own a sodlering iron, and you think you're an overclocker, you're sorely mistaken. You still need that iron.

Perfecdt example..all the claims of IVB running hot...no, actually it doesn't. you just failed to give proper cooling. :laugh:
Posted on Reply
#43
newtekie1
Semi-Retired Folder
Well after the last generation with idiots pushing 1.2v+ though 4 Phase cards, then bashing nVidia when the VRMs popped, I could see why nVidia wants to now limit voltage control...

Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops. So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.
Posted on Reply
#44
linoliveira
The cards are made to run at the stock voltage and frequencies, so if you push 0.001v and 1MHz more you shoul brick the warranty of the product, so you don't get a replacement straight away. am I wrong?
I never refunded a card for OCing it because I never overvolt it badly, but this is my understanding of it.
IF I am correct, why bother with voltage tweaking, just let ppl fry cards and put them in the market for more! :laugh:
Posted on Reply
#45
MxPhenom 216
ASIC Engineer
newtekie1Well after the last generation with idiots pushing 1.2v+ though 4 Phase cards, then bashing nVidia when the VRMs popped, I could see why nVidia wants to now limit voltage control...
Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops. So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.
I was just about to say that....
Posted on Reply
#46
Jstn7477
newtekie1Well after the last generation with idiots pushing 1.2v+ though 4 Phase cards, then bashing nVidia when the VRMs popped, I could see why nVidia wants to now limit voltage control...

Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops. So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.
I must say the same with all the people still wanting to unlock HD 6950s long after their release, as if the cards now are exactly the same as the ones that originally unlocked. I'm tired of people bricking their cards and not even trying to back up the original BIOS, then they come on here and cry because they think upgrading a video card's BIOS is always necessary and yields hidden performance improvements or something in all cases. :banghead:
Posted on Reply
#47
TheMailMan78
Big Member
3870x2There might be a reason other than money and control that NV is doing this.
Everyone is always quick to bash without looking at the reasons why someone did something. I would like to know the reasoning behind the removal instead of assuming it was out of "greed".....and if it was greed NVIDIA is well within its right. They designed the chip. They make the drivers. Its their property. EVGA is just a mfg plant making money off NVIDIA's coat tails.

But again we don't know WHY NVIDIA asked EVGA to remove it. Might be for safety reasons. Who knows.
Posted on Reply
#48
Easy Rhino
Linux Advocate
TheMailMan78Everyone is always quick to bash without looking at the reasons why someone did something. I would like to know the reasoning behind the removal instead of assuming it was out of "greed".....and if it was greed NVIDIA is well within its right. They designed the card. They make the drivers. Its their property. EVGA is just a mfg plant making money off NVIDIA's coat tails.

But again we don't know WHY NVIDIA asked EVGA to remove it. Might be for safety reasons. Who knows.
i see a lot of people (not just on TPU) criticize business decisions by amd,intel,nvidia, whomever.

most of these people sit around and contribute NOTHING to society. what gives them the right to be critical of anybody?? boggles my mind...
Posted on Reply
#49
TheMailMan78
Big Member
Easy Rhinoi see a lot of people (not just on TPU) criticize business decisions by amd,intel,nvidia, whomever.

most of these people sit around and contribute NOTHING to society. what gives them the right to be critical of anybody?? boggles my mind...
Common sense is not so common anymore. :toast:
Posted on Reply
#50
Easy Rhino
Linux Advocate
TheMailMan78Common sense is not so common anymore. :toast:
i don't even think it has to do with common sense. these people just consume. they constantly consume and judge other people's work without contributing anything themselves.
Posted on Reply
Add your own comment
Apr 23rd, 2024 01:31 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts