• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel IGPs Use Murky Optimisations for 3DMark Vantage

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,853 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Apart from being the industry's leading 3D graphics benchmark application, 3DMark has had a long history of 3D graphics hardware manufacturers cheating with their hardware using application-specific optimisations against Futuremark's guidelines to boost 3DMark scores. Often, this is done by drivers detecting the 3DMark executable, and downgrading image quality, so the graphics processor has to handle lesser amount of processing load from the application, and end up with a higher performance score. Time and again, similar application-specific optimisations have tarnished 3DMark's credibility as an industry-wide benchmark.

This time around, it's neither of the two graphics giants in the news for the wrong reasons, it's Intel. Although the company has a wide consumer base of integrated graphics, perhaps the discerning media user / very-casual gamer finds it best to opt for integrated graphics (IGP) solutions from NVIDIA or AMD. Such choices rely upon reviews evaluating the IGPs performance at accelerating video (where it's common knowledge that Intel's IGPs rely heavily on the CPU for smooth video playback, while competing IGPs fare better at hardware-acceleration), synthetic and real-world 3D benchmarks, among other application-specific tests.

Here's a shady trick Intel is using to up its 3DMark Vantage score: the drivers, upon seeing the 3DMark Vantage executable, change the way they normally function, ask the CPU to pitch in with its processing power, and gain significant performance according to an investigation by Tech Report. While the image quality of the application isn't affected, the load on the IGP is effectively reduced, deviating from the driver's usual working model. This is in violation of Futuremark's 3DMark Vantage Driver Approval Policy (read here), which says:
With the exception of configuring the correct rendering mode on multi-GPU systems, it is prohibited for the driver to detect the launch of 3DMark Vantage executable and to alter, replace or override any quality parameters or parts of the benchmark workload based on the detection. Optimizations in the driver that utilize empirical data of 3DMark Vantage workloads are prohibited.
There's scope for ambiguity there. To prove that Intel's drivers indeed don't play fair at 3DMark Vantage, Tech Report put an Intel G41 Express chipset based motherboard with Intel's latest 15.15.4.1872 Graphics Media Accelerator drivers, through 3DMark Vantage 1.0.1. The reviewer simply renamed the 3DMark executable, in this case from "3DMarkVantage.exe" to "3DMarkVintage.exe", and there you are: a substantial performance difference.



A perfmon (performance monitor) log of the benchmark as it progressed, shows stark irregularities in the CPU load graphs between the two, during the GPU tests, although the two remained largely the same during the CPU tests. An example of one such graphs is below:



When asked for a comment to these findings, Intel replied by saying that its drivers are designed to utilize the CPU for some parts of the 3D rendering such as geometry rendering, when pixel and vertex processing saturates the IGP. Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes, are among other applications that the driver sees and quickly morphs the way the entire graphics subsystem works. A similar test run on Crysis Warhead yields a similar result:



Currently, Intel's 15.15.4.1872 drivers for Windows 7 aren't in Futuremark's list of approved drivers, none of Intel's Windows 7 drivers do. For a complete set of graphs, refer to the source article.

View at TechPowerUp Main Site
 
Last edited:
Thanks for the news. That's real dirty. Class action, anyone?
 
Bla, bla, bla. But do they change image quality?
 
Bla, bla, bla. But do they change image quality?

It detects the application and changes vertex processing settings. Such a thing is prohibited under Futuremark's policy too. The GPU (IGP) itself isn't getting the graphics processing load it should be getting for the test to be fair and valid.
 
Last edited:
Bla, bla, bla. But do they change image quality?

it says it right there

this is done by drivers detecting the 3DMark executable, and downgrading image quality, so the graphics processor has to handle lesser amount of processing load from the application, and end up with a higher performance score.



:laugh:
 
Why? Isn't the driver's job to provide the best possible experience, while utilizing all possible recourses? If so, then this is exactly what intel have done.

Now, on the other hand, if it does indeed lower the quality of the final product, and this wasn't stated anywhere, then by all means they should get sued.

The thing that really bugs me here, is how deppressingly and catastrophically bad intel's igps are in reality.
 
No, that's not the way it should work. NVIDIA or AMD's drivers don't leave the CPU to do the parts of the graphics processing the Intel drivers are making the CPUs do. The IGP itself is weaker than it's appearing to be. Offloading work to the CPU isn't even a standard model for Intel's drivers, proven by running the applications renamed.
 
Why? Isn't the driver's job to provide the best possible experience, while utilizing all possible recourses? If so, then this is exactly what intel have done.

Now, on the other hand, if it does indeed lower the quality of the final product, and this wasn't stated anywhere, then by all means they should get sued.

The thing that really bugs me here, is how deppressingly and catastrophically bad intel's igps are in reality.

It's not providing the best expirence. It's downgrading the image to get a better score, and thats only in Futuremark apps. So you think "wow this IGP is amazing" then you hit some games and get slapped in the face as the Futuremark products score led you to believe it was far more powerful.
 
the bastards!
 
So... Intel's IGP's are so astoundingly shitty that their drivers actually offload processing to the CPU to help out?

I'm... almost willing to let them have that, solely out of pity.
 
I will always hate Intel IGPs. Drivers suck, and they are outdated by the time they are released.
 
Intel IGP's arent even good nuff for nettops
 
Yet when you see computers or notebooks on those home shopping channels, you got some really retarded nerd touting the 128mb of share ram being dedicated to the intel G(I suck gP) graphics. Really retarded.
 
FutureMark just needs to have their installer randomize the executable's name.
 
Nvidia and AMD are doing the same thing for years now - just with better techniques. If they are allowed, I don’t see any reason why Intel won’t be allowed too. It’s either everyone or nobody.
 
Intel IGPs has allways been called integrated crapstics, even 2 generation old AMD IGPs outclass intel finest :nutkick:
 
Yes, it detects the application and changes vertex processing settings. Such a thing is prohibited under Futuremark's policy too. The GPU (IGP) itself isn't getting the graphics processing load it should be getting for the test to be fair and valid.

Yes, it is against Furturemark's rules, but is it morally wrong to do it this way? I've always said, fuck benchmarks, all I care about is game performance, and it seems Intel was more worried about improving performance in games, and did simply applied the same optimizations to Futuremark's tests.

it says it right there

:laugh:

Funny how when you leave off that one word "often" it changes the whole sentence meaning. If only the original sentence was the one you editted it to...

Why? Isn't the driver's job to provide the best possible experience, while utilizing all possible recourses? If so, then this is exactly what intel have done.

Now, on the other hand, if it does indeed lower the quality of the final product, and this wasn't stated anywhere, then by all means they should get sued.

The thing that really bugs me here, is how deppressingly and catastrophically bad intel's igps are in reality.

Agreed, as far as I've read, the article mentions nothing about Intel actually lowering image quality. It seems their mistake was offloading the work to the CPU, which is also against the rules. This has nothing to do with lowering the quality of the final product.

If it helps the shitty performance of Intel's IGPs, I say let them do it, but they should remove the optimization from the Furturemark exes, just to adhere to the rules.

It's not providing the best expirence. It's downgrading the image to get a better score, and thats only in Futuremark apps. So you think "wow this IGP is amazing" then you hit some games and get slapped in the face as the Futuremark products score led you to believe it was far more powerful.

Maybe I missed something in the article, where does it say that it is downgrading the image to get a better score?
 
Intel IGPs has allways been called integrated crapstics, even 2 generation old AMD IGPs outclass intel finest :nutkick:

I always get a laugh when Intel gpus are talked about in reference to gaming.
 
doesn't really matter guys, it's not like your gonna play COD with an IGP
 
This would be understandable if Intel IGPs were any good, but they aren't, so all this does is make Intel look stupid. It would, however, go a long way to explaining why Intel's IGP drivers are consistently a pile of suck.

And for heaven's sake, they use the EXE filename to detect when to make their IGP look better... a half-decent first-year university student wouldn't use such a simplistic approach...
 
And for heaven's sake, they use the EXE filename to detect when to make their IGP look better... a half-decent first-year university student wouldn't use such a simplistic approach...

EXE name based optimization is a pretty common practice for ATi and nVidia... How many times do we see the suggestions of renaming the EXE to get better performance or Crossfire/SLi support when a new game comes out? You want to know why that works? Because the drivers detect the EXE name and applies optimizations.

It just happens to be against Futuremark rules. Though logically, I have to wonder how much that rule makes sense. I mean, they are allowed to do it in real games, and 3DMark is supposed to be a benchmark to measure game performance....so why can't they apply the same optimizations to 3DMark as they do to real games?
 
Last edited:
so why can't they apply the same optimizations to 3DMark as they do to real games?

Because the idea is to benchmark the GPU only. If you're benchmarking the GPU + CPU, how are users supposed to know which is doing more of the work? The idea of the synthetic benchmark is to take all other elements out of the equation and analyze the GPU's raw power, which is why we still run them instead of only game benchmarks. What if Intel doesn't optimize for a game you play? Well, you're out in the cold, because you bought an IGP which you thought performed 25% better than it really does.
 
Intel... How am I not surprised... Trying to exceed every other companies by these little tweaks to make them better...
 
Because the idea is to benchmark the GPU only. If you're benchmarking the GPU + CPU, how are users supposed to know which is doing more of the work? The idea of the synthetic benchmark is to take all other elements out of the equation and analyze the GPU's raw power, which is why we still run them instead of only game benchmarks. What if Intel doesn't optimize for a game you play? Well, you're out in the cold, because you bought an IGP which you thought performed 25% better than it really does.

Well...that isn't really the idea behind benchmarking. Yes, that is what they have become thanks to Futuremark turning it into more of a competition than a true benchmark. However, benchmarking is supposed to give you an idea of game performance, that is what 3DMark started out as. If the benchmark was truly all about raw GPU power, then there wouldn't be CPU tests included in it.

And we all know the various cards perform different for various games. So the argument that you bought something because you thought it performed better based on one benchmark doesn't work. Just an example of the flaws in your logic: What if I went and bought a HD4890 because it outscores a GTX260 216 in 3Dmark06...but if I went and fired up Far Cry, the GTX260 performs better... It is all about optimizations these days, and no one should be buying a card based on 3DMark scores...to do so is silly.

Now, if they applied this optimization to just 3DMark, I would say it is wrong, however it has been applied to most games as well. So, IMO, it isn't really that wrong.
 
you would be surprised what else, other than exe name, you could use for app detection - that pretty much nobody on the planet is ever gonna figure out
 
Back
Top