• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel EIST disabled/enabled

Turning off EIST will do exactly nothing for your performance. It just scales CPU clock and voltage depending on load. Which in games basically means 100% available clock at all times anyway. What will affect your performance even full utilization are C states. You see, CPU, even when under full load can independently shut off parts of the processor to save power. It can push unused or underutilized cores into sleep to save power even during utilization. Sometimes, it behaves funny and causes CPU to prefer power saving over performance, especially when waking up cores and other parts of CPU cause delays which in games quickly result in frame time issues. In most cases people don't even notice it, but in certain games, it can cause problems. Forcing CPU to go only to max C1 state should help, but you can also turn it off entirely. EIST is what gives you the most power saving, C states are just extra on top.

I have found this to be true - mostly with the older setups ivy bridge but I think it may also apply to the newer chips. Disabling C states used to give me a boost in FPS and also dropped my memory latency on my 1150 setup.
 
I have found this to be true - mostly with the older setups ivy bridge but I think it may also apply to the newer chips. Disabling C states used to give me a boost in FPS and also dropped my memory latency on my 1150 setup.

Sounds to me there is always a tradeoff, performance or green. I guess that is why I turned off all power saving functions in my desktop when I overclocked the Cpu and ram...
 
For everyone who advise me to turn off the FPS counter: The drops are very visible (and rather annoying with an expensive PC) without it as well. The point isn't on a number in the corner.
A friend of mine had FPS drops with a i5 4670K at 4.1 GHz using a 980 Ti as well, changing the CPU with a i7 4790K solved his problems and also increased maximum FPS for him. Simply do it, it's not a buy you can regret at all. i5's, especially the low clocked one's, aren't the best gaming CPUs now, I'd say a high clocked i5 is the absolute minimum for a highend gaming PC and is still worse compared to an i7 of the same gen because Hyper Threading (or having more than just 4 threads) now pays off. The best gaming CPUs are 6 to 10 cores now anyway. That all said, BF1 uses all the 12 threads my i7 3960X has at 4.5 GHz on one particular map at almost 100% util and it has still high usage on all the other maps, like 60-80%.
 
this thread is insane, i dont want to bother quoting everyone

vsync being double buffered on an AAA game like battlefield? the same battlefield that has always had triple buffering since bc2? come on... how many modern games DONT have triple buffering? 50 is obviously not 30 either, so what kind of useless 'turn off vsync' reply is that

bf1 uses more than 4 threads, this is confirmed by some sites & i have witnessed it in person on a stock 2500k + gtx970... the gpu was only 60% usage & full of stutters in multiplayer! (all four cpu cores 90-100% each)

why didnt OP monitor cpu+gpu+vram+ram usage? you do not think about very specific tweaks until actually knowing where to tweak in the first place (but of course, the easiest & most performant answer is to overclock if you dont want to turn down in game settings)

now since it's a simple bios setting, could have tried it to see if there's an effect before asking

edit: actually, suggesting max perf in nvcp or in bios is also wrong, you dont do that until you proved the clockspeeds have been dipping
 
A dip from 60 to 50 fps is NOT visible. I don't care who you are. You would never know it was happening without a frame counter telling you it was.
 
@kn00tcn
i thought tripple buffer is openGL only or are you suggesting d3dovverrider.

or what is your conclusion- if i told bullshites i am open for suggestions but from your post all i see is you have to overclock?
 
A dip from 60 to 50 fps is NOT visible. I don't care who you are. You would never know it was happening without a frame counter telling you it was.
I couldn't tell a difference between 60 and 50 fps just by looking at the monitor, but i could easily feel it, in some games at least. In Arma 3 such a drop makes gameplay quite choppy, mouse movements becomes less smooth. Though, i got over 2000 hours in this game and i know it better than my own bedroom :)
 
A dip from 60 to 50 fps is NOT visible. I don't care who you are. You would never know it was happening without a frame counter telling you it was.

OK.
 
Tibor Hazafi, if you truly want to find out the cause of FPS stuttering, run sequential tests in no less than 5 different games. In each of them do a benchmark of a fixed time and repeat it five times no less.

Report does that stuttering repeat in every game or just in single game or just in a few games? Report if the stuttering disappears after the second same sequence test.

Enable virtual memory, set manual SSD cashing with no less than 8 GB space, run those tests again.
 
I have set RenderDevice.RenderAheadLimit to 2 in the console, and everything is smooth like butter now.
 
Last edited:
A dip from 60 to 50 fps is NOT visible. I don't care who you are. You would never know it was happening without a frame counter telling you it was.
you are... making up bs, seriously, no personal offense, but offense to this concept (did you forget all the multi gpu stuttering issues of the past? the framecounters did not actually show any problem, the human brain/eye does, therefore you know it's happening & any counter you add is merely confirmation)

ANYTHING inconsistent is noticable, which means... on a 60hz monitor, not having 60fps will stutter or tear, this includes 59 or 61fps, any direction up or down not aligned to the refresh

on a freesync or gsync monitor, such stutters are eliminated & now a variable framerate will be less noticable, but still technically there

if you're always dipping between 50 to 60, then you framecap to 50, especially on freesync or gsync, now everything is good & consistent

obviously what's visually happening in the game will have a different effect, so the smoothly side scrolling backgrounds in sine mora will be painfully obvious when any minor stutter appears, while crysis will be less obvious since mouse aim is jerky by nature along with all the foliage moving around on screen

@kn00tcn
i thought tripple buffer is openGL only or are you suggesting d3dovverrider.

or what is your conclusion- if i told bullshites i am open for suggestions but from your post all i see is you have to overclock?
triple buffering is a concept that developers can choose to enable, a standard part of most engines & graphics APIs

as you know, the user can also force it on ogl in the driver control panels or with d3doverrider as an additional option as needed, but many games support TB already (there was a string of unreal3 ones that didnt a few years ago in particular, i've also noticed it happen more often on nvidia, so eventually some people assumed vsync sucks)

double buffering is the problem concept, which results in that mathematically divisible drop to 30, but nobody should accept this, not game devs, not the user, unless input latency is a great concern & the hardware is capable of outputting fps well over refresh (at which point, just do adaptive vsync so that you tear below refresh instead of totally dropping to 30)
 
Last edited:
Back
Top