• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Very poor performance with Intel core i5-11600KF with stock 4.9 GHz

I was forced to overclock this cpu to 5.6 GHz because at stock it wasn't even able to play 4K 60FPS Youtube videos? And CPU usage was over 70% in games like Minecraft (about to bottleneck!) even with a GT 610!
Is this kind of performance expected from this CPU? Also the Cpu uses around 75W at 3% usage when idling (at stock 4.9 GHz, absolutely NO overclocking), it's really power hungry. Is that normal too?
That usage sounds very...weird.

I'm not doing anything overly demanding, just playing music on spotify, and have discord, chrome, epic, spotify, steam, calculator, task manager, and hwinfo open.

Usage is around 5%, power draw is 37W, on a i9-13900KF.
 
Last edited:
Is the "Use hardware acceleration when available" toggle enabled in your Edge settings?
View attachment 354949
Yes it's turned on.

That usage sounds very...weird.

I'm not doing anything overly demanding, just playing music on spotify, and have discord, chrome, epic, spotify, steam, calculator, task manager, and hwinfo open.

Usage is around 5%, power draw is 37W, on a i9-13900KF.
Yes, the power draw is way too much! At full load while running cinebench r23 multi core, the cpu hits 220W! It could go even further, but my PSU is too weak, so the PC just turns off. Again, I tested it with absolutely NO overclocking, and the TDP of the CPU is only 125W.
 
Last edited:
Yes, the power draw is way too much! At full load while running cinebench r23 multi core, the cpu hits 220W! It could go even further, but my PSU is too weak, so the PC just turns off. Again, I tested it with absolutely NO overclocking, and the TDP of the CPU is only 125W.
Any suggestions on this?
 
My 11900k get's to ~240W when hit with R23 at 5.1GHz @ ~1.33V (AVX2 offset = 0).
Decrease Vcore under load (by offset or "fixed" value) to lower power consumption, however it probably will require dropping max frequency to maintain stability.
You can't have a cake, and eat it too though.
Either top performance with switching off under AVX2 heavy workload, or more stable system with a less top performance.

PS. TDP of CPU stopped being relevant since AVX came about.
Especially on [Auto] when whole "optimization" can set whatever MB maker wants (increased power limits, voltages and multipliers are "fair game" with MCE).
 
Last edited:
Any suggestions on this?
There's nothing out of line. 11th gen was a HUGE regression from Intel regarding power consumption, and Cinebench doesn't even use AVX-512.
1720965069600.png
 
My 11900k get's to ~240W when hit with R23 at 5.1GHz @ ~1.33V (AVX2 offset = 0).
Decrease Vcore under load (by offset or "fixed" value) to lower power consumption, however it probably will require dropping max frequency to maintain stability.
You can't have a cake, and eat it too though.
Either top performance with switching off under AVX2 heavy workload, or more stable system with a less top performance.

PS. TDP of CPU stopped being relevant since AVX came about.
Especially on [Auto] when whole "optimization" can set whatever MB maker wants (increased power limits, voltages and multipliers are "fair game" with MCE).
There's nothing out of line. 11th gen was a HUGE regression from Intel regarding power consumption, and Cinebench doesn't even use AVX-512.
View attachment 355125
So it's expected from a 11600k(f). I also ALWAYS want the maximum performance, it's a priority for me. I consider enabling "power saving" features and undervolting as wasting money. I also had to increase the power limits in the bios from 125W, otherwise it was power limit throttling. Thanks anyway.
 
I was forced to overclock this cpu to 5.6 GHz because at stock it wasn't even able to play 4K 60FPS Youtube videos? And CPU usage was over 70% in games like Minecraft (about to bottleneck!) even with a GT 610!
Is this kind of performance expected from this CPU? Also the Cpu uses around 75W at 3% usage when idling (at stock 4.9 GHz, absolutely NO overclocking), it's really power hungry. Is that normal too?
You are being dupped by marketing companies. there are no 4k cpu,s or even gpu,s yet. the fastest 4090 gpu is still a 2k graphics cards. 4104 is a sum of 2 cubes. which means all gpus before this are still 2k. however the next batch of blackwell gpus will be the first real 4k hardware but dont expect it this year or even next year. You need a 16 core cpu @ 5ghz to play real 4k content. so save yourself the hassle and use 1080p.
 
You are being dupped by marketing companies. there are no 4k cpu,s or even gpu,s yet. the fastest 4090 gpu is still a 2k graphics cards. 4104 is a sum of 2 cubes. which means all gpus before this are still 2k. however the next batch of blackwell gpus will be the first real 4k hardware but dont expect it this year or even next year. You need a 16 core cpu @ 5ghz to play real 4k content. so save yourself the hassle and use 1080p.
I don't know what you are even talking about. My phone, which has a Mediatek dimensity 700 processor with 8 cores at 2.2 and 2.0 ghz can actually play 4K content, but my PC can't? Yes, seriously! I could even attach a video of it playing 4k 60fps youtube videos, but I can't attach videos and I don't want to convert it to GIF and post here because I'm too lazy to do that.
 
I don't know what you are even talking about. My phone, which has a Mediatek dimensity 700 processor with 8 cores at 2.2 and 2.0 ghz can actually play 4K content, but my PC can't? Yes, seriously! I could even attach a video of it playing 4k 60fps youtube videos, but I can't attach videos and I don't want to convert it to GIF and post here because I'm too lazy to do that.
It's not about CPU power, but decoding capabilities. The DSP on your Mediatek is responsible for encoding and decoding video.
A CPU can't do hardware decoding by itself, so it either offloads it to a GPU (integrated or dedicated) or falls back to software decoding which is very slow and inefficient (and is precisely what your i5 is doing).
What I can't understand for the life of me is the reason why your GT610 isn't picking up the task when you use h264ify and leaving the i5 on its own. The GT610 simply CAN'T decode what Youtube uses for 1440p and higher (VP9/AV1).
 
Last edited:
What I can't understand for the life of me is the reason why your GT610 isn't picking up the task when you use h264ify and leaving the i5 on its own
If the OP is on Win11 (indicated in their profile) then the problem may be no official drivers for this system. Although technically the Win10 driver for the GT610 should work.

It would help knowing if the OP even has an Nvidia driver installed, and what OS they're on.
 
If the OP is on Win11 (indicated in their profile) then the problem may be no official drivers for this system. Although technically the Win10 driver for the GT610 should work.

It would help knowing if the OP even has an Nvidia driver installed, and what OS they're on.
I have recently fresh-installed windows 10, and have the latest 391.35 driver installed which my GPU supports. Also when I was on windows 11, I still had the 391.35 drivers installed and they were perfectly compatible. Windows 10 and 11 are literally the same thing (both are NT 10.0).
 
Last edited:
What I can't understand for the life of me is the reason why your GT610 isn't picking up the task when you use h264ify and leaving the i5 on its own. The GT610 simply CAN'T decode what Youtube uses for 1440p and higher (VP9/AV1).
The GT 610 is Fermi and too old to support web browser video decoding I believe, NVDEC is required for that IIRC and that's only with Kepler and up.
It's also very slow, even if the GPU doesn't support the video codec used it still has to process the frames being displayed.

My media pc with an 8700K equivalent can playback 4K60 without hardware acceleration (mind a few dropped frames) so surely an 11600KF can handle it no problem.
 
I'm surprised anyone would use a 11600kf with a gt610, like the launch date of those are almost 10 years apart
 
I'm surprised anyone would use a 11600kf with a gt610, like the launch date of those are almost 10 years apart
Ry7 5800 with XFX R7 250X Ghost
 
I'm surprised anyone would use a 11600kf with a gt610, like the launch date of those are almost 10 years apart
Sucks that OP has the KF variant, its iGPU would be otherwise better than that display adapter.
 
Sucks that OP has the KF variant, its iGPU would be otherwise better than that display adapter.
No doubt, I have a system here with an i7-11700k (MCE enabled) & it plays every YT video I throw at it ultra smooth @1440p, 120Hz (HDMI) with its iGPU. Although having 4600MT/S DDR4 @ CL19 helps too with gpu bandwidth.
 
The GT 610 is Fermi and too old to support web browser video decoding I believe, NVDEC is required for that IIRC and that's only with Kepler and up.
It's also very slow, even if the GPU doesn't support the video codec used it still has to process the frames being displayed.

My media pc with an 8700K equivalent can playback 4K60 without hardware acceleration (mind a few dropped frames) so surely an 11600KF can handle it no problem.
The funny thing is it actually WAS able to play 4K 60FPS videos a year or two ago completely smooth. It's recently behaving like this, so I created this thread because I want to get my performance back! Could it somehow be a BIOS update, or did something change with how youtube decodes videos since then?

I'm surprised anyone would use a 11600kf with a gt610, like the launch date of those are almost 10 years apart
Sucks that OP has the KF variant, its iGPU would be otherwise better than that display adapter.
As you can see on my profile, my PC is a workstation which has 64GB RAM, not a regular gaming machine. So I don't really need a GPU much. And I have the "F" variant because the non-F version wasn't available where I live when I built my PC back in 2020. It was the only good unlocked CPU available in my area, and I somehow felt that I would need to overclock in the future, so I bought it.
 
The funny thing is it actually WAS able to play 4K 60FPS videos a year or two ago completely smooth. It's recently behaving like this, so I created this thread because I want to get my performance back! Could it somehow be a BIOS update, or did something change with how youtube decodes videos since then?

This was answered in the 3rd post

There is no Bios Update that will fix this. There are only limited workarounds like h264ify or not using youtube. The only answer has always been to upgrade hardware. A pre-owned 1030 shouldnt cost a whole lot of money. They used to be a lot lot cheaper but those times are far behind us.

Youre not going to get any of that performance back unless you upgrade your 610 to something a lot lot stronger if you want to watch youtube the way you did two years ago.
 
This was answered in the 3rd post

There is no Bios Update that will fix this. There are only limited workarounds like h264ify or not using youtube. The only answer has always been to upgrade hardware. A pre-owned 1030 shouldnt cost a whole lot of money. They used to be a lot lot cheaper but those times are far behind us.
No, I'm asking was a BIOS update the reason for this degradation in performance? As I have said: it actually WAS able to play 4K 60FPS videos a year or two ago completely smooth! Please don't misread/misunderstand.
 
No, I'm asking was a BIOS update the reason for this degradation in performance? As I have said: it actually WAS able to play 4K 60FPS videos a year or two ago completely smooth! Please don't misread/misunderstand.




Either update your GPU or CPU to a more modern one that has an iGPU that will have all the codec support. Everything past Gen 8 intel with the 620 IGP should do youtube. My own laptop is an 8th gen with the same iGPU and youtube has always been smooth as butter compared to my old sandy bridge or ivy bridge laptop.
 
Last edited:
2. GT610 is from 2012 and does not support VP8/VP9, AVC/HEVC or AV1 codecs that youtube uses today.
I have always used a GT 610 since I built this PC and it never received any upgrade, except for the PSU. If it was able to play 4K60 videos a few years ago, then it should be able to play them now too, right?
 
I have always used a GT 610 since I built this PC and it never received any upgrade, except for the PSU. If it was able to play 4K60 videos a few years ago, then it should be able to play them now too, right?

I think what people are saying is that YouTube has probably changed codecs, either for better quality (doubt it) or to cut down on their hosting/serving costs. That's also why h264ify is being recommended so much, it forces YouTube to serve you an older codec that your card may be better able to handle.
 
I think what people are saying is that YouTube has probably changed codecs, either for better quality (doubt it) or to cut down on their hosting/serving costs. That's also why h264ify is being recommended so much, it forces YouTube to serve you an older codec that your card may be better able to handle.
h264ify doesn't work, first of all it sets the max quality to 1080p, second, it doesn't use my GPU, it still uses software decoding, as I can see in task manager that "video decode" of the GPU is 0%.
 
I have always used a GT 610 since I built this PC and it never received any upgrade, except for the PSU. If it was able to play 4K60 videos a few years ago, then it should be able to play them now too, right?

This would be nice but thats not the way the real world works unfortunately.
 
Back
Top