- Nov 22, 2005
- 26,949 (5.14/day)
- Indiana, USA
|Processor||Intel Core i7 9900K@5.0GHz|
|Motherboard||AsRock Z370 Taichi|
|Cooling||Corsair H115i Pro w/ Noctua NF-A14 Fans|
|Memory||32GB Corsair DDR4-3000|
|Video Card(s)||ASUS Strix GTX 1080Ti|
|Storage||500GB SX8200 Pro + 8TB with 1TB SSD Cache|
|Display(s)||QNIX QX2710 1440p@120Hz|
|Case||Fractal Design Define S|
|Audio Device(s)||Onboard is good enough for me|
|Power Supply||eVGA SuperNOVA 1000w G3|
|Software||Windows 10 Pro x64|
No, it doesn't even have to be overclocked to support the 1080Ti. In any modern game, using settings and resolutions that need a 1080Ti, the 2500K will not be the bottleneck. The 1080Ti will be, that is why we upgrade our GPUs way more than we upgrade our CPUs. I can't think of a single game released recently that this isn't true with.I don't think that's accurate. The 2500K falls behind in a lot of tests to the i3s, which (ignoring core clocks) is as strong as the current Pentiums. It has to be overclocked to support the 1080 Ti.
In theory, yes. In real world use, almost never.What he's asking for is a low res test that is CPU limited because the CPU that gets the worse result will be the CPU that starts to bottleneck future GPUs first. It is a relevant test for people who plan to keep the CPU longer than the GPU (almost everyone).