• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
An Expreview community member ran benchmarks comparing the performance of the Intel HD 4000 graphics embedded into its upcoming 22 nm "Ivy Bridge" Core i5-3570K, comparing it to the integrated graphics of Core i5-2500K, and discrete graphics NVIDIA GeForce GT 240. These tests are endorsed by the site. The suite of benchmarks included games that aren't quite known to be very taxing on graphics hardware by today's standards, yet are extremely popular; games such as StarCraft II, Left4Dead 2, DiRT 3, Street Fighter IV. Some of the slightly more graphics-intensive benchmarks included Far Cry 2 and 3DMark Vantage. All benchmarks were run at 1280 x 720 resolution.

The Intel HD 4000 graphics core beats the HD 3000 hands down, with performance leads as high as 122% in a particular test. The chip produces more than playable frame-rates with Left4Dead 2 and Street Fighter IV, both well above 50 FPS, even DiRT 3 and Far Cry 2 run strictly OK, over 30 FPS. StarCraft II is where it produced under 30 FPS, so the chip might get bogged down in intense battles. A mainstream discrete GeForce or Radeon is a must. On average, the graphics core embedded into the Core i5-3570K was found to be 67.25% faster than the one on the Core i5-2500K.



When pitted against a 2+ year old GeForce GT 240, the Core i5-3570K struggles. In StarCraft II, it's 53.64% slower. On average, the GT 240 emerged 56.25% faster. A decent effort by Intel to cash in on the entry-level graphics. We are hearing nice things about the HD video playback and GPU acceleration capabilities of Intel's HD 4000 core, and so there's still something to look out for. Agreed, comparing the i5-3570K to the i5-2500K isn't a 100% scientific comparison since the CPU performance also factors in, but it was done purely to assess how far along Intel has come with its graphics.

View at TechPowerUp Main Site
 
Last edited:
Do you think it can play Hard Reset maxed out at 1900x1200?
 
Do you think it can play Hard Reset maxed out at 1900x1200?

I don't think so. Hard Reset at that resolution, maxed out, can be sufficiently taxing on even $200 graphics cards.
 
Why is it being compared to nvidia not AMDs apu?

Doesnt make much sense to me.
 
67.25% faster than that of a 2007 IGP is good? Then to compare to a 2009 GT240 and probably a DDR3, if it is 36% is adequately respectable now in 2012. So, it’s maybe like a HD5550, which could now almost achieving modern entry leval so not bad.
 
Why is it being compared to nvidia not AMDs apu?

Doesnt make much sense to me.


maybe those comparisons hurt Intel's ego, I dont know.
 
btarunr You made a mistake.
On average, the GT 240 emerged 36% faster.
HD4000 is 36% slower than GT240 but on the other hand 100-36=64 -> 100/64=56.25
GT240 is 56.25% faster than HD4000.
 
The IGP graphics is okay for most office use or internet cafe internet surfing pc / headless server use. It helps if you use webcl.nokia.com and intel webcl software.
Still have no clue why Intel is not involved in some scam with a IGP company to force the use of their IGP's vs Intel wasting time on IGP's.

The best part is when you run a i7 cpu with IGP (no dedicated amd / nvidia gpu's) then the cpu's suck much more.

Example:
Mobile i7-2670 runs 7-zip 8/8 at avg. 12500 where a i7-2630 with nvidia mobile gpu runs at 15244~15384 8/8, same hardware except for gpu used and swapped out cpu for testing. You would expect it to be the opposite results.
 
So how does HD4000 compare to A8-3850/3870 graphics then ?

I understand Trinity figures aren't avaliable yet, I can't wait for the rumored performance of Kaveri though, finally a decent enough integrated GPU with absolutely no need for additional GPU, if only they can get the CPU IPC performance up from the 2006 first generation Phenom level ...
 
and the big differnce comes from optimization for 3dmark , games alone its only 56% still ok for an intel igp.
 
Still quite mediocre IMO.
 
So how does HD4000 compare to A8-3850/3870 graphics then ?

I understand Trinity figures aren't avaliable yet, I can't wait for the rumored performance of Kaveri though, finally a decent enough integrated GPU with absolutely no need for additional GPU, if only they can get the CPU IPC performance up from the 2006 first generation Phenom level ...

Since there shouldn't be any review of this available, are there any of A8-38X0 pitted against GeForce GT 240?
 
Quite impressive. Now if you could JOIN that performance with a budget gaming card, you'd be good to go. What a shame that the company developing that concept left the market. What was it called again?

How about the Xeon dual socket version of this chip. If it could combine combine graphics performance, now that would be decent enough for most people, and every reason for everyone to buy a workstation board and for Intel to sell twice as many CPU chips ;)

In fact, they could go back in time to the 386 and 387 math coprocessor concept. Only this time it would be GPU coprocessor. They could build a sister-chip that had half the CPU cores but double the GPU core/shaders, and it would make a marvellous combination.
 
2007? Are you joking? HD3000 was launched in 2011 last year.

I think he means 2007 era discrete graphics performance.
 
Since there shouldn't be any review of this available, are there any of A8-38X0 pitted against GeForce GT 240?

Found a review with both an A8-3850 and a GT 240 here

It has mixed results: a GT 240 has performance over the A8-3850 (with the RAM @ 1866) between 91% and 120%.

Would prefer a more comprehensive review for this comparison but was unable to locate one :(

EDIT

With this, it seems that Core i5-3570K Graphics still isn't up to the graphics of an A8-3850. It's a whole new ball game when you factor the CPU portion of the chip.
 
Last edited:
What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?
 
What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?

Its not that simple.
 
What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?


The Trinity A10 lineup will come with Radeon HD 7660D or something that approaches or betters the current 6570 discrete. Now consider that will perform similar to a 9800GTX 512Mb from 2008, which had a TDP of 168W. Today they combine a CPU and the GPU and keep it under 100W. In 4 years that’s pretty amazing wouldn't you say?

As to why they don't... it comes down to power and heat, most anyone or the OEM's that builds and markets the volume of general use computers have to do it for a price and within "green" efficiency. While at this time cooling would need to be developed and I would consider a prohibitive cost. But give it two years and you'll probably be getting 7770 performance with an APU.
 
Last edited:
What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?

Because llano A8 is a solution in the mainstream where you count every penny.
You don't count every penny on the enthusiast range, so a more powerful GPU integrated to a Phenom II or bulldozer won't necesarelly means the enthusiast crowd will buy it, first because you will be tied to that integrated GPU until you buy a discrete card, and second, no matter how powerful the integrated GPU is, it shares memory, and that makes performance drop, also DDR3 is not comparable to GDDR5 in any way.

Still quite mediocre IMO.

agree.
And I'm not counting graphics quality in 3D, microstutterings, and compatibility.
 
What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?

The TDP would be too high I imagine.
 
Back
Top