1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240

Discussion in 'News' started by btarunr, Feb 20, 2012.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,870 (11.07/day)
    Thanks Received:
    13,716
    Location:
    Hyderabad, India
    An Expreview community member ran benchmarks comparing the performance of the Intel HD 4000 graphics embedded into its upcoming 22 nm "Ivy Bridge" Core i5-3570K, comparing it to the integrated graphics of Core i5-2500K, and discrete graphics NVIDIA GeForce GT 240. These tests are endorsed by the site. The suite of benchmarks included games that aren't quite known to be very taxing on graphics hardware by today's standards, yet are extremely popular; games such as StarCraft II, Left4Dead 2, DiRT 3, Street Fighter IV. Some of the slightly more graphics-intensive benchmarks included Far Cry 2 and 3DMark Vantage. All benchmarks were run at 1280 x 720 resolution.

    The Intel HD 4000 graphics core beats the HD 3000 hands down, with performance leads as high as 122% in a particular test. The chip produces more than playable frame-rates with Left4Dead 2 and Street Fighter IV, both well above 50 FPS, even DiRT 3 and Far Cry 2 run strictly OK, over 30 FPS. StarCraft II is where it produced under 30 FPS, so the chip might get bogged down in intense battles. A mainstream discrete GeForce or Radeon is a must. On average, the graphics core embedded into the Core i5-3570K was found to be 67.25% faster than the one on the Core i5-2500K.

    [​IMG] [​IMG] [​IMG] [​IMG]

    When pitted against a 2+ year old GeForce GT 240, the Core i5-3570K struggles. In StarCraft II, it's 53.64% slower. On average, the GT 240 emerged 56.25% faster. A decent effort by Intel to cash in on the entry-level graphics. We are hearing nice things about the HD video playback and GPU acceleration capabilities of Intel's HD 4000 core, and so there's still something to look out for. Agreed, comparing the i5-3570K to the i5-2500K isn't a 100% scientific comparison since the CPU performance also factors in, but it was done purely to assess how far along Intel has come with its graphics.

    Source: Expreview
     
    Last edited: Feb 20, 2012
    1c3d0g and nuno_p say thanks.
  2. claylomax

    claylomax

    Joined:
    Apr 10, 2010
    Messages:
    1,612 (0.95/day)
    Thanks Received:
    261
    Location:
    London
    Do you think it can play Hard Reset maxed out at 1900x1200?
     
  3. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,870 (11.07/day)
    Thanks Received:
    13,716
    Location:
    Hyderabad, India
    I don't think so. Hard Reset at that resolution, maxed out, can be sufficiently taxing on even $200 graphics cards.
     
  4. OneCool

    OneCool

    Joined:
    Nov 27, 2005
    Messages:
    849 (0.26/day)
    Thanks Received:
    68
    Location:
    Look behind you!!
    Why is it being compared to nvidia not AMDs apu?

    Doesnt make much sense to me.
     
  5. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,181 (0.90/day)
    Thanks Received:
    90
    Location:
    So. Cal.
    67.25% faster than that of a 2007 IGP is good? Then to compare to a 2009 GT240 and probably a DDR3, if it is 36% is adequately respectable now in 2012. So, it’s maybe like a HD5550, which could now almost achieving modern entry leval so not bad.
     
  6. ZoneDymo

    ZoneDymo

    Joined:
    Feb 11, 2009
    Messages:
    423 (0.20/day)
    Thanks Received:
    70

    maybe those comparisons hurt Intel's ego, I dont know.
     
  7. KRONOSFX New Member

    Joined:
    Jan 24, 2011
    Messages:
    25 (0.02/day)
    Thanks Received:
    6
    Last edited: Feb 20, 2012
  8. KRONOSFX New Member

    Joined:
    Jan 24, 2011
    Messages:
    25 (0.02/day)
    Thanks Received:
    6
    btarunr You made a mistake.
    HD4000 is 36% slower than GT240 but on the other hand 100-36=64 -> 100/64=56.25
    GT240 is 56.25% faster than HD4000.
     
    btarunr says thanks.
  9. dickobrazzz New Member

    Joined:
    May 19, 2011
    Messages:
    43 (0.03/day)
    Thanks Received:
    2
    Location:
    Russia, Moscow
    wow:eek:..beautiful!:toast:
     
  10. WarraWarra New Member

    Joined:
    Nov 23, 2010
    Messages:
    268 (0.18/day)
    Thanks Received:
    14
    The IGP graphics is okay for most office use or internet cafe internet surfing pc / headless server use. It helps if you use webcl.nokia.com and intel webcl software.
    Still have no clue why Intel is not involved in some scam with a IGP company to force the use of their IGP's vs Intel wasting time on IGP's.

    The best part is when you run a i7 cpu with IGP (no dedicated amd / nvidia gpu's) then the cpu's suck much more.

    Example:
    Mobile i7-2670 runs 7-zip 8/8 at avg. 12500 where a i7-2630 with nvidia mobile gpu runs at 15244~15384 8/8, same hardware except for gpu used and swapped out cpu for testing. You would expect it to be the opposite results.
     
  11. faramir New Member

    Joined:
    May 20, 2011
    Messages:
    203 (0.16/day)
    Thanks Received:
    27
    So how does HD4000 compare to A8-3850/3870 graphics then ?

    I understand Trinity figures aren't avaliable yet, I can't wait for the rumored performance of Kaveri though, finally a decent enough integrated GPU with absolutely no need for additional GPU, if only they can get the CPU IPC performance up from the 2006 first generation Phenom level ...
     
  12. DarkOCean

    DarkOCean

    Joined:
    Jan 28, 2009
    Messages:
    1,618 (0.76/day)
    Thanks Received:
    350
    Location:
    on top of that big mountain on mars(Romania)
    and the big differnce comes from optimization for 3dmark , games alone its only 56% still ok for an intel igp.
     
  13. Borc New Member

    Joined:
    Jan 5, 2012
    Messages:
    4 (0.00/day)
    Thanks Received:
    0
    2007? Are you joking? HD3000 was launched in 2011 last year.
     
  14. Yo_Wattup New Member

    Joined:
    Jan 26, 2012
    Messages:
    790 (0.76/day)
    Thanks Received:
    200
    Location:
    Brisbane, Australia
    Still quite mediocre IMO.
     
  15. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,242 (0.92/day)
    Thanks Received:
    303
    Since there shouldn't be any review of this available, are there any of A8-38X0 pitted against GeForce GT 240?
     
  16. Completely Bonkers New Member

    Joined:
    Feb 6, 2007
    Messages:
    2,580 (0.90/day)
    Thanks Received:
    516
    Quite impressive. Now if you could JOIN that performance with a budget gaming card, you'd be good to go. What a shame that the company developing that concept left the market. What was it called again?

    How about the Xeon dual socket version of this chip. If it could combine combine graphics performance, now that would be decent enough for most people, and every reason for everyone to buy a workstation board and for Intel to sell twice as many CPU chips ;)

    In fact, they could go back in time to the 386 and 387 math coprocessor concept. Only this time it would be GPU coprocessor. They could build a sister-chip that had half the CPU cores but double the GPU core/shaders, and it would make a marvellous combination.
     
  17. Inceptor

    Inceptor

    Joined:
    Sep 21, 2011
    Messages:
    497 (0.43/day)
    Thanks Received:
    119
    I think he means 2007 era discrete graphics performance.
     
  18. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,242 (0.92/day)
    Thanks Received:
    303
    Found a review with both an A8-3850 and a GT 240 here

    It has mixed results: a GT 240 has performance over the A8-3850 (with the RAM @ 1866) between 91% and 120%.

    Would prefer a more comprehensive review for this comparison but was unable to locate one :(

    EDIT

    With this, it seems that Core i5-3570K Graphics still isn't up to the graphics of an A8-3850. It's a whole new ball game when you factor the CPU portion of the chip.
     
    Last edited: Feb 20, 2012
  19. v12dock

    v12dock

    Joined:
    Dec 18, 2008
    Messages:
    1,611 (0.74/day)
    Thanks Received:
    321
    Who did they get the design from this time around
     
  20. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,181 (0.90/day)
    Thanks Received:
    90
    Location:
    So. Cal.
    Ah, what HD4000? If a 4670 well yes, a GT240 DDR5 that came along 14 months later did better it, but I see it about 8% higher @ 1280x.
    http://www.techpowerup.com/reviews/MSI/GeForce_GT_240/30.html

    While didn’t do anything better than a the 780G.

    I second that
     
  21. Dent1

    Joined:
    May 18, 2010
    Messages:
    3,164 (1.91/day)
    Thanks Received:
    922
    What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?
     
  22. nuno_p

    nuno_p New Member

    Joined:
    Jan 31, 2010
    Messages:
    21 (0.01/day)
    Thanks Received:
    6
    Location:
    Abrantes\Portugal
    Its not that simple.
     
    1c3d0g says thanks.
  23. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,181 (0.90/day)
    Thanks Received:
    90
    Location:
    So. Cal.

    The Trinity A10 lineup will come with Radeon HD 7660D or something that approaches or betters the current 6570 discrete. Now consider that will perform similar to a 9800GTX 512Mb from 2008, which had a TDP of 168W. Today they combine a CPU and the GPU and keep it under 100W. In 4 years that’s pretty amazing wouldn't you say?

    As to why they don't... it comes down to power and heat, most anyone or the OEM's that builds and markets the volume of general use computers have to do it for a price and within "green" efficiency. While at this time cooling would need to be developed and I would consider a prohibitive cost. But give it two years and you'll probably be getting 7770 performance with an APU.
     
    Last edited: Feb 20, 2012
    Dent1 says thanks.
  24. Thefumigator

    Thefumigator

    Joined:
    Jun 11, 2008
    Messages:
    417 (0.18/day)
    Thanks Received:
    66
    Because llano A8 is a solution in the mainstream where you count every penny.
    You don't count every penny on the enthusiast range, so a more powerful GPU integrated to a Phenom II or bulldozer won't necesarelly means the enthusiast crowd will buy it, first because you will be tied to that integrated GPU until you buy a discrete card, and second, no matter how powerful the integrated GPU is, it shares memory, and that makes performance drop, also DDR3 is not comparable to GDDR5 in any way.

    agree.
    And I'm not counting graphics quality in 3D, microstutterings, and compatibility.
     
    Dent1 says thanks.
  25. repman244

    repman244

    Joined:
    Apr 7, 2011
    Messages:
    1,104 (0.83/day)
    Thanks Received:
    456
    The TDP would be too high I imagine.
     
    Dent1 says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page