1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why are there no Sandybridge reviews using top end C2Q CPUs?

Discussion in 'General Hardware' started by EastCoasthandle, Apr 12, 2011.

  1. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,743 (4.56/day)
    Thanks Received:
    6,768
    Location:
    Edmonton, Alberta
    Running dual GPUs on Sandybridge leads to like a 30% performance boost over 1156 with i7 870. Just ask triptex how much faster his games are now.
  2. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,654 (6.23/day)
    Thanks Received:
    5,831
    See my post above about dual GPUs.
    cadaveca says thanks.
    Crunching for Team TPU 50 Million points folded for TPU
  3. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,743 (4.56/day)
    Thanks Received:
    6,768
    Location:
    Edmonton, Alberta
    :toast:

    Before the launch I was set on not buying into SB at all, but now that I have it, I really think it's important to highlight how, for specific scenarios, SB is THE way to go, even with cost considered.

    I do not see SB getting alot of hype, but it should. Intel did a good job, and I'm definitely impressed.
  4. TRIPTEX_CAN

    TRIPTEX_CAN

    Joined:
    Feb 10, 2008
    Messages:
    3,304 (1.41/day)
    Thanks Received:
    723
    Location:
    BC.CAN
    I was set on bd until I saw the results of sb I might have made a mistake but the clocks and temps speak loudly. Sb is really as good as people say. Only those who haven't tried it still defend the c2q.
  5. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,654 (6.23/day)
    Thanks Received:
    5,831
    I absolutely agree, and if you are paying for dual-GPUs then you should be paying for propper supporting hardware as well.(Says the guy running SLi GTX460s on a Celeron...:D)
    Crunching for Team TPU 50 Million points folded for TPU
  6. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,497 (1.96/day)
    Thanks Received:
    930
    That res is exactly what makes the anandtech benches better than most. As others have said those sorts of tests are typically done at lower resolutions. 1680x1050 is one of the most common high resolutions in use today. Most complaints I see against their reviews are from people who can't except just how big the gap is between certain cpus. I know AMD comes out looking downright useless for gaming, but many seem to forget phenom II only performed well compared to phenom 1, it still sucks compared to yorkfield and i5/i7. I mean even a stock Q6600 gives a X6 a challenge. Hopefully it'll be better with bulldozer, can finally switch teams without feeling like I have to lie to myself to justify it.

    The whole issue is of course complicated by what sections of games are being compared. My friend had a bug that had his e5200 multi stuck at half speed, 1.6 GHz from 3.2 GHz. Just moving around in an empty crysis map he only lost like 5 frames with his 5850. Pretty remarkable given it's just a low cache dual core at such crap speeds, but during AI combat he lost a good 25fps.
  7. BababooeyHTJ New Member

    Joined:
    Apr 6, 2009
    Messages:
    907 (0.47/day)
    Thanks Received:
    85
    No, that doesn't make any sense at all. Who is buying a $400-500 cpu+motherboard combo to game on a ~$120 monitor on low or medium settings? Those comparisons are about as useful as any other synthetic benchmark. I gloss right over those reviews.

    Secondly as I said saying that the cpu don't have any effect on resolution and IQ settings is just ignorant.

    Lastly, where is a four year old 2.4ghz quad core cpu giving a 3.2ghz (not including turbo boost) hex core cpu a run for it's money? Maybe on Anandtech's reviews. Phenom 2 is at worst on par with Yorkfield clock for clock and in some cases can outperform Nehalem.
  8. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,654 (6.23/day)
    Thanks Received:
    5,831
    IMO, both high resolutions and low resolutions are important. Low resolutions take the bottleneck off the GPU and place it on the CPU, so it highlights the real performance difference between CPUs. However, higher resolution place the bottleneck back on the GPU, and show what real world usage will be like.

    Considering 1680x1050 is the second most common resolution on monitors today, it certainly is important that it is included in benchmarks, it gives people with these monitors an idea of what performance they can expect, and actually it is people with these "lower" resolutions that will see the most noticeable difference between CPUs. Oh, and just FYI, 1280x1024 is the 3rd most common monitor resolution, so that too is important in benchmarks as well, and again they will show a larger difference between CPUs than a higher resolution would.
    LifeOnMars says thanks.
    Crunching for Team TPU 50 Million points folded for TPU
  9. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,543 (1.54/day)
    Thanks Received:
    510
    Location:
    Australia
    This is what ive seen before and it has been told again and again. A Phenom I 9950 is = to a Q6600 so if you think a Q6600 has any chance competing to any X6 from AMD your sadly mistaken. I choose to go AMD for this reason and that's for gaming, it has been proven that an AMD CPU can hold its own just fine in gaming compared to any intel CPU, and more so at high res. Not this low res BS that everyone points to, who in there right minds play games at 1024*768??? using a quad core CPU. My CPU in any game i have run so far has never maxed out my processor once, still has room to breath.
  10. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,497 (1.96/day)
    Thanks Received:
    930
    Did you miss the rest of my post? As discussed, 1680 is not low res. http://www.anandtech.com/bench/Product/53?vs=147&i=47.48.49.50

    Not bad for a chip they don't even make anymore. I'd say it competes just fine. Even more so for it's replacement. http://www.anandtech.com/bench/Product/89?vs=147&i=47.48.49.50.59.60.61.62

    Again, Phenom II was decent when compared to phenom I, which was just inexcusably awful. They needed to add 2 more cores to actually compete with Intel's worst chips.
  11. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,543 (1.54/day)
    Thanks Received:
    510
    Location:
    Australia
    If your talking just purely about gaming benchmark, then its a no brainier that a X6 isnt the best choice (that's why i got a quad core), games at this stage wont use the full power of 6 cores, but when it comes to everything else the 1055T will eat that Q6600 alive, and in the future games will need more and more cores so the X6 will only increase its lead in time. http://www.guru3d.com/article/phenom-ii-x6-1055t-1090t-review/10

    And once again the 9950 is = to a Q6600 at stock clocks, so they do not need two more cores to catch up at all. Even the 965 runs better in games then my X6.
  12. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,497 (1.96/day)
    Thanks Received:
    930
    What do you mean "if your talking purely about gaming", not only is this thread about gaming, you just said you bought AMD for gaming. Now you're saying the X6 isn't the best choice for gaming. Again, I'll bring up my point about how you basically have to lie to yourself to justify buying AMD at this point and time. If you wanted better application performance, you'd have gotten a 1156 for the same price and gotten radically better gaming performance as a bonus. I bought AMD when they were the best, they aren't now and haven't been for years. I'd encourage anyone supporting current AMD offerings above the $70 price point to reconsider their reasoning. I mean you just compared a Q6600 to a lower speed phenom 1. Phenom 1 had a severe clock for clock disadvantage, well outside of gaming. http://www.anandtech.com/bench/Product/53?vs=23

    This isn't news. That was a well accepted fact at the time, as it should still be now. Reason would not lead someone to buy a phenom 1. You'd have to allow yourself to view things simply as you wished they were to justify that purchase.
  13. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,543 (1.54/day)
    Thanks Received:
    510
    Location:
    Australia
    Well if your comparing a X6 against anything else for gaming then that's your fault in the first place for bringing up the X6 compared to a X4 for gaming performance as a X4 seems to be the better choice for gaming as i explained above^. Lie to myself? i didn't have to do any such thing, i bought both my CPU's over 8months ago, and for the price performance they gave out it was unbeatable at that time. See now i don't get why your bringing up newer sockets when this is all about older sockets compared to the 2600K ONLY. You are the one saying and i quote> I mean even a stock Q6600 gives a X6 a challenge.:confused:

    Just a reminder that the Phenom 1 that i posted was at the time the fastest AMD quad, at 2.6GHz there for the so called "severe" clock for clock disadvantage was only 200MHz. http://www.guru3d.com/article/amd-phenom-x4-9950-be-processor-tested/1

    I would agree with you there it would not of been the greatest choice to get a Phenom 1, this is why i held of till Phenom II came along and it proved to be a very worthy contender giving any C2Q a very good run for its money, and in most cases better performance per $$. I must say that iam comparing the Prices for over here in AUS as intel is ALOT more expensive here then AMD is.
  14. BababooeyHTJ New Member

    Joined:
    Apr 6, 2009
    Messages:
    907 (0.47/day)
    Thanks Received:
    85
    I agree with you but how many people in the market for a $300 cpu like 2600k will be using 1680x1050. What I don't like is Anand using Medium settings. I'll tell you right now that if anything most of the settings in FO3 are mostly cpu intensive on a modern video card.
  15. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,654 (6.23/day)
    Thanks Received:
    5,831
    A lot, a lot of people don't think that is a bad resolution. And a lot of people will be upgrading their computer, but keeping their LCD that they already have, and really why not do it this way? The monitor is still very good, and if you are someone like me, who spent a pretty penny on their 1680x1050 monitor, they will probably keep it if it still works. Heck, all my monitors are 1680x1050 or less with the exception of my main machine.
    Crunching for Team TPU 50 Million points folded for TPU
  16. Crap Daddy

    Crap Daddy

    Joined:
    Oct 29, 2010
    Messages:
    2,739 (2.03/day)
    Thanks Received:
    1,044
    For me the best screen is the 22" with 1680x1050, not too large, not too small. Perfect. It also allows you to go for max settings with GPUs that are not quite the most poweful in the world. And a good CPU adds to a great gaming experience. So I think this particular res still has life in it and deseves to be benchmarked.
  17. BababooeyHTJ New Member

    Joined:
    Apr 6, 2009
    Messages:
    907 (0.47/day)
    Thanks Received:
    85
    And you leave all of the settings on medium while you are at it?
  18. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,654 (6.23/day)
    Thanks Received:
    5,831
    I wasn't disagreeing with your on that point, hence why I didn't address it or even include it in the quote. Why would you assume that just because I disagree with part of your post that I disagree with it all? That doesn't make any sense.
    BababooeyHTJ says thanks.
    Crunching for Team TPU 50 Million points folded for TPU
  19. yogurt_21

    yogurt_21

    Joined:
    Feb 18, 2006
    Messages:
    4,302 (1.40/day)
    Thanks Received:
    542
    Location:
    AZ
    *raises hand* now I'm certainly on all maxed out settings, just the lower resolution.

    some of us want an actual monitor upgrade not just a small resolution bump while still ending up with a crappy tn panel.

    the cheapest ips panel that meets my requirements is 500$ and that's cash I don't have right now, so I'm still on my 4.5 year old 1680x1050 monitor.

    besides take a look at the resolutions in megapixels and there's really a small difference between 1680x1050 at 1.76 megapixel, and 1920x1080 2 megapixel.
    Column 1
    0 resolution megapixels
    1 640x480 0.31
    2 800x600 0.48
    3 1024x768 0.79
    4 1440x900 1.29
    5 1280x1024 1.31
    6 1650x1080 1.76
    7 1600x1200 1.92
    8 1920x1080 2.07
    9 1920x1200 2.30
    10 2048x1152 2.36
    11 2560x1440 3.68
    12 2560x1600 4.1


    you only really get a big jump if you skip up to the uber resolutions.
    Last edited: Apr 28, 2011
  20. BababooeyHTJ New Member

    Joined:
    Apr 6, 2009
    Messages:
    907 (0.47/day)
    Thanks Received:
    85
    I probably shouldn't have even have bought up the 1680x1050 part, it's really not that low. It's mostly the low IQ settings that bothers me about anand. They also used to use a lower resolution in their cpu reviews (when I decided to stop paying attention to them) not too long ago IIRC.
  21. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,654 (6.23/day)
    Thanks Received:
    5,831
    Again, the lower resolution, besides still being extremely popular and hence important, also shows the real difference in performance between the CPUs. That is why lower resolutions and lower settings are used in CPU reviews. If people are reading a CPU review, they want to see how much actual difference there are between CPUs. Quite frankly, if you only talk about large resolution and maxed out graphics settings, then my celerons would show little difference compared to my i7.
    Crunching for Team TPU 50 Million points folded for TPU
  22. LifeOnMars

    LifeOnMars New Member

    Joined:
    Jun 20, 2008
    Messages:
    2,566 (1.16/day)
    Thanks Received:
    396
    Location:
    Kettering,Northants. UK
    Absolute tosh :shadedshu Frame rates at any resolution are ruled by the lowest common denominator. If the CPU is slower than the GPU at higher resolution you are bottlenecked by the CPU, and vice-versa. Therefore a celeron compared to an i7 is still gonna show a big difference in alot of games, even at high res.

    It's very true that the gap is reduced between CPU's at higher resolutions, normally though, this is an artificial representation because you are hitting GPU bottlenecks first.
    Last edited: Apr 29, 2011
  23. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.05/day)
    Thanks Received:
    1,505
    Here is some food for thought

    A 5850 at 1Ghz was used to test a Q9550 vs i7 860 both at 4.00GHz. The results can be found here. If I recall correctly a 860 was faster then a 920. Here are some results using a QX9650 vs 920 at stock and overclocked. But this time using dual GPUs.
  24. LifeOnMars

    LifeOnMars New Member

    Joined:
    Jun 20, 2008
    Messages:
    2,566 (1.16/day)
    Thanks Received:
    396
    Location:
    Kettering,Northants. UK
    In that first test - GPU bottlenecked, plain and simples. Get a more powerful GPU paired with the CPU, or even a pair of GPU's and the better CPU would shine and come into its own. Crap test.
  25. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.05/day)
    Thanks Received:
    1,505
    I think it's clear that having a good CPU is beneficial when gaming. So, if anyone has the age old question, "Do I buy a CPU or a GPU?" If it's an SB CPU then get that 1st then get the GPU later.
    BababooeyHTJ and LifeOnMars say thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page