1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 750 Ti Benchmarked Some More

Discussion in 'News' started by btarunr, Jan 27, 2014.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,972 (10.98/day)
    Thanks Received:
    13,759
    Location:
    Hyderabad, India
    In the run up to its rumored February 18th launch, GeForce GTX 750 Ti, the first retail GPU based on NVIDIA's next-generation "Maxwell" GPU architecture, the card is finding itself in the hands of more leaky PC enthusiasts, this time, members of Chinese PC enthusiast community site PCOnline. The site used an early driver to test the GTX 750 Ti, which it put through 3DMark 11 (performance preset) and 3DMark Fire Strike. In the former, the card scored P4188 points, and 3170 points in the latter. The test-bed details are not mentioned, but one can make out a stock Core i7-4770K from one of the screenshots. Also accompanying the two is an alleged GPU-Z 0.7.5 screenshot of the GTX 750 Ti, which reads out its CUDA core count as 960. Version 0.7.5 doesn't support GTX 750 Ti, but it has fall-backs that help it detect unknown GPUs, particularly from NVIDIA. Its successor, GPU-Z 0.7.6, which we're releasing later today, comes with support for the chip.

    [​IMG] [​IMG] [​IMG]

    Source: PCOnline.com.cn Forums
     
  2. EzioAs

    EzioAs

    Joined:
    Oct 16, 2012
    Messages:
    113 (0.14/day)
    Thanks Received:
    14
    The 3DMark 11 performance score is only slightly higher than a GTX 460. If this holds true after official launch (plus better driver), then this card seems a bit too weak.
     
  3. infernox New Member

    Joined:
    Jan 27, 2014
    Messages:
    1 (0.00/day)
    Thanks Received:
    0
    Yeah it really does if that is true. The GTX 650ti was around GTX560 levels and the GTX650ti boost was a great value card close to the GTX570/480. I was expecting the 750ti to be around at least GTX660 levels since the 650ti boost can get very close to 660levels with an OC.
     
  4. Supercrit

    Joined:
    Aug 3, 2011
    Messages:
    89 (0.07/day)
    Thanks Received:
    29
    450 and 550ti all over again, back to the era of midrange of suck.
     
  5. EzioAs

    EzioAs

    Joined:
    Oct 16, 2012
    Messages:
    113 (0.14/day)
    Thanks Received:
    14
    And the top GPU cost $600+...
     
  6. NC37

    NC37

    Joined:
    Oct 30, 2008
    Messages:
    1,207 (0.54/day)
    Thanks Received:
    270
    Yet the best high midrangers they ever put out were in that Fermi era. Fermi era was at least setup well. You knew what you were paying for and got some of the best SLI performers out of it.

    The 600 series was actually more of a mess with midrangers becoming the high ends and low/mid chips filling in the midrange block. We didn't get the 110s till late. 700 series is finally started getting things back on track but now the results are skewed thanks to NV's bumblings in the 600s. Course, AMD wasn't really mounting any competition so they didn't help much.
     
  7. HisDivineOrder

    HisDivineOrder

    Joined:
    Aug 23, 2013
    Messages:
    96 (0.19/day)
    Thanks Received:
    23
    Not sure why anyone expects a 660 Ti for a 750 Ti product.

    The 700 series in general isn't that much higher in performance than the 600 series, so the new 700 series product seems appropriately named for the performance offered.

    This is not a high end product. This is not intended to be pushing the boundaries of performance. This is a test bed for the new architecture on a well-known fab process. This is to help them work out any deficiencies in the new design ahead of the new fab processes finally being ready and readily available.

    They've done this before. I suppose that's why I'm not surprised. The mid-range part that gets the new design (or the new fab process) has unrealistic expectations heaped upon it and then when it comes out at more or less the performance level assigned it by the very branding it was given (in this case the 750 series), people are disappointed by the new architecture.

    The real disappointment is you're unrealistic expectations for a lower than mid-range product.

    I'll be curious to see what improvements are present in the product's architecture (ie., mostly the reviews) and I'm still looking forward to the "bigger" versions of Maxwell on the smaller die process, but this card was never going to be worlds better than the existing products given what current fabs are capable of.
     
  8. james888

    james888

    Joined:
    Jun 27, 2011
    Messages:
    4,769 (3.72/day)
    Thanks Received:
    1,976
    The performance is fine if the power consumption is great.
     
    Crunching for Team TPU
  9. bogami

    bogami

    Joined:
    Jan 15, 2012
    Messages:
    285 (0.26/day)
    Thanks Received:
    10
    Location:
    Slovenia
    Core efficiency is low. it reaches 0.4 Fremi architecture. Pixs fillrate and Texture fillrate is very consistent with GTX480 - GTX570 operating frequency much higher which again shows the ineffectiveness of the core.
    Customizing the upcoming 4K resolution monitors! seem a big problem. And next gen Open GL to .Or is it as usual resisting direct X. Difficult to say that they are not prepared Driver for new architecture.
    definitely a disappointment ! 1-8 nop !
     
  10. west7

    west7

    Joined:
    Dec 1, 2013
    Messages:
    29 (0.07/day)
    Thanks Received:
    1
    this looks weak
     
  11. john_

    john_

    Joined:
    Sep 6, 2013
    Messages:
    269 (0.56/day)
    Thanks Received:
    69
    Location:
    Athens, Greece
    If you take the graphics score from a 650Ti and then calculate what that score would be if 650Ti was running at 750Ti's gpu speed and had 960 cores instead of 768, you come up with a score close to that 5600.
    Now at Wccftech they where saying that we will also see a 750 not Ti with a rumored 75W power consumption. If 750 non ti follows the same logic as 650, with 650 having half the cores compared to 650Ti, then there isn't really any difference in power consumption compared to Kepler. From 64W and 384 cores on 650 non ti you go to 75W and 480 cores on 750 non ti. No big deal I think.

    Looking at these facts/rumors, and also considering that both cards are part of 700 series, I still have huge doubts that these cards are Maxwell. I think they are Kepler rebrands.
     
  12. Ed_1

    Joined:
    Dec 14, 2006
    Messages:
    346 (0.12/day)
    Thanks Received:
    50
    Since there still on 28nm why would power go down by much for same core count , efficiency improvements would only be small IMO , few % and that would probably need solid drivers .
     
  13. Dave65

    Dave65

    Joined:
    Mar 7, 2010
    Messages:
    131 (0.07/day)
    Thanks Received:
    15
    Location:
    Michigan
    Yep not much to see here!
     
  14. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,898 (0.92/day)
    Thanks Received:
    383
    Location:
    Singapore
    It seems to be a tad memory starved. Compared to a 660 which also comes with 960 shaders, not only does it have 1 less memory bus, but the mem speed is lower too. That seems to be the culprit and explains the low performance.
     
  15. djisas

    djisas

    Joined:
    Jun 26, 2007
    Messages:
    1,714 (0.62/day)
    Thanks Received:
    324
    Location:
    Darkness
    Performance lower than my Hd6950 running on a 3.4Ghz i3 and probably will cost around 150-200€...
     
  16. john_

    john_

    Joined:
    Sep 6, 2013
    Messages:
    269 (0.56/day)
    Thanks Received:
    69
    Location:
    Athens, Greece
    I would expect a price between 650Ti and 650Ti Boost. I think it is meant to replace both cards(in a way).
     
  17. xorbe

    Joined:
    Feb 14, 2012
    Messages:
    441 (0.42/day)
    Thanks Received:
    64
    Location:
    Bay Area, CA
    Strangled by 86.4GB/s bandwidth, same as 650Ti. 750Ti will respond supremely to memory overclocking.
     
  18. v12dock

    v12dock

    Joined:
    Dec 18, 2008
    Messages:
    1,617 (0.73/day)
    Thanks Received:
    326
    I expect more GPGPU gains with Maxwell than anything else
     
  19. alwayssts

    alwayssts

    Joined:
    May 13, 2008
    Messages:
    391 (0.16/day)
    Thanks Received:
    87
    Bingo.

    If it is 960 unified shaders (meaning a smx is comprised of 240sp instead of 192+32 special function units) is will require exactly 25% more bandwidth, even though the actual usefulness versus 896 (768+128) is only a couple to a few percent in games beyond increased compute performance, which granted is important for additional effects/physics/etc. Unified is the best way forward considering gpus are for more than rasterization, but it is not always the most efficient purely for graphics performance. I assume sfus are quite a bit smaller than full shaders, use less power, and were tied to the internal cache (hence not needing extra bw), but granted 240 unified is a good number.

    The thing about kepler is while nvidia nailed the ratio needed for special function (which amd does in shaders, perhaps nvidia will now too) the best ratio of total units is around 234-235sp for 4 rops. Nvidia had 224 (192+32) while amd has 256 in most scenarios. This is why often kepler was more efficient, but gcn stronger per clock (when both similarly satiated but not accounting for excessive bandwidth per design). 240 is obviously very close to the ideal mark per 4 rops per one 'core', and realistically the closest an arch can get based on large, compact engines with units based on granularity of multiples of 4/8/16 etc.

    That said, I truly expected something clever like 1037-1100/7000, or the outlier chance of something phenomenal, like say 192-bit, up to 1200mhz boost, and 5500mhz-rated (7000mhz 1.5v binned to 1.35v) ram...those would have made sense or been exciting....perhaps in six-or-seven months when we get 20nm parts.

    This...this is shit. It basically screams 'wait for the 770-style rebrand we try to sell you in six months with a better bandwidth config that we will hail as a huge efficiency improvement' once gk106 chips exit the channel and no longer competes with it. AMD isn't much better with the 5ghz-> 6ghz ram on 7870->270(x), but at least in that case the they were probably waiting for that specific ram to get cheaper, as well as the fact the initial design wasn't supremely bottle-necked by 5ghz (or most 5ghz overclockable speeds). I suppose there is a chance nvidia could rig future (20nm) maxwell designs up with a larger cache so it is less dependent on external bw and this design got the shaft because of 28nm, but I'm not holding my breath.
     
  20. TheHunter

    TheHunter

    Joined:
    Apr 10, 2012
    Messages:
    981 (0.99/day)
    Thanks Received:
    375
    Location:
    Europa
    Looks ok for a physx card..
     
  21. darkangel0504

    darkangel0504

    Joined:
    May 20, 2011
    Messages:
    94 (0.07/day)
    Thanks Received:
    28
    Really high core clock .
     
  22. xorbe

    Joined:
    Feb 14, 2012
    Messages:
    441 (0.42/day)
    Thanks Received:
    64
    Location:
    Bay Area, CA
    I hate it when they jack the clock on a low core count, inefficient. 660 Ti was nice, higher core count lower clock. Of course that's the one they axed, and the 650 Ti Boost.
     
  23. refillable

    Joined:
    Jun 6, 2012
    Messages:
    45 (0.05/day)
    Thanks Received:
    6
    This doesn't look good. It brings back my memories to the 550 Ti. It is not even faster than the 2 year old 7850.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page