1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What's the hottest new GPU technology this season?

Discussion in 'TPU Frontpage Polls' started by W1zzard, Nov 14, 2013.

?

What's the hottest new GPU technology this season?

Poll closed Dec 9, 2013.
  1. AMD Mantle

    4,666 vote(s)
    57.1%
  2. NVIDIA G-Sync

    1,705 vote(s)
    20.9%
  3. NVIDIA ShadowPlay

    338 vote(s)
    4.1%
  4. AMD TrueAudio

    137 vote(s)
    1.7%
  5. DirectX 11.2

    331 vote(s)
    4.1%
  6. I just want one of the new consoles

    280 vote(s)
    3.4%
  7. None of the above

    709 vote(s)
    8.7%
  1. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    10,862 (3.92/day)
    Thanks Received:
    4,211
    Location:
    Quantum well (UK)
    Well, yes actually. :rolleyes:

    I thought it was tied to AMD's graphics cards, but that slide says it's not, which is really great and I hope it succeeds.

    NVIDIA should have opened up PhysX in the same way. It's especially ridiculous the way NVIDIA deliberately disable it if you have an NVIDIA card fitted at the same time as an AMD card.
     
  2. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    40,627 (12.41/day)
    Thanks Received:
    15,456
    I had to pick Mantle for the reasons it exists, with Shadowplay in a close 2nd. I'm not qualified enough to talk about API's but Shadowplay is an awesome feature to have.
     
    MxPhenom 216 says thanks.
  3. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,834 (6.12/day)
    Thanks Received:
    2,836
    Location:
    Seattle, WA
    Probably the most hyped. Mantle. Probably the one that might actual be of interest to me. G-Sync.

    It is. Need to try it out still though.
     
    Crunching for Team TPU
  4. Jack1n

    Joined:
    Oct 8, 2012
    Messages:
    1,054 (1.05/day)
    Thanks Received:
    239
    Location:
    Central Israel
    I voted for mantle because im hoping it will increase the PC market share over consoles.
     
  5. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    515 (0.43/day)
    Thanks Received:
    108
    it'll never ever happen. trust me
     
  6. happita

    happita

    Joined:
    Aug 7, 2007
    Messages:
    2,450 (0.85/day)
    Thanks Received:
    483
    Definitely Mantle. The biggest hope for this being successful is that AMD has their hardware in all next-gen consoles right now. It'll be stupid if it fails imo.

    TrueAudio is nothing new. Creative came out with EAX so many years ago (not sure which version introduced positional audio) and it changed the playing field for hardware-driven audio effects. AMD is just re-packaging this technology and slapping their name on it in hopes of giving something extra to people incentive-wise to buy their cards (especially for HTPCs).
    Doesn't matter for me though because I've been rocking my X-Fi since 2005 and it's still treating me great, so no complaints here :toast:
     
    Last edited: Nov 15, 2013
  7. haswrong

    Joined:
    Jun 11, 2013
    Messages:
    262 (0.35/day)
    Thanks Received:
    23
    nobody wants the super powerful console?
     
  8. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,430 (0.86/day)
    Thanks Received:
    509
    I vote for G-Sync. It's actually a paradigm shift in displays, and I'm eager to see it once it's released.

    Mantle in the end is just a performance improvement, which is no different than you can get with a new graphics card. If I need more performance, I'll save for and buy a new graphics card.

    Truthfully, none of these are completely revolutionary. In ten years I hope to no longer need to sit in front of a PC and use a mouse and keyboard; I want true, immerse virtual reality. That is the paradigm shift I am looking forward to. If you remember, Carrell Killebrew of AMD envisioned Eyefinity as a stepping stone to the holodeck, which he hoped to see in 2016. I doubt that technology is going to come so soon, but that is where the gaming industry needs to go, not toward more frames per second.
     
    Last edited: Nov 15, 2013
  9. AsRock

    AsRock TPU addict

    Joined:
    Jun 23, 2007
    Messages:
    12,045 (4.10/day)
    Thanks Received:
    2,233
    Location:
    US
    Well AMD already had positional audio although that don't mean it cannot be improved upon. And as for Creative thy been dad to me for such a long time there name is not worth the effort saying.

    And having it on the HDMI cable to my AV and not having to plug more stuff or having to worry about slot placement when i buy new mobo's is just plain win win win win. Never mind more updates too.


    I used to be in to creative in fact i was back in to the 90's till about 06 then they went pretty much shit with problems with drivers that did not fix shit..
     
  10. happita

    happita

    Joined:
    Aug 7, 2007
    Messages:
    2,450 (0.85/day)
    Thanks Received:
    483
    But you see, that's the thing. If they properly implement it with the majority of the games that will be ported over to PC, you won't have to buy a new graphics card. It gives you more performance for free, I think that's pretty sweet actually. It's like AMD's drivers improve upon the performance/reliability of their graphics cards usage, and so Mantle will increase that gap even further hopefully if they get it working the way they want to.

    I totally agree with that statement. Although I haven't had any major problems ever with my experience. Oh well, to each his own.
     
  11. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,430 (0.86/day)
    Thanks Received:
    509
    I see your argument, and I agree it's a good thing in that regard. However, I dislike the focus of the Mantle supporters that the only important thing in gaming is getting higher frame rates and higher detail settings. That's the way the graphics industry has been since it's inception - a sole focus on improving GPU performance. The problem I have is that games are now getting to the point where there are diminishing returns for utilizing that extra computing power. Having played the latest games, yes they look good, but now I am looking at needing double the GPU power just to enable minuscule things like more detailed shadows and particle effects. Certainly, they add to the experience but it's not the "wow" factor that you would get comparing game advances even as recently as 10 years ago.

    What hardware manufacturers need is a focus on the auxiliary parts of gaming since they can have a bigger impact on how a game is perceived than minor graphical improvements. TrueAudio and G-Sync are good ideas, since they increase immersion in the game. But the true problem is that the concept of sitting in front of a keyboard and mouse and staring in one direction is antiquated. I want a holodeck, or more realistically at least a 180° curved screen that takes up my entire field of view,
     
    happita says thanks.
  12. Am*

    Joined:
    Nov 1, 2011
    Messages:
    262 (0.20/day)
    Thanks Received:
    59
    This is not even a contest. G-Sync won't launch until next year and Shadowplay is last season's tech. Mantle wins, no doubt.

    True Audio has a lot of potential but IMHO it needs to be more accessible. Maybe AMD can make a True Audio-only chip, slap it onto some laptop & desktop motherboards/sound cards and take over as the next Creative? I hope so, I wouldn't want this sort of tech to lock down to one GPU manufacturer. Otherwise their cards would need an accessible audio connector like SPDIF/TOSLINK to make sense TBH (though I can't really see that happening).
     
    Last edited: Nov 15, 2013
  13. pr0fessor

    pr0fessor

    Joined:
    Oct 29, 2013
    Messages:
    65 (0.11/day)
    Thanks Received:
    3
    Location:
    Switzerland
    HDMI audio connectors have a bigger bandwidth than SPDIF and they are able to send DTS Master and True DD. SPDIF is an old standard.
     
  14. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    14,292 (4.24/day)
    Thanks Received:
    7,618
    Location:
    Edmonton, Alberta
    Hmmm....? Odd timing for this, but:

    [​IMG]
     
  15. theoneandonlymrk

    theoneandonlymrk

    Joined:
    Mar 10, 2010
    Messages:
    3,492 (1.80/day)
    Thanks Received:
    613
    Location:
    Manchester uk
  16. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    983 (0.76/day)
    Thanks Received:
    362
    Definitely AMD's Mantle, being "open" and all that I can't wait to see what a GTX 780 Ti can do combined with a G-SYNC enabled monitor.

    :D

    Almost felt like trolling... sorry. :p
     
    Last edited: Nov 16, 2013
  17. EpicShweetness

    EpicShweetness

    Joined:
    Dec 1, 2011
    Messages:
    330 (0.25/day)
    Thanks Received:
    33
    Location:
    Ft Stewart
    I agree with this! :rockout:

    Although this is why Mantle, and G-SYNC tickle me a little bit, but not really more then a chuckle.
     
  18. Fiendish New Member

    Joined:
    Mar 8, 2013
    Messages:
    2 (0.00/day)
    Thanks Received:
    1
  19. d1nky

    Joined:
    Jan 21, 2013
    Messages:
    3,803 (4.25/day)
    Thanks Received:
    1,323
    this g-sync, designed to eliminate tearing and reduce input lag.

    wouldn't buying a 120hz 2ms monitor or more be the same?!

    also, the DIY kit is only for a single monitor. the g-sync monitors next year will only be compatible with new gen cards, and certain drivers.

    couldn't they of adapted or widened the technology for people with flatscreen Tv's, other monitors etc. that would be awesome, and probably boost NVidia gpu sales!

    to me its a good bit of innovation but a bit restricted.
     
  20. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,481 (1.78/day)
    Thanks Received:
    354
    NONE! Nothing revolutionary, mostly marketing gimmiks
     
  21. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,430 (0.86/day)
    Thanks Received:
    509
    No, it would help slightly with tearing but it wouldn't come close to solving the problems that G-Sync solves. This is a good, short explanation of what G-Sync is, what it aims to solve, and why no current technology can achieve the same effects.

    Neither NVidia nor any other company could have made this backwards compatible with existing LCDs. The technology is so different from current standards and requires such extensive modification to the hardware that it is not reasonable to expect any end user to do. However, while I am looking forward to G-sync, I hope to see newer monitors coming with slots for these cards as opposed to integrated solutions so that you don't have to pay the up front cost for G-sync in your monitor if you don't plan to use it. This would also broaden the choices for people who want G-sync monitors.

    I agree the single ASUS monitor restriction at launch is a downside, but at this point it seems like any traditional launch product, which requires close collaboration among all of the companies involved. Realistically there are not many revolutionary products that launch with broad market support, it takes time to build up the market.

    I agree that it is restrictive. At the same time I would rather have a few validated solutions compared to a mass release with many combinations of incompatible or buggy hardware. You have to draw the line somewhere in order to make the project scope manageable.

    The biggest challenge NVidia is going to have is to convince people what G-Sync is and why they should buy it. AMD has it easy as they can say "Mantle = More FPS," but it takes an entire article or a live demonstration to convince someone of the benefits of G-Sync.
     
    Flibolito, d1nky and HTC say thanks.
  22. arbiter

    Joined:
    Jun 13, 2012
    Messages:
    520 (0.47/day)
    Thanks Received:
    91
    AMD claims its open but they know Nvidia will never use it. Its the principle of the matter.

    Nvidia offered to license it, AMD never took in on it.

    For hand full of games that use it, and performance is questionable how much it will add on r9 290/290x side since those gpu's run so hot it might end up crippling it.

    AMD saying it being "open" is key point. They know nvidia won't touch it, so this turns in just a PR stunt for AMD.

    Problem is with buying a 120hz monitor is you have to dumb games graphic's down to keep it locked at that, G-sync allows monitor to run at any hz between 60-144 i think. So you can run max settings and not have tearing.

    Mantle in theory could be good, but question if that low level access it claims compromises stability. Just don't know if end up back in Windows 9x days were a program crash == complete computer restart.
     
  23. d1nky

    Joined:
    Jan 21, 2013
    Messages:
    3,803 (4.25/day)
    Thanks Received:
    1,323
    I fully understand, I know NVidia is a business but I wish they had used this technology combined with monitor manufacturers and done something selfless and engineered a new era for monitors that would be totally universal. I just wonder if it is revolutionary enough to purchase a graphics card, and g-sync monitor tho.


    Im just thinking maybe G-sync is a bit wasted (early days I know) and that it could be developed into something brilliant, whereas someone with the cash could buy a 120-144hz monitor and AMD gpu instead. Or people with existing NVidia cards buy all new monitors.

    I cant wait to see either! lets see how it will develop!
     
  24. arbiter

    Joined:
    Jun 13, 2012
    Messages:
    520 (0.47/day)
    Thanks Received:
    91
    its like a lot of new tech, its not something that can just be magically added to current monitors. even with a 120-144hz monitor you will get tearing of images when fps is lower then 120 or 144hz. G-sync ditchs that fixed rate, pretty much a revolutionary thing in monitors that hasn't had any new tech in it for what? 10+ years? I hope over next few years, all monitors use or do something similar to g-sync and ditch the fixed refresh rate crap that was kept in from old CRT monitor's.
     
    Last edited: Nov 17, 2013
  25. Agility

    Agility

    Joined:
    Jun 10, 2005
    Messages:
    1,665 (0.45/day)
    Thanks Received:
    54
    Location:
    Singapore
    You do realise that your POV is actually fully based on Nvidia Fan-boy? No pun intended but G-Sync is a good but stupidly PR stunt by Nvidia to allow only their cards to use it. It not only increases "exclusive" costs on already over-priced 1440P monitors but loses its use BECAUSE only Nvidia card can.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page