1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gigabyte GeForce GTX 1080 Ti AORUS Xtreme Edition Graphics Card Detailed

Discussion in 'News' started by Raevenlord, Mar 20, 2017.

  1. Raevenlord

    Raevenlord News Editor Staff Member

    Joined:
    Aug 12, 2016
    Messages:
    775 (2.43/day)
    Thanks Received:
    734
    It was only a matter of time before Gigabyte applied its custom works to the GeForce GTX 1080 Ti. The company has released some pictures of its upcoming AORUS Xtreme Edition - the company's take on what is the world's most powerful gaming graphics card ever released. As an AORUS branded card, the AORUS Xtreme will feature Gigabyte's Windforce (triple-slot, 3x 100mm fans) cooler with RGB lighting (16.8 million colors). Aiding its triple-fan cooling prowess is a direct copper contact with a 6-heatpipe design, as well as a built-in backplate.

    The 1080 Ti AORUS Xtreme Edition only has a single VR-link HDMI port on its front corner (while the GTX 1080 had two). On the rear IO however, you'll find 2x HDMI ports (ideal for VR-link), 3x DisplayPort, and 1x DVI. No information on pricing or clock speed is available at the moment, though the card is expected to hit shelves mid-April.

    Update: Clock speeds have been revealed by Gygabyte itself, and the card's OC Mode shows boost speeds of 1746 MHz and 1632 MHz base; while its Gaming Mode lowers those to 1721 MHz boost, and 1607 MHz base clocks.

    [​IMG] [​IMG] [​IMG] [​IMG]
     
    Last edited: Mar 27, 2017
  2. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    3,617 (3.13/day)
    Thanks Received:
    2,088
    Location:
    Texas
    I am really digging the new Gigabyte designs, however I say (In my opinion) its ruined by adding back the DVI as I think its much cleaner on the outputs without it.
     
    Caring1 and peche say thanks.
  3. Joss

    Joss

    Joined:
    Sep 10, 2014
    Messages:
    274 (0.27/day)
    Thanks Received:
    113
    The plastic shroud covers the heatsinks too much, it should impede airflow somewhat.
     
  4. peche

    peche Thermaltake fanboy

    Joined:
    Nov 7, 2014
    Messages:
    5,875 (6.11/day)
    Thanks Received:
    4,567
    Location:
    San Jose, Costa Rica
    i just realized that i love the led sayin' GeForce GTX than any other name or brand...no brand, no fan stop led, nothing, Geforce GTX, adding nvidia's logo will be nice too,

    Also Gigabyte's cooler, shroud should be metallic slim frame....also this is a big card, cant image weight,
     
    Crunching for Team TPU
  5. chodaboy19

    Joined:
    Nov 23, 2010
    Messages:
    99 (0.04/day)
    Thanks Received:
    5
    Does the overlapping fan setup work without issues? Looks pretty interesting!
     
  6. Dj-ElectriC

    Dj-ElectriC

    Joined:
    Aug 13, 2010
    Messages:
    2,867 (1.14/day)
    Thanks Received:
    1,365
    Trusting GB's fans is like trusting farts.

    Anybody who worked in a lab repairing parts and PCs knows what im talking about
     
  7. peche

    peche Thermaltake fanboy

    Joined:
    Nov 7, 2014
    Messages:
    5,875 (6.11/day)
    Thanks Received:
    4,567
    Location:
    San Jose, Costa Rica
    well... i could say that about zotac or some asus cards, i have never faced problems with their cards, i do know about some issues on GTX 1000 series...
     
    Crunching for Team TPU
  8. alucasa

    Joined:
    Apr 2, 2009
    Messages:
    3,505 (1.17/day)
    Thanks Received:
    2,291
    Overengineered for epeen?
     
  9. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    3,927 (1.86/day)
    Thanks Received:
    937
    What's with the triple slots recently???
     
  10. Joss

    Joss

    Joined:
    Sep 10, 2014
    Messages:
    274 (0.27/day)
    Thanks Received:
    113
    What's with the triple fans recently??? and the RGBs, and the over the board designs, and monster coolers for a 1060 that could do with passive... etc
    the answer is: they don't design for what we want but for what they think we should want. The explanation is out of place here because it's political/Historical.
     
  11. DecanFrost New Member

    Joined:
    Mar 21, 2017
    Messages:
    3 (0.03/day)
    Thanks Received:
    0
    I just got the 1080 version of this and I have to see the cooling on this card is simply awesome. I had a MSI 980TI and that card was at max temp every game (84C locked) with the new card it sits at about 70C gaming.
     
  12. night.fox

    night.fox

    Joined:
    Dec 23, 2012
    Messages:
    1,376 (0.84/day)
    Thanks Received:
    920
    Location:
    south korea
    One reason that i know of is that pascal is much cooler chip than maxwell.
     
  13. ddferrari

    Joined:
    Nov 28, 2015
    Messages:
    16 (0.03/day)
    Thanks Received:
    5
    Who cares how "clean" the end of the GPU looks? No one ever complained about DVI ports until the reference 1080 Ti omitted it. Now every monkey-see-monkey-doer out there is hopping on the "Get rid of the DVI port, man!" train, although they have no idea why. It makes no difference in a card that doesn't use the reference rear-exhaust cooler.

    A lot of overclockable 1440p Korean monitors have only one input: DVI - which is why they're overclockable. So no, nothing is ruined by including it, and no, I'm not going to try to use an adapter and hope it can handle my 120Hz refresh rate. Spend a lot of time admiring the back your pc, do you?
     
  14. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    3,617 (3.13/day)
    Thanks Received:
    2,088
    Location:
    Texas
    First of all, I have mentioned it plenty including how much better it looked when the newer R9 Fury line came out and how its really unnecessary. Second most people who don't want it is not only because it looks bad on the back of the card, but it also block airflow for a tech that has become outdated and unnecessary/unused on modern setups. Third, it makes it impossible to make a card single slot without physically modding the card. I like water cooling cards like others and some upgraded cards have better VRM sections/better yields for overclocking so are likely to be purchased even if the person is not using the cooler that comes with the card.

    Most monitors in this day and age use HDMI or Displayport. DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions being supported. Just because there are some monitors out there still around or being purchased does not mean we should still support the old tech. Adapters exist for pretty cheap and are easily available if you want to keep using the outdated tech similar to how for awhile VGA was. I know it does not work on those "Overclockable" monitors but that is a tiny niche compared to the rest of the market who is using now HDMI or DP and how the industry is pretty much only supporting those two on modern monitors (some keep the connector available, however in many cases it can't even support the monitors full performance).

    I stated it was my opinion that I did not like the look because of it and wished it was not there (Its called my opinion for a reason). Manufacturers do not need to keep an antiquated tech going forever and DVI's time is up.
     
  15. ddferrari

    Joined:
    Nov 28, 2015
    Messages:
    16 (0.03/day)
    Thanks Received:
    5
    "...most people who don't want it is not only because it looks bad on the back of the card"
    Please. It's one more jack on the back of your computer. It doesn't look bad, or good... it's just there. I doubt anyone's date walked out when they spot that ghastly DVI port. Nor did those LAN party invites just dry up.

    "...it also block(s) airflow..."
    Not on the non-reference design. AIB cards don't exhaust out of the case. In fact, the reference designs, sans DVI port, are hitting 84° C and throttling in reviews, while the AIB cards with DVI are staying in the 60's. No reviewer has openly questioned the card's cooling ability because it has DVI.

    "...for a tech that has become outdated"
    Incorrect. Korean 1440p overclockable monitors are still very much current and available now on Newegg. They all use DVI exclusively. See: Crossover 2795QHD; Pixio PX277; QNIX QX2710; Overlord Tempest X2700C; etc. They are likely no smaller a niche than 4K owners at this time.

    "...and unnecessary/unused on modern setups.
    1440p @120Hz is still very much a "modern" setup, and many of those monitors use DVI-D. It is quite necessary.

    "DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions."
    It handles my 1440p 120Hz signal perfectly. As shown above, this is hardly old tech.

    Thankfully AIB partners also disagree with your opinion of DVI, so I have nothing to prove here. But it rubs me the wrong way when someone basically states "I don't need it myself, so they should get rid of it all together". It's just so... Millennial.
     
    Last edited: Apr 8, 2017
    Fluffmeister says thanks.
  16. Caring1

    Caring1

    Joined:
    Oct 22, 2014
    Messages:
    5,281 (5.39/day)
    Thanks Received:
    3,249
    Location:
    Sunshine Coast
    So are generalizations :slap:
    Everyone is entitled to their own opinion here, just don't insist yours is the only correct one.
     
    Crunching for Team TPU
  17. ddferrari

    Joined:
    Nov 28, 2015
    Messages:
    16 (0.03/day)
    Thanks Received:
    5
    A factual debate on a forum?
    upload_2017-4-7_20-16-33.jpeg
     
  18. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    3,617 (3.13/day)
    Thanks Received:
    2,088
    Location:
    Texas
    In order:
    1: Uhh, well unfortunately it looks cleaner without it and allows for single slot modifications. My opinion is it does not look good on the card and it does not fit with what I would intend the card for and thats important to me!!!
    2: Aftermarket cards still blow out the back, not all the air like blowers but they still push some air that way and that is intrusive so the point still stands.
    3: You can still purchase a monitor that only supports VGA, that does not make it modern. Just because a few monitors decide to only have that as a connection does not make it a necessity for everyone else. If you go online and picked a monitor out at random, its most likely going to have HDMI. Not to mention again adapters for those still with the older tech. But I would saying overclocking a monitor is more niche than 4k is... (Yes I am aware overclocking a monitor does not work with the adapters)
    4: It is a modern resolution and refresh yes, but using an older tech and at that pretty much the max it will ever handle unless they decide to bring it back. Its the same concept VGA (D-SUB) went through.
    5: Congrats, its basically at its limit so yea its modern in that term but everything else is dated.

    AIB partners also build parts in quantity and all the upper Nvidia cards had the same outputs that included DVI so do the math. Also if your going to try and insult me and say I am being a "Millenial" by stating my opinion about a graphics card design, then I think you need to look in a mirror because:
    A: Your stating your opinion is more valid than mine because you said it.
    B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.

    So if anything by your definition of a "Millennial" (Which is actually a term used mostly to cover people born in the early 1980's to early 2000's which by that definition I am) your are acting more that way than I am. Either way my point is proven, if you wish to continue arguing over an opinion I merely stated as my own not directed at anyone nor intending to insult anyone then by all means go ahead. You will almost guarantee receive 1 like for every comment from fluffmeister for your trouble.
     
  19. ddferrari

    Joined:
    Nov 28, 2015
    Messages:
    16 (0.03/day)
    Thanks Received:
    5
    "Aftermarket cards still blow out the back, not all the air like blowers but they still push some air that way and that is intrusive so the point still stands."

    Take a quick peek at the chart below from TechPowerUp. No AIB review from any site I've read has stated that. I'll let you take a mulligan on that one if you'd like. ;)

    GPU Temperature Comparison:

    Idle/ Load/ Gaming Noise
    46°C 71°C 33 dBA Gigabyte GTX 1080 Ti Xtreme Gaming (DVI)
    53°C 72°C 35 dBA MSI GTX 1080 Ti Gaming X (DVI)
    46°C 69°C 33 dBA ASUS GTX 1080 Ti STRIX OC (DVI)
    34°C 84°C 39 dBA GTX 1080 Ti FE (No DVI)
    42°C 84°C 39 dBA Titan X Pascal (No DVI)

    https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/34.html

    A: Your stating your opinion is more valid than mine because you said it.

    No, I'm saying my opinion is more valid than yours because the entire AIB industry's actions support my view, and don't align with your assertions at all.
    The only reason they omitted DVI on reference cards is because there are major heat/throttling issues (see above chart) and they need all the help they can get, not because DVI is dead. I doubt they'd bother including the adapters if it was as antiquated as you say.

    Over half the world population now uses the internet- over 3.5 billion people. If a modest 1 in every 1,000 users has one of these monitors, that equates to 3.5 million out there worldwide. It's not nearly as "niche" as you claim.

    B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.

    And you're stating that they should stop supporting DVI-D because you think the port looks icky on the back of your case. Okay... (o_O)
    The difference here is that your opinion is all about you, while mine represents me and the other +/- 3.5 million folks- still using modern and desirable monitors- who would be shafted.
    So, yes: Millennial
     
    Last edited: Apr 8, 2017

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)