Monday, August 14th 2017

Gigabyte Announces Its RX Vega 56 Reference Model Graphics Card

In a slightly anemic post on their website, Gigabyte has been the first of AMD's AIB partners to showcase a reference design RX Vega 56 graphics card. This was a silent addition to their website; no press releases have been sent out as of yet. There is only so much a company can say about their reference design graphics cards, though, absent of coming up with interesting phrases to sell the card's capabilities. Specifically, Gigabyte praised:

Next Gen Compute Units
These revamped nCUs (1 nCU = 64 stream processors) are designed to operate at incredible clock speeds and deliver extreme gaming experiences with the newest high resolution and high refresh rate monitors.
Revolutionary Memory Engine
State of the art memory system shatters the limitations of traditional GPU memory, to tackle the growing high res texture packs in today's games.

Optimized Pixel Engine
Optimized rasterizer technology enhances rendering efficiency and leaves more headroom to crank up quality settings while maintaining smooth 3D rendering.

Enhanced Geometry Engine
The enhanced geometry engine can efficiently process millions of polygons that make up your game, helping boost FPS.

Features
  • Powered by AMD Radeon RX VEGA 56
  • Air Cooling System
  • 8GB 2048-bit High Bandwidth Memory (HBM2)
  • Radeon VR Ready Premium
  • 3rd Gen FinFET 14
  • Features HDMIx1/ DisplayPortx3
  • System power supply requirement: 650W
Core Clock
Boost Clock: 1471 MHz
Base Clock: 1156 MHz

The card features 3x DisplayPort 1.3 and 1x HDMI 2.0b outputs. Card size stands at H= 40 mm, L= 280 mm, W= 127 mm (including bracket). If AMD's pricing isn't totally shipwrecked by limited availability and high demand from miners, expect this version of AMD's RX Vega to go for $399.Source: Gigabyte
Add your own comment

11 Comments on Gigabyte Announces Its RX Vega 56 Reference Model Graphics Card

#2
claylomax
Don't worry, that DVI port will be back when AIB partners release their versions.
Posted on Reply
#3
TheLostSwede
claylomax said:
Don't worry, that DVI port will be back when AIB partners release their versions.
Nooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo!
Posted on Reply
#4
Hood
TheLostSwede said:
Nooooooooooooooooo
I not sure I appreciate your attitude towards DVI ports, some of my best friends use DVI - you must be one of those port snobs I've been hearing about...
Posted on Reply
#5
P4-630
The Way It's Meant to be Played
Posted on Reply
#6
Dammeron
P4-630 said:

I don't know, whether You're laughing at the comments above, or author's wishful thinking of Vega's availability. :P
Posted on Reply
#7
P4-630
The Way It's Meant to be Played
Dammeron said:
I don't know, whether You're laughing at the comments above, or author's wishful thinking of Vega's availability. :p
The comments above actually..:p
Posted on Reply
#8
GhostRyder
TheLostSwede said:
Finally DVI as put to rest... :toast:
I sure hope we start to see less of DVI on cards. They really are not needed anymore and take up to much room on the back of the card. I would rather all ports be on the bottom in one layer on the card instead of two and the only way them seem to be willing to do that is without DVI (Since the DVI most of the time ends up by itself on the top row).
Posted on Reply
#9
Totally
GhostRyder said:
I sure hope we start to see less of DVI on cards. They really are not needed anymore and take up to much room on the back of the card. I would rather all ports be on the bottom in one layer on the card instead of two and the only way them seem to be willing to do that is without DVI (Since the DVI most of the time ends up by itself on the top row).
I don't have a thing against DVI. It's just whenever it is included it is shamelessly up into the second slot blocking airflow and looking ugly.
Posted on Reply
#10
GhostRyder
Totally said:
I don't have a thing against DVI. It's just whenever it is included it is shamelessly up into the second slot blocking airflow and looking ugly.
I agree as I to dont like the blocked airflow/look/dual slot conversion. However my main thing is its old tech and both HDMI and (especially) DisplayPort both outclass it. Some people want it for support on certain monitors, but even so most of the new tech, resolutions, refresh, etc are only going to work on HDMI and Displayport so I think it should go. Adapters exist to make a DP into a DVI if people really need it.
Posted on Reply
#11
Totally
GhostRyder said:
I agree as I to dont like the blocked airflow/look/dual slot conversion. However my main thing is its old tech and both HDMI and (especially) DisplayPort both outclass it. Some people want it for support on certain monitors, but even so most of the new tech, resolutions, refresh, etc are only going to work on HDMI and Displayport so I think it should go. Adapters exist to make a DP into a DVI if people really need it.
I am aware of that. Until it is cheaper to include an adapter in box than solder another output on there for pennies, as long as there is any demand for DVI, it will be included in order to not risk losing a sale.
Posted on Reply
Add your own comment