Wednesday, July 19th 2017

AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.

The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)
All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.
AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
Sources: Reddit User @ Szunyogg, RX Vega Budapest Google Photos, WCCFTech
Add your own comment

175 Comments on AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

#76
efikkan
Did AMD actually showcase anything new here? (besides the hired help?) It would be interesting with some performance figures.
Posted on Reply
#77
warup89
Has there been as much hype like this for an AMD GPU before? I've been following AMD ever since the ATI 9800pro days because I personally like them but dang everyone seems to be blowing this coming GPU out of the water, the 295x2 debut was not even like this even though that was an milestone for them in terms of raw performance.

I find it funny how "naked eye FPS comparisons" are even making tech news, like cmon hahah, whats next? footage of a pc running YouTube while using an RX ?
Posted on Reply
#78
silentbogo
warup89Has there been as much hype like this for an AMD GPU before? I've been following AMD ever since the ATI 9800pro days because I personally like them but dang everyone seems to be blowing this coming GPU out of the water, the 295x2 debut was not even like this even though that was an milestone for them in terms of raw performance.

I find it funny how "naked eye FPS comparisons" are even making tech news, like cmon hahah, whats next? footage of a pc running YouTube while using an RX ?
Well, RX480 was kind of the same way, except eventually we got the actual numbers. Same with Pascal.
The only difference is that Vega takes so long from an initial announcement to the actual release, that even pseudo-tech-enthusiast sites completely lost interest in making bogus benchmark screenshots and fake photos for the sake of ad revenue :banghead:
Everyone is tired... Radeon Group, OEMs, fans, potential buyers... even boobs don't help in this depressing situation. :cry:
Posted on Reply
#79
cdawall
where the hell are my stars
ZoneDymono it does not.... a stock HD5870 does only 4 fps less then a stock GTX480...
and you claimed you needed a HD5970 to get the same performance... in fact from there you see a HD5970 does a lot better.


and yeah nothing looks as hot as the GTX480 as the GTX480 was extremely hot ;)




ONE GAME DOES NOT DESCRIBE A VIDEO CARD. I don't know what way that needs to be phrased for people, but it is a thing. Overall the only same generation AMD card that is competitive is the 5970 and yes captain obvious it beats it. It also consumes 30 watts less power, which I already said.

The reference 480 was hot correct, it also consumed quite a bit of power (quite a bit less than vega), but that design was not bad, the fermi 2.0 cards were absolutely beautiful.
RejZoRThere was such disparity with GeForce FX in terms of heat and consumption compared to Radeon 9000 series and yet they were selling despite everyone saying its crap. You'll NEVER EVER see that with Radeon cards. As presented with endless exhibits.


2003 was the release period for the 9000 series AMD saw a 13% market share increase. They saw a 21% market share increase with the X800 series. The market follows when someone releases a good product just fine. They saw a slight market share bump with the Radeon 4000/5000 series as well. After that the next "good" AMD product was the tahiti based 79x0 cards which they yet again saw a bump from. Looks like the market works fine.
Posted on Reply
#80
warup89
silentbogoWell, RX480 was kind of the same way, except eventually we got the actual numbers. Same with Pascal.
The only difference is that Vega takes so long from an initial announcement to the actual release, that even pseudo-tech-enthusiast sites completely lost interest in making bogus benchmark screenshots and fake photos for the sake of ad revenue :banghead:
Everyone is tired... Radeon Group, OEMs, fans, potential buyers... even boobs don't help in this depressing situation. :cry:
Right, usually reviews should be out by now in order to entice the customers to buy their cards as soon as they are out. I hope this market strategy stops in the future because I dont see it doing any good for them, unless they are trying to hide something, time will tell.
Posted on Reply
#81
Footman
cdawallNow lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.
That's kind of how I feel to be honest. Shame really as I am still waiting for a decent AMD VGA card to run my Freesync monitor at 144hz... Gamer Nexus were able to successfully undervolt and run their Vega FE at 1600mhz, so it gives me hope that we will see 1080 performance in the gaming range of cards at less than 440W.
Posted on Reply
#82
FordGT90Concept
"I go fast!1!11!1!"
Look on the bright side: it's going to be faster than RX 580. ;)
Posted on Reply
#83
notb
TheLostSwedeThe latest generation of TN panels are actually not as bad as they used to be. See this review for example www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
Yes, they are.
Poor viewing angles are a built-in property of TN. This really IS an issue.
This is a 24" LCD, which helps a lot. If you go for a larger TN LCD (30"+) and use it at a typical desktop viewing distance (50-70cm), you'll already notice the difference near the edges.

From a different point of view: imagine a situation when 2 people at the office are looking at the same TN LCD while discussing e.g. a PowerPoint presentation. They will see different colours on the screen. Good luck. :D
Posted on Reply
#84
Anymal
Well, for gaming a modern TN is the best.
Any price speculations for big and aib Vega? Hbm2 is pricey and Vega chip is bigger than gp102, what can AMD do at all?
Posted on Reply
#85
NGreediaOrAMSlow
Hugh MungusSuppose AMD doesn't want the actual performance to leak. Guess I'm going back into hibernation.
That will make sense with product available on launch day. But a month or two later.... Nah!

nVidia could announce a product the day after, and have it in a market in a month.
Posted on Reply
#86
Bytales
I got the new Samsung 32 Inch 2560x1440 Freesync 2 Monitor, currently using a watercooled r9 nano.
Waiting for Vega. It'll definetly be an improvement over the Nano, thats for sure, and it will help with the 144hz freq of the Monitor.
By the way, this Monitor, while beeing VA, is a bit of HDR as well (600 nits) and it is as fast as a TN supposedly with a 1ms Response Time.
I believe THIS would be the best Monitor for Gaming.
If Vega has Performance at or above 1080 while costing less, ist fine by me. I naturally wished it was more, but hey, you cant have it all. Beside, probably with the maturing of Drivers more Performance will be squezed out of it.

Im allready gaming decently with the R9 Nano, which is basically a stand by Card (Had 2 1080 before, and sold them), so it cant be that bad with VEGA.
Besides, what i'm really hoping for is a DUAL GPU Card. Something with like 4 8 PIN PCI power. I would get me two of those just to Benchmark.
Posted on Reply
#87
ZoneDymo
FordGT90ConceptLook on the bright side: it's going to be faster than RX 580. ;)
Well that is kinda where I am surprised by AMD, in a negative sense.
Their old R9 390 is pretty much the same in performance to the RX 580...

Like is VEGA just a terrible architecture or what? cant they like just fit an RX 580 with a lot more of.. well everything and have that be the more high end card?
I mean unless you tell me it would not offer any more performance,
why not get an RX580 and put on that card 16gb of ram, 64 ROP's and 4608 shaders or something and sell it as an RX590?
Posted on Reply
#88
r.h.p
RaevenlordOn the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.

The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)



[---]

All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.



AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.



Sources: Reddit User @ Szunyogg, RX Vega Budapest Google Photos, WCCFTech
Geezus I'm gonna kill myself before Vega makes it out ...... Hurry Up
Posted on Reply
#89
Pap1er
I'm just wondering why ppl rumble about 440 watts of power draw of Vega FE @ liquid @ overclocked...
It does seem a little stupid, doesn't it?
Air cooled card wouldn't draw so much power anyway, air cooler won't allow that...
Posted on Reply
#90
the54thvoid
Intoxicated Moderator
Pap1erI'm just wondering why ppl rumble about 440 watts of power draw of Vega FE @ liquid @ overclocked...
It does seem a little stupid, doesn't it?
Air cooled card wouldn't draw so much power anyway, air cooler won't allow that...
Problem is that the higher draw is higher clocks is higher performance. Lower clocks and temp/power limiting means lower performance. Water cooled FE is about 10% faster than air cooled. Assuming RX performs better on gaming drivers by 10-20% (figures plucked out of air) that air cooling will limit performance.
If you want max 24/7 performance, you want water cooling on any card (or a very good custom fan design).
Posted on Reply
#91
Gasaraki
RejZoRGeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?
It doesn't matter if it's hot and power hungry AS LONG AS IT PERFORMS. The GTX280, GTX285, GTX295 were all furnaces but at least they performed at the top of their class. We already have a mid range card from AMD, the 580. We don't need another card that sucks power and produces tons of heat yet only performs on the level of the 1080. The 1080Ti is already 30% faster than the regular 1080.
Posted on Reply
#92
RejZoR
You're going against the "we don't care if it performs". Performs like what? If they pit it against GTX 1080 and price it around there and performs, how is that not "it performs"? RX 480 wasn't top of the line product, but they positioned it against GTX 1060 offering. And it performed.
Posted on Reply
#93
Gasaraki
RejZoRYou're going against the "we don't care if it performs". Performs like what? If they pit it against GTX 1080 and price it around there and performs, how is that not "it performs"? RX 480 wasn't top of the line product, but they positioned it against GTX 1060 offering. And it performed.
If you have 2 cards and they cost around the same, performance is around the same but one sucks 100W more power and can heat up a small room, which one do you buy?
Posted on Reply
#94
Anymal
100w more and 1080 performance, 1 year later. Yes, it performs. Big chip, hbm2,... price is also a big question. I think they need better than 1080ti gp102 performance, otherwise we will party like its 2016.
Posted on Reply
#95
cdawall
where the hell are my stars
Pap1erI'm just wondering why ppl rumble about 440 watts of power draw of Vega FE @ liquid @ overclocked...
It does seem a little stupid, doesn't it?
Air cooled card wouldn't draw so much power anyway, air cooler won't allow that...
Well issue one would be it vastly overdraws atx spec.
Posted on Reply
#96
RejZoR
GasarakiIf you have 2 cards and they cost around the same, performance is around the same but one sucks 100W more power and can heat up a small room, which one do you buy?
Well, that didn't stop people opting for GeForce FX and Fermi. Which is contradicting your own words. But now that AMD is mentioned, no no no, you can't have higher power draw, even if you deliver performance. Why?
Posted on Reply
#97
Anymal
GasarakiIf you have 2 cards and they cost around the same, performance is around the same but one sucks 100W more power and can heat up a small room, which one do you buy?
RejZoRWell, that didn't stop people opting for GeForce FX and Fermi. Which is contradicting your own words. But now that AMD is mentioned, no no no, you can't have higher power draw, even if you deliver performance. Why?
How much Fermi draw vs. VEGA?
Posted on Reply
#98
cdawall
where the hell are my stars
RejZoRWell, that didn't stop people opting for GeForce FX and Fermi. Which is contradicting your own words. But now that AMD is mentioned, no no no, you can't have higher power draw, even if you deliver performance. Why?
Fermi actually performed well? Power draw also went down with most aib cards. Temps really hurt power consumption on those.

I also posted that people did buy and those times with a ncie neat graph that had a breakdown showing market share increase for amd during those times. You have chosen to ignore it however because it doesnt fit your plight against the green.
Posted on Reply
#99
RejZoR
"Plight against the green." Are you seriously still grasping at those straws?
Posted on Reply
#100
Anymal
RejZoR"Plight against the green." Are you seriously still grasping at those straws?
All readers see your bullshit and you go on and on and on...
Posted on Reply
Add your own comment
Apr 23rd, 2024 17:56 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts