• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon VII 16 GB

Joined
Sep 15, 2016
Messages
322 (0.35/day)
Likes
61
AMD shit the bed with HBM2. Theoretically it was a smart investment (it is faster) but GDDR6 makes it irrelevant with current content for gamers and it's too expensive to sell their product at a price that would be competitive.

I had high hopes for HBM2, it's just too little too late.
 
Joined
Mar 23, 2005
Messages
3,274 (0.64/day)
Likes
769
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseEN Gaming PC
Processor AMD Ryzen 7 1700X @ stock
Motherboard ASRock Fatal1ty X370 GAMING X AM4
Cooling Corsair H115i PRO RGB, 280mm Radiator, Dual 140mm ML Series PWM Fans
Memory G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200
Video Card(s) Sapphire Radeon RX 580 8GB Nitro+ SE
Storage Corsair Force MP500 480GB M.2 (OS) + Force MP510 480GB M.2 (Steam/Games)
Display(s) Asus 27" (MG278Q) 144Hz WQHD 1440p + 1 x Asus 24" (VG245H) FHD 75Hz 1080p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ ASUS Xonar DGX PCI-E GX2.5 Audio Engine Sound Card
Power Supply Corsair TX750W Power Supply
Mouse Razer DeathAdder PC Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G15 Classic Gaming Keyboard
Software Windows 10 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor.
AMD shit the bed with HBM2. Theoretically it was a smart investment (it is faster) but GDDR6 makes it irrelevant with current content for gamers and it's too expensive to sell their product at a price that would be competitive.

I had high hopes for HBM2, it's just too little too late.
HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
On the other hand, For a professional card, go all in with HBM2.
 
Joined
Sep 15, 2016
Messages
322 (0.35/day)
Likes
61
HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
On the other hand, For a professional card, go all in with HBM2.
Is this not advertised as a gaming card?
 
Joined
Mar 23, 2005
Messages
3,274 (0.64/day)
Likes
769
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseEN Gaming PC
Processor AMD Ryzen 7 1700X @ stock
Motherboard ASRock Fatal1ty X370 GAMING X AM4
Cooling Corsair H115i PRO RGB, 280mm Radiator, Dual 140mm ML Series PWM Fans
Memory G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200
Video Card(s) Sapphire Radeon RX 580 8GB Nitro+ SE
Storage Corsair Force MP500 480GB M.2 (OS) + Force MP510 480GB M.2 (Steam/Games)
Display(s) Asus 27" (MG278Q) 144Hz WQHD 1440p + 1 x Asus 24" (VG245H) FHD 75Hz 1080p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ ASUS Xonar DGX PCI-E GX2.5 Audio Engine Sound Card
Power Supply Corsair TX750W Power Supply
Mouse Razer DeathAdder PC Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G15 Classic Gaming Keyboard
Software Windows 10 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor.
Joined
Apr 30, 2012
Messages
3,117 (1.24/day)
Likes
1,907
HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
On the other hand, For a professional card, go all in with HBM2.
You can go back and watch the unveiling at CES. AMD from the start market it as a Gaming + Creation card.
 
Joined
Feb 3, 2017
Messages
1,072 (1.38/day)
Likes
510
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
Can you check with Forza Horizon 4 at some point too? It has a reworked order processor/buffer which is supported by the VII, if i know correctly.
A couple of posts above there was talk about outliers. There is something in FH4 that brings down performance of Nvidia cards compared to AMD ones. at least in 1080p and 1440p. At 2160p the results are back to where you would roughly expect. On the other hand, TechReport in their Radeon VII review found (and AMD told them) that MSAA 8x is rough on AMD in that title.
 
Joined
Feb 22, 2009
Messages
678 (0.18/day)
Likes
655
System Name Artas1984
Display(s) Dell S2415H
Case Corsair Obsidian 800D
Audio Device(s) Asus Xonar Essence ST
Power Supply Evga 750 GQ
Mouse Razer DeathAdder 2013
Keyboard Roccat Ryos MK Glow
Software Windows 7 Pro 64 BIT
Benchmark Scores i've got a shitload of them
I will call this card Vega VII, IDK.

People should not be criticizing Vega VII performance based on 1080P and 1440P. It is clearly a 4K card. No one makes a 16 GB VRAM and 1 TERABYTE of bandwidth card even for 1440P. That will be a thing for the future, but not for now. Having said that, it's equal to GTX 1080 Ti and is 5 - 10 % short of RTX 2080, so it's not a failure. Just because it has the most horrable cooler since R9 290 means shit.

Second party factory models with much better coolers and core over-clocks will come eventually, at least equaling the RTX 2080 FE.
 
Joined
Jun 13, 2012
Messages
1,077 (0.44/day)
Likes
256
System Name desktop
Processor i7-4770k
Motherboard Asus z87-plus
Cooling Corsair h80
Memory 32gb G.Skill Ares @ 2400mhz
Video Card(s) EVGA GeForce GTX 1080 SC (ACX 3.0)
Storage 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB271HU 27inch IPS G-Sync 165hz
Audio Device(s) Sound Blaster x-FI Platium, Turtle beach Elite pro 2 + superamp.
Power Supply OCZ Z Series 850W
Mouse Razer Deathadder Elite
Keyboard Logitch G710+
Card isn't bad but part they failed on was going with 16gb vs 8gb. If they went with 8gb then price undercut would made the card better option to look at but at current price point its a kind a wash with check box for win to the rtx2080.
 
Joined
Jun 8, 2011
Messages
3,517 (1.24/day)
Likes
4,740
Location
Somerset, UK
System Name Not so complete or overkill
Processor 5960X @ 4.20Ghz @ 1.06v - For Crunching!!
Motherboard MSI X99 Titanium Gaming
Cooling Custom loop with old bits and pieces that I need to replace
Memory G Skill TridentX 3466 non RGB..
Video Card(s) 2 x EVGA GTX 1080 Ti SC Black Edition
Storage Sandisk 120Gb SSD, 2 x 2Tb Sammy 3.5" drives - Need more drives!!
Display(s) 3 x 23" LG IPS panels (can't remember model!!)
Case 10mm thick MDF on plastic risers.. It's kinda a case??
Audio Device(s) Onboard
Power Supply EVGA T2 1200w
Mouse Corsair thingy
Keyboard Corsair thingy
Software Windows 10
Benchmark Scores It's not to bad.. More importantly, it works!!
Wow that spirialed out of control quickly... That and 3 days of being not on the forum lol So apologises for the out of date reply..

Do tell. Compute is the only reason I could come up with after seeing these results. Otherwise, its slower, uses a lot more power, and is quite noisey in this form. Remind me why I should pay the same money for less?

Sorry.. what?
As I said, you could, meaning people have a choice, right or wrong, it's a choice for people to decide what they want to have :)

I was referring to the amount of cash, cards etc that Nvidia release, they either giving Intel ideas or just releasing a lot of cards to confuse people... Either ways, this thread is the same as an AMD v Intel one.. Everyone has a perference and will voice it high and wide... It's all a choice for whoever buys one :) If it was closer to the £600 mark or even less than that, I'd definitely consider it.. :) Sadly not enough cash to do so right now..

I never mentioned power issues, I'm guessing that's AMD giving the GPU core a little more juice than it needed, just like Vega.. Hopefully it might change in a new card release, whenever that might be but I don't really tend to worry about power usage... Efficiency is great but still, we've all had worse efficient GPUs in our time... I can remember a few :)
Seems I did mention the power issue as well... So I'm was kinda right :) (Makes a change! :))
 
Joined
Dec 18, 2015
Messages
51 (0.04/day)
Likes
27
System Name Avell old monster - Workstation T1 - HTPC
Processor i7-3630QM\i7-5960x\Ryzen 3 2200G
Cooling Stock.
Memory 2x4Gb @ 1600Mhz
Video Card(s) HD 7970M \ EVGA GTX 980\ Vega 8
Storage SSD Sandisk Ultra li - 480 GB + 1 TB 5400 RPM WD - 960gb SDD + 2TB HDD
Joined
Dec 31, 2009
Messages
13,875 (4.12/day)
Likes
8,033
I will call this card Vega VII, IDK.

People should not be criticizing Vega VII performance based on 1080P and 1440P. It is clearly a 4K card. No one makes a 16 GB VRAM and 1 TERABYTE of bandwidth card even for 1440P. That will be a thing for the future, but not for now. Having said that, it's equal to GTX 1080 Ti and is 5 - 10 % short of RTX 2080, so it's not a failure. Just because it has the most horrable cooler since R9 290 means shit.

Second party factory models with much better coolers and core over-clocks will come eventually, at least equaling the RTX 2080 FE.
It's performance numbers dictate it isn't. That type of bandwidth and that amount of VRAM isn't needed at 4K either. It is simply a cut down Instinct card marketed for gaming. Nobody said its a failure...(well ,maybe one or two people did) but people are disappointed that it is, in general, slower than a 2080, costs the same, uses a lot more power, noisey, drivers are borked at launch... etc. While I wouldn't call it a failure, there are a few shortcomings when compared to the RTX 2080 at the same price point.
 
Joined
Aug 8, 2016
Messages
56 (0.06/day)
Likes
30
System Name Acer Predator 500 Helios AMD
Processor AMD Ryzen 2700
Cooling 5 copper pipes with Aeroblade 3D
Memory 16GB DDR4 2400
Video Card(s) RX Vega 56 8GB HBM2
Storage 256GB SK Hynix SC300
Display(s) AU Optronics AHVA
Audio Device(s) Realtek HD
In fact, some games in @W1zzard 's review are heavily biased for nVidia GPUs (Darksiders 3 gives 30% diff, Civ6 gives 28% diff, Dragon Quest XI gives 43% diff and Hitman 2 gives 35% diff @1440P). On the other side, only Strange Brigade gives 10% diff for Radeon 7. Without those games, the difference on average would be 9% @1440P and 5% in 4K. The 2080 would win for sure but not by that far. Guru3D's review results on average are exaclty the ones without those biased games. And some DX12 games are absent also (Sniper Elite 4). I am not judging the professionalism of our @W1zzard but I am suggesting him some changes in his gamelist that would make the results better balanced.
This is something that I always complained about those charts, cause those games that puts nVidia on a huge advantage, are very easy on resources, they run over 180-250fps, which triggers Radeons notorious CPU bottlenecks on DX11 draw calls when drawing such huge amount of fps which makes no difference on playability. The RTX 2080 will be faster overall, but the gap would be smaller and not the reason to put AMD on better light, more like using games that are very optimized regardless of the platform and those games like Dragon Quest XI, looks like hot gbage and even an GTX 1050 can max those.
 
Joined
Oct 2, 2015
Messages
1,847 (1.46/day)
Likes
868
Location
Argentina
System Name Ciel / Yukino
Processor AMD Ryzen R3 1200 @ 3900MHz / Intel Core i3 5005U
Motherboard MSI B350M PRO-VDH / HP 240 G5
Cooling Wraith Stealth / Stock
Memory 2x 4GB Corsair Ballistix Sport DDR4 2400MHz @ 3066MHz / 2x 4GB Hynix + Kingston DDR3L 1600MHz
Video Card(s) Sapphire R9 270X Toxic 2GB 1250/1575MHz / Intel HD 5500
Storage SSD WD Green 240GB M.2 / SSD Kingston A400 120GB SATA
Display(s) HP w17e 1440x900 @ 75 Hz / Integrated 1366x768 @ 94Hz
Case Generic / Stock
Audio Device(s) Realtek ALC892 / Realtek ALC282
Power Supply Sentey XPP 525W / Power Brick
Mouse Logitech MX Revolution / Elan Touchpad
Keyboard Generic / Stock
Software Windows 10 Education x64 / Ubuntu 18.04 x64
Benchmark Scores Time Spy: 2200
It's not Nvidia's fault that AMD never spent money on multi-thread optimizations on their drivers. AMD's performance in DX11 is nothing compared to OpenGL, their performance difference there is abysmal.
 
Joined
Aug 8, 2016
Messages
56 (0.06/day)
Likes
30
System Name Acer Predator 500 Helios AMD
Processor AMD Ryzen 2700
Cooling 5 copper pipes with Aeroblade 3D
Memory 16GB DDR4 2400
Video Card(s) RX Vega 56 8GB HBM2
Storage 256GB SK Hynix SC300
Display(s) AU Optronics AHVA
Audio Device(s) Realtek HD
It's not Nvidia's fault that AMD never spent money on multi-thread optimizations on their drivers. AMD's performance in DX11 is nothing compared to OpenGL, their performance difference there is abysmal.
I am not blaming nVidia, is just that the GCN architecture itself is forward looking and the DX11 API is obsolete, its 10 years old! But GCN already needs a revamp, those 4 shaders per CU and 64 ROPs are hindering the potential, and on lesser degree, the hardware scheduler and the Tessellator.
 
Joined
Mar 18, 2015
Messages
1,632 (1.12/day)
Likes
913
Location
Long Island
With 16 GB VRAM, Radeon VII has more memory than any other graphics card below $1000, twice that of RTX 2080, which really makes no measurable difference in any of our tests.
Thanks for that :)
I found the performance per dollar charts really amusing lol.

AMD has just managed to build the second worst value consumer card ever, beaten only by the 2080Ti!
And people will still buy them. When you sell them as fast as you can make them, value is not a concept sales managers worry about. Like a shoe made by Jimmy Choo many folks pay for status, others will pay because having nothing but the best is "their thing". With AMD unable to put anything oin this space, you don't have to care about value when you have no competition.


because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia for my next build. 9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.
It's called capitalism .... corporations are bound to serve, better said legally required to maximize profits for their shareholders (withing the law) , no one else. A situation where stores can't keep cards in stock and you lower proces ... that's a betrayal of your reseonsibilities to shareholders.


What? You can't be serious? Why would you not get a likely perfectly good cpu that is going to be likely better than the 9700K because of this GPU? What did you really expect from this GPU? Simple math told you this is where it was going to be. Besides, the i9 9900K is basically the equivalent of this GPU in the CPU space.
Because everytime AMD says they will be ... they ain't. And then the fans are clamoring are all exclaiming, "yeah but it's great at (insert one of the things that 98% of consumers enver do".


i know the person who made them, I will ask him. I don't think he is trying to be misleading though, GND is a small site.
That's actually quite common... it's done to emphacize differences. 98 to 99 looks a lot bigger starting at 95 than 0 :) ...oft argued that using 0 makes graphic too big


It's a common problem with AMD across the board, even their CPU's are unnecessarily overvolted.
It's how AMD has competed since 290x .... they basically overlock the begeeezes out of their cards beore boxing them... when 290x was released, they got loads of press saying how it was faster than the 780 ... but it was a faux victory .... once the web sites had a chance to test those cards which came clocked out of the box running 95C, the 780 was the faster card ... Since htem AMDs overclocking has been limited to single digits whereas nVidia is almost always double digits since then, sometimes breaking 30%.


I wanted a RX Vega 64, but wasnt sure about my 3 monitor setup and the 8GB of ram, so have waited for the 16GB version for more than a year. The reference design have but one problem (besides of all the bugs the reviews talk of), the lack of DVI/HDMI support for 3 monitors.

What would be best for 3 times 1920x1200 setup in Eyefinity, a displayport converter to HDMI or DVI-D solution?

BTW danish price around 5.749 dkr including tax and delivery 22. feb 2019, actually some in stock but on sale for as high ad 6.500 dkr.
2160 p is just 4 x 1080p, so if ya have the memory for run 4k, you have a third more memory than ya need for 3 x 1080p. This card has 3 display ports as do most 2080s
 
Last edited:
Joined
Oct 2, 2015
Messages
1,847 (1.46/day)
Likes
868
Location
Argentina
System Name Ciel / Yukino
Processor AMD Ryzen R3 1200 @ 3900MHz / Intel Core i3 5005U
Motherboard MSI B350M PRO-VDH / HP 240 G5
Cooling Wraith Stealth / Stock
Memory 2x 4GB Corsair Ballistix Sport DDR4 2400MHz @ 3066MHz / 2x 4GB Hynix + Kingston DDR3L 1600MHz
Video Card(s) Sapphire R9 270X Toxic 2GB 1250/1575MHz / Intel HD 5500
Storage SSD WD Green 240GB M.2 / SSD Kingston A400 120GB SATA
Display(s) HP w17e 1440x900 @ 75 Hz / Integrated 1366x768 @ 94Hz
Case Generic / Stock
Audio Device(s) Realtek ALC892 / Realtek ALC282
Power Supply Sentey XPP 525W / Power Brick
Mouse Logitech MX Revolution / Elan Touchpad
Keyboard Generic / Stock
Software Windows 10 Education x64 / Ubuntu 18.04 x64
Benchmark Scores Time Spy: 2200
I am not blaming nVidia, is just that the GCN architecture itself is forward looking and the DX11 API is obsolete, its 10 years old! But GCN already needs a revamp, those 4 shaders per CU and 64 ROPs are hindering the potential, and on lesser degree, the hardware scheduler and the Tessellator.
I remember that during the first GCN releases, GCN was praised for giving priority to DX11 over 9.
DX11 and OpenGL are "obsolete" (for gaming), but not all developers want to "waste time" on the extra work that DX12 and Vulkan and Metal means (and it's A LOT of work), so, for some years, like it was with DX9, DX11 will still be a priority.
Also, there's even the issue that Vulkan is based off OpenGL ES, that means that it's not as feature rich (bindless textures for example), plus what's missing from dropping OpenGL compat.
 
Last edited:
Joined
Aug 8, 2016
Messages
56 (0.06/day)
Likes
30
System Name Acer Predator 500 Helios AMD
Processor AMD Ryzen 2700
Cooling 5 copper pipes with Aeroblade 3D
Memory 16GB DDR4 2400
Video Card(s) RX Vega 56 8GB HBM2
Storage 256GB SK Hynix SC300
Display(s) AU Optronics AHVA
Audio Device(s) Realtek HD
I remember that during the first GCN releases, GCN was praised for giving priority to DX11 over 9.
DX11 and OpenGL are "obsolete" (for gaming), but not all developers want to "waste time" on the extra work that DX12 and Vulkan and Metal means (and it's A LOT of work), so, for some years, like it was with DX9, DX11 will still be a priority.
Also, there's even the issue that Vulkan is based off OpenGL ES, that means that it's not as feature rich (bindless textures for example), plus what's missing from dropping OpenGL compat.
Vulkan is based on Mantle. Unless if I am wrong, Mantle is not based on OpenGL and even programs differently.
 
Joined
Mar 18, 2015
Messages
1,632 (1.12/day)
Likes
913
Location
Long Island
I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).

There are two more important things to take away:

1) The noise level is unacceptable
Is it the loudest card available today? Certainly near the "top". And it's an expensive, high-end model with 3 fans.
I wonder how AMD supporters feel about it? Do they think AMD lost contact with reality? Or maybe AMD simply doesn't respect their customers?

2) Many people kept saying that 7nm is the cure for all AMD problems. We get the first product and it clearly hasn't helped a lot.
Of course this node will get better over time, but AMD doesn't have time - Zen 2 is around the corner.
1. The noise:

@ Idle AMD VII Noise = 27 dbA / MSI 2080 Gaming = 0 dbA.
@ Load AMD VII Noise = 43 dbA / MSI 2080 Gaming =36 dbA ... that's 1.625 times louder .... the reference nVidia card is quieter tha MSI's factory OCd version.

2. It's been a long while since we have seen products from AMD competing at the top end in the gaming segment, that's why all the pre-launch talk is about 7nm and # of cores ... and no talk about results. Mantle was gonna change everything, HBM was gonna change everything, 7 nm was gonna change everything. The "pump and dump" crowd use this to rake in millions from uninformed investors and it keeps the name in the news. In the end the new machine's value remains tied to the performance it delivers (increase in actual user productivity or fps) , versus the cost of putting it on the desk (everything inside the box or connected to it.)

We very well could see significant improvements from the AIB cards or even after what I call the 2nd beta period (1st 3 months since release) when most problems are addressed but given the starting point we are looking at, I see this as a $550 card


It is worth mentioning that this is the first "top" model from AMD in recent years that don't come close to their Nvidia counterpart in performance, so in essence we can call this their largest fail yet.
AMD last took the title with the 290x .... it lasted about a week. They lost it in the press when the 780 Ti dropped a week later .... but they actully lost it before that when the web sites compared the 290x OC'd with the 780 OCd.... because of the decision to aggressively OC the 2xx series cards before putting them in the box, they lost the title once those cards were tested,


Just read a lot of reviews from multiple sites. The consensus is fairly clear: this is a statement card. A statement to the market that says AMD is not done making consumer GPU yet. And that is it. Value is not good for the current price ($599 would be the sweet spot).
I won't contest the $599, tho $550 seems more appropriate to my eyes considering PSU should be 100 watts larger.... will need an extra fan to keep case interior the same temps. But unless AIB folks get that noise down, that's a deal breaker ... should include a set of 30 foot cables so PC coud be placed in another room. As for the statement ... statements get made every day ... look at politics ... the question is, is anyone taking it seriously ? If price stays at $700, I have to say no.


While I don't care what brand I buy (I buy whatever has the best price/performance/power consumption), if you buy a gaming card either from AMD or NVIDIA they are all crap for any serious CAD work and won't accelerate anything, for that you need FirePro or Quadro. With NVIDIA at least you can use CUDA even on gaming cards.
That's a common misconception and least in the construction fields. Have been a practicing engineer for over 40 years and started my own consulting business in 1990. Because we couldn't get what we wanted without a $2,000 markup, we built our own boxes, for us and others. The fact is AutoCAD is almost entrely single threaded, no need or benefit to 6, 8, 10, or more core CPUs because the program can't use them. In addition, Firepro / Quadro are historically poor performers up against GTX cards when doing 2D and 3D CAD work. Not saying that a $2000 / $4000 Firepro / Quadro won't beat a $600 GTX card, but they win by less than 1% in2D CAD and actually lose by a significant margin in 3D CAD. Quadro and Fire pro, excell at modeling, rendering and animation. Radeon had oft taken the top spot over all other comers in AutoDesk Inventor. Maya most often favors Quadro followed by Firepro and same for Solidworks. Most Architect / Engineer offices I visit will typically have a say 6 - 10 CAD stations will have just one on Quadro for the rendering stuff and the rest for straight 2D / 3D CAD. The one reason for using Quadro in this environment ... no AutoDesk support. When visiting another office some years ago, after a meeting and a lunch, went back to office and one of the folks was asking me about AutoCAD performance tweaks. I sat down at his workstation (Quadro) and manuevered my way around the drawing ... opened it on my lappie (GTX) and it was faster. Like anything inside a PC case, it's wise to match your hardware and budget needs to your specific software requirements.
 
Last edited:
Joined
Oct 2, 2015
Messages
1,847 (1.46/day)
Likes
868
Location
Argentina
System Name Ciel / Yukino
Processor AMD Ryzen R3 1200 @ 3900MHz / Intel Core i3 5005U
Motherboard MSI B350M PRO-VDH / HP 240 G5
Cooling Wraith Stealth / Stock
Memory 2x 4GB Corsair Ballistix Sport DDR4 2400MHz @ 3066MHz / 2x 4GB Hynix + Kingston DDR3L 1600MHz
Video Card(s) Sapphire R9 270X Toxic 2GB 1250/1575MHz / Intel HD 5500
Storage SSD WD Green 240GB M.2 / SSD Kingston A400 120GB SATA
Display(s) HP w17e 1440x900 @ 75 Hz / Integrated 1366x768 @ 94Hz
Case Generic / Stock
Audio Device(s) Realtek ALC892 / Realtek ALC282
Power Supply Sentey XPP 525W / Power Brick
Mouse Logitech MX Revolution / Elan Touchpad
Keyboard Generic / Stock
Software Windows 10 Education x64 / Ubuntu 18.04 x64
Benchmark Scores Time Spy: 2200
Vulkan is based on Mantle. Unless if I am wrong, Mantle is not based on OpenGL and even programs differently.
Vulkan took the "command list" procedure from Mantle, it still is a Khronos' work (like OpenGL). If Vulkan was only Mantle (a GCN specific API), it wouldn't work on Nvidia and Intel too.
 
Joined
Jun 28, 2016
Messages
1,996 (2.01/day)
Likes
680
1. The noise:

@ Idle AMD VII Noise = 27 dbA / MSI 2080 Gaming = 0 dbA.
@ Load AMD VII Noise = 43 dbA / MSI 2080 Gaming =36 dbA ... that's 1.625 times louder .... the reference nVidia card is quieter tha MSI's factory OCd version.
1.625 times what? Power of the source? Certainly, not. :-D
 
Joined
Feb 3, 2017
Messages
1,072 (1.38/day)
Likes
510
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
This is something that I always complained about those charts, cause those games that puts nVidia on a huge advantage, are very easy on resources, they run over 180-250fps, which triggers Radeons notorious CPU bottlenecks on DX11 draw calls when drawing such huge amount of fps which makes no difference on playability.
So, Darksiders 3, Civ6, Dragon Quest XI, Hitman 2, right?
At 1440p, Radeon VII does 77.3, 88.9, 93.1 and 61.5 FPS average in these games.
At 2160p, Radeon VII does 39.3, 79.0, 44.4, 39.7.
What are you talking about, these are very far from 180-250 FPS.

I am simply suggesting not having 3 games using the same engine (Unreal engine 4) with 2 of them showing much worse performance for a specific brand (Only Senua's Sacrifice is optimised well for both AMD and nVidia out of those 3). Outliers aren't good for objectivity. And I don't think Civ6 is neutral in how it performs for AMD. But the solution to this could be for @W1zzard to show the performance of all games and he can count only the not biased ones (<25% diff for the same tier GPUs) in the summary. A win-win scenario for all imho.
Unreal Engine is by far the most popular engine around. One could argue that at 3 out of 21 it is underrepresented.
There are biased games and always will be. DX12 and Vulkan will make that a lot worse, with lower level APIs developers will write engines and games than end up being more focused to specific architectures.
 
Joined
Jun 28, 2016
Messages
1,996 (2.01/day)
Likes
680
Yes I believe so. Ask AMD why? I'm curious to how they answer that question lol
If the rumors about ~5000 pieces being made are true, than there's no point in "why". Why cares? Most of us won't even see this card live.

If they decide to make more of them, then the answer is pretty obvious. I've given it a few times already in other topics.
No one buys Radeon Instinct
AMD makes orders HBM possibly a year or so before order (maybe someone here knows?), but they end up with an awful GPU. What do you expect them to do? Throw it to the bin? They do whatever they can to sell the inventory.
In this case: they rebrand a GPGPU accelerator as a gaming card. Not a very efficient one, but with enough oomph to pull this off (performance-wise).
Mind you, it's still awful as consumer electronics.
 
Joined
Jun 28, 2016
Messages
1,996 (2.01/day)
Likes
680
In loudness, 7dB more is ~1.625 times more loud in psychoacoustics.
So first of all: I would love to see a text that describes how "n times more loud" is measured. Any source?
Of course I know we don't hear the power increase (which in this case is ~5), but where does the 1.625 come from?

Second: even assuming something called "psychoacoustics" works, what you said may only be correct for some reference levels.
Certainly, 30dB vs 23dB and 100dB vs 93dB have different values of perceived loudness increase. :)
 
Joined
Jan 15, 2015
Messages
191 (0.13/day)
Likes
94
I wasn't aware you made the rules regarding how I may measure the card. So it is either good or bad based upon exactly every criteria? So noise being bad and mentioning it as such still means card is bad overall, even though that maybe one can work around the noise in turn to enjoy the performance of the card if it is good? It sounds more like you are being pretentious.

May be obvious to you. Maybe not to others. I will take your constructive criticism into consideration next time I decide to measure how good or bad a device is.
The first sentence is a straw man. I didn't make the rules about logical expression with the English language, which is what my post was about. Logical expression dictates, regardless of your feelings (i.e. the pretentiousness complaint), that a person can only take one position at a time, not two contradictory positions simultaneously.

When a person says a card is loud and that they can't stand noise, that person is clearly saying noise is unacceptable to them and that the card is loud. This means that the card is included in the realm of things the person cannot tolerate. This means that, for that person, the card is not good (also known as unacceptable).

Performance encompasses all aspects of how a product performs, including the noise it produces. Furthermore, regardless of how many FPS it delivers, you already declared that the product belongs to the category of unacceptable products due to its noise output and your lack of tolerance for noise. Draw a Venn diagram.

Finally, the accusation about pretentiousness is an example of concern trolling (tone complaint). It's a distraction from the point and an example of the ad hominem fallacy.

So first of all: I would love to see a text that describes how "n times more loud" is measured. Any source?
Of course I know we don't hear the power increase (which in this case is ~5), but where does the 1.625 come from?

Second: even assuming something called "psychoacoustics" works, what you said may only be correct for some reference levels.
Certainly, 30dB vs 23dB and 100dB vs 93dB have different values of perceived loudness increase. :)
It seems that this site has all the data anyone here could possibly want on this subject:

http://www.sengpielaudio.com/calculator-levelchange.htm
 
Top