• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon VII 16 GB

Joined
Sep 15, 2016
Messages
473 (0.17/day)
AMD shit the bed with HBM2. Theoretically it was a smart investment (it is faster) but GDDR6 makes it irrelevant with current content for gamers and it's too expensive to sell their product at a price that would be competitive.

I had high hopes for HBM2, it's just too little too late.
 
Joined
Mar 23, 2005
Messages
4,054 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
AMD shit the bed with HBM2. Theoretically it was a smart investment (it is faster) but GDDR6 makes it irrelevant with current content for gamers and it's too expensive to sell their product at a price that would be competitive.

I had high hopes for HBM2, it's just too little too late.
HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
On the other hand, For a professional card, go all in with HBM2.
 
Joined
Sep 15, 2016
Messages
473 (0.17/day)
HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
On the other hand, For a professional card, go all in with HBM2.

Is this not advertised as a gaming card?
 
Joined
Mar 23, 2005
Messages
4,054 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
On the other hand, For a professional card, go all in with HBM2.

You can go back and watch the unveiling at CES. AMD from the start market it as a Gaming + Creation card.
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Can you check with Forza Horizon 4 at some point too? It has a reworked order processor/buffer which is supported by the VII, if i know correctly.
A couple of posts above there was talk about outliers. There is something in FH4 that brings down performance of Nvidia cards compared to AMD ones. at least in 1080p and 1440p. At 2160p the results are back to where you would roughly expect. On the other hand, TechReport in their Radeon VII review found (and AMD told them) that MSAA 8x is rough on AMD in that title.
 
Joined
Feb 22, 2009
Messages
757 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
I will call this card Vega VII, IDK.

People should not be criticizing Vega VII performance based on 1080P and 1440P. It is clearly a 4K card. No one makes a 16 GB VRAM and 1 TERABYTE of bandwidth card even for 1440P. That will be a thing for the future, but not for now. Having said that, it's equal to GTX 1080 Ti and is 5 - 10 % short of RTX 2080, so it's not a failure. Just because it has the most horrable cooler since R9 290 means shit.

Second party factory models with much better coolers and core over-clocks will come eventually, at least equaling the RTX 2080 FE.
 
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Card isn't bad but part they failed on was going with 16gb vs 8gb. If they went with 8gb then price undercut would made the card better option to look at but at current price point its a kind a wash with check box for win to the rtx2080.
 

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,861 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
Wow that spirialed out of control quickly... That and 3 days of being not on the forum lol So apologises for the out of date reply..

Do tell. Compute is the only reason I could come up with after seeing these results. Otherwise, its slower, uses a lot more power, and is quite noisey in this form. Remind me why I should pay the same money for less?

Sorry.. what?

As I said, you could, meaning people have a choice, right or wrong, it's a choice for people to decide what they want to have :)

I was referring to the amount of cash, cards etc that Nvidia release, they either giving Intel ideas or just releasing a lot of cards to confuse people... Either ways, this thread is the same as an AMD v Intel one.. Everyone has a perference and will voice it high and wide... It's all a choice for whoever buys one :) If it was closer to the £600 mark or even less than that, I'd definitely consider it.. :) Sadly not enough cash to do so right now..

I never mentioned power issues, I'm guessing that's AMD giving the GPU core a little more juice than it needed, just like Vega.. Hopefully it might change in a new card release, whenever that might be but I don't really tend to worry about power usage... Efficiency is great but still, we've all had worse efficient GPUs in our time... I can remember a few :)

Seems I did mention the power issue as well... So I'm was kinda right :) (Makes a change! :))
 
Joined
Dec 18, 2015
Messages
142 (0.05/day)
System Name Avell old monster - Workstation T1 - HTPC
Processor i7-3630QM\i7-5960x\Ryzen 3 2200G
Cooling Stock.
Memory 2x4Gb @ 1600Mhz
Video Card(s) HD 7970M \ EVGA GTX 980\ Vega 8
Storage SSD Sandisk Ultra li - 480 GB + 1 TB 5400 RPM WD - 960gb SDD + 2TB HDD
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I will call this card Vega VII, IDK.

People should not be criticizing Vega VII performance based on 1080P and 1440P. It is clearly a 4K card. No one makes a 16 GB VRAM and 1 TERABYTE of bandwidth card even for 1440P. That will be a thing for the future, but not for now. Having said that, it's equal to GTX 1080 Ti and is 5 - 10 % short of RTX 2080, so it's not a failure. Just because it has the most horrable cooler since R9 290 means shit.

Second party factory models with much better coolers and core over-clocks will come eventually, at least equaling the RTX 2080 FE.
It's performance numbers dictate it isn't. That type of bandwidth and that amount of VRAM isn't needed at 4K either. It is simply a cut down Instinct card marketed for gaming. Nobody said its a failure...(well ,maybe one or two people did) but people are disappointed that it is, in general, slower than a 2080, costs the same, uses a lot more power, noisey, drivers are borked at launch... etc. While I wouldn't call it a failure, there are a few shortcomings when compared to the RTX 2080 at the same price point.
 
Joined
Aug 8, 2016
Messages
85 (0.03/day)
System Name Evo PC
Processor AMD Ryzen 5800X3D
Cooling Noctua NH15
Memory G.Skill Ripjaws 32GB 3,600MHz
Video Card(s) Gigabyte Aorus RX 7900XTX Gaming OC
Storage 1TB WD SN750/512GB Samsung 870 EVO
Display(s) LG UltraGear 27 2K 165Hz
Case Fractal Define C
Audio Device(s) Creative X-Fi Titanium
Power Supply Supernova G2 1000
Mouse MSI Vigor GM11
Keyboard MSI Vigor GK30
In fact, some games in @W1zzard 's review are heavily biased for nVidia GPUs (Darksiders 3 gives 30% diff, Civ6 gives 28% diff, Dragon Quest XI gives 43% diff and Hitman 2 gives 35% diff @1440P). On the other side, only Strange Brigade gives 10% diff for Radeon 7. Without those games, the difference on average would be 9% @1440P and 5% in 4K. The 2080 would win for sure but not by that far. Guru3D's review results on average are exaclty the ones without those biased games. And some DX12 games are absent also (Sniper Elite 4). I am not judging the professionalism of our @W1zzard but I am suggesting him some changes in his gamelist that would make the results better balanced.

This is something that I always complained about those charts, cause those games that puts nVidia on a huge advantage, are very easy on resources, they run over 180-250fps, which triggers Radeons notorious CPU bottlenecks on DX11 draw calls when drawing such huge amount of fps which makes no difference on playability. The RTX 2080 will be faster overall, but the gap would be smaller and not the reason to put AMD on better light, more like using games that are very optimized regardless of the platform and those games like Dragon Quest XI, looks like hot gbage and even an GTX 1050 can max those.
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
It's not Nvidia's fault that AMD never spent money on multi-thread optimizations on their drivers. AMD's performance in DX11 is nothing compared to OpenGL, their performance difference there is abysmal.
 
Joined
Aug 8, 2016
Messages
85 (0.03/day)
System Name Evo PC
Processor AMD Ryzen 5800X3D
Cooling Noctua NH15
Memory G.Skill Ripjaws 32GB 3,600MHz
Video Card(s) Gigabyte Aorus RX 7900XTX Gaming OC
Storage 1TB WD SN750/512GB Samsung 870 EVO
Display(s) LG UltraGear 27 2K 165Hz
Case Fractal Define C
Audio Device(s) Creative X-Fi Titanium
Power Supply Supernova G2 1000
Mouse MSI Vigor GM11
Keyboard MSI Vigor GK30
It's not Nvidia's fault that AMD never spent money on multi-thread optimizations on their drivers. AMD's performance in DX11 is nothing compared to OpenGL, their performance difference there is abysmal.

I am not blaming nVidia, is just that the GCN architecture itself is forward looking and the DX11 API is obsolete, its 10 years old! But GCN already needs a revamp, those 4 shaders per CU and 64 ROPs are hindering the potential, and on lesser degree, the hardware scheduler and the Tessellator.
 
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
With 16 GB VRAM, Radeon VII has more memory than any other graphics card below $1000, twice that of RTX 2080, which really makes no measurable difference in any of our tests.

Thanks for that :)
I found the performance per dollar charts really amusing lol.

AMD has just managed to build the second worst value consumer card ever, beaten only by the 2080Ti!

And people will still buy them. When you sell them as fast as you can make them, value is not a concept sales managers worry about. Like a shoe made by Jimmy Choo many folks pay for status, others will pay because having nothing but the best is "their thing". With AMD unable to put anything oin this space, you don't have to care about value when you have no competition.


because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia for my next build. 9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.

It's called capitalism .... corporations are bound to serve, better said legally required to maximize profits for their shareholders (withing the law) , no one else. A situation where stores can't keep cards in stock and you lower proces ... that's a betrayal of your reseonsibilities to shareholders.


What? You can't be serious? Why would you not get a likely perfectly good cpu that is going to be likely better than the 9700K because of this GPU? What did you really expect from this GPU? Simple math told you this is where it was going to be. Besides, the i9 9900K is basically the equivalent of this GPU in the CPU space.

Because everytime AMD says they will be ... they ain't. And then the fans are clamoring are all exclaiming, "yeah but it's great at (insert one of the things that 98% of consumers enver do".


i know the person who made them, I will ask him. I don't think he is trying to be misleading though, GND is a small site.

That's actually quite common... it's done to emphacize differences. 98 to 99 looks a lot bigger starting at 95 than 0 :) ...oft argued that using 0 makes graphic too big


It's a common problem with AMD across the board, even their CPU's are unnecessarily overvolted.

It's how AMD has competed since 290x .... they basically overlock the begeeezes out of their cards beore boxing them... when 290x was released, they got loads of press saying how it was faster than the 780 ... but it was a faux victory .... once the web sites had a chance to test those cards which came clocked out of the box running 95C, the 780 was the faster card ... Since htem AMDs overclocking has been limited to single digits whereas nVidia is almost always double digits since then, sometimes breaking 30%.


I wanted a RX Vega 64, but wasnt sure about my 3 monitor setup and the 8GB of ram, so have waited for the 16GB version for more than a year. The reference design have but one problem (besides of all the bugs the reviews talk of), the lack of DVI/HDMI support for 3 monitors.

What would be best for 3 times 1920x1200 setup in Eyefinity, a displayport converter to HDMI or DVI-D solution?

BTW danish price around 5.749 dkr including tax and delivery 22. feb 2019, actually some in stock but on sale for as high ad 6.500 dkr.

2160 p is just 4 x 1080p, so if ya have the memory for run 4k, you have a third more memory than ya need for 3 x 1080p. This card has 3 display ports as do most 2080s
 
Last edited:
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
I am not blaming nVidia, is just that the GCN architecture itself is forward looking and the DX11 API is obsolete, its 10 years old! But GCN already needs a revamp, those 4 shaders per CU and 64 ROPs are hindering the potential, and on lesser degree, the hardware scheduler and the Tessellator.
I remember that during the first GCN releases, GCN was praised for giving priority to DX11 over 9.
DX11 and OpenGL are "obsolete" (for gaming), but not all developers want to "waste time" on the extra work that DX12 and Vulkan and Metal means (and it's A LOT of work), so, for some years, like it was with DX9, DX11 will still be a priority.
Also, there's even the issue that Vulkan is based off OpenGL ES, that means that it's not as feature rich (bindless textures for example), plus what's missing from dropping OpenGL compat.
 
Last edited:
Joined
Aug 8, 2016
Messages
85 (0.03/day)
System Name Evo PC
Processor AMD Ryzen 5800X3D
Cooling Noctua NH15
Memory G.Skill Ripjaws 32GB 3,600MHz
Video Card(s) Gigabyte Aorus RX 7900XTX Gaming OC
Storage 1TB WD SN750/512GB Samsung 870 EVO
Display(s) LG UltraGear 27 2K 165Hz
Case Fractal Define C
Audio Device(s) Creative X-Fi Titanium
Power Supply Supernova G2 1000
Mouse MSI Vigor GM11
Keyboard MSI Vigor GK30
I remember that during the first GCN releases, GCN was praised for giving priority to DX11 over 9.
DX11 and OpenGL are "obsolete" (for gaming), but not all developers want to "waste time" on the extra work that DX12 and Vulkan and Metal means (and it's A LOT of work), so, for some years, like it was with DX9, DX11 will still be a priority.
Also, there's even the issue that Vulkan is based off OpenGL ES, that means that it's not as feature rich (bindless textures for example), plus what's missing from dropping OpenGL compat.

Vulkan is based on Mantle. Unless if I am wrong, Mantle is not based on OpenGL and even programs differently.
 
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).

There are two more important things to take away:

1) The noise level is unacceptable
Is it the loudest card available today? Certainly near the "top". And it's an expensive, high-end model with 3 fans.
I wonder how AMD supporters feel about it? Do they think AMD lost contact with reality? Or maybe AMD simply doesn't respect their customers?

2) Many people kept saying that 7nm is the cure for all AMD problems. We get the first product and it clearly hasn't helped a lot.
Of course this node will get better over time, but AMD doesn't have time - Zen 2 is around the corner.

1. The noise:

@ Idle AMD VII Noise = 27 dbA / MSI 2080 Gaming = 0 dbA.
@ Load AMD VII Noise = 43 dbA / MSI 2080 Gaming =36 dbA ... that's 1.625 times louder .... the reference nVidia card is quieter tha MSI's factory OCd version.

2. It's been a long while since we have seen products from AMD competing at the top end in the gaming segment, that's why all the pre-launch talk is about 7nm and # of cores ... and no talk about results. Mantle was gonna change everything, HBM was gonna change everything, 7 nm was gonna change everything. The "pump and dump" crowd use this to rake in millions from uninformed investors and it keeps the name in the news. In the end the new machine's value remains tied to the performance it delivers (increase in actual user productivity or fps) , versus the cost of putting it on the desk (everything inside the box or connected to it.)

We very well could see significant improvements from the AIB cards or even after what I call the 2nd beta period (1st 3 months since release) when most problems are addressed but given the starting point we are looking at, I see this as a $550 card


It is worth mentioning that this is the first "top" model from AMD in recent years that don't come close to their Nvidia counterpart in performance, so in essence we can call this their largest fail yet.

AMD last took the title with the 290x .... it lasted about a week. They lost it in the press when the 780 Ti dropped a week later .... but they actully lost it before that when the web sites compared the 290x OC'd with the 780 OCd.... because of the decision to aggressively OC the 2xx series cards before putting them in the box, they lost the title once those cards were tested,


Just read a lot of reviews from multiple sites. The consensus is fairly clear: this is a statement card. A statement to the market that says AMD is not done making consumer GPU yet. And that is it. Value is not good for the current price ($599 would be the sweet spot).

I won't contest the $599, tho $550 seems more appropriate to my eyes considering PSU should be 100 watts larger.... will need an extra fan to keep case interior the same temps. But unless AIB folks get that noise down, that's a deal breaker ... should include a set of 30 foot cables so PC coud be placed in another room. As for the statement ... statements get made every day ... look at politics ... the question is, is anyone taking it seriously ? If price stays at $700, I have to say no.


While I don't care what brand I buy (I buy whatever has the best price/performance/power consumption), if you buy a gaming card either from AMD or NVIDIA they are all crap for any serious CAD work and won't accelerate anything, for that you need FirePro or Quadro. With NVIDIA at least you can use CUDA even on gaming cards.

That's a common misconception and least in the construction fields. Have been a practicing engineer for over 40 years and started my own consulting business in 1990. Because we couldn't get what we wanted without a $2,000 markup, we built our own boxes, for us and others. The fact is AutoCAD is almost entrely single threaded, no need or benefit to 6, 8, 10, or more core CPUs because the program can't use them. In addition, Firepro / Quadro are historically poor performers up against GTX cards when doing 2D and 3D CAD work. Not saying that a $2000 / $4000 Firepro / Quadro won't beat a $600 GTX card, but they win by less than 1% in2D CAD and actually lose by a significant margin in 3D CAD. Quadro and Fire pro, excell at modeling, rendering and animation. Radeon had oft taken the top spot over all other comers in AutoDesk Inventor. Maya most often favors Quadro followed by Firepro and same for Solidworks. Most Architect / Engineer offices I visit will typically have a say 6 - 10 CAD stations will have just one on Quadro for the rendering stuff and the rest for straight 2D / 3D CAD. The one reason for using Quadro in this environment ... no AutoDesk support. When visiting another office some years ago, after a meeting and a lunch, went back to office and one of the folks was asking me about AutoCAD performance tweaks. I sat down at his workstation (Quadro) and manuevered my way around the drawing ... opened it on my lappie (GTX) and it was faster. Like anything inside a PC case, it's wise to match your hardware and budget needs to your specific software requirements.
 
Last edited:
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
Vulkan is based on Mantle. Unless if I am wrong, Mantle is not based on OpenGL and even programs differently.
Vulkan took the "command list" procedure from Mantle, it still is a Khronos' work (like OpenGL). If Vulkan was only Mantle (a GCN specific API), it wouldn't work on Nvidia and Intel too.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
1. The noise:

@ Idle AMD VII Noise = 27 dbA / MSI 2080 Gaming = 0 dbA.
@ Load AMD VII Noise = 43 dbA / MSI 2080 Gaming =36 dbA ... that's 1.625 times louder .... the reference nVidia card is quieter tha MSI's factory OCd version.
1.625 times what? Power of the source? Certainly, not. :-D
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
This is something that I always complained about those charts, cause those games that puts nVidia on a huge advantage, are very easy on resources, they run over 180-250fps, which triggers Radeons notorious CPU bottlenecks on DX11 draw calls when drawing such huge amount of fps which makes no difference on playability.
So, Darksiders 3, Civ6, Dragon Quest XI, Hitman 2, right?
At 1440p, Radeon VII does 77.3, 88.9, 93.1 and 61.5 FPS average in these games.
At 2160p, Radeon VII does 39.3, 79.0, 44.4, 39.7.
What are you talking about, these are very far from 180-250 FPS.

I am simply suggesting not having 3 games using the same engine (Unreal engine 4) with 2 of them showing much worse performance for a specific brand (Only Senua's Sacrifice is optimised well for both AMD and nVidia out of those 3). Outliers aren't good for objectivity. And I don't think Civ6 is neutral in how it performs for AMD. But the solution to this could be for @W1zzard to show the performance of all games and he can count only the not biased ones (<25% diff for the same tier GPUs) in the summary. A win-win scenario for all imho.
Unreal Engine is by far the most popular engine around. One could argue that at 3 out of 21 it is underrepresented.
There are biased games and always will be. DX12 and Vulkan will make that a lot worse, with lower level APIs developers will write engines and games than end up being more focused to specific architectures.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
Yes I believe so. Ask AMD why? I'm curious to how they answer that question lol
If the rumors about ~5000 pieces being made are true, than there's no point in "why". Why cares? Most of us won't even see this card live.

If they decide to make more of them, then the answer is pretty obvious. I've given it a few times already in other topics.
No one buys Radeon Instinct
AMD makes orders HBM possibly a year or so before order (maybe someone here knows?), but they end up with an awful GPU. What do you expect them to do? Throw it to the bin? They do whatever they can to sell the inventory.
In this case: they rebrand a GPGPU accelerator as a gaming card. Not a very efficient one, but with enough oomph to pull this off (performance-wise).
Mind you, it's still awful as consumer electronics.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
In loudness, 7dB more is ~1.625 times more loud in psychoacoustics.
So first of all: I would love to see a text that describes how "n times more loud" is measured. Any source?
Of course I know we don't hear the power increase (which in this case is ~5), but where does the 1.625 come from?

Second: even assuming something called "psychoacoustics" works, what you said may only be correct for some reference levels.
Certainly, 30dB vs 23dB and 100dB vs 93dB have different values of perceived loudness increase. :)
 
Joined
Jan 15, 2015
Messages
362 (0.11/day)
I wasn't aware you made the rules regarding how I may measure the card. So it is either good or bad based upon exactly every criteria? So noise being bad and mentioning it as such still means card is bad overall, even though that maybe one can work around the noise in turn to enjoy the performance of the card if it is good? It sounds more like you are being pretentious.

May be obvious to you. Maybe not to others. I will take your constructive criticism into consideration next time I decide to measure how good or bad a device is.
The first sentence is a straw man. I didn't make the rules about logical expression with the English language, which is what my post was about. Logical expression dictates, regardless of your feelings (i.e. the pretentiousness complaint), that a person can only take one position at a time, not two contradictory positions simultaneously.

When a person says a card is loud and that they can't stand noise, that person is clearly saying noise is unacceptable to them and that the card is loud. This means that the card is included in the realm of things the person cannot tolerate. This means that, for that person, the card is not good (also known as unacceptable).

Performance encompasses all aspects of how a product performs, including the noise it produces. Furthermore, regardless of how many FPS it delivers, you already declared that the product belongs to the category of unacceptable products due to its noise output and your lack of tolerance for noise. Draw a Venn diagram.

Finally, the accusation about pretentiousness is an example of concern trolling (tone complaint). It's a distraction from the point and an example of the ad hominem fallacy.

So first of all: I would love to see a text that describes how "n times more loud" is measured. Any source?
Of course I know we don't hear the power increase (which in this case is ~5), but where does the 1.625 come from?

Second: even assuming something called "psychoacoustics" works, what you said may only be correct for some reference levels.
Certainly, 30dB vs 23dB and 100dB vs 93dB have different values of perceived loudness increase. :)
It seems that this site has all the data anyone here could possibly want on this subject:

http://www.sengpielaudio.com/calculator-levelchange.htm
 
Top