• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD to Unlock Professional Features for Radeon VII to Blunt RTX 2080's Ray-tracing Edge

I'm sure that somebody is happy to get 4 TFLOPS of double precision compute power. That won't be gamers though.
so what do you think of Nvidias double precision then at 1/32th on 2080 , that would'nt be in the T range, and your arguments moot 4Tflops DP makes it the fastest consumer card on the market for DP afaik.

at least there is more then two apps that use DP, eh.
 
Generation Fermi of Quadro 5000 and 6000 cards could enable ECC mode and lose 12.5% of memory space in the same time.
If the cards prove to be stable in professional production then they will be very popular with pro people that are used to pay a lot for their cards.
 
I won't complain about more features, especially since they only need a software update, but as someone pointed out in another thread, this card has a serious identity crisis. Adding more pro-level features to a supposedly consumer product won't help clearing that out.
I don't really think we can speak of consumer product for $700 which is 2nd tier only in the GPU list.
 
Meanwhile, in an AI lab somewhere, scientists are playing network Doom on some Radeon Instinct accelerator cards.
Meanwhile, in an AI lab somewhere, scientists are checking whether Radeon Instinct / VII supports CUDA. No, it doesn't.
Nice move imho. They can place Radeon 7 as a successor of Vega FE which was costing over $1000 when launched. And at 30% more performance with 30% less money, the VFM is strong there for whoever needs it.
With the tiny issue that Radeon VII is not a card for professional use. And this move by AMD doesn't change much.
Do you really don't know why they do it?
It's meant to attract this very specific kind of customer... who will be so amazed by the "pro features", that he'll buy this for gaming. We have lots of candidates on this forum :-)
 
Meanwhile, in an AI lab somewhere, scientists are checking whether Radeon Instinct / VII supports CUDA. No, it doesn't.

With the tiny issue that Radeon VII is not a card for professional use. And this move by AMD doesn't change much.
Do you really don't know why they do it?
It's meant to attract this very specific kind of customer... who will be so amazed by the "pro features", that he'll buy this for gaming. We have lots of candidates on this forum :)
You seam blinded to it's use cases and veiled in green why argue with you eh ,you still got that 1050 ti?
 
You seam blinded to it's use cases and veiled in green why argue with you eh ,you still got that 1050 ti?
Man, and when I almost believed you've grown up or found the pills. What's wrong with you?

We're discussing professional use of GPUs and you're still using the argument of what I own (the same one for almost a year).

As for use cases: I'm answering to a particular post that mentions AI. And the case of AI is pretty simple. CUDA dominated it. There's really no point in getting a Radeon.
 
Low quality post by eidairaman1
Useless post above
 
I don't think it has ECC, as @xkm1948 has mentioned. That makes it unsuitable for some specific tasks. The Titan Volta card had this issue as well (I think). ECC is what makes the actual compute cards, specifically suited to those workloads.

Yep.

You cannot publish in good journals, or use in biomedical research without stating the computation hardware has ECC support. It is not black and white and seriously frowned upon. Well at least in the field of genomics/epigenomics research that I work in.

Meanwhile, in an AI lab somewhere, scientists are checking whether Radeon Instinct / VII supports CUDA. No, it doesn't.

With the tiny issue that Radeon VII is not a card for professional use. And this move by AMD doesn't change much.
Do you really don't know why they do it?
It's meant to attract this very specific kind of customer... who will be so amazed by the "pro features", that he'll buy this for gaming. We have lots of candidates on this forum :)


I dont get it why one specific member is constantly down voting you for stating a fact.

In scientific research CUDA absolutely DOMINATES. Scientists are there to solve medical problems, not spending days after days to make the software work.

Take a look at NCBI Pubmed publication with "CUDA" in keywords as it partially reflects the amount of work done/developed with CUDA:

https://www.ncbi.nlm.nih.gov/pubmed/?term=cuda

711 biology related publications


Purely just OpenCL

https://www.ncbi.nlm.nih.gov/pubmed?term=(opencl) NOT cuda

58 publications.


In my specific department, literally NO ONE buys AMD GPU for professional work, which inlcudes but not limited to protein docking simulation and genome alignment.
 
Man, and when I almost believed you've grown up or found the pills. What's wrong with you?

We're discussing professional use of GPUs and you're still using the argument of what I own (the same one for almost a year).

As for use cases: I'm answering to a particular post that mentions AI. And the case of AI is pretty simple. CUDA dominated it. There's really no point in getting a Radeon.
Yes it might Have, but the pricing pushed customers to make their own (google =Tensor(1st)$$$$$), and AMDs take on it with vII is capable of 8bit Rpm im sure i read 4bit too(think 64bit register how many 8bit ops fit in ,packed, think AI quadratics now possible easier,accelerated) , that's a lot of math in a 64bit wavefront(stream processor) and a potent Ai tool.

Point , price.
 
Last edited:
I dont get it why one specific member is constantly down voting you for stating a fact.
I though that's pretty obvious. Because he probably has no knowledge of the topic at hand. And because what I say (be it true or not) hurts his feelings.

I don't expect gamers on a gaming-focused forum to know much about professional or scientific computing. It's shouldn't be important to them. And I'm not running around topics with gaming benchmarks saying "but in sucks in AI".
But as we touch other topics - computing, accelerating productivity software - why not simply be honest? :-)
This is a community after all and last time I checked communities were about sharing knowledge and experience, not getting votes. If I cared about votes, I'd be active on Instagram or Tinder instead. ;-)

Yes it might Have, but the pricing pushed customers to make their own , and AMDs tske on it with vII is capable of 8bit Rpm im sure i read 4bit too , that's a lot of math in a 64bit wavefront(stream processor) and a potent Ai tool.

Point , price.
You don't understand costs and software development more than I understand your quasi-English.
As far as computing goes, Nvidia GPUs cost more to buy, but they cost less to use. That's the whole story.
 
I though that's pretty obvious. Because he probably has no knowledge of the topic at hand. And because what I say (be it true or not) hurts his feelings.

I don't expect gamers on a gaming-focused forum to know much about professional or scientific computing. It's shouldn't be important to them. And I'm not running around topics with gaming benchmarks saying "but in sucks in AI".
But as we touch other topics - computing, accelerating productivity software - why not simply be honest? :)
This is a community after all and last time I checked communities were about sharing knowledge and experience, not getting votes. If I cared about votes, I'd be active on Instagram or Tinder instead. ;-)


You don't understand costs and software development more than I understand your quasi-English.
As far as computing goes, Nvidia GPUs cost more to buy, but they cost less to use. That's the whole story.
Your opinion is not mine, deal with it , not once have you not spoken bullshit ,not once have you replied when I rectified your alarming lack of knowledge of Rpm

And the irony I've been running scientific research flat out for years.

And I can show equally impressive amounts of scientific (legit) work done on Amd hardware, its like opencl and vulkan don't exist as does reallity.

AI does not need ecc.

I fully understand where ecc is and is not needed in scientific research , same goes for precision and im not arguing against some things being beyond Radeon VII but im not dillusional in the opposite direction either .

And as for the GPU, I put my money where my mouth was.. and used it to do scientific research most of the time with slight bouts of gaming , i except im a minority but im not in the minority of people who mouth off without buying their revered brand and doing something with it ??.

your on here saying i cant use my vega 64 or a VII for pro uses yet i think i still will , im not getting a VII though , no money, tut.:)
 
Last edited:
amd bashed in reviews for releasing a compute oriented card against geforce rtx,adds more compute to "blunt the rtx's edge".
makes sense to some apparently.
 
Nice move imho. They can place Radeon 7 as a successor of Vega FE which was costing over $1000 when launched. And at 30% more performance with 30% less money, the VFM is strong there for whoever needs it.
This is something AMD should have done back at CES.
This card made no sense without Pro Drivers support.
 
This is something AMD should have done back at CES.
This card made no sense without Pro Drivers support.
It still makes no sense.
Performance is there and now so are the features (at least some). But this is still not a workstation GPU.
Maybe it would make sense plugged into a server, but clearly not in a desktop.

Also, it still lacks pro support and stuff. I don't think any serious IT department would allow this into an office. There are policies for such things.
 
Also, it still lacks pro support and stuff. I don't think any serious IT department would allow this into an office. There are policies for such things.
That is what Quadros / Radeon Pro cards are for.
I doubt there are many non-nVidia or non-Intel hardware at big corporations' IT deprartments anyway.
The fact that those require no explaination to your boss when something goes wrong etc.
IT department is often the first to go when there are lay-offs, so I doubt many would take the "risks" either way.

I am thinking more from the hobbiest / Freelancer point of view.
 
Funny how Nvidia got bashed by some people, when they unlocked certain pro features in Titan Xp.

"Its an evil company, crippling the GPUs deliberately...", " I will never buy Nvidia again..."

Now, AMD does the same thing, and everyone applauds. Go figure...
Funny? It's not even surprising. At least not to me.
 
Its a 1080ti rival.

Bitter much, don't be a r3+@#

I've had enough with the people of trying to justify that amd performs better at this price point. People are so fucking dumb to remember things nowadays, a few months ago, they criticized nvidia over and over and over again about it's price to performance ratio, now this gpu comes along and all the fanboys are loving it. this is pure and utter bullshit. nvidia brought many different innovations to the market, I wonder what amd accomplished in last 2 years? don't get me started on the 7nm process and hbm2, both are not the technologies of amd. (7nm TSMC, HBM2 Samsung)

This is not a gaming card, if you want to compare this to something else, compare it to Quadro because it has a repurposed chip from MI50 with ECC memory. AMD is losing money just to compete with nvidia and investors soon be aware of it. If and when I get one of these cards, I'll be using on testing my ROCm platform, till then, this card is just a compute card, like all other amd cards, they all have brute force performance with extreme power consumption.

All of the amd processors, since the beginning of their time, consumed much more power compared to innovative companies. they must realize that this needs to stop somewhere.
 
it's what amd is capable of delivering at the moment,unimpressive but kicking it won't change much,they have nothing left in the tank to impress us .anyone who saw how pascal massacred vega in high end should've seen it coming with the introduction of a more advanced turing,though tbh the way this card clocks at 7nm is absolutely pathetic and I think everyone expected more.
 
it's what amd is capable of delivering at the moment,unimpressive but kicking it won't change much,they have nothing left in the tank to impress us .anyone who saw how pascal massacred vega in high end should've seen it coming with the introduction of a more advanced turing,though tbh the way this card clocks at 7nm is absolutely pathetic and I think everyone expected more.

I am fine with what AMD can offer. I guess the majority of people are simply annoyed but the mental gymastics some crazed AMD fans are performing right now to spin a not-so-desirable product into a win.
 
Back
Top