vega IS far better suited for (professional) compute applications
Yes, that's what I said. Kind of weird to try to make a point against me be repeating what I said like that. Is a snarky tone supposed to make it mean something different? This is a fact. Doesn't mean that it can't work for gaming. I was very surprised to see Vega 20 arrive as a gaming card, but apparently it performs well for this also. Nothing weird about that.
this is not about what AMD feels
Considering they're the only ones with any knowledge of how this performs in various workloads, it very much is. Launching it is also 100% AMD's decision, so, again this comes down to how AMD feels about it.
memory bandwidth gives nothing to gaming performance on vega
Source?
it is not double the ROPs, they are still 64
That seems to be true, given the coverage of how this was an error in initial reports. My post was made before this was published, though.
it's not really 7 nm, actual size is closer to intel's 10 nm.
Node names are largely unrelated to feature size in recent years - there's nothing 10nm in Intel's 10nm either - but this is still a full node shrink from 14/12nm. Intel being slightly more conservative with node naming doesn't change anything about this.
I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
If that's the case, he should look up the definition of "work". At best, that's stretching the truth significantly, but I'd say it's an outright lie. What you're describing there is not "not working".
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...
Why is that? AMD helped create an open standard. How this is implemented is not up to them, as long as the rules of the standard are adhered to (and even then, it's VESA that owns the standard, not AMD). AMD has no power to police how monitor manufacturers implement Freesync - which is how it should be, and not Nvidia's cost-adding gatekeeping. AMD has never promised more than tear-free gaming (unless you're talking FS2, which is another thing entirely), which they've delivered. AMD even has a very helpful page dedicated to giving accurate information on the VRR range of every single FreeSync monitor.
Also, your standard for "as dirty as" seems... uneven, to say the least. On the one hand, we have "created an open standard and didn't include strict quality requirements or enforce this as a condition for use of the brand name", while on the other we have (for example) "attempted to use their dominant market share to strong-arm production partners out of using their most famous/liked/respected/recognized brand names for products from their main competitor". Do those look equal to you?