• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Reports Theft of Graphics IP, Stolen Information Not Core to Competitiveness

For some reason I really do hope its a stupid script kiddie that got lucky, gets a life crippling punishment for it and sets an example for all the other kiddies as to how the world really works.
 
For some reason I really do hope its a stupid script kiddie that got lucky, gets a life crippling punishment for it and sets an example for all the other kiddies as to how the world really works.

i agree. for some reason this make me think of some kid being carted away in hand cuffs repeatedly saying "I'm Famous!".
 
Someone is going to get sacked....

But this IP still holds value, like it or not. If someone wanted to create a GPU they could technically do it and remake a simular version of it.

I mean on products that hold IP still does'nt stop vendors from creating products that look a like or function a like.
 
Not exactly. Only needed to curb performance impact cause you only need full shading to preserve detail in far end scope when ray tracing. Ray tracing is good when few rays can make up for rasterization drawbacks. Tessellation was also underscored in this sampling theory area. You get better scans if the samples get compounded well.
One best case scenario is ray traced near versus rasterized far end. Rays are gpu scalable which is always good for business.


That was what I was against. Walls aren't high res when you render more, it is about the sample distribution. It has to follow detail curve analogously which ray tracing does natively.
In rasterization, you render in a similar pattern, however, since weight is on reads it curbs performance. It has to have equal weights to r:w for optimal scalability. Ray tracing has that. Rasterized graphics have a low cutoff point. Think of it as 512x rays versus 512x aa sampling.
Variably rate shading should have the biggest impact at higher resolutions or AA settings where the performance improvement to image quality trade off will be least noticed. I don't think it will do tons for the 1080p crowd, but then again it could help off higher fps for high refresh rates for that e-league higher refresh rates/fps at all costs they don't care how ugly it looks so it's a win win scenario really.
 
Variably rate shading should have the biggest impact at higher resolutions or AA settings where the performance improvement to image quality trade off will be least noticed.
Apropos, read the sentence again. That is an oxymoron I think. Don't believe me? You say it yourself - 'performance'.
I am a hobbyist aa enthusiast. I try to read a lot on it, since I don't understand much about computer graphics - that I do, in fact. Precisely 100% of my gpu purchases has been on aa, I literally would trade all higher res benefits just to be able to get the same crt image clarity. So, be careful what you say. I'm a fragile relic, but even a broken clock is right twice a day. The hardware and software developers ensemble are collecting 100% of their efforts to reach the correct tonage of tube tvs that crts had.
Res, is a post-truth symptom suffered by the millenials, in my opinion, and I have the industry projected development goals vector to back me. As you would say, all your base are belong to us. :)
 
Last edited:
Apropos, read the sentence again. That is an oxymoron I think. Don't believe me? You say it yourself - 'performance'.
I am a hobbyist aa enthusiast. I try to read a lot on it, since I don't understand much about computer graphics - that I do, in fact. Precisely 100% of my gpu purchases has been on aa, I literally would trade all higher res benefits just to be able to get the same crt image clarity. So, be careful what you say. I'm a fragile relic, but even a broken clock is right twice a day. Hardware and software developers are collecting 100% of their efforts to reach the correct tonage of tube tvs that crts had.
Res, is a post-truth symptom suffered by the millenials, in my opinion, and I have the industry projected development goals vector to back me. As you would say, all your base are belong to us. :)

If calculated to seconds, assuming an up-to-date digital analogy or even analogue with a second hand, a broken clock is correct 0.002% every day. And even then, scientifically speaking, a broken measurement system is never correct, although the co-incidence of time passing makes it appear so. But I know it's just a saying.

However, you say you buy all GPU's based on AA, but pixelation is just another visual distortion created by technology. CRT's had 'blur' around the phosphor dots - it's how they created colour but even a dot is not truly observed as reality in terms of what a human sees. I'm only saying this because you made a statement that 'Res(olution)?, is a post-truth symptom suffered by millenials, in my opinion...'. The get-out there is you stated it's an opinion but resolution is the means to reduce AA by creating ever smaller pixels from which to create straighter, razor sharp lines. As for the industry development, we're moving to 4k, to 8k and onwards, all for greater visual fidelity. So, resolution is the momentum of the industry.

CRT's were never awesome at AA, they were superb for colour clarity and minimal input lag. But even a CRT had a pixel limit by design.

As far as varibale rate shading goes, it's just another system to reduce processing power on items that don't require substansive detail, therefore increasing resources to render the remainder of the scene.

There are other software fixes for increased performance related the focal field of view (similar to how we do it) reducing resource intensity on areas of vision in the periphery (espec. with VR, I think).
 
Someone is going to get sacked....

But this IP still holds value, like it or not. If someone wanted to create a GPU they could technically do it and remake a simular version of it.

I mean on products that hold IP still does'nt stop vendors from creating products that look a like or function a like.
Depending on what the files actually are (there's 2 different claims about it, some say they're verilog-files of individual features (HDL) and some that they're microcode etc used in testing
Neither could be used to make same or similar GPU, in fact, neither should be of any real interest to anyone or any company, they don't know AMDs internal language to make any use of the verilog files, and if it's the microcode option it should be just as useless.
 
resolution is the means to reduce AA by creating ever smaller pixels from which to create straighter, razor sharp lines.
No sir, resolution is the brute force method in which we bring infinitely sharp lines to sample-parity down to our resolution. It ascribes no fine distinction as to how it can be measured to reach its abstract target. Ray tracing is open about that. 500 rays per pixel is as good as 500 samples antialiasing. You don't do 500x aa for its practical cost to benefit ratio would be low in rasterization hardware which is why analytical aa filters were developed to sample more aligned to parity with graphical digital assets before the resolution downscale occurs.

As far as varibale rate shading goes, it's just another system to reduce processing power on items that don't require substansive detail,
That is vice versa, graphics resort to variable shading because it under delivers. You don't fix it, unless it is broken.

As for the industry development, we're moving to 4k, to 8k and onwards, all for greater visual fidelity. So, resolution is the momentum of the industry.
Yes, I have seen it quoted a couple of times, but hear me out. When your idols backed out of that claim by bringing in ray tracing first, do you consider the goals you love and care haven't been switch-baited? Ray tracing does not have helper pixel artifacts commonly experienced by astute crt lovers, such as myself, for starters. You cannot fool me. I know what I see. :)
 
When your idols backed out of that claim...

Steady there fella. They're not my idols. But I'll leave to your views.
 
How did this turn into a CRT vs LCD/LED discussion...far as that goes and the color issue LCD/LED/OLED panels have made improvements over the years and will continue doing so. CRT's were a blurry mess I don't exactly miss them personally. It would be interesting to see how they look today on modern hardware that can offset their negative points, but that's another subject matter. It's all rather unrelated to variable rate shading however so not sure what you were getting at.
 
How did this turn into a CRT vs LCD/LED discussion...far as that goes and the color issue LCD/LED/OLED panels have made improvements over the years and will continue doing so. CRT's were a blurry mess I don't exactly miss them personally. It would be interesting to see how they look today on modern hardware that can offset their negative points, but that's another subject matter. It's all rather unrelated to variable rate shading however so not sure what you were getting at.
Lcds cannot distinguish between near hues when there is motion on the screen. Near hues being the antialiasing effect. Which you can call it ray tracing by extension. I wouldn't allot oleds in the same distinction since they do not suffer the same motion hue loss.
As for this...
How did this turn into a CRT vs LCD/LED discussion...
They are a relic of the past. You can take it as you will.
 
Back
Top