Monday, January 31st 2011

NVIDIA Developing Subpixel Reconstruction Antialiasing to Compete with MLAA

NVIDIA is developing its own antialiasing (AA) technology to rival morphological antialiasing (MLAA). The subpixel reconstruction antialiasing (SRAA) NVIDIA is currently conducting research on aims to provide better image quality with minimal performance penalty. It "combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost," as NVIDIA puts it in its research abstract. SRAA is suited for rendering engines that don't use multisample antialiasing (MSAA) since they use deferred shading. For such renderers, SRAA works as a post-processing layer, just like MLAA.

Where SRAA defers from MLAA is that the new algorithm can better respect geometric boundaries and has a fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. "For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality," claims NVIDIA.Source: NVIDIA Research
Add your own comment

33 Comments on NVIDIA Developing Subpixel Reconstruction Antialiasing to Compete with MLAA

#1
RejZoR
Good to hear NVIDIA is also working on this. Since AMD released MLAA i was pretty much locked to their products since i love MLAA so much. If NVIDIA pulls it off i'll be again able to choose between NVIDIA and AMD. Plus if they can make it to filter the image well with very little performance hit, that's even better. I'm hoping for GTX 600 series to already have this...
Posted on Reply
#3
ZoneDymo
by: RejZoR
Good to hear NVIDIA is also working on this. Since AMD released MLAA i was pretty much locked to their products since i love MLAA so much. If NVIDIA pulls it off i'll be again able to choose between NVIDIA and AMD. Plus if they can make it to filter the image well with very little performance hit, that's even better. I'm hoping for GTX 600 series to already have this...
GTX 600 series??
Excuse me?

This sort of tech needs to be on the 8800 series and up not be exclusive to just the latest cards.
Posted on Reply
#4
btarunr
Editor & Senior Moderator
SRAA can technically be deployed to all current-generation NV GPU users via driver update.
Posted on Reply
#5
Bjorn_Of_Iceland
Im thinking any nv gpu is capable as long as it can receive programmable shaders (G80 and above)
Posted on Reply
#6

I hope they don't have the blurring and text fracking of the MLAA...
#7
Jack Doph
by: TAViX
I hope they don't have the blurring and text fracking of the MLAA...
That's mainly a driver issue I found.
The current driver (11.1) has removed all those oddities for me :)
Posted on Reply
#8
alwayssts
While this tech is all well and good, did anyone find the last sentence woefully out of place?

It felt like they were opportunely calling allusion to the AMD default texture quality blunder. While that wasn't cool on AMD's part, I feel they've been called on it enough (and have since fixed it.)

It's kind of like: "And one more thing...*kicks opponent in the balls when they aren't looking.*"

How tactful from a company that hardware engineering is realistically slightly behind on average, even if their software and marketing teams are ahead. I still remember 3Dmark03 tricks before they started (insert PC phrase) companies to develop toward their hardware differences/strengths, ala Heaven/HAWX2 this go-round, PhysX/AA the last; Let's not forget doing Vantage physics scores on the GPU (when the test isn't using the GPU for anything else) when intended for a CPU after conveniently buying the company that owned the API. There are many more hardware/software examples of unrepresentative performance/quality and underhanded tactics.

Not trying to be a troll or start a combative thread; nVIDIA has a good lineup with great features at admirable cost/performance ratios, and the 600 series likely will be as well. Just putting it out there that they don't need to lower quality settings for performance when their settings/features are almost-always the optimized default, and even then there are tricks. Just because the tactic is different, doesn't make it any less bullshit.
Posted on Reply
#10
RejZoR
by: ZoneDymo
GTX 600 series??
Excuse me?

This sort of tech needs to be on the 8800 series and up not be exclusive to just the latest cards.
You wish. I never said it can't be done, it WON'T be done from a marketing perspective.
Radeon 9600 Pro was also able to utilize Adaptive AA but was never officially supported.
GeForce 6600GT was also able to utilize Transparency AA but was never officially supported.

So even if GeForce 2 could do it, they won't enable it to existing users just like AMD doesn't officially support MLAA for HD5000 users even though the cards can easily do it.
So yes, GTX 600 series...
Posted on Reply
#11
bear jesus
It's nice to see Nvidia is working on this, the more things that one company bring out then the other company counters with their own version the better for all users as it just helps push for better and better image quality.
Posted on Reply
#12
Jack Doph
by: RejZoR
....AMD doesn't officially support MLAA for HD5000 users even though the cards can easily do it....
And it's available to 5xxx users..
Posted on Reply
#13
RejZoR
No it's not. Via hacked drivers, yes. Officially. No. Only driver that supported it was 10.10e hotfix and supposedly 11.1a hotfix. None of the WHQL drivers support it.
Posted on Reply
#14
Jack Doph
Yup, it is :)



This is a screenie of my HD5850 :)
Posted on Reply
#15
Jack Doph
I must add that this is actually starting to go off-topic..

For all intents & purposes, this idea of nVidia is not a bad one.
Posted on Reply
#16
Bjorn_Of_Iceland
by: RejZoR
You wish. I never said it can't be done, it WON'T be done from a marketing perspective.
Well they did include 8 series to utilize physx when it came out 2 years later after the card's debut through drivers.. and that was when they were selling 9 series and 200 series.

Their software / driver department is huge, I think they can easily pull it off (including the 8 series capable of Subpixel Reconstruction Antialiasing.. heck even 6 series, since its capable of programable shaders lol)
Posted on Reply
#17
KashunatoR
they should have done this for a long time
Posted on Reply
#18
RejZoR
by: Bjorn_Of_Iceland
Well they did include 8 series to utilize physx when it came out 2 years later after the card's debut through drivers.. and that was when they were selling 9 series and 200 series.

Their software / driver department is huge, I think they can easily pull it off (including the 8 series capable of Subpixel Reconstruction Antialiasing.. heck even 6 series, since its capable of programable shaders lol)
Well, with PhysX its another story because in that case you need a reason for developers to take time and develop with PhysX. Where in SRAA, it doesn't matter. It just works in any game.
Meaning they do not depend on 3rd party developers and as such don't need the attention from the entire user base. It would be nice to have SRAA at least on GeForce 400 series and above, but i don't think it will happen.
Posted on Reply
#19

by: Jack Doph
That's mainly a driver issue I found.
The current driver (11.1) has removed all those oddities for me :)
I'm still on 10.10e...To afraid to go to 11.1a :eek::o:D

Or should I....?? (5870 user)
#20
Q-ho
I hope NVIDIA will change the name coz so far SRAA means ..taking a sh** in my language .
Posted on Reply
#21
char[] rager
Just a question...

Is MSAA still the best anti-aliasing in terms of picture quality? I could care less about performance impact.
Posted on Reply
#22
dir_d
by: char[] rager
Just a question...

Is MSAA still the best anti-aliasing in terms of picture quality? I could care less about performance impact.
I think SSAA is best if im not mistaken.
Posted on Reply
#23
bear jesus
by: dir_d
I think SSAA is best if im not mistaken.
I think you are right.

It just sucks that it comes with the biggest performance hit :(
Posted on Reply
#24
Jack Doph
by: TAViX
I'm still on 10.10e...To afraid to go to 11.1a :eek::o:D

Or should I....?? (5870 user)
Well.. If you're not experiencing any issues.. is there really a compelling reason to update your drivers atm? :)
Perhaps wait for this month's release instead?
Posted on Reply
#25
ViperXTR
from what i understand, this SRAA is post processing based, similar to MLAA. ITs only applied when the full image/frame is already rendered by the GPU. AMD used DirecCompute for their MLAA and takes advantage of the parallel processing of their GPU to do this post process effect costing much less resources from the GPU. nVidia would prolly use their own CUDA engine to enable this effect. (traditional AA is usually handled by the ROPs in the end of the render pipeline)
Posted on Reply
Add your own comment