Tuesday, January 7th 2014

AMD Responds to NVIDIA G-Sync with FreeSync

At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.

FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).

In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.Sources: The TechReport, AnandTech
Add your own comment

53 Comments on AMD Responds to NVIDIA G-Sync with FreeSync

#1
esrever
arterius2 said:
Wait, what exactly has AMD implemented here regarding Vsync? I'm not understanding this fully.

I mean, there really isn't a lot of options here, you either match the monitor with the GPU, or you match the GPU with the monitor, since you can't just magically "improve" performance on the GPU, then this is a hardware problem, not software.

I get that G-sync works by adjusting the monitor's refresh rate to match the GPU's frame rate - OK, fairly simple and straightforward, I get this.

But what is AMD's method here? according to my understanding of the article, AMD tries to insert "blank/fake" frames to pretentiously boost the frame rate to match that of the monitor, is that what they are doing here?

if I remember correctly, this is exactly the same method Lucidlogic tried to implement with their Virtu software which was nothing but gimmick, it only 'looked' good in benchmark scores, but it didn't fix the problem, if anything it created more stuttering and decreased overall performance.
AMD's way is exactly the same as nvidia's way which also uses vblanks. The only thing the gsync hardware does is hold a frame on the monitor side to wait for the gpu it displays the frame once the vblank time interval is up. They both use dynamic vblank to set a custom refresh rate.
Posted on Reply
#2
Wittermark
esrever said:
AMD's way is exactly the same as nvidia's way which also uses vblanks. The only thing the gsync hardware does is hold a frame on the monitor side to wait for the gpu it displays the frame once the vblank time interval is up. They both use dynamic vblank to set a custom refresh rate.
With AMD's method you still require hardware support on the monitor to alter its refresh rate. which means most people without this feature on their monitors are still out of luck.
Posted on Reply
#3
arterius2
esrever said:
AMD's way is exactly the same as nvidia's way which also uses vblanks. The only thing the gsync hardware does is hold a frame on the monitor side to wait for the gpu it displays the frame once the vblank time interval is up. They both use dynamic vblank to set a custom refresh rate.
oh, now I get it! I kept thinking how these would work on my current monitors "for free", but apparently you can't, so instead of going out and buy a gsync monitor I have to buy this new VESA compliant monitor that barely even exist atm?
Posted on Reply
#4
Xzibit
The support is already there in VESA standards CVT. Monitor manufactures might be paying royalties for it and not even utilizing it. That's the difference between this FreeSync and G-Sync.

G-Sync is a hardware add-on you pay extra for to lock you down into Nvidia eco-system and disable some of the monitors features and is limited to DisplayPort which is also a VESA standard (Irony).

FreeSync uses a VESA standard which can be active on all monitors.


Both ATI and Nvidia were in the VESA committee when these standards were made. Nvidia is just try'n to charge you hundreds extra for it.
Posted on Reply
#5
Bansaku
arterius2 said:
Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.



Oh, you mean like the adaptive v-sync setting in the Nvidia control panel?


Silence nVidia fanboy!! :P
Posted on Reply
#6
namae_nanka
arterius2 said:
Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.
Won't be the first time, happened before with concurrent kernel execution and fermi launch, the software support otoh...
Oh, you mean like the adaptive v-sync setting in the Nvidia control panel?

Adaptive VSync dynamically turns VSync on and off to maintain a more stable framerate.

http://www.geforce.com/hardware/technology/adaptive-vsync/technology

The dynamism isn't where you think it is.
Posted on Reply
#7
Steevo
It wasn't widely adopted and never implemented in their drivers as a option as so many people buy cheap crappy monitors. For another example ATI had tessellation years ago and it was never used as other vendors didn't want to, their hardware didn't support it and they didn't want to add it. http://en.wikipedia.org/wiki/Truform

ATI has always been good at pushing, their failure has always been on the follow through with the reality of it all.

Nvidia is much better at forcing and pushing, and their market share shows.
Posted on Reply
#8
Wanton
esrever said:
AMD's way is exactly the same as nvidia's way which also uses vblanks. The only thing the gsync hardware does is hold a frame on the monitor side to wait for the gpu it displays the frame once the vblank time interval is up. They both use dynamic vblank to set a custom refresh rate.
Again someone wrong on the internet. No G-Sync doesn't use vblank or vsync at all.
G-Sync works like this it sends your frame to display and then ask display to refresh it self. If your framerate drops below 30fps g-sync monitor duplicates last frame, but that doesn't cause stutter because next update can come even after some milliseconds from another update. So even framerate that fluctuates between 20 and 40 is smooth on G-Sync.

So when you are using G-Sync there is no vsync or vblank ever period.

It seems like some AMD fanboys wish that FreeSync was good as G-Sync but it's much crappier tech.
Also it needs new hardware in monitors and new VESA specification because current VESA specification is for integrated displays only like in laptops. That why they are showing it on those Toshiba laptops.

FreeSync is probably cheaper tech than G-Sync but you don't get latency reduction you get with G-Sync.

With G-Sync you probably get better CPU usage because CPU doesn't have to wait for anything, but with FreeSync there is still some sort of sync I think, so there must be some CPU overhead.

NVidia added frame duplication because lcd displays really don't work well under 30hz.
Because of that I think you can't have under 30hz with FreeSync. That means framerate fluctuating between 20 to 40 framerate isn't going to be smooth with FreeSync.
Posted on Reply
#9
Xzibit
Wanton said:
Again someone wrong on the internet. No G-Sync doesn't use vblank or vsync at all.
G-Sync works like this it sends your frame to display and then ask display to refresh it self. If your framerate drops below 30fps g-sync monitor duplicates last frame, but that doesn't cause stutter because next update can come even after some milliseconds from another update. So even framerate that fluctuates between 20 and 40 is smooth on G-Sync.

So when you are using G-Sync there is no vsync or vblank ever period.

It seems like some AMD fanboys wish that FreeSync was good as G-Sync but it's much crappier tech.
Also it needs new hardware in monitors and new VESA specification because current VESA specification is for integrated displays only like in laptops. That why they are showing it on those Toshiba laptops.

FreeSync is probably cheaper tech than G-Sync but you don't get latency reduction you get with G-Sync.

With G-Sync you probably get better CPU usage because CPU doesn't have to wait for anything, but with FreeSync there is still some sort of sync I think, so there must be some CPU overhead.

NVidia added frame duplication because lcd displays really don't work well under 30hz.
Because of that I think you can't have under 30hz with FreeSync. That means framerate fluctuating between 20 to 40 framerate isn't going to be smooth with FreeSync.
Dude please...
PC Perspective - PCPer Live! NVIDIA G-Sync Discussion with Tom Petersen, Q&A

G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.

  1. A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
  2. A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.
In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline. The result was varying frame latency and either horizontal tearing or fixed refresh frame rates. With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.
You were saying something about fanboyism ?

Shorting and extending vBlank is a VESA standard CVT since 2003. You can also switch on the fly between refresh-rates. Companies were already doing that for products far as 2011. Look up Intel Seamless Display Refresh Rate Switching.

The G-Sync module is a TCON that is used for PSR (Panel Self Resfreh). Theres nothing new in G-Sync that wasn't already a VESA standard.
Posted on Reply
#10
theoneandonlymrk
Nah your wrong, Vblank is poorly named ,It should have been Frame hold because thats what it does, via a Vesa standard the frame is held by the monitor itself and frame smoothness is not its only purpose, power reduction is also one of its main caveats because if nothing happens on screen the frame is held hence saveing power in many areas and its been co-operatively worked upon for some time as i believe intel and nvidia are working on their own tech to use Vblank ,

However Nvidia had shitloads of tegra 4s sitting around and thought they would slap their fanbase (just like with nv3d) for something that included a royalty fee to them, hell you never know it might catch on ,,,,or all monitor maker's will realise they can do the same without paying Nvidia a penny and Nvidia will likely roll the support in quitely later via a driver and or new Gpu's with SPECIAL Gsync built in only 500 extra notes
Posted on Reply
#11
Wanton
Xzibit said:
Dude please...
You were saying something about fanboyism ?
Dude did you read what you quoted? Even if it's reading vBlank below doesn't mean there vBlank/vSync in use on the monitor. NVIDIA is probably sending vBlank signal to ask for display refresh.
  1. A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
So everything I wrote in my post stands.
I've have owned both NVIDIA and AMD graphics card. Both are good cards but this time NVIDIA surely got better tech with G-Sync.

Maybe you should watch this Youtube: KhLYYYvFp9A
Posted on Reply
#12
Xzibit
What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
I fail to see anything about eliminating it ?

I think you need understand how a frame and timing is composed.

Nice first 2 post tho
:toast:
Posted on Reply
#13
vega22
RCoon said:
Freesync is supposed to be competing against GSync right? I'm fairly certain people buying GSync monitors have enough cash to splash on a good system considering they're spending so much on a GSync monitor. If this isn't for the high end market like GSync monitors, then it isn't competing at all, just another bit of free stuff for everyone.
dude you really are missing the point, nv are trying to create another market to skim the cream off.

why does anybody need gsync on their screen when the vesa standards already have systems in place to do what the same thing?

because it helps feed the stock prices.....

amd have shown that the system already works in tech already out which costs nothing. nvidia think you are stupid enough to go out and buy a new nvidia approved screen to plug into your nvidia gpu to get a feature which is already out and free.

nvidia fanbois are nearly getting as gullible as apple fanbois, damn.
Posted on Reply
#14
Xzibit
theoneandonlymrk said:
Nah your wrong, Vblank is poorly named ,It should have been Frame hold because thats what it does, via a Vesa standard the frame is held by the monitor itself and frame smoothness is not its only purpose, power reduction is also one of its main caveats because if nothing happens on screen the frame is held hence saveing power in many areas and its been co-operatively worked upon for some time as i believe intel and nvidia are working on their own tech to use Vblank ,

However Nvidia had shitloads of tegra 4s sitting around and thought they would slap their fanbase (just like with nv3d) for something that included a royalty fee to them, hell you never know it might catch on ,,,,or all monitor maker's will realise they can do the same without paying Nvidia a penny and Nvidia will likely roll the support in quitely later via a driver and or new Gpu's with SPECIAL Gsync built in only 500 extra notes
vBlank isn't a frame holding. vBlank [Vertical blank] is the vertical blanking area of a frame. The space from the end of the border to the end of the blanking verticaly in a frame.
Posted on Reply
#15
Serpent of Darkness
Wanton said:
Again someone wrong on the internet. No G-Sync doesn't use vblank or vsync at all.
G-Sync works like this it sends your frame to display and then ask display to refresh it self. If your framerate drops below 30fps g-sync monitor duplicates last frame, but that doesn't cause stutter because next update can come even after some milliseconds from another update. So even framerate that fluctuates between 20 and 40 is smooth on G-Sync.
G-Sync will have a tendency to ghost below 30 FPS...

Wanton said:

So when you are using G-Sync there is no vsync or vblank ever period.
"In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part." btarunar.

Wanton said:

It seems like some AMD fanboys wish that FreeSync was good as G-Sync but it's much crappier tech.
I think you meant to say "free" tech. To make a claim that it is "crappier," is a bias opinion on your part with zero proof to support your statement.

Any person can argue that your statement alone, makes you bias to one side.

Wanton said:

Also it needs new hardware in monitors and new VESA specification because current VESA specification is for integrated displays only like in laptops. That why they are showing it on those Toshiba laptops.
One, from what I have read on Anandtech.com, TV and displays that meet the VESA standard or compliance, is capable of using Free-Sync. Just like G-Sync, AMD Mantle, TruAudio, it will be enabled, most likely through a Driver. It will also be tweaked with a driver on AMD's end. The cost of "Free-Sync" is already in the retail price of the TV or monitor you own, have owned, and will own. Any AMD 5000 series generation, and after, is capable of using Free-Sync. It isn't limited to just laptop displays.

Second, I think a lot of people didn't get the indirect point made by AMD with the two Toshiba Satellite Laptops. You have one laptop using it, and you have another laptop "not" using it. Laptop A had a set FPS, and the other had a dynamically changing FPS because Free-Sync changes the static refresh rate to a dynamic one. So at the core of AMD's indirect point, the tech is there, it works, and it' not limited to any specific brand-name, display model or type like NVidia's G-Sync is. That's part of its allure.

Wanton said:


With G-Sync you probably get better CPU usage because CPU doesn't have to wait for anything, but with FreeSync there is still some sort of sync I think, so there must be some CPU overhead.
I don't think G-Sync will improve or hinder the CPU usage. G-Sync isn't designed to improve CPU Usage, on the other hand, AMD Mantle will improve CPU usage. The NVidia GPU is controlling G-Sync. It's not the CPU if I am not mistaken.

In addition, I don't think Free-Sync will do the same. The AMD Gpu will probably control Free-Sync, unless it is stated else where...
Posted on Reply
#17
Wanton
Xzibit said:
vBlank isn't a frame holding. vBlank [Vertical blank] is the vertical blanking area of a frame. The space from the end of the border to the end of the blanking verticaly in a frame.
Yes it's sad that vBlank term is used so wrong. When I was coding demos for Amiga vBlank was time after vSync has happened and before screen started to draw. But now if you look at wiki http://en.wikipedia.org/wiki/Vertical_blanking_interval
It's used as synonym for Vertical blanking interval. That why I said there is no vBlank anymore on monitor with G-Sync. It's sad that people are using wrong term for VBI but what can you do.

There is no vSync or vBlank/VBI on monitor when you are using G-Sync. That means there is no interval for display refresh. With G-Sync graphics card is asking for display refresh. With FreeSync display is still controlling refresh rate but display card driver is setting that inverval by guessing.
Posted on Reply
#18
RCoon
Gaming Moderator
marsey99 said:
nvidia fanbois are nearly getting as gullible as apple fanbois, damn.
Wow man, calling people fanboys? Calling someone an AMD/Intel/NVidia fanboy is about the most childish thing you can say on a forum, you may as well go around calling people poopie heads...
Seriously, I expected more from you of all people.

NVidia are a business, as are AMD. The whole point of a business is to sell people something they probably don't necessarily need. This keeps profits up and stockholders happy. NVidia happen to have marketing team, so they can successfully charge people for something they don't really need, because they went ahead and marketed GSync. AMD did nothing bit sit on good tech for a few years, and they poorly marketed it. As far as I can conjure, Freesync (VBlank) is not yet a complete VESA standard, so not all monitors have it. People might call it "Free"sync, but there's nothing stopping display manufacturers adopting it while it's not a standard, advertising their screens as having such technology, and charging more for it. Just means AMD missed out on some possible profit.
Posted on Reply
#19
Xzibit
Wanton said:
Yes it's sad that vBlank term is used so wrong. When I was coding demos for Amiga vBlank was time after vSync has happened and before screen started to draw. But now if you look at wiki http://en.wikipedia.org/wiki/Vertical_blanking_interval
It's used as synonym for Vertical blanking interval. That why I said there is no vBlank anymore on monitor with G-Sync. It's sad that people are using wrong term for VBI but what can you do.

There is no vSync or vBlank/VBI on monitor when you are using VBI. That means there is no interval for display refresh. With G-Sync graphics card is asking for display refresh but with FreeSync display is still controlling refresh rate but display card is setting that inverval by guessing.
?

Things haven't changed.

That's called the VFP [Vertical Front Porch]
vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
G-Sync is just extending or reducing the lines of the VFP and VBP to fit the timing of the closest refresh cycle.
When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.
VBI - Is the VBP of one frame to the VFP of another frame.
Posted on Reply
#20
Wanton
Xzibit said:
?
G-Sync is just extending or reducing the lines of the VFP and VBP to fit the timing of the closest refresh cycle.
No if that would be the case you would not have that latency reduction that they are promising. So it can't wait for next refresh cycle if you want that latency reduction. What you just quoted I haven't seen any mention about anything like that on NVIDIA written texts about G-Sync. Everything written by NVIDIA employees says graphics card is controlling display refresh. Find me something written by NVIDIA employee and i'll believe you.
Posted on Reply
#21
vega22
RCoon said:
Wow man, calling people fanboys? Calling someone an AMD/Intel/NVidia fanboy is about the most childish thing you can say on a forum, you may as well go around calling people poopie heads...
Seriously, I expected more from you of all people.

NVidia are a business, as are AMD. The whole point of a business is to sell people something they probably don't necessarily need. This keeps profits up and stockholders happy. NVidia happen to have marketing team, so they can successfully charge people for something they don't really need, because they went ahead and marketed GSync. AMD did nothing bit sit on good tech for a few years, and they poorly marketed it. As far as I can conjure, Freesync (VBlank) is not yet a complete VESA standard, so not all monitors have it. People might call it "Free"sync, but there's nothing stopping display manufacturers adopting it while it's not a standard, advertising their screens as having such technology, and charging more for it. Just means AMD missed out on some possible profit.
it aint man, it is nvidia wanting to get paid by everyone their products touch. making new standards so they get paid at the expense of us, the consumer. you think lg or samsung are going to take the hit for the extra cost of the gsync stuff? nope it will be bang on top of the rrp.

nvidia are just out to make more money, from old rope, for things which the rest of their segment where working together to fix in an attempt to keep costs down.

idk about amd missing the ball, they was playing ball waiting for the vesa standard which is now implemented in some screens. i mean amd cards have the ability to do this for a few years now (5k cards and later). nvidia are again trying to make people spend again as their old, yet still powerful cards do not supprt it (650 and bellow...).

as for the fan bois comment, how else can you describe someone who is blind to the truth in front of them? if you think for as second that nv are doing this for our good, then it is already too late you.
Posted on Reply
#22
arterius2
heres what Nvidia had to say about the whole Freesync vs Gsync debacle:
Quote from Guru3D: http://www.guru3d.com/news_story/nvidia_responds_to_amd_freesync.html
NVIDIA

He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."
Basically, to sum it up, reason why AMD showcased it on some crappy laptop is because it's the only system could utilize this feature atm. Why is it free? because AMD hasn't invested a lot of research nor could offer any form of polished product at the given moment. So I guess this ends all discussions then.
Posted on Reply
#23
theoneandonlymrk
Wanton said:
No if that would be the case you would not have that latency reduction that they are promising. So it can't wait for next refresh cycle if you want that latency reduction. What you just quoted I haven't seen any mention about anything like that on NVIDIA written texts about G-Sync. Everything written by NVIDIA employees says graphics card is controlling display refresh. Find me something written by NVIDIA employee and i'll believe you.
Look ,If you agree with something or not does not really matter, we dont have to reach a concensus here, but saying it has to come from Nvidia's mouth, I mean whats that even about, they are not the only graphical computing experts.
and as far as me being wrong before i might as well add No , I was right, regardless of the how and what with ,,,all these things hold the displayed frame until the next one is sent by the gpu instead of updateing erespectively and without a new frame to show its really that simple.

Oh and Arterias , Exactly Where is this tech going to benefit AMD the most, APU powered laptops der , most high end sytems just switch Vsync on and buy enough GPU to power there 60fps 1080p gameing at a smooth rate, and thats the bit i dont get, all this talk of Gsync keeping it smooth betwwen 60-20Fps, I only see 20Fps on a few benchmarks and i wouldnt pay top end for a monitor for that.


Case point, My guy bought the Whole Nvidia 3d setup and paid 360 uk notes just for the monitor(19 inch too and not That long ago) an asus nvidia special with active glasses, he finally has the gpu grunt to power most things easily but i have'nt seen him use 3d in two years, i bought a 70 quid 22" hanns G so A i didnt get ripped off and B i have'nt messed up any carpets with vomit.

Im no more impressed by Gsync or freesync for that matter then I was in 3d
Posted on Reply
#24
Steevo
Actually there is quite a bit of lag created between 20 & 30 FPS, and really the graphics card could ask the monitor to change the refresh rate over HDMI to 24 FPS/refresh, 30, or 60 on the fly based on current render time and that fixes the (vsync tearing) problem, I tried 24Hz refresh rate last night on my TV and it works fine if a bit laggy, which is the same thing that happens with any time constraint between display frames and input.
Posted on Reply
#25
Xzibit
arterius2 said:
Basically, to sum it up, reason why AMD showcased it on some crappy laptop is because it's the only system could utilize this feature atm. Why is it free? because AMD hasn't invested a lot of research nor could offer any form of polished product at the given moment. So I guess this ends all discussions then.
research? The research has been there for years.

Look up PSR [Panel Self Refresh] a far better tech that is both 2D, 3D complaint and power savings allowing the monitor to be unplugged.

Youtube: xxIbNdT1fn8

Youtube: TLMb4VtCzaw

AMD Gaming Blog

Doing the work for everyone

In our industry, one of the toughest decisions we continually face is how open we should be with our technology. On the one hand, developing cutting-edge graphics technology requires enormous investments. On the other hand, too much emphasis on keeping technologies proprietary can hinder broad adoption.

It’s a dilemma we face practically every day, which is why we decided some time ago that those decisions would be guided by a basic principle: our goal is to support moving the industry forward as a whole, and that we’re proud to take a leadership position to help achieve that goal.

The latest example of that philosophy is our work with dynamic refresh rates, currently codenamed "Project FreeSync”. Screen tearing is a persistent nuisance for gamers, and vertical synchronization (v-sync) is an imperfect fix. There are a few ways the problem can be solved, but there are very specific reasons why we’re pursuing the route of using industry standards.

The most obvious reason is ease of implementation, both for us from a corporate perspective and also for gamers who face the cost of upgrading their hardware. But the more important reason is that it’s consistent with our philosophy of making sure that the gaming industry keeps marching forward at a steady pace that benefits everyone.

It sometimes takes longer to do things that way — lots of stakeholders need to coordinate their efforts — but we know it’s ultimately the best way forward. This strategy enables technologies to proliferate faster and cost less, and that’s good for everyone.

The same philosophy explains why we’re revealing technology that’s still in the development stage. Now’s our chance to get feedback from industry, media and users, to make sure we develop the right features for the market. That’s what it takes to develop a technology that actually delivers on consumers’ expectations.

And Project FreeSync isn’t the only example of this philosophy and its payoffs. We worked across the industry to first bring GDDR5 memory to graphics cards— an innovation with industry-wide benefits. And when game developers came to us demanding a low-level API, we listened to them and developed Mantle. It’s an innovation that we hope will speed the evolution of industry-standard APIs in the future.

We’re passionate about gaming, and we know that the biggest advancements come when all industry players collaborate. There’s no room for proprietary technologies when you have a mission to accomplish. That’s why we do the work we do, and if we can help move the industry forward we’re proud to do it for everyone.
The only thing Nvidia has done is try to leap frog standards. Who are Nvidias partners on G-Sync? None are panel producers. All are monitor sellers. It doesn't make sense to pay royalties to tech you have received only to replace it with another. I assume that's why G-Sync monitor cost +$200. They have to make up the cost for the parts they are removing.
Posted on Reply
Add your own comment