• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD's FreeSync ?

Joined
Mar 31, 2012
Messages
875 (0.18/day)
Location
NL
System Name SIGSEGV
Processor AMD Ryzen 9 9950X
Motherboard MSI MEG ACE X670E
Cooling Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Fury Beast 64 Gb CL30
Video Card(s) TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 27" /34"
Case O11 EVO XL
Audio Device(s) Realtek
Power Supply FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
according videcardz.com there is a report that amd is quietly preparing a free tool alternative to counter nvidia's gsync. It's called with FreeSync though it's not final name. As usual amd will not take your penny even 1 cent to use this application aka FREE. In addition the good part is, it requires no additional hardware like nvidia's g-sync. All you need is monitor which meet VESA standard requirements and capable to run VBlank. However it'll take some (much) time to develop.

Note : take this info with a pinch of salt..
 
Great news, and I think this is better than Light Strobing, since it works with 60Hz and you don't need a 3D display.

I know GSync doesn't either, but Nvidia have just dumped so much time and money into extravagant things, it seems they've lost touch with their focus. They don't even seem to be supporting their own games very well anymore.

There's tons of people on 700 series cards having strange problems with AC IV that those of us on AMD cards are not having issues with.
 
Nice. Perhaps they could fix the simple matter of D3D Vsync in their drivers first
 
will be watching this hehe
 
do want. sounds like it could be good.

at long last all the micro stutter and vsync issues are being sorted...
 
I am awaiting the time where it only refreshes the part of the screen that needs it, 60 frames but only 25% change in pixels rendered.........copy and paste sections of the screen to overdraw and use the Z buffer to compare and the AA engine to refit sections of the existing frames.
 
I wouldn't get your hopes up yet. It may not require extra hardware, but it only works on a limited number of panels and display hardware. You almost surely will still would have to replace your monitor to use it.
 
Vbank is a vesa standard power saving function. I would be surprised if a majority of monitors didnt support it.
 
Vbank is a vesa standard function. I would be surprised if a majority of monitors didnt support it.

Certainly it's a standard function, but according to the few news articles about this, most desktop displays don't support the standard. It's a limitation of the display controller. It just like how displays don't support certain refresh rates even though the panel and connecting cables are within specification. The display controller just isn't designed (or programmed) to support those features.
 
I wouldn't get your hopes up yet. It may not require extra hardware, but it only works on a limited number of panels and display hardware. You almost surely will still would have to replace your monitor to use it.
Unlike gsync where it costs extra, and uses a proprietary format and hardware, instead of a already implemented standard....sounds.....like another Physx attempt to capture the market by doing the same thing, but with a cost.

I am really starting to feel like Nvidia would bottle air, and sell it, and people would buy it and proclaim we don't thneed trees.
 
Unlike gsync where it costs extra, and uses a proprietary format and hardware, instead of a already implemented standard....sounds.....like another Physx attempt to capture the market by doing the same thing, but with a cost.

I am really starting to feel like Nvidia would bottle air, and sell it, and people would buy it and proclaim we don't thneed trees.

I agree it's a better implementation because it's open, but I will bet you that monitors supporting this will still be marketed as "gaming monitors" and marked up in price even though there is no hardware change. NVidia may overprice things, but the third party manufacturers love that they can make a larger margin on accompanying products compared to free and open standards.
 
Last edited:
Certainly it's a standard function, but according to the few news articles about this, most desktop displays don't support the standard. It's a limitation of the display controller. It just like how displays don't support certain refresh rates even though the panel and connecting cables are within specification. The display controller just isn't designed (or programmed) to support those features.

Ive seen no information to this effect.

As far as i can tell VBI is still included in all vesa standard monitors, and while outmoded for graphics in favor of double bufering, its still used for things like power saving, and macrovision encoding.

Some industrious coder needs to make a checker....
 
they used an existing laptop as an example of the tech, which means the hardware support is already out there to some extent.

this isnt like nvidias thing where a new monitor is guaranteed, you MIGHT get lucky with this.
 
Nvidia responds to AMD's ''free sync'' demo

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

CES — On the show floor here at CES today, I spoke briefly with Nvidia's Tom Petersen, the executive instrumental in the development of G-Sync technology, about the AMD "free sync" demo we reported on yesterday. Alongside the demo, a senior AMD engineering executive asserted that a variable refresh rate capability like G-Sync ought to be possible essentially for free, without adding any extra costs to a display or a PC system. Peterson had several things to say in response to AMD's demo and claims.

He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."

This sentiment is a familiar one coming from Nvidia. The company tends to view its GeForce GPUs and related solutions as a platform, much like the Xbox One or PS4. Although Nvidia participates in the larger PC gaming ecosystem, it has long been guarded about letting its competitors reap the benefits of its work in various areas, from GPU computing to PhysX to software enablement of advanced rendering techniques in AAA games.

Like it or not, there is a certain competitive wisdom in not handing off the fruits of your work to your competition free of charge. That's not, however, how big PC players like Intel and AMD have traditionally handled new standards like USB and x86-64. (Intel in particular has done a lot of work "for everyone.")

If you recall our report from yesterday on this subject, Nvidia and AMD do seem to agree on some of the key issues here. Both firms have told us that the technology to support variable refresh rates exists in some cases already. Both have said that the biggest challenge to widespread adoption of the tech on the desktop is support among panel (and scaler ASIC) makers. They tend to disagree on the best means of pushing variable refresh tech into wider adoption. Obviously, after looking at the landscape, Nvidia chose to build the G-Sync module and enable the feature itself.

My sense is that AMD will likely work with the existing scaler ASIC makers and monitor makers, attempting to persuade them to support dynamic refresh rates in their hardware. Now that Nvidia has made a splash with G-Sync, AMD could find this path easier simply because monitor makers may be more willing to add a feature with obvious consumer appeal. We'll have to see how long it takes for "free sync" solutions to come to market. We've seen a number of G-Sync-compatible monitors announced here at CES, and most of them are expected to hit store shelves in the second quarter of 2014.
 
AMD already responded to that as well.

AMD Gaming Blog said:
Doing the work for everyone

In our industry, one of the toughest decisions we continually face is how open we should be with our technology. On the one hand, developing cutting-edge graphics technology requires enormous investments. On the other hand, too much emphasis on keeping technologies proprietary can hinder broad adoption.

It’s a dilemma we face practically every day, which is why we decided some time ago that those decisions would be guided by a basic principle: our goal is to support moving the industry forward as a whole, and that we’re proud to take a leadership position to help achieve that goal.

The latest example of that philosophy is our work with dynamic refresh rates, currently codenamed "Project FreeSync”. Screen tearing is a persistent nuisance for gamers, and vertical synchronization (v-sync) is an imperfect fix. There are a few ways the problem can be solved, but there are very specific reasons why we’re pursuing the route of using industry standards.

The most obvious reason is ease of implementation, both for us from a corporate perspective and also for gamers who face the cost of upgrading their hardware. But the more important reason is that it’s consistent with our philosophy of making sure that the gaming industry keeps marching forward at a steady pace that benefits everyone.

It sometimes takes longer to do things that way — lots of stakeholders need to coordinate their efforts — but we know it’s ultimately the best way forward. This strategy enables technologies to proliferate faster and cost less, and that’s good for everyone.

The same philosophy explains why we’re revealing technology that’s still in the development stage. Now’s our chance to get feedback from industry, media and users, to make sure we develop the right features for the market. That’s what it takes to develop a technology that actually delivers on consumers’ expectations.

And Project FreeSync isn’t the only example of this philosophy and its payoffs. We worked across the industry to first bring GDDR5 memory to graphics cards— an innovation with industry-wide benefits. And when game developers came to us demanding a low-level API, we listened to them and developed Mantle. It’s an innovation that we hope will speed the evolution of industry-standard APIs in the future.

We’re passionate about gaming, and we know that the biggest advancements come when all industry players collaborate. There’s no room for proprietary technologies when you have a mission to accomplish. That’s why we do the work we do, and if we can help move the industry forward we’re proud to do it for everyone.

There is very little difference to what AMD & Nvidia are doing and most of it is on the display side.

Nvidia G-Sync
add-on module for monitor sellers to add in.
*Will only work with a Nvidia GPU

Project FreeSync
A push for DP (eDP) v1.3 monitor support standards.
*Will work with any GPU that has a driver to support it
 
Last edited:
i feel like i remember reading one of the interviews talking about how AMD's version doesnt quite align with timings or it adds a frame of input lag or something along those lines

even if gsync is a technically more advanced implementation, pcper has a shot showing problems below 30fps, so... hope something can be done about that

regardless, both cases are exciting
 
Between those two articles, per se, AMD wins.

o_O
 
DisplayPort DevCon 2010

eDP1.3.jpg
 
AMD: get a displayport monitor with this feature, and it'll work. some already exist.

Nvidia: get a G-sync monitor that costs a heap more cause its all proprietary tech, and it'll work. new monitor required.
 
Essentially any monitor with advanced video processing has a small frame buffer to perform the work with, most happen to be TV's instead of "gaming monitors".
 
AMD: get a displayport monitor with this feature, and it'll work. some already exist.

Nvidia: get a G-sync monitor that costs a heap more cause its all proprietary tech, and it'll work. new monitor required.

By the time that all the pieces are in place for Free Sync (DP 1.3 standard ratified, DP 1.3 monitors available, AMD's driver released), G-Sync will be old hat. That $100 premium to use a technology years before a standard version is implemented is well worth it. It's no different than paying $100 extra to buy a top of the line graphics card now when you could just buy a mid range card with equal performance years later.
 
By the time that all the pieces are in place for Free Sync (DP 1.3 standard ratified, DP 1.3 monitors available, AMD's driver released), G-Sync will be old hat. That $100 premium to use a technology years before a standard version is implemented is well worth it. It's no different than paying $100 extra to buy a top of the line graphics card now when you could just buy a mid range card with equal performance years later.

For first to market I guess.

Its different because that old graphic card can fit into any PCI-E slot. The G-Sync monitor will only work with certain Kepler cards. Usually people put there old GPU in one of there other systems or sell them. If your try'n to sell your G-Sync TN monitor once DP 1.3 is ratified. More then not it will be included in the top lines like IPS / VA / 4k monitors. GOOD LUCK!!!

Nvidia is also going to have to deal with making sure there two drivers calls work G-Sync and DP v1.3 . Which one do you think they will give priority in-order to look better ?
 
Last edited:
For first to market I guess.

Its different because that old graphic card can fit into any PCI-E slot. The G-Sync monitor will only work with certain Kepler cards. Usually people put there old GPU in one of there other systems or sell them. If your try'n to sell your G-Sync TN monitor once DP 1.3 is ratified. More then not it will be included in the top lines like IPS / VA / 4k monitors. GOOD LUCK!!!

Nvidia is also going to have to deal with making sure there two drivers calls work G-Sync and DP v1.3 . Which one do you think they will give priority in-order to look better ?

Any G-Sync monitor will work with any other DP source, just without the frame syncing. However, for anyone who upgrades, it's absurd to think that G-Sync won't be supported by Maxwell or Volta cards. Even in the ridiculously unlikely scenario that conventional G-sync is completely discontinued with DP 1.3 or after Kepler cards, the only depreciation is the $100 or so of the G-Sync module. There are always people looking to buy used products for discounts. You do state that IPA / VA / 4K monitors will make current G-sync monitors obsolete, but I disagree with you there too. >60Hz monitors are a different market than IPS, VA, and 4K monitors. There are no >60Hz IPS, VA, or 4K monitors in any major company's plans, so the niche for high refresh rate monitors will still exist even if G-sync does not.

Regarding the second point, I agree with you in that I don't believe that NVidia's drivers will adopt frame syncing over conventional DP 1.3. However, G-Sync and DP 1.3 are not exclusive features in a monitor; it's highly likely that future G-sync monitors will have other DP 1.3 inputs just like most monitors have multiple inputs. This alone would make those monitors universally compatible with both NVidia's and AMD's implementations, since all AMD's implementation needs is DP 1.3. This is similar to how modern NVidia 3D vision monitors also accept DisplayPort, which allows them to use AMD's HD3D as well. If you're willing to pay the extra up front cost for G-sync, you won't be "locked in" to one side or the other with future monitors.
 
Last edited:
Back
Top