• We've upgraded our forums. Please post any issues/requests in this thread.

AMD Responds to NVIDIA G-Sync with FreeSync

Joined
Apr 30, 2012
Messages
2,417
Likes
1,332
#51
Basically, to sum it up, reason why AMD showcased it on some crappy laptop is because it's the only system could utilize this feature atm. Why is it free? because AMD hasn't invested a lot of research nor could offer any form of polished product at the given moment. So I guess this ends all discussions then.
research? The research has been there for years.

Look up PSR [Panel Self Refresh] a far better tech that is both 2D, 3D complaint and power savings allowing the monitor to be unplugged.



AMD Gaming Blog said:
Doing the work for everyone

In our industry, one of the toughest decisions we continually face is how open we should be with our technology. On the one hand, developing cutting-edge graphics technology requires enormous investments. On the other hand, too much emphasis on keeping technologies proprietary can hinder broad adoption.

It’s a dilemma we face practically every day, which is why we decided some time ago that those decisions would be guided by a basic principle: our goal is to support moving the industry forward as a whole, and that we’re proud to take a leadership position to help achieve that goal.

The latest example of that philosophy is our work with dynamic refresh rates, currently codenamed "Project FreeSync”. Screen tearing is a persistent nuisance for gamers, and vertical synchronization (v-sync) is an imperfect fix. There are a few ways the problem can be solved, but there are very specific reasons why we’re pursuing the route of using industry standards.

The most obvious reason is ease of implementation, both for us from a corporate perspective and also for gamers who face the cost of upgrading their hardware. But the more important reason is that it’s consistent with our philosophy of making sure that the gaming industry keeps marching forward at a steady pace that benefits everyone.

It sometimes takes longer to do things that way — lots of stakeholders need to coordinate their efforts — but we know it’s ultimately the best way forward. This strategy enables technologies to proliferate faster and cost less, and that’s good for everyone.

The same philosophy explains why we’re revealing technology that’s still in the development stage. Now’s our chance to get feedback from industry, media and users, to make sure we develop the right features for the market. That’s what it takes to develop a technology that actually delivers on consumers’ expectations.

And Project FreeSync isn’t the only example of this philosophy and its payoffs. We worked across the industry to first bring GDDR5 memory to graphics cards— an innovation with industry-wide benefits. And when game developers came to us demanding a low-level API, we listened to them and developed Mantle. It’s an innovation that we hope will speed the evolution of industry-standard APIs in the future.

We’re passionate about gaming, and we know that the biggest advancements come when all industry players collaborate. There’s no room for proprietary technologies when you have a mission to accomplish. That’s why we do the work we do, and if we can help move the industry forward we’re proud to do it for everyone.
The only thing Nvidia has done is try to leap frog standards. Who are Nvidias partners on G-Sync? None are panel producers. All are monitor sellers. It doesn't make sense to pay royalties to tech you have received only to replace it with another. I assume that's why G-Sync monitor cost +$200. They have to make up the cost for the parts they are removing.
 
Last edited:

Wanton

New Member
Joined
Jan 8, 2014
Messages
6
Likes
0
#52
Has anyone seen any other demos about FreeSync than that wind turbine one?
I mean that isn't very comparable to games where frames close to each other can have huge difference in rendering time.
If FreeSync works by predicting and dynamically setting refresh interval and not by lets call it frame holding like G-Sync, you will get stutter in games for sure. Because you can't just predict frame rending time accurately. Maybe you can buffer some frames but then you get more latency.

Maybe VESA Direct Drive can help AMD to achieve something like G-Sync in the future. But I think that needs more hardware on display card side.
 
Joined
Apr 30, 2012
Messages
2,417
Likes
1,332
#53
Has anyone seen any other demos about FreeSync than that wind turbine one?
I mean that isn't very comparable to games where frames close to each other can have huge difference in rendering time.
If FreeSync works by predicting and dynamically setting refresh interval and not by lets call it frame holding like G-Sync, you will get stutter in games for sure. Because you can't just predict frame rending time accurately. Maybe you can buffer some frames but then you get more latency.

Maybe VESA Direct Drive can help AMD to achieve something like G-Sync in the future. But I think that needs more hardware on display card side.
?

The G-Sync module is a add-on TCON that uses the DD

FreeSync uses the eDP TCON. Since its standard it won't need to be a add-on through DD or replace any others.
 

Wanton

New Member
Joined
Jan 8, 2014
Messages
6
Likes
0
#54
?

The G-Sync module is a add-on TCON that uses the DD

FreeSync uses the eDP TCON. Since its standard it won't need to be a add-on through DD or replace any others.
Ok but eDP is embedded display port. What about normal displays? There are probably some people who game on laptops but what about others?