Wednesday, December 17th 2014

AMD Working on "Dynamic Frame Rate Control" Feature

AMD is working on a new software feature for its Radeon graphics cards, which it calls "Dynamic Frame Rate Control." Revealed informally to the web, by AMD director of PR Chris Hook, who goes by the handle "AMD_Chris" on various forums, Dynamic Frame Rate Control, or DFRC, is a frame-rate limiter, which gives you power savings when you reduce frame-rates. This probably works by reducing clock speeds to achieve the desired frame-rates.

Sounds a lot like V-Sync? Well the way AMD describes it, DFRC is a frame-rate limiter with a slider. Whereas V-Sync makes the GPU spit out frame-rates to match the monitor's refresh-rate. When a game runs, say, 100 FPS, and you enable V-Sync to bring that down to 60 FPS, your GPU is still running at 3D-performance clocks, unless the 3D load is way too low, and the driver decides to change the power state altogether. DFRC probably achieves lower frame-rates by underclocking the GPU, and increasing the clocks, whenever the scene gets more demanding, and the output FPS drops below the target. Hook describes the energy savings with DFRC as "mind blowing." This peaks our curiosity.
Source: 3DCenter.org
Add your own comment

67 Comments on AMD Working on "Dynamic Frame Rate Control" Feature

#51
Casecutter
Blue-Knight
Considering they are... I guess they are more worried with their pockets (at the moment of purchase), otherwise it is not logical.

And I hope they're not running their cards 24/7 or for long periods /day (8+ hours)... Otherwise it is not justifiable, unless AMD is bringing them twice the performance /WATT, otherwise the difference in their electricity bill added to the amount they paid for the AMD card exceeds the NVIDIA card costs for the same performance (i.e. they ended up paying more for the same performance).
It hasn't been as straight-forward as you contend... until 970/980 came onto the scene. When looking at the average power used the 290/780 weren't astronomically different, W1zzard had 780 besting the 290 by 3% at 2560x perf/w.

As you point out nobody games 24/7; if you game like 5hr's every day that's 20% a month. Do folks turn off their computer nowadays, no they "sleep" them. That remaining 80% of time AMD and Zerocore kicks' in dropping the card to like 2W vs. the 780 that's maintaining 6-7W. I like saving power but if enjoying gaming, I'm less concerned when playing than I am when the machine sitting there doing nothing suckling like a vampire, which is something Nvidia still does.
Posted on Reply
#52
HumanSmoke
Casecutter
It stinks if companies have employees posting snippets like this, to me it seems to lack control of the message. Social media while thought to be a place to generate "Buzz", in these cases it just end's up mudding the supposed message as it come across (even from a PR guy) as rouge. AMD needs to back-off these Social media snippets. If they want to cover something just say tomorrow... and then provide a decent overview... if it's not ready just yet then STFU until you can provide something useful.
Unfortunately, guerrilla marketing has proven effective in the past and is often the only way for a company to fight for mindshare when it has no new product to spearhead PR.
Nvidia selectively talked up Fermi at every opportunity when/wherever it could as a stopgap between GT200 and the eventual GF100 arrival. AMD's John Fruehe did likewise with the Bulldozer architecture for years during its slow gestation, and so effective was the message and strongarming forum moderaters to delete dissent, the 990X/FX boards that arrived in advance of Bulldozer sold very well in large part due to Fruehe's overselling (and outright lying) of the CPU architectures capabilities. Social media placement, guerrilla marketing, and company sanctioned "leaks" allow the company to claim plausible deniability post-factum while claiming PR credit with little actual company risk ( see the three pages- and counting- of this thread, and the mega hype generated by (what turned out to be) a mid range card launching in India).
Posted on Reply
#53
Blue-Knight
Casecutter
As you point out nobody games 24/7; if you game like 5hr's every day that's 20% a month. Do folks turn off their computer nowadays, no they "sleep" them.
They could let it "folding" while sleeping. :)

And I used to play 12-16* hours a day around 2008-2010.

*Values are approximate.
Posted on Reply
#54
Serpent of Darkness
NC37
So its literally just an ultra dynamic over/underclocker. Does the same thing Boost modes do, just is constantly doing it instead of being boosted the moment the game starts.

Yeah, this is going to be a buggy mess. Plus likely will be some latency as it decides if and when to clock up. Try again AMD.
Last time I used Dynamic Frame Rate Control, it didn't overclock or underclock core frequencies. Personally, I think a lot of repliers on this threads have misunderstood and generalize the context of what AMD_Peter was trying to say. Simply, without the b.s., he is just stating that DFRC is going to be a new feature in the AMD CCC and Drivers. That's wonderful. I look forward to it. I honestly don't believe this has anything to do with regulating power consumptions or a counter to G-Sync as others have expressed.

Now do you truly understand what it does, NC37. It's very simple. DFRC basically (unless its been changed recently) act as a Frame Rate Lock. You pick a frame rate to lock, and whatever game you are playing, it stays at that frame rate. So say as an example, you wanted to play Planetside 2 at 100 FPS on the highest settings. Ok. So you go into RadeonPro, input 100 into the DFRC box, and click apply. What happens after that is this. You're going to see your FPS at 100 FPS. It's constant. The variants is + / - .5 fps roughly. No matter where you go, or what you do, or how many other players are in the area plus the particle effects, it's still showing an output of 100 FPS. So what's the caught. The caught is the inverse reciprocal of Framerates is Frame Time ((frames per second)^-1 = seconds per frame); QED--Ryan Shrout please take note of this because you suck at math) skyrocket from ms to seconds per frame. Frame times increase to meet the frame rates. So you're going to sit there watching your computer take more time to produce the 100 frames--plus the cpu dealing with shadows and particle effects, and you're toon could be dead in the process because you weren't able to react so fast in a 90 versus 90 plus NC-VS shet-feast of epic proportions. While your computer is trying to produce that 100 fps, you're going to be rocket-primaried to the face by an aimbotter on the NC side, and you weren't quick enough to react, even though, the output on the OST says 100 FPS.

Does it work? Yes. It works almost flawlessly, and the transition is smooth between frames. It's almost like having G-Sync without the G-Sync because you get 100% framerate uptime. So say you're only using a 60 hz display. So most likely, you're going to get 60 frames out of the 100, take a 40 frame loss, and have smooth, fluid animation without any form of screen-tearing or runt-frames at a cost of $0.00.

Now with some of the other members who have mentioned that this could be used to reduce power consumption, I can see this being true only under certain situations were you want to limit your FPS below your refresh rate throughout the game. Instead of going with 45 - 60 FPS at 98% of the game, and having those moments where it's 2% at 28FPS, this is a 6 hour play secession, you could just go with 100% 30 FPS, and you'd consume less power in the process.

I believe the goal of DFRC was really to find other ways of fixing the issue that AMD 7990 had when it was still king: Eliminate runt frames and provide a smooth, fluidic gameplay at no cost to the consumer or the manufacturers. That's good for the consumers, but that's not good for business.

Believe me. I know what I am talking about. I use to play around with DFRC on my AMD Graphic Card Rig. I don't because that rig has become my music and 3D content workstation.
Posted on Reply
#55
Nordic
Input lag is a problem with vsync, and any frame limiter afaik. So it is not that free.
Posted on Reply
#56
Casecutter
Blue-Knight
They could let it "folding" while sleeping. :)

And I used to play 12-16* hours a day around 2008-2010.

*Values are approximate.
If Folding a GPU is working not sleeping. ;)
Posted on Reply
#57
kn00tcn
what kind of broken phone is this, where are people getting the insane idea that the card is going to do some weird power management!?

the google translated 3dcenter article doesnt say it, chris didnt say it, only TPU & some users are saying it
Posted on Reply
#58
Prima.Vera
TheGuruStud
Vsync off is unplayable. Idk who can even use this new feature without their brain hemhoragging.
You MUST be joking!! Seriously...
Posted on Reply
#59
RejZoR
There are more games that are crap with V-Sync than those without it. Mostly because of horrible mouse lag or horrible performance in general...
Posted on Reply
#60
nemesis.ie
Casecutter
If Folding a GPU is working not sleeping. ;)
I'm sure he meant the owner would be sleeping not the GPU. :)
Posted on Reply
#61
natr0n
With MSI afterburners RTSS you can set frame rate limit which in turn is a power saving feature.

Not dynamic, but it's something.
Posted on Reply
#62
Blue-Knight
natr0n
you can set frame rate limit which in turn is a power saving feature.
That's a "new" option I noticed in many overclocking utilities such as "Zotac FireStorm", "EVGA Precision", etc...

And in my opinion it was late... A very useful option I always needed! :)

There's just one problem, the frame rate min. they allow is too high. I had to hack the application for it to set lower frame rate limits. And it worked, so it is possible.
Posted on Reply
#63
arbiter
Blue-Knight
That's a "new" option I noticed in many overclocking utilities such as "Zotac FireStorm", "EVGA Precision", etc...
Um, i used it for years and years its not a "new" option.
Posted on Reply
#64
Blue-Knight
arbiter
Um, i used it for years and years its not a "new" option.
I guess I was out for too long then... :laugh:
Posted on Reply
#65
BiggieShady
It all sounds exactly like a variant of existing clock control algorithm both red/green teams use ... target can be power usage, temperature and now the frame rate ... wait till someone has an idea to set fan noise as a target variable and calls it (Someone Is Asleep On The Couch Mode TM)
Posted on Reply
#66
Mussels
Moderprator
Browser crash ate half my post. The initial quote outright states its an FPS cap - if you want 30FPS on your laptop, 60FPS on your desktop or 144FPS on your leet gaming machine, you should be able to pick what you want and deal with teh appropriate power/heat/noise without needing Vsync.

I already do this with afterburner/rivatuner, capped my FPS to 70.


Many games that might run 40-60FPS in game will spike to 100-500FPS in various menus or loading screens causing capacitor whine, fans to ramp up and other annoyances... and this was a simple program running in the background to fix it.

Also solves problems with screen tearing and the various games that have broken Vsync (dead space comes to mind, and skyrims micro stutter both fixed by doing this)

See Frame rate limit in the middle right




Oh and with my wall meter i've observed power savings of upto 150W on my system by doing this in games that didnt work well with Vsync, which meant my fans stayed at silent levels instead of load levels.
Posted on Reply
#67
JunkBear
I'm
TRWOV
Mantle does all what AMD said it would. Game developers not using it is another issue altogether.
As for DFRC, sounds interesting. I've yet to upgrade from a 60Hz monitor so this could be big for me. I'm already seeing <350W figures while gaming but if this can slash 10% or so it would be a welcomed improvement. I guess it'll launch along the Hawaii refresh?
I'm not into extensive understanding of design but is it me or these game designers are not into newer stuff? It's cost effective to keep technology already proved and paid since long time. Could it be also just the fact that they prefer Intel stuff over AMD newer development? As stated I'm not expert but I could just imagine what gaming and all graphic applications could be now if gaming companies used more AMD graphic technology fitted with AMD GPU. In my souvenir AMD have been long time in advance in graphic technology over Intel and Intel it was more CPU stuff. If companies had pushed more AMD stuff I can tell that GPU could be small like USB key and still deliver astonishing quality and power in full HD.
Posted on Reply
Add your own comment