Thursday, December 3rd 2020

AMD Radeon RX 6800 XT Tested on Z490 Platform With Resizable BAR (AMD's SAM) Enabled

AMD's recently-introduced SAM (Smart Access memory) feature enables users pairing an RX 6000 series graphics card with a Ryzen 5000 series CPU to take advantage of a long-lost PCIe feature in the form of its Resizable Bar. However, AMD currently only markets this technology for that particular component combination, even though the base technology isn't AMD's own, but is rather included in the PCIe specification. It's only a matter of time until NVIDIA enables the feature for its graphics cards, and there shouldn't be any technical problem on enabling it within Intel's platform as well. Now, we have results (coming from ASCII.jp) from an Intel Z490 motherboard (ASUS ROG Maximus XII EXTREME) with firmware 1002, from November 27th, paired with AMD's RX 6800 XT. And SAM does work independently of actual platform.

Paired with an Intel Core i9-10900K, AMD's RX 6800 XT shows performance increases across the board throughout the test games - which are games AMD themselves have confirmed SAM is working with. This means testing was done with Assassin's Creed Valhalla, Forza Horizon 4, Red Dead Redemption 2, and Rainbow Six Siege. The results speak for themselves (SAM results are the top ones in the charts). There are sometimes massive improvements in minimum framerates, considerable gains in average framerates, and almost no change in the maximum framerates reported for these games on this given system. Do note that the chart for Forza Horizon 4 has an error, and the tested resolution is actually 1440p, not 1080p.
Source: ASCII.jp
Add your own comment

59 Comments on AMD Radeon RX 6800 XT Tested on Z490 Platform With Resizable BAR (AMD's SAM) Enabled

#2
Steevo
Those are some great increases in minimum frame rates, I wonder what this will do to reviews in the next few months as it rolls out?
Posted on Reply
#4
Makaveli
This looks good.

Even if the improvement was only in Minimum frame rates that matters the most as you feel that over avg and the high fps.
Posted on Reply
#5
ChosenName
It'd be ironic if the feature were to be unlocked on Intel motherboards without the user having to own a 5000 series AMD CPU.

Hopefully this will "encourage" AMD to unlock the feature on relevant motherboards, regardless of CPU installed.
Posted on Reply
#6
milewski1015
ChosenNameHopefully this will "encourage" AMD to unlock the feature on relevant motherboards, regardless of CPU installed.
This. If it's part of the PCIe spec, why is it being limited to only the most recent hardware? I'd like a boost for my 2600/5700XT
Posted on Reply
#7
Unregistered
I would love for someone to compare wattage usage to before and after. Because I too can unlock more performance for "free" by increasing the power limit to 110% on my GPU.

All I can find on PCIe 4.0 is that as quoted from pretty much the same source across the board that "The increased power limit instead refers to the total power draw of expansion cards. PCIe 3.0 cards were limited to a total power draw of 300 watts (75 watts from the motherboard slot, and 225 watts from external PCIe power cables). The PCIe 4.0 specification will raise this limit above 300 watts, allowing expansion cards to draw more than 225 watts from external cables."

I can't find any real evidence if PCIe 4.0 allows you to pull more than 75 watts from the slot or not. Anyone please prove me I'm wrong and people aren't just remaining things in order to avoid being sued.
#9
mechtech
Ferrum MasterRed Dead min FPS increase is spectacular.
Indeed. It’s in Vulcan though so would DX be the same? Also does that indicate there might be something not optimized in Vulcan or a driver??
Posted on Reply
#10
kruk
Ferrum MasterRed Dead min FPS increase is spectacular.
It's impossible for the gains to be so high. This has to be an measuring error (possibly a single dip during the test). Probably the test was only performed once per graph, which is a big "no no" when benchmarking ...

// edit: yeah, I should definitely had read the source - they performed the test 3 times. The last sentence is incorrect, but I'm standing behind everything else I have written.
Posted on Reply
#11
DeathtoGnomes
a long-lost PCIe feature
This was too funny.

"did you see my keys anywhere?"
Posted on Reply
#12
TheLostSwede
News Editor
Bork BorkI would love for someone to compare wattage usage to before and after. Because I too can unlock more performance for "free" by increasing the power limit to 110% on my GPU.

All I can find on PCIe 4.0 is that as quoted from pretty much the same source across the board that "The increased power limit instead refers to the total power draw of expansion cards. PCIe 3.0 cards were limited to a total power draw of 300 watts (75 watts from the motherboard slot, and 225 watts from external PCIe power cables). The PCIe 4.0 specification will raise this limit above 300 watts, allowing expansion cards to draw more than 225 watts from external cables."

I can't find any real evidence if PCIe 4.0 allows you to pull more than 75 watts from the slot or not. Anyone please prove me I'm wrong and people aren't just remaining things in order to avoid being sued.
Do you even understand how this works? It has nothing to do with extra power. Also how would the card know that the slot can deliver more power than the PCIe spec? And In this case, we're looking at a PCIe 3.0 motherboard with a PCIe 3.0 CPU, so can you please explain your flawed logic here?
This works by allowing the CPU to utilize the full amount of memory on the GPU, rather than just having access to a limited part of it. As such, more data can be shuffled between the two, which leads to increased performance.
Posted on Reply
#13
mechtech
Bork BorkI would love for someone to compare wattage usage to before and after. Because I too can unlock more performance for "free" by increasing the power limit to 110% on my GPU.

All I can find on PCIe 4.0 is that as quoted from pretty much the same source across the board that "The increased power limit instead refers to the total power draw of expansion cards. PCIe 3.0 cards were limited to a total power draw of 300 watts (75 watts from the motherboard slot, and 225 watts from external PCIe power cables). The PCIe 4.0 specification will raise this limit above 300 watts, allowing expansion cards to draw more than 225 watts from external cables."

I can't find any real evidence if PCIe 4.0 allows you to pull more than 75 watts from the slot or not. Anyone please prove me I'm wrong and people aren't just remaining things in order to avoid being sued.
My clothes dryer uses 5500W. That’s all the plug can supply. ;)
Posted on Reply
#14
dj-electric
dj-electricImagine a hypothetical situation, where Intel's platform could benefit a higher performance gain from enabling resizeable BAR...
:laugh:
Posted on Reply
#15
TheLostSwede
News Editor
krukIt's impossible for the gains to be so high. This has to be an measuring error (possibly a single dip during the test). Probably the test was only performed once per graph, which is a big "no no" when benchmarking ...
Why is it impossible? If you're going to make a claim like that, then you better back it up with some facts.
It's minimum FPS we're talking her, so say there's some kind of bug in the game, that gets a work around by enabling a wider "memory interface" between the CPU and GPU, why wouldn't we see a huge jump in performance for the minimum FPS?
Obviously I'm just speculating here, but you didn't even do that, you just said it's impossible or that the tester made a mistake. The latter is highly unlikely, as ASCii Japan doesn't do blunders like that, the journalists working there are not some n00bs.
Also, the average FPS is only up around 9fps, which seems to be in line with the other games tested. As such, I still believe that we're seeing som game related issues that got a workaround by enabling the resizable bar option.
Posted on Reply
#16
mastrdrver
mechtechIndeed. It’s in Vulcan though so would DX be the same? Also does that indicate there might be something not optimized in Vulcan or a driver??
krukIt's impossible for the gains to be so high. This has to be an measuring error (possibly a single dip during the test). Probably the test was only performed once per graph, which is a big "no no" when benchmarking ...
It's because RDR2 will stutter every once in a while when playing in window borderless mode. In full screen, Vulkan does not have this issue and DX does not have this issue in either mode.
Posted on Reply
#17
milewski1015
mastrdrverIt's because RDR2 will stutter every once in a while when playing in window borderless mode. In full screen, Vulkan does not have this issue and DX does not have this issue in either mode.
So why would that be remedied with BAR enabled? Or are you suggesting that they ran the "SAM off" test in borderless and then the "SAM on" test in fullscreen?
Posted on Reply
#18
Unregistered
TheLostSwedeDo you even understand how this works? It has nothing to do with extra power. Also how would the card know that the slot can deliver more power than the PCIe spec? And In this case, we're looking at a PCIe 3.0 motherboard with a PCIe 3.0 CPU, so can you please explain your flawed logic here?
This works by allowing the CPU to utilize the full amount of memory on the GPU, rather than just having access to a limited part of it. As such, more data can be shuffled between the two, which leads to increased performance.
Ahh yes I completely forgot that the ASUS ROG Maximus XII EXTREME is a PCIe 3.0 board my bad. Same goes for the CPU. I'm sorry I forgot to take that into consideration, therefor making the board useless for Rocket Lake as well by not being able to utilize PCIE 4.0 either. Thank you for pointing that out. Now I can go back to making food. And once again, thank you for pointing out the flaw in my logic. Now I know that I won't be buying a ASUS ROG Maximus XII EXTREME because it'll never support PCIe 4.0.
mechtechMy clothes dryer uses 5500W. That’s all the plug can supply. ;)
And I like potatoes. ;)
#19
Rahnak
This is very interesting. And as this feature is neither limited to PCI4 or some magic in the CPU, I hope AMD goes back and enables this across the Zen range.
Posted on Reply
#20
TheLostSwede
News Editor
Bork BorkAhh yes I completely forgot that the ASUS ROG Maximus XII EXTREME is a PCIe 3.0 board my bad. Same goes for the CPU. I'm sorry I forgot to take that into consideration. Therefor making the board useless for Comet Lake as well by not being able to utilize PCIE 4.0 either. Thank you for pointing that out. Now I can go back to making food. And once again, thank you for pointing out the flaw in my logic. Now I know that I won't be buying a ASUS ROG Maximus XII EXTREME because it'll never support PCIe 4.0.
Eh? Once the new CPUs are out next year, this board should support PCIe 4.0, but only for the x16 slot and possibly for one M.2 slot, depending on their design.
Also, I don't know what this has to do with anything, as the test that was performed was on PCIe 3.0 and that's what we're discussing here, if I'm not mistaken?
Posted on Reply
#21
Unregistered
TheLostSwedeEh? Once the new CPUs are out next year, this board should support PCIe 4.0, but only for the x16 slot and possibly for one M.2 slot, depending on their design.
Also, I don't know what this has to do with anything, as the test that was performed was on PCIe 3.0 and that's what we're discussing here, if I'm not mistaken?
But you just said that it's a PCIe 3.0 motherboard so that's preposterous!
#22
TheLostSwede
News Editor
Bork BorkBut you just said that it's a PCIe 3.0 motherboard so that's preposterous!
Seriously? :rolleyes:
Posted on Reply
#23
Steevo
Bork BorkBut you just said that it's a PCIe 3.0 motherboard so that's preposterous!
I don't know what you are on about, but it has NOTHING to do with the very relevant topic at hand, maybe have a snack or juice box and come back to it later. Let me try and spell out what this improvement does, and does NOT do.

1) SAM and PCIe 3 or4 spec has piss all to do with what you are suggesting.
2) SAM does NOT increase the power the GPU can use.
3) SAM DOES allow the CPU to push more data to the GPU at once instead of being limited by the prior frame buffer "window"

Imagine if the GPU needed 2Gb of data from the system memory, previously it had to be sent in lets say 32MB chunks, then the GPU had to manage moving it all around after it was there and mapped into the GPU memory address space. While the GPU waits for texture data that doesn't arrive in the first load its waiting, causing lag, lower frame rates, stutters.

PCIe spec allows for larger data chunks to be pushed, but no one implemented it, maybe it was overlooked, maybe drivers for GPUs and other devices would have faults if it were enabled, maybe the timing choice was made to ease CPU load and prevent buffer overflows.

Modern GPU's and CPU's are able to handle DMA transfers, and with the increased core counts maybe we have finally reached the point where the performance loss from larger transfers or GPU's being able to directly access the system memory and the driver running on the CPU managing it is providing the increased performance.

Maybe the balance between the driver checking to see if its faster meant that more coding for every scenario out there made it more difficult and could lead to unpredictable results.
Posted on Reply
#24
kruk
TheLostSwedeWhy is it impossible? If you're going to make a claim like that, then you better back it up with some facts.
It's minimum FPS we're talking her, so say there's some kind of bug in the game, that gets a work around by enabling a wider "memory interface" between the CPU and GPU, why wouldn't we see a huge jump in performance for the minimum FPS?
Obviously I'm just speculating here, but you didn't even do that, you just said it's impossible or that the tester made a mistake. The latter is highly unlikely, as ASCii Japan doesn't do blunders like that, the journalists working there are not some n00bs.
Also, the average FPS is only up around 9fps, which seems to be in line with the other games tested. As such, I still believe that we're seeing som game related issues that got a workaround by enabling the resizable bar option.
You have multiple graphs where the difference is a few percentages and then one which increases three times - obviously this is a large red flag. If enabling Resizable BAR triggers something in the driver which makes it work correctly and FPS to be more stable, this doesn't mean that Resizable BAR improves the minimum FPS three times. First you wait for AMD to fix the drivers so they work properly, and then test again. If AMD can't fix the drivers, then your theory is obviously correct and resizable BAR can also fix game bugs.

Also, you don't just publish min FPS results for the obvious outlier, as this will raise questions (everyone that has ever done benchmarking will be skeptical of the results ...). You publish frametime graphs, which explain in detail what changed when Resizable BAR got turned on. Isn't it obvious? It doesn't seem to be for them ...
Posted on Reply
#25
Searing
krukYou have multiple graphs where the difference is a few percentages and then one which increases three times - obviously this is a large red flag. If enabling Resizable BAR triggers something in the driver which makes it work correctly and FPS to be more stable, this doesn't mean that Resizable BAR improves the minimum FPS three times. First you wait for AMD to fix the drivers so they work properly, and then test again. If AMD can't fix the drivers, then your theory is obviously correct and resizable BAR can also fix game bugs.

Also, you don't just publish min FPS results for the obvious outlier, as this will raise questions (everyone that has ever done benchmarking will be skeptical of the results ...). You publish frametime graphs, which explain in detail what changed when Resizable BAR got turned on. Isn't it obvious? It doesn't seem to be for them ...
Min FPS is the result of bottlenecking, SAM removes the bottleneck and you get more of the performance you are already receiving (the average FPS)

This isn't complicated.
Posted on Reply
Add your own comment
Apr 25th, 2024 09:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts