• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Brings Smart Access Memory (Resizable BAR) Support to Ryzen 3000 Series

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,741 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD in its "Where Gaming Begins Episode 3" online event, announced that it is introducing Smart Access Memory (resizable base address register) support to Ryzen 3000 series "Matisse" processors, based on the "Zen 2" microarchitecture. These exclude the Ryzen 3 3200G and Ryzen 5 3400G. The PCI-SIG innovated feature was, until now, restricted to the Ryzen 5000 series on the AMD platform, although is heavily proliferated across the Intel platform. Resizable BAR enables the CPU to see the graphics card's entire dedicated memory as one addressable block, rather than through 256-megabyte apertures. For game engines that are able to take advantage of the feature, this could translate to a performance boost of up to 16 percent. Be on the lookout for BIOS updates from your motherboard manufacturer.



View at TechPowerUp Main Site
 
I said this at the time it launched: I bet that the 5000 series limit is purely at launch, and it'll backdate to older CPU's and mobos over time



And it has
 
I said this at the time it launched: I bet that the 5000 series limit is purely at launch, and it'll backdate to older CPU's and mobos over time



And it has
Does that include zen & zen+ o_O
 
Does that include zen & zen+ o_O

andre-3000.jpg
 
Does that include zen & zen+ o_O
no, looks like 3000 series only which is reasonable enough. Two generations of support seems fair.
 
no, looks like 3000 series only which is reasonable enough. Two generations of support seems fair.

i mean, especially considering it really doesn't matter at end of the day. i think what 95% of all games only get a 2-3 fps gain? i'm not sure, haven't checked the numbers lately, but from what I remember very few games actually benefit more than 5% from it. that being said I have had SAM enabled on 5600x and rx 6800 non-xt since launch and 0 issues, but i did notice i can get a higher ram OC when I have SAM off... so tradeoffs cause higher ram OC does give gains too... so yah I'd say SAM breaks about even for those who don't want to pedal to the medal oc their ram (which I don't for longevity issues) im happy with SAM on and 3466 2x16gb 14-14-14 1.370 volt. to hit 4000 1:1 i needed SAM off and 1.5v it prob gives me more gains then SAM on but eh its just not worth it long term, the heat, etc.
 
Nvidia is only enabling it on a per game basis as well, so it's going to have a pretty limited effect regardless
 
Nvidia is only enabling it on a per game basis as well, so it's going to have a pretty limited effect regardless

aye AMD would have better off investing resources into more polish, more games supporting their competing version of DLSS 2.0. thats where Nvidia wins big big respect from me. DLSS 2.0 and whatever amd's equivalent will be is the future. and AMD had a unique advantage with having both next gen consoles as well, which would allow them greater /flex when convincing game companies to utilize their variant over nvidia's.

i would have put SAM on the backburner and invested more resources into the equivalent of DLSS 2.0 if I were an AMD executive. but eh, it is what it is.
 
aye AMD would have better off investing resources into more polish, more games supporting their competing version of DLSS 2.0. thats where Nvidia wins big big respect from me. DLSS 2.0 and whatever amd's equivalent will be is the future. and AMD had a unique advantage with having both next gen consoles as well, which would allow them greater /flex when convincing game companies to utilize their variant over nvidia's.

i would have put SAM on the backburner and invested more resources into the equivalent of DLSS 2.0 if I were an AMD executive. but eh, it is what it is.
I would think faster gpus that can run games better at higher resolutions is the future and not up-scaling to compensate foe how poor rt performance is atm but I guess that is just me....
 
I would think faster gpus that can run games better at higher resolutions is the future and not up-scaling to compensate foe how poor rt performance is atm but I guess that is just me....

we need both unfortunately. my ryzen 5600x and rx 6800 non-xt can't even run World of Warcraft at 165 fps 165hz 1080p... it can in dungeons maxed out but not in the new Shadowlands cities (even when the cities are not busy) i only get around 120 fps... and yes I can tell a different between 120 and 165, i really prefer the smoothness of 165. i mean its a 16 yr old game and the latest hardware still can't even do it max... at least for what i want. so yes we need both improvements hardware and a new form of DLSS 2.0, all of the above. i'd love to play AC valhalla at 165 fps 165hz, but my setup at 1080p will only pull off around 102 fps... turn down to medium... maybe i will get 140 ish... but the game would be more immersive to me with a steady 165 fps... im not trying to sound snobby about this, its just what i enjoy. i enjoy the smoothness /shrug
 
Excellent, Looking forward to getting my X570 board and 3700X ready for when the appropriate RTX3080 driver drops.
its just what i enjoy. i enjoy the smoothness /shrug
Don't feel bad about that at all, every person has different preferences for visual fidelity and framerates.

I really enjoy enabling DLSS where supported, to my eye, especially in motion there is often imperceptible differences, even in lower base resolution modes. And, for argument's sake, and my personal preference on the balance of IQ and FPS, I happily would deal with a slightly softer image for a significant uptick in fps. I mean some of these differences you need to compare screenshots to nitpick, but the difference between say 60fps and 90fps takes no nitpicking whatsoever to perceive, it's bloody obvious. Again, my preferences.

This era of extremely competent upscaling techniques is very interesting to me. Another example would be that I'd much rather play the games that have all the latest visual techniques, like for example RTRT, and take the hit in resolution and let upscaling do its best to claw that back toward native, as oppose to play the game without the latest and greatest visuals, but hit my performance target natively.
 
I would think faster gpus that can run games better at higher resolutions is the future and not up-scaling to compensate foe how poor rt performance is atm but I guess that is just me....
I'm fine with games using "cheats" to get more performance. Lighting effects have been faked for years, and real time ray tracing haven't even reached it's final form. You still get many people saying that DXR is too soon because it doesn't make rasterization feel old yet. Any performance gain in Ray tracing are probably going to be translated into heavier effects in newer games.

I've been shunned for saying this, but asking for better graphics, higher resolution, and 120 fps to become the new minimum with only brute force can only happen if we get some kind major, historical technical breakthrough where the hardware would start to evolve waay faster than the software. 100% more perfomance gen to gen happened before, but that quickly got neutered by the software catching up
 
I said this at the time it launched: I bet that the 5000 series limit is purely at launch, and it'll backdate to older CPU's and mobos over time

If you had looked over at what linux does, that would have been clear as day a while ago.
Heck, even my old Vega64 on 2700x (Zen+) has it under linux, albeit it is noted that such older hw is not be guaranteed to be free of quirks/problems related to it.
Take a look at this phoronix forum thread about it, started by an engineer working under AMD (Marek Olšák)

(Ah yes, the good ol' "when in doubt about a hardware feature, look at what linux does about it to get a better idea". A whole 60% of the time it works every time)
 
Can't wait to use it with my brand new 6800 XT, oh wait...
 
Ya know, I'd ask about the 4000 series, but thought nah I'd prolly get laughed at. :kookoo: :shadedshu:
 
we need both unfortunately. my ryzen 5600x and rx 6800 non-xt can't even run World of Warcraft at 165 fps 165hz 1080p... it can in dungeons maxed out but not in the new Shadowlands cities (even when the cities are not busy) i only get around 120 fps... and yes I can tell a different between 120 and 165, i really prefer the smoothness of 165. i mean its a 16 yr old game and the latest hardware still can't even do it max... at least for what i want. so yes we need both improvements hardware and a new form of DLSS 2.0, all of the above. i'd love to play AC valhalla at 165 fps 165hz, but my setup at 1080p will only pull off around 102 fps... turn down to medium... maybe i will get 140 ish... but the game would be more immersive to me with a steady 165 fps... im not trying to sound snobby about this, its just what i enjoy. i enjoy the smoothness /shrug

show off, im playing red dead online at like 30 - 35 fps and am so happy when it jumps to 43 or so for a brief moment, so...smooth :'(

I'm fine with games using "cheats" to get more performance. Lighting effects have been faked for years, and real time ray tracing haven't even reached it's final form. You still get many people saying that DXR is too soon because it doesn't make rasterization feel old yet. Any performance gain in Ray tracing are probably going to be translated into heavier effects in newer games.

I've been shunned for saying this, but asking for better graphics, higher resolution, and 120 fps to become the new minimum with only brute force can only happen if we get some kind major, historical technical breakthrough where the hardware would start to evolve waay faster than the software. 100% more perfomance gen to gen happened before, but that quickly got neutered by the software catching up

Feel old no, but it does reveal how ermm badly faked the fakes sometimes are, Cyberpunk (as much of a pos that game is) has pretty cool comparison options of running it with RT on vs off, I think digital foundry had a nice vid about it and it shows sometimes how rasterization has it completely wrong.

What I do wonder though, (I have no idea how this works really) could they not like...run an RT pass on a game, see how it SHOULD be lighted if it was realistic and then fake that look using raserization?
 
my ryzen 3900x have already been working with this for months now....
 
show off, im playing red dead online at like 30 - 35 fps and am so happy when it jumps to 43 or so for a brief moment, so...smooth :'(



Feel old no, but it does reveal how ermm badly faked the fakes sometimes are, Cyberpunk (as much of a pos that game is) has pretty cool comparison options of running it with RT on vs off, I think digital foundry had a nice vid about it and it shows sometimes how rasterization has it completely wrong.

What I do wonder though, (I have no idea how this works really) could they not like...run an RT pass on a game, see how it SHOULD be lighted if it was realistic and then fake that look using raserization?
for a "dynamic" world that might be hard to pull off... maybe for some shadows, but true Global illumination and reflections would be hard to do if you got a character that can have any combination of weapons/clothing/colors, then you get the lights coming from skills as well. You would have to ray trace and bake a lot of different combination...

I used unity a bit, and baking is a bit tedious, there's a lot of steps. A simple scene took 30min to render on my 3700x. At his peak real time Ray tracing might enable a workflow closer to what lighting artist are doing in VFX where you stop to care about what is ray traced or not, but just play around with a few sliders to get a "mood".
However, Lumen of unreal engine 5 is interesting. It's real time lighting without any kind of baking that isn't ray traced. It looked really great in the demo, but we'll need a comparison to see just how good it really is.
 
So hold on, this is only if you have a RX 6000 card? Or does this work with RTX 3000 too?
 
aye AMD would have better off investing resources into more polish, more games supporting their competing version of DLSS 2.0. thats where Nvidia wins big big respect from me. DLSS 2.0 and whatever amd's equivalent will be is the future. and AMD had a unique advantage with having both next gen consoles as well, which would allow them greater /flex when convincing game companies to utilize their variant over nvidia's.

i would have put SAM on the backburner and invested more resources into the equivalent of DLSS 2.0 if I were an AMD executive. but eh, it is what it is.

I'd take SAM over DLSS any day. The amount of games that support DLSS is palty and the performance gain isn't huge across the board. Some games benefit as little as 3%. SAM just works across the board.

Feel old no, but it does reveal how ermm badly faked the fakes sometimes are, Cyberpunk (as much of a pos that game is) has pretty cool comparison options of running it with RT on vs off, I think digital foundry had a nice vid about it and it shows sometimes how rasterization has it completely wrong.

Rasterization is whatever the devs decided they wanted the area lighting to look like. It's not "wrong" per say, it's as designed by the devs. RT is a lighting simulation that's closer to realism but that doesn't mean it's going to be inherently better than a well designed scene with rasterization. If the game devs can better convey their artistic vision with rasterization and that so happens to be unrealistic lighting wise so be it.

for a "dynamic" world that might be hard to pull off... maybe for some shadows, but true Global illumination and reflections would be hard to do if you got a character that can have any combination of weapons/clothing/colors, then you get the lights coming from skills as well. You would have to ray trace and bake a lot of different combination...

I used unity a bit, and baking is a bit tedious, there's a lot of steps. A simple scene took 30min to render on my 3700x. At his peak real time Ray tracing might enable a workflow closer to what lighting artist are doing in VFX where you stop to care about what is ray traced or not, but just play around with a few sliders to get a "mood".
However, Lumen of unreal engine 5 is interesting. It's real time lighting without any kind of baking that isn't ray traced. It looked really great in the demo, but we'll need a comparison to see just how good it really is.

The unreal engine 5 demo is pretty impressive. It does have both indirect lighting, zero baking, and indirect defuse with unlimited bounces. That's far more than current graphics cards can render via Ray Tracing

 
Rasterization is whatever the devs decided they wanted the area lighting to look like. It's not "wrong" per say, it's as designed by the devs. RT is a lighting simulation that's closer to realism but that doesn't mean it's going to be inherently better than a well designed scene with rasterization. If the game devs can better convey their artistic vision with rasterization and that so happens to be unrealistic lighting wise so be it.

Well not entirely, if the put a window somewhere and the sun is out and no lights shines through the window... I mean yeah unless its really meant to be designed that way with your main character going nuts or it being horror or something sure...but otherwise its simply just wrong.

Like in Cyberpunk there is an elevator with the bright sunlight inside of the building being clearly visibile.
Rasterized its barely light at all and has light (lamp) coming from the ceiling in the elevator itself,
with RT however you can see the outside light would be WAY brighter completely drowning out the ceiling light...just like how it would be in real life, so that is not really artistic vision or so, its just wrong.
 
Last edited:
we need both unfortunately. my ryzen 5600x and rx 6800 non-xt can't even run World of Warcraft at 165 fps 165hz 1080p... it can in dungeons maxed out but not in the new Shadowlands cities (even when the cities are not busy) i only get around 120 fps... and yes I can tell a different between 120 and 165, i really prefer the smoothness of 165. i mean its a 16 yr old game and the latest hardware still can't even do it max... at least for what i want. so yes we need both improvements hardware and a new form of DLSS 2.0, all of the above. i'd love to play AC valhalla at 165 fps 165hz, but my setup at 1080p will only pull off around 102 fps... turn down to medium... maybe i will get 140 ish... but the game would be more immersive to me with a steady 165 fps... im not trying to sound snobby about this, its just what i enjoy. i enjoy the smoothness /shrug
How many FPS are enough? You have a serious Gaming machine. I know am a hypocrite though it's Thursday there is soccer on (been on don't let work know) and I am a few bevies in as I will be getting another 6800XT.
 
I'd take SAM over DLSS any day. The amount of games that support DLSS is palty and the performance gain isn't huge across the board. Some games benefit as little as 3%. SAM just works across the board.



Rasterization is whatever the devs decided they wanted the area lighting to look like. It's not "wrong" per say, it's as designed by the devs. RT is a lighting simulation that's closer to realism but that doesn't mean it's going to be inherently better than a well designed scene with rasterization. If the game devs can better convey their artistic vision with rasterization and that so happens to be unrealistic lighting wise so be it.



The unreal engine 5 demo is pretty impressive. It does have both indirect lighting, zero baking, and indirect defuse with unlimited bounces. That's far more than current graphics cards can render via Ray Tracing


SAM does not work across the board, some titles have negatives. It's blocked at a driver level and enabled on games once they've been confirmed to get a positive result.
Over time we'll end up with a lot of titles getting support and that'll be great, but its not a magic on switch for free FPS in every title.
 
How many FPS are enough? You have a serious Gaming machine. I know am a hypocrite though it's Thursday there is soccer on (been on don't let work know) and I am a few bevies in as I will be getting another 6800XT.


hmm, I'd say around 150-165 range fluctuations is about perfect to my eyes, when I get 103-120 fluctuations in the Shadowlands cities it ruins my immersion a bit, it's not horrible don't get me wrong, but it's not as smooth looking to the eyes. I have seena 240hz monitor btw, and I actually didn't like 240hz gaming, it feels like soap opera almost. 150-180hz range I think is the ultimate sweet spot. 140-190 is probably my perfect target area. no higher no lower. I'm hoping someday to upgrade my 1080p to a 27" 1440p 190hz or so... it might be coming soon, I know Asus has a 180hz one coming, so maybe I will look into that one.
 
Back
Top