• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Brings Smart Access Memory (Resizable BAR) Support to Ryzen 3000 Series

Joined
Jun 2, 2017
Messages
7,920 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
hmm, I'd say around 150-165 range fluctuations is about perfect to my eyes, when I get 103-120 fluctuations in the Shadowlands cities it ruins my immersion a bit, it's not horrible don't get me wrong, but it's not as smooth looking to the eyes. I have seena 240hz monitor btw, and I actually didn't like 240hz gaming, it feels like soap opera almost. 150-180hz range I think is the ultimate sweet spot. 140-190 is probably my perfect target area. no higher no lower. I'm hoping someday to upgrade my 1080p to a 27" 1440p 190hz or so... it might be coming soon, I know Asus has a 180hz one coming, so maybe I will look into that one.
Got it I did not realize you meant 1080P. Once you go to 1440P 1080P will look like 720P does today.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,981 (4.60/day)
Location
Kepler-186f
Got it I did not realize you meant 1080P. Once you go to 1440P 1080P will look like 720P does today.

I have seen 1440p, I used to own 1440p 27" qnix monitor but sold it to someone, I regret selling it honestly. BUT this a really high quality 1080p, 12 bit IPS 0.5 ms, lots of new tech in it, and very high quality. got it on launch on best buy for $169 free ship. its the acer brand of those new IPS 23.8" panels, but it looks gorgeous, way better than most 1080p. but yes I know what you mean as far as clarity goes.
 
Joined
Jun 2, 2017
Messages
7,920 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I have seen 1440p, I used to own 1440p 27" qnix monitor but sold it to someone, I regret selling it honestly. BUT this a really high quality 1080p, 12 bit IPS 0.5 ms, lots of new tech in it, and very high quality. got it on launch on best buy for $169 free ship. its the acer brand of those new IPS 23.8" panels, but it looks gorgeous, way better than most 1080p. but yes I know what you mean as far as clarity goes.
The QNIX 2710? That monitor was truly epic! A really good 1080P IPS does look sweet too though.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,981 (4.60/day)
Location
Kepler-186f
The QNIX 2710? That monitor was truly epic! A really good 1080P IPS does look sweet too though.

yep (it might have been x-star brand i can't remember, they were same panel though), plus keep in mind - i will be taking a fps hit at 1440p and i have a target range of fps i intend to hit. for example AC Valhalla I will play on my 1080p IPS even when I do have 1440p again. so i can turn down settings a little hit at least 130 fps range or so.

but games that can hit 130+ at 1440p which there are a lot, I will play on 1440p. plus it will be nice having two monitors, so its a win win. :)
 
Joined
Jun 2, 2017
Messages
7,920 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
yep (it might have been x-star brand i can't remember, they were same panel though), plus keep in mind - i will be taking a fps hit at 1440p and i have a target range of fps i intend to hit. for example AC Valhalla I will play on my 1080p IPS even when I do have 1440p again. so i can turn down settings a little hit at least 130 fps range or so.

but games that can hit 130+ at 1440p which there are a lot, I will play on 1440p. plus it will be nice having two monitors, so its a win win. :)

You got it. I love Gaming on my 1440P monitor but visuals are sweet on my 4K monitor so I use that to watch vidoes.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,981 (4.60/day)
Location
Kepler-186f
You got it. I love Gaming on my 1440P monitor but visuals are sweet on my 4K monitor so I use that to watch vidoes.

yep and i intend to get a nice a 4k OLED tv someday (probably a couple years from now alongside hopefully a updated PS5 model that doesn't reach 95 celsius vrm's...) and games on PC that are capped to 30 or 60 fps I will play on the 4k tv. so i am going to have the trifecta of monitors going on so to speak. for now I am holding off on PS5, until they fix the drift in controller and the 95 celsius vrms
 
Joined
Mar 11, 2019
Messages
294 (0.16/day)
no, looks like 3000 series only which is reasonable enough. Two generations of support seems fair.

I've seen results of ReBar on Zen 1, there is no missing capability on the older processors, its an artificial limitation.

Nvidia is only enabling it on a per game basis as well, so it's going to have a pretty limited effect regardless

Feel free to try it on more games.



SAM does not work across the board, some titles have negatives. It's blocked at a driver level and enabled on games once they've been confirmed to get a positive result.

"SAM" is all or nothing, and that All implies the downsides and a reboot required to turn it off when performance is negatively affected.

Nvidia isn't doing "SAM", they went one better and tied the capability into their profile system so that games not whitelisted use traditional 256MB uploads.

Another unfortunate situation where AMD is first to something, but their implemenation is half arsed and done better by the competition.
 
Joined
Jun 2, 2017
Messages
7,920 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
yep and i intend to get a nice a 4k OLED tv someday (probably a couple years from now alongside hopefully a updated PS5 model that doesn't reach 95 celsius vrm's...) and games on PC that are capped to 30 or 60 fps I will play on the 4k tv. so i am going to have the trifecta of monitors going on so to speak. for now I am holding off on PS5, until they fix the drift in controller and the 95 celsius vrms
I have the Sony XBR65X900H in my Amazon cart. It is supposed to be a House warming gift for my Sister. It literally dropped $100 overnight to $1598. I know it's not OLED but it is better than any TV I currently own. I might make it 2 but that would kill my budget.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,981 (4.60/day)
Location
Kepler-186f
I have the Sony XBR65X900H in my Amazon cart. It is supposed to be a House warming gift for my Sister. It literally dropped $100 overnight to $1598. I know it's not OLED but it is better than any TV I currently own. I might make it 2 but that would kill my budget.


Costco has the 55" LG CX OLED flagship for $1349 with extra warranty included free. i'd rather move my couch up a little bit and get the 55" over 65". OLED is just too lovely imo
 
Joined
Jun 2, 2017
Messages
7,920 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Costco has the 55" LG CX OLED flagship for $1349 with extra warranty included free. i'd rather move my couch up a little bit and get the 55" over 65". OLED is just too lovely imo
I should quoted CAD. That Sony monitor would be $1262 US. That is still a sweet deal though.
 
Joined
Jul 13, 2016
Messages
2,834 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Well not entirely, if the put a window somewhere and the sun is out and no lights shines through the window... I mean yeah unless its really meant to be designed that way with your main character going nuts or it being horror or something sure...but otherwise its simply just wrong.

No, that'd just be an oversight from the devlopers. Forgetting to place a light source would have the same result if they were using RT as well. Of course it's wrong, it'd be wrong no matter what lighting technology you are using. I don't get the point of your comment, it's like you are implying that a accident on the dev's end somehow makes rasterization wrong. Makes no sense.
 
Joined
Feb 11, 2009
Messages
5,398 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
No, that'd just be an oversight from the devlopers. Forgetting to place a light source would have the same result if they were using RT as well. Of course it's wrong, it'd be wrong no matter what lighting technology you are using. I don't get the point of your comment, it's like you are implying that a accident on the dev's end somehow makes rasterization wrong. Makes no sense.

Well ok lets take a step back, games in general are meant to look realistic right? even in a fantasy game like for example Zelda or whatever the sun amits light and when that light is blocked by idk a mountain, it should cast a shadow etc etc etc, we get this.
RT makes that lighting be realistic, at the cost of performance yes but its correct and realistic.
Raterization is faked lighting (so to speak) and thus can actually be not realistic, we accept how it looks in general but that does not make it correct.

My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?

(now obviously I get that RT lighting is also dynamic and you lose that sure, but there is plenty of static light that is revealed to be more realistic with RT and seemingly could easily be faked, in Cyberpunk there is a bench with a light above it used in Digital Foundry's video to illustrate RT on and off that shows what im talking about)
 
Joined
Jul 13, 2016
Messages
2,834 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Well ok lets take a step back, games in general are meant to look realistic right? even in a fantasy game like for example Zelda or whatever the sun amits light and when that light is blocked by idk a mountain, it should cast a shadow etc etc etc, we get this.
RT makes that lighting be realistic, at the cost of performance yes but its correct and realistic.
Raterization is faked lighting (so to speak) and thus can actually be not realistic, we accept how it looks in general but that does not make it correct.

My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?

(now obviously I get that RT lighting is also dynamic and you lose that sure, but there is plenty of static light that is revealed to be more realistic with RT and seemingly could easily be faked, in Cyberpunk there is a bench with a light above it used in Digital Foundry's video to illustrate RT on and off that shows what im talking about)

Zelda's lighting is certainly not designed to be realistic. Basic things like the sun emitting light and objects casting shadows are not evidence to the contrary.

You are mistaken in your assumption that rasterization effects can't be accurate as well, especially given that many modern game engines are adding lighting features that exceed what is currently possible with via RT on modern video cards at a fraction of the processing budget. Go and look at the unreal engine 5 demo.

"My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?"

In order to match the RT version though, your rasterized lighting would have to support indirectly lighting. At that point though, if it does there's really no point in bothering with an RT pre-pass as your rasterized lighting is already as good as your RT lighting. If you are using CryEngine, Unreal engine, or Unity you can already get this with rasterized lighting.
 
Joined
Mar 11, 2019
Messages
294 (0.16/day)
You are mistaken in your assumption that rasterization effects can't be accurate as well, especially given that many modern game engines are adding lighting features that exceed what is currently possible with via RT on modern video cards at a fraction of the processing budget. Go and look at the unreal engine 5 demo.

Rasterization cannot be as accurate without incurring a significant penalty on the CPU to simulate dynamic shadow and lighting without restraint.
 
Top