• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 9 3900X, SMT on vs SMT off, vs Intel 9900K

720p like 1080p, 480p, 1440p is standard.
But not seen any monitor with that resulation. My 20 Inches monitor is 768p, all 20 and 19 Inches monitor I have seen is 768p and people game on that resulation, not 720p. I game on my monitor on 768p not 720p.
 
Yes you can't believe it because its not true

Yes, win in some games, lose in others... also, differences below 5% are within the error margin.

I think that keeping the same architecture by half a decade has had some advantages for blue side: stability and great optimization.
 
But not seen any monitor with that resulation. My 20 Inches monitor is 768p, all 20 and 19 Inches monitor I have seen is 768p and people game on that resulation, not 720p. I game on my monitor on 768p not 720p.

Video industry standard is different than monitor resolution standard. For example 1600p, 1200p, 1024p and 768p were the most used monitor resolutions. None of them are standard but I understand your point.
 
But not seen any monitor with that resulation. My 20 Inches monitor is 768p, all 20 and 19 Inches monitor I have seen is 768p and people game on that resulation, not 720p. I game on my monitor on 768p not 720p.
I could agree that in games like CS:GO which is being played by 450.000 avg (according to steam) 768p is widely being used.
https://csgopedia.com/csgo-pro-setups/ (on Mobile watch in landscape mode; rotate screen to get each players specs)
 
Since its release, the 9400f is still the best cost benefit modern cpu.

See this is what I mean, I didn´t see media gaving attention to this little chip! I´m very surprised by these results and impressed by the 9400F. 10% slower than 9900k in gaming? Damn. 9400F + gtx 1660ti seems a killer combo right now for gaming
 
But not seen any monitor with that resulation. My 20 Inches monitor is 768p, all 20 and 19 Inches monitor I have seen is 768p and people game on that resulation, not 720p. I game on my monitor on 768p not 720p.
It's more about removing the GPU from the equation then posting results on a resolution that may be slightly more popular.
 
Why would someone recommend an İntel/Nvidia setup in an AMD SMT On/Off thread?!
why would anyone care?
It's more about removing the GPU from the equation then posting results on a resolution that may be slightly more popular.
this. Though I disagree with using an artificially low resolution to exaggerate a difference which doesnt scale up.
 
Last edited:
Sorry if my question is silly but:
Is it possible to switch on/off SMT on the fly, without restarting or going to BIOS?

I'm asking because if answer is yes, then maybe it's possible to make some ghost program with information (small database) about Game, CPU and SMT option (on/off) and using this option when or before every game is started? Or if it's possible to made such option in BIOS? Question maybe to AMD or BIOS makers.

In this case you could have always best option choosen for every game and the average of games performance would be even 2-3% higher.
 
Last edited:
very true but the the topic was gaming :D

Add in background tasks and you've lost 5% more. I kept checking and rechecking benchmarks on a gaming PC I built that had all of their crap installed (for gaming only). It literally lost 5% in multithreaded tests. I guess if you don't use anything at all, but discord and other shit adds up.

6 cores are basically yesteryear. 6 cores without SMT are dead to me.
 
There should be no performance difference between disabling SMT vs assigning cores 0-11 to a specific process with project lasso. So i don't think there is any need to disable SMT.
 
Add in background tasks and you've lost 5% more. I kept checking and rechecking benchmarks on a gaming PC I built that had all of their crap installed (for gaming only). It literally lost 5% in multithreaded tests. I guess if you don't use anything at all, but discord and other shit adds up.

6 cores are basically yesteryear. 6 cores without SMT are dead to me.

When you built said gaming PC and it delivered 95FPS instead of 100FPS, did the PC collapse onto itself in sheer embarrassment causing a mini-implosion...cause that would be cool.

P.S. I feel the same way about guys whose neck tie color matches their dress shirt color as you do about 6 cores without SMT...I feel your pain.
 
Last edited:
When multithreading really counts, an SMT acts as half a core, in Rendering/baking tests 12cores/24 threads is AVG. %50 fasters than 12cores. That's really impressive imo.
That's not accurate. It can act anything in between half a core (worst case scenario, completely bandwidth starved) and a full core (best case scenario, in-place, computing intensive tasks).
 
That's not accurate. It can act anything in between half a core (worst case scenario, completely bandwidth starved) and a full core (best case scenario, in-place, computing intensive tasks).
Can you link to testing that shows HT/SMT with 100%/2x gains over not HT loads?
 
I didn't think that point was true... it was always somewhere around that 50% mark give or take many percentage points depending on the testing. I've never seen it hit 2x/100% more before... figured perhaps you have since you said as much.
 
I didn't think that point was true... it was always somewhere around that 50% mark give or take many percentage points depending on the testing. I've never seen it hit 2x/100% more before... figured perhaps you have since you said as much.
No, I said that based on how HT works. As I told you, you'd get that under a specific kind of workload (let's not kid ourselves, we're still talking some missing hardware here, not a full core), but I don't know of benchmarks that measure that.
 
No, I said that based on how HT works. As I told you, you'd get that under a specific kind of workload (let's not kid ourselves, we're still talking some missing hardware here, not a full core), but I don't know of benchmarks that measure that.
And I'm simply saying that doesn't happen (2x/100%)... Over the last 10-15 years I have been doing this, not one test I have seen puts it at that value. If you run into one, LMK so I can update the 'internal database'. :)
 
And I'm simply saying that doesn't happen (2x/100%)... Over the last 10-15 years I have been doing this, not one test I have seen puts it at that value. If you run into one, LMK so I can update the 'internal database'. :)
Now that you mentioned it, I could try and write one (God knows when I'll find the time). My PC doesn't have HT, but I think my laptop does.
 
ok... GL with that. But theory and application in the real world are clearly different. Hit me up when you can make that or run across a real world 2x result.
 
In other news Game Mode in Ryzen Master still serves a purpose. Even after AMD said Windows 10 v1903 was now optimized for all Zen processors with faster clock ramping and improved topology awareness.

Positive result:
metro-zen2-1080p.png


Negative Result:
rainbow-zen2-1080p.png


 
In other news Game Mode in Ryzen Master still serves a purpose. Even after AMD said Windows 10 v1903 was now optimized for all Zen processors with faster clock ramping and improved topology awareness.

Positive result:
metro-zen2-1080p.png


Negative Result:
rainbow-zen2-1080p.png


I hope that was sarcastic, because that link says:
When we ran other game titles with ‘Game Mode’ enabled on the 3900X we found that performance dropped. We only ran Rainbow Six: Siege, Far Cry 5 and Metro Exodus in our CPU test suite and of those titles only Metro Exodus showed a performance improvement with Game Mode enabled. It is likely that 99% of game titles run better in ‘Creator Mode’ and we hope that is the case.
 
Back
Top