• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 with AMD Ryzen 3900XT vs. Intel Core i9-10900K

awev

New Member
Joined
Sep 19, 2020
Messages
5 (0.00/day)
Thank you @lexluthermiester for the feedback, and glad you found my first post interesting. As @biffzinker shows I can remember things, just not accurately. :( I would of offered a citation if I could of remembered where I saw it - to many news feeds, not enough time to study and memorize it all.

The reason I mentioned the code being compiled for a program preferring one chipset over another is that it looks like one of the test games is frame locked at about 52 FPS (Anno 1800) on AMD CPUs. @Selaya suggests it comes down to the I/O chiplet - more to the point the IF for memory. While I don't disagree, yes AMD made the mistake of going with a 12nm chiplet for a 7mn chip, I don't think that is the only thing. So, are we testing to see if a game is frame locked on a CPU or a GPU? Nice that when you are testing a GPU you are also testing the limitations of a game based on the CPU, and by extension, the coding and compilation of the said game.

As to my statements about DirectX11 & 12, well here is some of the info: DirectX12 - What is the big deal? People have tested different games, checked Task Manager (and other software), and reported that DirectX 12 is only utilizing a few cores/thread, and ignoring about half or more of them, on games being played in the last couple years (even today's games). DirectX 12 was developed during a time when four cores and eight threads where considered high end, now days it is only enough to run the OS and maybe M$ Office (just kidding). I am looking to replace my video editing software because it does not properly use all of the recourses available to it - both CPU cores/threads and GPU acceleration, so this is not limited to just games.

BIOS and chip sets can have an impact, just like the CPU can. Just look at this article from Tom's Hardware where they test boot/reboot/shutdown times. How does the BIOS and I/O controllers affect GPUs? I am not sure, yet might be interesting to find out. Same thing with the OSes, how does it effect the frame rates?

The RTX 3080 is the consumer GPU king this week, that is, until the 3090 comes out next week. And Intel still has a lead over AMD, in gaming, even when Until is still using PCIe gen 3 on the desktop as opposed to AMD's PCIe gen 4.

While I tend to prefer AMD (I think better value for the price), I do have WinTel in the house as well. I am not a fan boy for one or the other.

P.S.: When do I get to play games on a 120" 10K monitor with a 360 Hz refresh, with a CPU and GPU able to drive it?
 
Joined
Aug 4, 2020
Messages
1,570 (1.16/day)
Location
::1
Don't you plug your GPU in the slot that uses the CPU's and not the chipset's PCIe lanes :confused:
Just realised the typo, that post made no sense at all :D
Obviously meant
Don't plug your GPU in the slots that uses the chipset's PCIe lanes (use the one with direct CPU lanes, duh)
but I think you guys got the gist anyways :D

As for the memory latency, that is probably an inherent disadvantage of the chiplet vs monolithic design - whether the I/O die is 7nm or not probably isn't the dealbreaker.
 
Joined
Dec 24, 2008
Messages
2,062 (0.37/day)
Location
Volos, Greece
System Name ATLAS
Processor Intel Core i7-4770 (4C/8T) Haswell
Motherboard GA-Z87X-UD5H , Dual Intel LAN, 10x SATA, 16x Power phace.
Cooling ProlimaTech Armageddon - Dual GELID 140 Silent PWM
Memory Mushkin Blackline DDR3 2400 997123F 16GB
Video Card(s) MSI GTX1060 OC 6GB (single fan) Micron
Storage WD Raptors 73Gb - Raid1 10.000rpm
Display(s) DELL U2311H
Case HEC Compucase CI-6919 Full tower (2003) moded .. hec-group.com.tw
Audio Device(s) Creative X-Fi Music + mods, Audigy front Panel - YAMAHA quad speakers with Sub.
Power Supply HPU-4M780-PE refurbished 23-3-2022
Mouse MS Pro IntelliMouse 16.000 Dpi Pixart Paw 3389
Keyboard Microsoft Wired 600
Software Win 7 Pro x64 ( Retail Box ) for EU
Same thing with the OSes, how does it effect the frame rates?

There is a simple answer and a complicate one, the simple is Free of charge.
At Windows 7 Microsoft though to run another 30 services them operating at to be Artificial Intelligence - operating system automatic troubleshooter.
An expert server administrator, he may disable half of them, this will offer a lesser waste of hardware related activity for pointless tasks.
At Win 10 the Microsoft system services number this is further increased, and most of them can not be disabled.
Additionally Microsoft .NET a pile of software layers, instead this working as well oiled machine, there is a plaque of half working security patches, forcing the OS to behave as turtle.
These days it makes you wonder ( IT experts only), of how it is possible the operating system (a prisoner full of ties and hand-caps all in the name of security and safety) this to be free enough to deal with Gaming applications when the user wants to play a game.

If Microsoft was a serious software maker, they should adopt as standard a new User Profile this named as Gaming mode, in which all unnecessary system services they will be terminated.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
At Win 10 the Microsoft system services number this is further increased, and most of them can not be disabled.
Additionally Microsoft .NET a pile of software layers, instead this working as well oiled machine, there is a plaque of half working security patches, forcing the OS to behave as turtle.

Do you remember back in very early 2000s when Microsoft was pushing .NET as clean, pure, rapid and easy to deploy framework to replace the "DLL hell!!!". Oh my! 20 years later and "DLL hell!!!" seems like paradise. .NET interframework version incompatibilities are infamous. Framework itself is massive, buggy, and rarely used outside of corporate world niche uses. The JIT performance Microsoft promised would be indistinguishable from native is anything but. I CAN LITERALLY tell when an application is .NET framework based. The beautiful feeling of clicking on a executable and staring at a desktop while nothing at all happens, all the while CPU activity and disk activity spikes to near max levels. Ahhh, .NET framework application at work... trying to start itself.... give me back "DLL hell" any day.


...
..
.
 
Joined
May 31, 2014
Messages
377 (0.10/day)
System Name serenity now/Faithul Eight
Processor Amd 2400g/Amd 3800x
Motherboard Asrock ab350m Pro 4/Asrock x570 Taichi
Cooling CM Masterair G100M/Wraith
Memory G.skill 2x4gb 3200/2x16 gb 3600 G.skill
Video Card(s) igpu vega 11/3070 oc Palit
Storage Apacer pcie ssd 240 gb/Adata 512gb nvme
Display(s) 50" LG 4k hdr
Case scratch build
Power Supply inter tech 650w 80+bronze/850 phantex pro
Software Ubuntu bionic beaver 18.04 lts/W10
Joined
Apr 17, 2011
Messages
235 (0.05/day)
It should. I haven't tried it yet personally, but then why else would it be artificially limited to DX11? They're not advertising Windows 7 compatability, but it's implied with the exclusive use of DX11.
Technically Windows 8.1 is still officially supported until 2024 (Extended support until January 10, 2023 ) I think? Though also "technically" installing 8.1 on "newer hardware" (it's some arbitrary cutoff pre-Ryzen) also isn't supported by Microsoft soo... I'm not sure what they're targeting?

Is there some weird advantage to running the Flight Sims on Windows Server or something? I've no idea... you'd think MS would want to promote/push DX12 and by extension win10 outright. I love win7 too but by this point... come on.
 
Last edited:
Joined
Mar 9, 2017
Messages
27 (0.01/day)
Processor 9900k HT OFF 5.0/4.6 1.21v
Motherboard XI Gene
Cooling NH-U12A - direct die
Memory 4100 17-17-17-36 1T 1.35v / 3800 14-14-14-32 1T 1.55v
Video Card(s) 1080 ti evga black
Power Supply Seasonic 1000W

Attachments

  • 1080ti.stock.4100.c15.jpg
    1080ti.stock.4100.c15.jpg
    263.9 KB · Views: 147
Joined
Mar 18, 2015
Messages
2,960 (0.89/day)
Location
Long Island
It's not that the gaming performance at low resolutions doesn't matter ... the reality is as you go up in resolution the CPU stops being the bottleneck and the GPU becomes the bottleneck. It doesn't erase the bottleneck and that's a significant distinction. I am going to go back a few years when the prevailing wisdom was "faster speed / lower CAS RAM doesn't improve gaming". After a while that changed to "faster RAM doesn't matter, as long as your testing was limited to average fps". It was shown that it had a significant impact on minimum fps. But here again ... faster RAM's impact was not being seen when the GFX card was the bottleneck. When you put x80s in SLI, that bottleneck was removed and faster RAM / lower CAS showed significant impact.

===============================

22.3 % (SLI) increase in minimum frame rates w/ C6 instead of C8 in Far Cry 2
18% (single card) / 5% (SLI) increase in minimum frame rates w/ C6 instead of C8 in Dawn of War
15% (single card) / 5% (SLI) increase in minimum frame rates w/ C6 instead of C8 in World in Conflict

Also see http://www.bit-tech.net/hardware/memory/2011/01/11/the-best-memory-for-sandy-bridge/1



===============================

Today it's more accepted that faster speed / lower CAS does affect things ... but it won't where the bottleneck lies elsewhere. In this article, today that bottleneck at 4k is the 3080 ... the line where that bottleneck kicks in will move substantially when you install a 4080 or 5080. Right now the CPU limitations can't be seen at higher reolutions because the GPU is not allowing us to see it. Car geeks know this ... your sports car's drive train may be incapable or being blown by tour current engine, but increase engine size, add a supercharger and the HP / Torque increase will likely overstress something in the drive train.

Second, unless you never do anything else with your PC, other uses must be considered. Unfortunately "I'm choosing this CPU because its almost as good and gaming and it kicks tail in Cinebench" is not an uncommon train of thought. I don't know anyone who runs Cinebench on a daily or weekly basis . It's like choosing one house over another in Nome, Alaska because it has an air conditioner.

Even if it's primarily a gaming box, after gaming, the next consideration is application performance, and where these are close or immaterial, "other factors". If you have 4k, the relevant questions are:

a) Am I the guy / gal who keeps a CPU / GPU / MoB / RAM combo for 3 -5 years or am I more likely to upgrade the GPU at some point down the line ?
b) After gaming, what do I have on my box that benefits either way from CPU choice ? For me that's mainly AutoCAD and Office Suites, which decidedly favor Intel. They guy I'd send my drawings to in order to get rendered if I ever needed one ... he'd want AMD. Then again, given that I never needed one in the 35 years since i started my business, I could just as easily run the render overnight in which case time whether it takes 2, 4 ot 8 hours, I won't see it fir at least 12.
c) If ya still betwixt and between on the decision by the time you get done with the list, it's time to look at the other factors. Power consumption is insignificant but Intel has a substantial advantage in temps, the more important part of which is the fan noise assoctaed with that 20C temp difference.

If you are decidedly in favor of one brand or another and can't point to a specific application that has a marked advantage or that CPU, you're doing it wrong. Often there will be a note on requested build components list we receive saying "I chose an AMD CPU because it has more cores". When I ask "Do you have any apps that benefit from more cores ? The most common and also the most disturbing answer is "I don't know".

Looking how the market is responding to the choice, there's this.

The 3900 XT is selling at $476 ... 95% of its original MSRP
The 10900KF is selling at $551... 103% of its original MSRP

What does that tell us and what questions does it pose:

a) The demand for the 10900kf exceeds demand by a wee bit
b) The supply for the 3900 XT is a wee bit under current demand.
c) Doesn't the current $75 price difference render performance differences moot ?

Again, no definitive answer, it depends. If it's just because you're impatient and want the new build with the new 3080 toy yesterday, then no. I wouldn't build a new box today under any circumstances other than "I can't work". PSU prices are up 50 - 90% ... Case prices are up as much as 100% ... My youngest was building a new box some 2-3 months back (3800x) and only thing he was taking was the SSHD and the GFX card. So in order to have a spare PC in the house, I bought him a new SSHD ($85) and then he left his in the old box with a working Windows 10 Pro OS on it and all hardware drivers. Son No. 2 wanted one last week .... it was $254. Today its $150

Vendors can't make money selling what's not on their shelves, they can't sell what is sitting on their shelves because demand is low .... they price products to meet demand. When supply catches up with demand prices will stabilize, not just CPUs but everything else . The consumer is the ultimate arbiter of product pricing, manufacturers can only charge what consumers are willing to pay.

So, .... when asked "What CPU do I want to pair with my 3080 today ? The answer is neither. The reason the price difference doesn't matter is, (forgetting the CPU choice for a moment), with PSU, Case, MoBo, Storage and other prices in the stratosphere right now, Im not willing to pay a $150 - $175 price premium for "I want it now". (SSD +$0, MoBo + $10, SSHD + $65, PSU + $60, case +$40) In addition, waiting also allows the 3080 drivers to mature, less headaches, less bugs, less failed designs, benefit of production line improvements upping chances in silicon lottery. The $75 CPU savings we'd see today with an 3900 XT will be eaten by the more than twice over.

Somewhere between Thanksgiving and New Years, prices will stabilize, we will know the winners and losers brand / model wise and the price situation will be different than it is today
 
Joined
Mar 9, 2017
Messages
27 (0.01/day)
Processor 9900k HT OFF 5.0/4.6 1.21v
Motherboard XI Gene
Cooling NH-U12A - direct die
Memory 4100 17-17-17-36 1T 1.35v / 3800 14-14-14-32 1T 1.55v
Video Card(s) 1080 ti evga black
Power Supply Seasonic 1000W
Congrats, but I'm not using the integrated benchmark, try some gameplay
If you are willing to share save file or script you used to test SOTTR i'm willing to test it out.

Thanks

ps...i did test out 1440p and 4k and average fps for 1080ti is spot on in line with your results.
 

Kendragon

New Member
Joined
Sep 25, 2020
Messages
1 (0.00/day)
I am sorry, but I have to say something about the ram used on the 3900x. I had a 3900x and a 9900K, the Ryzen shined with DDR4 3600 with a cl of 16 and the 10900K isn't that much faster than the 9900K. I was actually able to match or beat my 990k, both with the same RTX 2080TI. Also, good luck finding a 10900K for MSRP or even near it. You could buy 2 3900X's for the price of 1 10900K. But I can throw a wrench in it if someone would like to send me an RTX 3080 to test on my Threadripper 3960X vs the 2080TI I have. And I went to threadripper for the PICE lanes mainly. No SSD's or spinning hard drives in my computer. I have 2 2tb Aorus nvme 4.0's and 2 1tb Aorus nvme 4.0 and 1 samsung 960pro Nvme 3.0. I used to have a 7820X that i missed dearly when I went to an 8700K, then the 9900K, then 3900x and now Threadripper 3.

Let us not forget that AMD cpu's use to best intel cpu's in everything (Athlon64 days). I have confidence they will do so again with Zen 3. See, not a fan boy at all. I ahve had AMD?ATI and Nvidia gpus, and intel and amd cpu's, Ryzen 3900x was my first back into AMD since Athlon 64
 

terrorfrog

New Member
Joined
Sep 26, 2020
Messages
1 (0.00/day)
FYI, AMD didnt became better. Never where.
At some point they where close enough to intel but offered a better price for their cpus.

the ryzens didnt became faster or better over night at all, AMD just made a new form of multi socket layout (that is indeed better than traditional multisockets)
put it into one package and offered it for cheap.

yet they are only either an edge case buy or a budget one not the technical better option.
not certified for anything and a bad board and bios ecosystem and stability issues still keeps ryzen out of the ideal usecases (workstations and such)

so the only real reasonable usecase is either beeing on a very tight budget or having a workstation load but can live with stability issues.

theres a reason why there is not a single serious workstation or server board out there asside from epic.
asad but true and a good reason why intel still not in panic mode
 
Joined
Mar 29, 2014
Messages
339 (0.09/day)
FYI, AMD didnt became better. Never where.
At some point they where close enough to intel but offered a better price for their cpus.

the ryzens didnt became faster or better over night at all, AMD just made a new form of multi socket layout (that is indeed better than traditional multisockets)
put it into one package and offered it for cheap.

yet they are only either an edge case buy or a budget one not the technical better option.
not certified for anything and a bad board and bios ecosystem and stability issues still keeps ryzen out of the ideal usecases (workstations and such)

so the only real reasonable usecase is either beeing on a very tight budget or having a workstation load but can live with stability issues.

theres a reason why there is not a single serious workstation or server board out there asside from epic.
asad but true and a good reason why intel still not in panic mode
You are delusional. You don't even make sense. Asside from Epic is like saying asside from Xeon.

You have ThreadRipper for desktops. Dell and HP are all selling AMD equipped servers.

Clock for clock AMD has HIGHER IPS and LOWER power draw than Intel. Intel's process allows for higher clocks while AMD's allows for higher density.


And, thanks to AMD this 10900K doesn't cost $1500 like it's predecessors.
That is why it's the BETTER option for many.
 
Top