• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD 16-core Ryzen a Multi-Chip Module of two "Summit Ridge" Dies

Finally decent mobo that can compete with x99. Tbh the only problem of ryzen are mobo. i dont get why ppl buy ryzen for playing at 1080p. Really a waste.

Agreed. I haven't run 1080p in like 6 years.

People buying $300+ processors and $150+ motherboards to run a $150 monitor...
 
I get the feeling that few months from now all quad core CPUs will feel like core 2 Duo.

Again as I mentioned before, this is like 2007-2008 intel core 2 Duo vs core 2 Quad drama all over again. We got the same type of "thinking" arguing about how useless the extra cores are and so on.

I just don't get it why some people get so mad at you, if you mention anything about AMD or 8+ cores chip.
The reality is a 4 cores chip really has NO HOPE vs an 8+ cores chip, as soon as applications and games start making good use of them.

Plus... If a 4 cores chip is good enough for someone's use, that doesn't necessarily mean this "someone" should start a war on anyone who say " more than 4 cores ".
I don't see any drama here as I haven't seen drama before. Generic software always targets mainstream CPUs. Back then dual core was mainstream, today it's quad core. Everybody's still free to buy whatever they want, regardless.

Also, I'm not sure whether you realize it, but in the bolded part you've used the present tense in the first two sentences and the future (?) tense in the last one. 4 cores will have no hope when that happens, but that certainly is not today.
 
  • Like
Reactions: idx
I figured they would have a multiplex chipset solution to reduce the number of pins to the PCIe bus and other devices on larger chips. Four pins that are quad pumped to give 16 pins worth of bandwidth.

But I look forward to seeing how they handle saturation and what effect it has on stability and latency on die.
 
And the other a dual socket 88 pcie lanes type deal.

I have a feeling the dual socket board will be entry level enterprise and so us regular consumers won't be seeing that.
A good search will likely reveal one available to the regular customer - should they want one. I doubt, though, that someone looking for one would fall into the regular customer category.
 
Finally decent mobo that can compete with x99. Tbh the only problem of ryzen are mobo. i dont get why ppl buy ryzen for playing at 1080p. Really a waste.

LOL is anyone?

Furthermore I think people keep forgetting that the 6 and 4-core models launch in a couple weeks. If you are making a budget 1080p build, you would be an idiot to get a $300 i7 instead of a $150 R5. That extra $150 is the difference between a high end and a midrange graphics card.

It's not like Ryzen can't do 60 FPS either lol.

P.S. If you get fast ram Ryzen meets or beats the 7700K anyways...
 
1080p should have died already.
Yeah, because mid-range cards stopped having problems handling everything at max details at FHD years ago. /s
 
Yeah, because mid-range cards stopped having problems handling everything at max details at FHD years ago. /s

I actually have to back up Bug on this one.


Don't get me wrong, I recommend 4K Freesync IPS to everyone, but at the same time that's only really because they don't cost much more than lower-spec monitors. You spend twice as much as 1080p to have vastly better color and contrast qualities, in addition to having the option of running 4K in games that are easy enough to run.

But the fact is that even the $700 1080 Ti isn't running all games at 60 FPS in 4K, and if you want 144Hz+ gaming - 1080p is still the only real option imo. (Don't make me laugh at 1440p @ 144Hz, it's harder to run than 4K @ 60).
 
.....

How is running 1440P @ 144Hz harder than 4K at 60Hz? 4K is more than twice the resolution of 1440P...
 
I actually have to back up Bug on this one.


Don't get me wrong, I recommend 4K Freesync IPS to everyone, but at the same time that's only really because they don't cost much more than lower-spec monitors. You spend twice as much as 1080p to have vastly better color and contrast qualities, in addition to having the option of running 4K in games that are easy enough to run.

But the fact is that even the $700 1080 Ti isn't running all games at 60 FPS in 4K, and if you want 144Hz+ gaming - 1080p is still the only real option imo. (Don't make me laugh at 1440p @ 144Hz, it's harder to run than 4K @ 60).

The only reason why I love 4k-5k displays is the pixels density. If the PPI is around 150-200ppi then you really don't need AA or any kind of pixel filters, turn it OFF and this will get you up to a really good frame rate overall.
I do think its much better in general to have higher screen resolution even if you don't max out the resolution in games. It is always nice to have apps, text, photos and everything on your system so clear and sharp.
 
Last edited:
More pins for some clumsy gibbon to bend, time Amd dropped the pins like Intel. Put the pins on the cheap bit, ie the motherboard.

16 cores means more porn tabs woo hoo.
 
.....

How is running 1440P @ 144Hz harder than 4K at 60Hz? 4K is more than twice the resolution of 1440P...
Let's see:
  • 3,840x2,160x 60=497,664,000 pixels/second to render
  • 2,560x1,440x144=530,841,600 pixels/second to render
 
More pins for some clumsy gibbon to bend, time Amd dropped the pins like Intel. Put the pins on the cheap bit, ie the motherboard.

16 cores means more porn tabs woo hoo.

p1kalmig2k372.jpg


Way ahead of you buddy
 
This is in preparation for news that Crossfire works perfectly with no SLI-style microstutter in quad-GPU configs. Plug a 16-core Ryzen in with 4 single slot thickness Radeon Pro workstation GPU cards and now we're talking.

Nvidia abandoned multi-gpu and AMD embraced it. In the future, AMD will be releasing Vega GPUs in similar configurations as they are talking about with Ryzen. I mean, multi chip Vega and Polaris packages. Their GPU strategy for defeating Nvidia will be similar to what they did to Intel with Ryzen. Super fast core and alot of them, combined with the 14nm Samsung process, which allows AMD to run relatively high voltages through their chips due to the different way Samsung does 14nm as opposed to Intel. As Intel wastes their time trying to get more performance out of 10nm, Samsung figured out how to efficiently run more voltage through 14nm without the bad thermal performance exhibited by the simpler 14nm manufacturing process used on Kaby Lake.

I'm also expecting more support for water-cooling from AMD. I'm thinking AMD designed water blocks backed by manufacturer warranties, maybe partnering up with PC builders to sell pre-built workstations with quad-GPU configs, full liquid-cooling loop, backed by AMD warrantee, supported by the vendor. Something like that.
 
Don't get me wrong, I recommend 4K Freesync IPS to everyone, but at the same time that's only really because they don't cost much more than lower-spec monitors. You spend twice as much as 1080p to have vastly better color and contrast qualities, in addition to having the option of running 4K in games that are easy enough to run.

Very decent 1080p IPS monitor (not 144hz): €150. Entry level 4K IPS: €450. Which begs the question: is a so-so 4K panel inherently better than a good 1080p panel? I have no idea.
 
The only reason why I love 4k-5k displays is the pixels density. If the PPI is around 150-200ppi then you really don't need AA or any kind of pixel filters, turn it OFF and this will get you up to a really good frame rate overall.
I do think its much better in general to have higher screen resolution even if you don't max out the resolution in games. It is always nice to have apps, text, photos and everything on your system so clear and sharp.
your dumb. You need AA at 4K 27 in monitor.

Your eyes can see 300+PPI at 2 feet and 700+ at 1 foot. Plus AA still makes it smoother.

Very decent 1080p IPS monitor (not 144hz): €150. Entry level 4K IPS: €450. Which begs the question: is a so-so 4K panel inherently better than a good 1080p panel? I have no idea.

all depends on what you want to do.
 
your dumb. You need AA at 4K 27 in monitor.

Your eyes can see 300+PPI at 2 feet and 700+ at 1 foot. Plus AA still makes it smoother.



all depends on what you want to do.

1. Why the insults?
2. And avaliable money.
 
More pins for some clumsy gibbon to bend, time Amd dropped the pins like Intel. Put the pins on the cheap bit, ie the motherboard.

16 cores means more porn tabs woo hoo.

Until the pins on the motherboard kills your cpu cause a short.
Seen it happen.

I don't support either as a "Better" solution, they both have their issues, both are equally flawed.


By the way, use Linux and chrome for porn duties.
Linux exhibits better security and Chrome runs a lot better on more cores for better porn experience.
 
your dumb. You need AA at 4K 27 in monitor.

Your eyes can see 300+PPI at 2 feet and 700+ at 1 foot. Plus AA still makes it smoother.



all depends on what you want to do.

Thanks for the insult.

And NO 200 PPI is not that bad at all if you carefully pick a good monitor (ofc the more ppi the better) you still can see the pixels but it is way better than 1080p.

4k-5k without AA is way better than 1080p with 8x MSAA or even 4xSSAA . When you apply a 4x SSAA your GPU will render stuff at 4 times the pixel density and down scale it to 1080p.

And if you are using a 4k screen without AA the load on the GPU will be similar or close to that of a 4xSS on 1080p. For an extremely thin lines or edges if your GPU can handle it you may apply something like CMAA it will do the job without adding too much load.
 
Thanks for the insult.

And NO 200 PPI is not that bad at all if you carefully pick a good monitor (ofc the more ppi the better) you still can see the pixels but it is way better than 1080p.

4k-5k without AA is way better than 1080p with 8x MSAA or even 4xSSAA . When you apply a 4x SSAA your GPU will render stuff at 4 times the pixel density and down scale it to 1080p.

And if you are using a 4k screen without AA the load on the GPU will be similar or close to that of a 4xSS on 1080p. For an extremely thin lines or edges if your GPU can handle it you may apply something like CMAA it will do the job without adding too much load.
you said you don't need AA which is factually wrong and spreads bad info. AA still makes a significant differences especially 2x AA.

Stop trying to save face.

This gmae does really well without AA but still 2x makes a good difference
https://www.extremetech.com/gaming/...ng-were-glitching-our-way-to-gaming-nirvana/3

I play mostly older games and AA makes a world of a difference. I think newer games have done better at not needing AA as much.
 
Last edited:
you said you don't need AA which is factually wrong and spreads bad info. AA still makes a significant differences especially 2x AA.

Stop trying to save face.

This gmae does really well without AA but still 2x makes a good difference
https://www.extremetech.com/gaming/...ng-were-glitching-our-way-to-gaming-nirvana/3

I play mostly older games and AA makes a world of a difference. I think newer games have done better at not needing AA as much.
I was never able to see 2xMSAA's benefits. 4x is the minimum required. 2x makes a little difference that I can only see in screenshots or if I sit still in game and look for details.

And those pictures tell us squat. Of course you're going to see jaggies when you look at a 4k screenshot on a FHD monitor (you're actually zooming in). The question is: are those jaggies visible when the pixel is physically 4x smaller?
 
Stop trying to save face.
No.. He has his idea, you have yours... You can discuss your opinions but he's under no obligation to change his mind because of what you've said. By saying he's dumb and stop trying to save face you're just being a d*ck. That's my opinion anyway. Continue as you please.

Edit: And back on topic, these 16 core chips sound awesome! :D The 8 core summit ridge is already running rings around my 6700k @ 4.7ghz.
 
Back
Top