• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Readying a 10-core AM4 Processor to Thwart Core i9-9900K?

For most users that's actually enough.
I don't disagree, but with the increasing popularity of NVMe drives another few lanes would be a nice form of future-proofing.

LOL! No it isn't. A solid 20% of my clients do either crossfire or SLI(mostly SLI). Multi GPU gaming has maintained a steady level of popularity for more than 10 years. It's not going anywhere.
Well, that would place them solidly in the "more money than sense" category. I suppose you can't stop people from wasting money. The RTX 2070 doesn't support SLI, meaning that the minimum price for a dual GPU setup in the future will be ~$1400. According to GamersNexus' recent test of RTX SLI, scaling is actually slightly better than it has been (probably thanks to NVLink), but still unpredictable and requires per-game profiles, limiting support to a handful of titles per year. In other words, in most games your $1400 SLI setup will perform no better than a $699 single card, let alone a $1200 Ti - which works in every game. That's dead enough for me.
 
I don't disagree, but with the increasing popularity of NVMe drives another few lanes would be a nice form of future-proofing.

Supposedly Ryzen (at least the 2600x) has 20 lanes, 16 for video card and 4 for... whatever else, usually those NVMe drives you speak of. Beyond that the chipset provides more. Not sure what else you would be looking for.

Well, that would place them solidly in the "more money than sense" category. I suppose you can't stop people from wasting money. The RTX 2070 doesn't support SLI, meaning that the minimum price for a dual GPU setup in the future will be ~$1400. According to GamersNexus' recent test of RTX SLI, scaling is actually slightly better than it has been (probably thanks to NVLink), but still unpredictable and requires per-game profiles, limiting support to a handful of titles per year. In other words, in most games your $1400 SLI setup will perform no better than a $699 single card, let alone a $1200 Ti - which works in every game. That's dead enough for me.

This I agree with. SLI is plagued with problems and meh performance gains even in titles that support it. Even if someone handed me $10,000 and told me I MUST use it to buy myself a computer, SLI (or xfire) would still not be on the list. I have two 1070s right now only because of mining. If it weren't for that, I would still more than likely be rocking my old 660 Ti.
 
I don't disagree, but with the increasing popularity of NVMe drives another few lanes would be a nice form of future-proofing.
But..
Supposedly Ryzen (at least the 2600x) has 20 lanes, 16 for video card and 4 for... whatever else, usually those NVMe drives you speak of. Beyond that the chipset provides more. Not sure what else you would be looking for.
This.
Well, that would place them solidly in the "more money than sense" category.
Or people that have the money and can, comfortably or not, afford the setup and want the extra performance.
limiting support to a handful of titles per year.
Rubbish, SLI/Crossfire support is driver-centric. All games will run fine in a dual GPU config.
SLI is plagued with problems and meh performance gains even in titles that support it.
Also rubbish. I haven't seen a show-stopping bug/glitch/problem in over three years and the last one had an easy workaround until AMD fixed the driver. Haven't seen an SLI related problem, that affected the systems I've built, in over five years.
 
Also rubbish. I haven't seen a show-stopping bug/glitch/problem in over three years and the last one had an easy workaround until AMD fixed the driver. Haven't seen an SLI related problem, that affected the systems I've built, in over five years.

I've seen problems with SLI personally (in my uncle's system), but granted that was ages ago with two 8800GTS 320MB cards. Between that and the nonstop lamenting over SLI/xFire before and after, to this day, all over the net is more than enough to put me off of it.
 
I've seen problems with SLI personally (in my uncle's system), but granted that was ages ago with two 8800GTS 320MB cards. Between that and the nonstop lamenting over SLI/xFire before and after, to this day, all over the net is more than enough to put me off of it.
I'm not saying dual GPU system aren't without glitch and issues once in a while, but these problems like many other, get blow way out of proportion. I've been building gaming PC's since the original Voodoo SLI and have never seen the kind of problems a lot of people lament over. The worst problem with multiGPU setups I ever encountered was with the Voodoo2's. Even that was just a matter of figuring out what the problem was.

With the new RTX series cards, SLI seems like an attractive prospect for those who can afford it. However..
The RTX 2070 doesn't support SLI
I had to look this up. The 2070 and below will not have NVlink. That does not mean they will not still have the standard SLI bridge. NVidia has not stated that it will not be available.
 
  • Like
Reactions: hat
Supposedly Ryzen (at least the 2600x) has 20 lanes, 16 for video card and 4 for... whatever else, usually those NVMe drives you speak of. Beyond that the chipset provides more. Not sure what else you would be looking for.
That's the 16+4+4 I mentioned above. 16 for graphics (and general usage, really), 4 for NVMe (or, again, anything, really), and 4 for the chipset link. The chipsets only provide PCIe 2.0 (8 lanes for 70-series, 6 for 50). So if you want/need more than one full-speed NVMe SSD (which is growing more likely as time passes), you need to eat into the 16 GPU lanes, which means that no/few motherboards will provide more than one NVMe port from the CPU-connected lanes - they'll use the chipset 2.0 lanes instead. Of course, running your GPU at x8 isn't actually a problem, but this requires a riser card for the SSD.

Rubbish, SLI/Crossfire support is driver-centric. All games will run fine in a dual GPU config.
Yes, it depends on drivers - SLI profiles in the drivers, specifically. SLI has zero effect without a bespoke profile for the game in question (activating it in a game without a profile usually leads to a tiny but measurable performance drop, bugginess, or nothing at all happening). For some games, modders even make their own profiles, with varying success. The only difference between SLI and DX12 multi-GPU in this regard is that the effort lies with Nvidia and not the developer. The statement that "all games will run fine in a dual GPU config" is thus either false (no performance scaling without a profile) or meaningless (defining "running fine" as not requiring performance scaling, invalidating the point of SLI).

As for the 2070 having SLI, there are no SLI fingers visible on the back of the board (scroll down for a picture of the back). For previous cards, the SLI fingers needed a cut-out in the backplate, so unless they've redesigned the entire SLI interface, it doesn't have it. There isn't room to fit the bridge connector between the backplate and the PCB, so a cutout or fingers sticking up past the backplate would be necessary. The NVLink slot is also visible from the back on the 2080/- TI. Nvidia cut SLI from the third-largest die (then the 60-series) previously, so it's no surprise if they keep to this line even now that the third-largest die is in the 70-series.

SLI gives you the ultimate performance in the (relatively few) games that support it for the people who can afford it, but given the cost and what you gain back, it's an utter waste of money.
 
As for the 2070 having SLI, there are no SLI fingers visible on the back of the board (scroll down for a picture of the back).
That's a CGI mock-up not an actual photograph. And the FE RTX cards and many of the AIB cards have a removable cover for the NVlink, the FE GTX 2070 cards likely have the same.
The statement that "all games will run fine in a dual GPU config" is thus either false (no performance scaling without a profile) or meaningless (defining "running fine" as not requiring performance scaling, invalidating the point of SLI).
What I meant was that all games will benefit from SLI/CF. I have yet to see a game that doesn't get at least some performance increase from a multiGPU setup, when properly configured.
SLI gives you the ultimate performance in the (relatively few) games that support it for the people who can afford it, but given the cost and what you gain back, it's an utter waste of money.
That is entirely your opinion, not shared by all.
 
Last edited:
That's a CGI mock-up not an actual photograph. And the FE RTX cards and many of the AIB cards have a removable cover for the NVlink, the FE GTX 2070 cards likely have the same.
The same mock-ups of the 2080 and 2080 Ti have the NVLink connector (with its cover) very clearly visible (it protrudes slightly from the edge of the backplate). Official renders of the 970 and 1070 also clearly showed the SLI fingers. The renders of the 2070 show nothing but a straight edge there - clearly no nvlink adapter, and also no SLI finger cutout. Official product renders for Founders Edition cards also tend to match the final product quite exactly.

What I meant was that all games will benefit from SLI/CF. I have yet to see a game that doesn't get at least some performance increase from a multiGPU setup, when properly configured.
I think what you mean is that all games can benefit from it. Will implies that it'll happen in time, which it won't - even Nvidia doesn't have the resources to do all that development. The problem is that >95% of games never come close to "properly configured" for SLI. The vast majority never even get profiles, and many of those who do never see more than 30-40% scaling (there are exceptions - in the GamersNexus 2080 Ti SLI scaling review i referenced above they had a title with >90% scaling!). Of course, some people don't mind paying 2x the price for 1.4x the performance in <5% of titles, and that's of course their right - but that won't make me stop calling it dumb, bad value, poorly implemented, and generally problematic. 'Cause it is.

I'm quite a fan of the concept behind SLI/CF (I even had CF 4850s back in the day, which worked decently in supported titles up until their 512MB of RAM started being an issue). The problem is that until MULTI-GPU can be implemented universally and transparently on a general driver basis (meaning no developer effort required, unlike DX12 multi-GPU, and much less driver development required, unlike SLI/CF), it's going to be a niche solution with disappointing results and egregious value.
 
The same mock-ups of the 2080 and 2080 Ti have the NVLink connector (with its cover) very clearly visible (it protrudes slightly from the edge of the backplate). Official renders of the 970 and 1070 also clearly showed the SLI fingers. The renders of the 2070 show nothing but a straight edge there - clearly no nvlink adapter, and also no SLI finger cutout. Official product renders for Founders Edition cards also tend to match the final product quite exactly.
Until they actually officially announce that they will not have SLI, I'm not willing to accept such. They would be shooting themselves in the foot and handing AMD a whole class of customers if they didn't continue SLI on mid-range cards.
I think what you mean is that all games can benefit from it. Will implies that it'll happen in time
Right, bad choice of vocabulary.

We are way off topic here, let's rope it in..
 
Until they actually officially announce that they will not have SLI, I'm not willing to accept such. They would be shooting themselves in the foot and handing AMD a whole class of customers if they didn't continue SLI on mid-range cards.
Don't disagree here (I'm generally not a fan of making features exclusive to high-end SKUs), but seeing how they cut it from XX06 cards last generation, It'd be a bit strange for them to bring back support to this chip tier this generation. Of course, the separation of the three topmost SKUs into three separate silicon dice is itself unprecedented, so who knows what they'll end up doing?
 
Lol. This is gonna be embarrassing. I've decided to find those old slides for you, and out of all the sources the first one that came up in google was an article from WCFTech with an informative title "Fake AMD Ryzen 2800X 12 Core 5.1GHz Slide Sends Media Into Frenzy" ))))
So much for keeping up with news.... :banghead:

So, all we have to go on, is a now-taken-down and non-existent MSI promotional video for B450 motherboard that claimed "8-core and up CPU" support... All clues and hints have been meticulously erased.
18145622369l.jpg


Not just Msi.

If you look at the manual for Asrock x370 PRO btc+

You can see in the bios that they have cpu overclock for 16 cores
 
Not just Msi.

If you look at the manual for Asrock x370 PRO btc+

You can see in the bios that they have cpu overclock for 16 cores
... Which fits perfectly with AMD moving their top-end consumer parts to the same silicon as the currently sampling 7nm EPYC with the 3000-series (with two 8-core CCXes per die for a maximum of 64 cores in 4-die EPYC). There's no reason to suspect this is relevant before then.

This aligns with AMDs current strategy, as well as reasonable expectations of its extention into the future. It is as such the answer that requires the fewest new assumptions (no deviations from current strategy, no unknown silicon, no reconfiguration of the architecture that we don't know of) and is thus the best hypothesis according to Occam's razor.
 
Wrong thread.
Not really - we went off on a bit of a tangent for the past couple of pages. Still, not really on-topic, but neither were the posts preceding it.
 
With four cores per CCX, 10 cores aren't possible? Well then, why not go to 12 cores. And some rejects with two failed cores turned off would be the 10 core parts.
 
With four cores per CCX, 10 cores aren't possible? Well then, why not go to 12 cores. And some rejects with two failed cores turned off would be the 10 core parts.
That concept has already been covered/suggested. I agree that it's possible with some sort of combination. We will see what happens.
 
With four cores per CCX, 10 cores aren't possible? Well then, why not go to 12 cores. And some rejects with two failed cores turned off would be the 10 core parts.

They will not do that for the same reason as they will not make a 16 core Ryzen chip and why I think a 10 core Ryzen 2800X is likely just a rumor. It doesn't make any sense cannibalize your own market segments. We already have a Threadripper 12 core/24-thread chips in the 1920X with a possible 2920X on the way. There is no reason to shoot yourself in the foot by offering a 10 or 12 core Ryzen chip.
 
There is no reason to shoot yourself in the foot by offering a 10 or 12 core Ryzen chip.
But that isn't what would happen. They would be maximizing inventory that would otherwise sit unused. That's not shooting oneself in the foot, it's being smart. Shooting themselves in the foot would be wasting those unused dies.
 
Not based ona tinge of evidence and it would likely massively increase the current die size for Zen+. It is an idiotic article by people who need to generate a buzz when there is nothing out there at all to substantiate it. Fake News in this case.
 
That's the 16+4+4 I mentioned above. 16 for graphics (and general usage, really), 4 for NVMe (or, again, anything, really), and 4 for the chipset link. The chipsets only provide PCIe 2.0 (8 lanes for 70-series, 6 for 50). So if you want/need more than one full-speed NVMe SSD (which is growing more likely as time passes), you need to eat into the 16 GPU lanes, which means that no/few motherboards will provide more than one NVMe port from the CPU-connected lanes - they'll use the chipset 2.0 lanes instead. Of course, running your GPU at x8 isn't actually a problem, but this requires a riser card for the SSD.


Yes, it depends on drivers - SLI profiles in the drivers, specifically. SLI has zero effect without a bespoke profile for the game in question (activating it in a game without a profile usually leads to a tiny but measurable performance drop, bugginess, or nothing at all happening). For some games, modders even make their own profiles, with varying success. The only difference between SLI and DX12 multi-GPU in this regard is that the effort lies with Nvidia and not the developer. The statement that "all games will run fine in a dual GPU config" is thus either false (no performance scaling without a profile) or meaningless (defining "running fine" as not requiring performance scaling, invalidating the point of SLI).

As for the 2070 having SLI, there are no SLI fingers visible on the back of the board (scroll down for a picture of the back). For previous cards, the SLI fingers needed a cut-out in the backplate, so unless they've redesigned the entire SLI interface, it doesn't have it. There isn't room to fit the bridge connector between the backplate and the PCB, so a cutout or fingers sticking up past the backplate would be necessary. The NVLink slot is also visible from the back on the 2080/- TI. Nvidia cut SLI from the third-largest die (then the 60-series) previously, so it's no surprise if they keep to this line even now that the third-largest die is in the 70-series.

SLI gives you the ultimate performance in the (relatively few) games that support it for the people who can afford it, but given the cost and what you gain back, it's an utter waste of money.

The X470 chipset abolished pciE 2.0 lanes in the chipset. All lanes are pciE 3.0 in the X470 chipset..
 
The X470 chipset abolished pciE 2.0 lanes in the chipset. All lanes are pciE 3.0 in the X470 chipset..
No.
AMD said:
PCI EXPRESS® GP*
x8 Gen2 (plus x2 PCIe® Gen3 when no x4 NVMe)
Link (scroll down the page).
 
Back
Top