• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 9 7900X3D Now at a Mouth-watering $329

This is not true.
Back in 2007, the average salary in my country was the equivalent of 276$, that means one could buy 8800 Ultra with 3.6 average salaries.
Today, the average salary in my country is the equivalent of 1278$, this means one could buy a graphics card at $1291 with only one salary and a bit more.

There are many production areas in which because of process optimisations and using cheaper materials, the costs of the goods end lower than in the past.
Your Standard of Living has obviously gone up quite a bit. That's a good thing for you but I am talking about the $829 USD you posted and I used an inflation calculator provided by the US government to translate that $829 in today's buying power. There will always be inflation as long as we use Fiat Currency.

I wouldn't place any huge bets on the RTX 5090. Going by credible rumors, nVidia will once again leverage a custom 5nm process for the consumer RTX 5000 series. Sure, there will probably be some optimizations but it won't be a huge step forward compared to their current custom 5nm (N4) process. It is definitely disappointing that they will not switch to a 3nm process after over two years but also understandable that they are going all-in on 3nm with AI exclusively.

Anyway, since the GPUs will be made on the same(-ish) node as the RTX 4000 series, all improvements in terms of performance and efficiency will have to come from the new architecture and there is only so much you can do with that. I would temper expectations and expect a rather moderate +30% over the RTX 4090.

I also wouldn't expect any fancy new features. They have surprised us with some nice innovative stuff like RTX Remix in previous gens but I don't see that happening again as long as all of their top talent is very busy with AI/datacenter. Let's be blunt here. nVidia is going to more or less half-ass the RTX 5000 series because it is not at all their priority at the moment. AI datacenter keeps on exploding exponentially and it is only logical and natural that nVidia have all hands on deck and are leveraging the best production node (3nm) for AI exclusively. Gaming is forced to take a backseat for now.

Finally, I wouldn't ever get the 7900X3D. You get the worst of the X3D CPUs with that model because it is an all around compromise on all fronts. You either get the 7800X3D (gaming focus) or the 7950X3D if you really have a use case for the additional cores (like rendering, content creation or whatever). The 7900X3D is mid garbage. Personally, I do not like the X3D CPUs above the 7800X3D at all because of the requirement to run Xbox Game Bar (ewww) and to activate the abysmal 'Game Mode' in Windows. Those reqs are unacceptable to me.
Besides, if anyone is waiting until later this year then I would definitely wait an additional two or three months and get the 9800X3D which should be introduced at CES in January 2025 and available in/by March or so.
Thanks for the info. As you can see I'm not well informed about AMD CPUs right now. I will look into the 7800X3D. You probably saved me from spending more than I needed to.
 
People don't want the two ccd chips and the continual price drops tell us demand is flagging...
 
Demand of this is pretty bad.
It kind of wierd step child for gamers, since it bascially acts as a 6-core in games, and in other workloads it is a lower clocked 7900X. On top of that there is the scheduling issue.
It really wan't attractive at all given it launced at a price higher than the 14700k which has 8 P-cores. And this was before people realized whats going on with Raptor Lake.
Exactly, its like if one is only gaming on their rig, then up to 12 threads is all that's needed until next gen consoles hit the market. I mean with a game like starfield for example, 12 threaded cpu is all that's recommended & that's a late 23' released game.
Dedicated gamers would be better off spending their money on other parts of the rig for a better all round gaming experience. Even with stuff like chairs.
 
I've been using Intel CPUs for 17 years now since my first build but I'm going with Ryzen on my new build later this year. There's just no reason not too imo.

Yep, Im in the exact same boat except I've been using intel for over 20 years. Will wait for the next Intel vs AMD results to make a more informed choice but the way it's looking I could finally be going AMD : )
 
Yep, Im in the exact same boat except I've been using intel for over 20 years. Will wait for the next Intel vs AMD results to make a more informed choice but the way it's looking I could finally be going AMD : )
Well if it helps, almost 70% of us here at TPU use AMD.


AMD CPUs far exceed Intel CPUs in too many metrics.
 
Still pretty expensive here but looks like the 5700X3D price cut does affect Finland too. Though I'll stay with my current 5800X.
 
I'd guess laptops is one exception.
One area I would argue it's AMD favoured due to the GPU advantages you get now.
 
Not interested...
 
One area I would argue it's AMD favoured due to the GPU advantages you get now.
Favoured? In sales? There are A LOT of laptops being sold where the IGP doesn't matter.

Besides sales, AMD doesn't have any IGP advantages anymore like they did six months ago.
 
One area I would argue it's AMD favoured due to the GPU advantages you get now.
Yes and also AMD has a huge advantage in laptops for the same reason we like them in desktop…efficiency.

Its just that Intel has so many OEM design wins and they can saturate the market with cheap dual and quad cores with bare minimum iGPUs.
 
If I want a X3D CPU, I get the 7800X3D
If I want an all rounder, I get the 7950X

7900X3D just stood there without purpose.
 
The 7900X3D should have been 8 + 4 cores, not 6 + 6.
If possible.

It's not how it works.

Hardware team has a defective chip and cuts it down to a usable 6 + 6 cores.

While doing this, they effectively made a chip that was not suitable for the top tier, working for a lower tier, and things get sold.

How or what model is being released don't matter. You can still disable the 2nd CCD if that is all an issue for you.

Chip performs perfect for what it does.
 
I wish I could have paid $399 Canadian for my 7900X3D but it is May and that means Computex. If you are looking at them now I would get one before they are gone. Where I live the 7800X3D is $394 and then you know the price. For $5 I will always take more cores. The same argument for the 5900X vs the 5800X3D holds true for the 7900X3D and 5800X3D, with the difference being that you now have Vcache. I actually bought a 7800X3D to see, used it for a week and returned it. Yes I will admit that in some Games the FPS counter was higher but I also know that computing feels faster having more cores. All of this is until the next 12 core X3D chip launches in about a month. Then we are going to see how the 7900X3D does at CPU mining.
 
It's not how it works.

Hardware team has a defective chip and cuts it down to a usable 6 + 6 cores.

While doing this, they effectively made a chip that was not suitable for the top tier, working for a lower tier, and things get sold.
Nothing you say stops AMD from doing it, but it would change the BOM (if that's even a term used in CPU's lol).

Besides, it would allow AMD to get rid of some chiplets with only 4 - 5 cores, although I bet there aren't many.

How or what model is being released don't matter. You can still disable the 2nd CCD if that is all an issue for you.

Chip performs perfect for what it does.
No.

6 cores with 3D V-cache performs worse than 8 cores, and that was my whole point.

I bet 8 + 4 would work better, and given that the 7900X3D originally cost $150 more than the 7800X3D it wouldn't have been to much too ask that AMD would use a full 8 core 3D chiplet.

Keep the defective ones for a 5600X3D successor (shown below, using a 7900X3D with one chiplet off).
1714562364964.png
 
Last edited:

In above video, and i think another one, there's a scene where the AMD hardware specialists are given semi defective CPU's and try to simply make a different version (less cores, lower clocks) out of it so that it can be sold.

As far as i understand, they simply hardware fuse off things through software. So that proces is irreversible. Gives a good insight on how CPU's are made. The whole X3D thing was a failed EPYC chip and it was up to them to figure out what todo with it.

The additional cache had one excellent use case, gaming!
 
Nothing you say stops AMD from doing it, but it would change the BOM (if that's even a term used in CPU's lol).

Besides, it would allow AMD to get rid of some chiplets with only 4 - 5 cores, although I bet there aren't many.


No.

6 cores with 3D V-cache performs worse than 8 cores, and that was my whole point.

I bet 8 + 4 would work better, and given that the 7900X3D originally cost $150 more than the 7800X3D it wouldn't have been to much too ask that AMD would use a full 8 core 3D chiplet.

Keep the defective ones for a 5600X3D successor (shown below, using a 7900X3D with one chiplet off).
The 7800X3D was not available when the 7900X3D launched.
 
Funny, my i5-2500k finally died when the 7800x3d came out. I had to pay exactly 650 (with tax) bucks to buy it. A few weeks later, it was down with 150. Now? Its even cheaper, around 370. Funny thing is, the 7900X3D was even more expensive. Now it's cheaper than the 7800X3D here, and WAY cheaper from what i had to pay. Day 1 scalping prices can be brutal sometimes, but also... CPU prices really do go down with time, unlike GPU prices. I can still buy an RTX 3070 for 650 bucks. Crazy, for something so old now. 4070 is pretty much the same price too, a cool 700-750.
 
It came 37 days later, what difference does that make?
So I had a 5900X system and then decided to get the 5800X3D. I noticed that while the 5800X3D was better, generally speaking in Gaming. I missed the smooth butter feeling that the 5900X gives you. That is why I saved $350 and got 4 less cores for $650 when they launched and I got one. I know the 7800X3D was coming but I do not want 8 cores. If a person buys this chip they will not be sad with the performance. BTW they also work in A620 boards too and have great power draw.
 
8+4 config likely doesn't exist because they were shunting as many of those as they could to 7950X3D and 7800X3D skus. Possibly could happen now, but I doubt it.

Problem with the 7900X3D is that the 7800X3D outperformed it in games for less.
 
I'd expect a lot more deals to come round in the next few weeks/months. Got to start clearing stock for the new release coming soon. Most will be aimed at US first but keep an eye out for them spreading worldwide.
 
The 7900X3D and 7950X3D are CPUs looking for a purpose.

Inter-CCD latency makes them hard to recommend for gamers - you can always disable half the cores, but then you just have an overpriced 6 or 8-core CPU, you should have bought that in the first place.

For productivity, the asymmetrical cache causes scheduling headaches and in most instances it adds nothing to performance. In reality the additional layer of cache imposes clockspeed, voltage, and cooling complications that simply aren't there on the regular 7900X and 7950X.

IMO the 7800X3D is the only X3D AM5 part worth considering, unless AMD make the bold move of adding 3D V-Cache to both CCDs on the 7900 and 7950 variants.
 
Demand of this is pretty bad.
It kind of wierd step child for gamers, since it bascially acts as a 6-core in games, and in other workloads it is a lower clocked 7900X. On top of that there is the scheduling issue.
It really wan't attractive at all given it launced at a price higher than the 14700k which has 8 P-cores. And this was before people realized whats going on with Raptor Lake.

Well, I would have been interested, but I have to upgrade everything once again, also, my PC will still be fast enough for some time, the GPU on the other hand, that will need upgrading, but that will only happen if nVidia won't be absurd with their asking prices in the future.

Sick of overpriced sand.
 
Back
Top