Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?
AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source:
VideoCardz
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?
Pointless to imagine the same price of 4090 and 7900XTX, unless we also imagine the same performance and the same price. If so, I'd still buy 7900XTX due to modern DisplayPort 2.1 at 54 Gbps, whereas 4090 offers me 7 years old DP 1.4 with twice as little bandwidth. In reality though, it's 25% faster in 4K but 60% more expensive. Not for me. You do need to realize that different buyers favour different features. I am aware of what is appealing about 4090, but it's not for me. Most people enjoy drinking milk. I don't.
In gaming, the difference is 34W. Maximum, is 41W.
Quite embarrassing because the RX 7600 has the consumption of a 4060 Ti, and performs only in rasterization at the level of a non-ti 4060.
You are an AMD fan, a radical one, but try not to be embarrassing. The 20% increase in 4K means a lot, the rest of the technologies make AMD graphics cards some juniors and you are tied to a connection? LOL!!!!!
Even my old 3070 Ti outperforms the 7900XTX in rendering in all programs that included OptiX.
P.S. We were talking about the 7600 versus the 4060. I don't know why you put the 4060 Ti in the same pot. It is similar to 7600 only in terms of power consumption. And nVidia has the curve up. With the old GTX 1080, I stabilized it at 80W instead of 120W, and the RTX 3070 Ti achieves the same performance with 230W (default 290W).
Another example:
Nothing new.
A good strategy, imo, as long as you are the weaker (by a large distance) player.
To chase the enthusiast segment dead on, on the back of other segment`s, will be a poor choice.
As to this current rummer, well, rummers, leaks and other dripping matters... paint it blue and throw it to the sea.
In the TPU survey, AMD is in 2nd place on all levels and the RTX 4000 is on the rise compared to the RX7000.
Globally, the situation is dramatic for AMD, nVidia capturing 82% of the discrete video card market at the end of 2022. It is probably even more dramatic now because they cannot keep up with nVidia releases.
www.tomshardware.com/news/intel-looks-to-be-catching-up-with-amd-discrete-gpu-matket-share
There are figures, there are real data and the discussions here are for the sake of discussion. If a wart on nVidia graphics cards bothers you, buy a slower and less power-friendly AMD video card. They don't have the school of those from nVidia, they don't have a good grasp of modern technologies beyond rasterization, some even have problems with the drivers, but they are cheaper. Everyone chooses what they want.
Strictly for 7600 versus 4060, I would not give up the technologies offered by nVidia for 25-50 euros. My unwavering opinion.
I'm not buying anything because the choice made in 2021 was good and I can quietly wait for the next generation. No game forces me to give up ray tracing for a satisfactory framerate, nor am I forced to use DLSS. Not yet.
You sure that's a good example?
You'd better deal with tribalism in your own mind before you project it onto others. And ask members questions, above all, before you post nonsense judgements that sound like uninformed and meaningless label. Nvidia cards are great in Optix, no doubt, but it's niche workload, just like 95% of GPU owners would never encode one single media file. Media encoders/decoders are pretty much similar between new cards from all three vendors, especially H265 and AV1. There is still a hiccup on H.264 for AMD which is being worked on, but other than that, it's fine for various codecs and variable bitrates. See Jarred Walton's extensive testing of media engines on twelve Intel, AMD and Nvidia cards published on Tom's Hardware. There's an informative table that compares all cards and one i9 CPU.
RDNA3 cards have already greatly improved in HIP workloads (see Phoronix extensive testing on this) and AMD engineers are working on the last bit of HIP RT for Radeon cards that should release in a month or so, alongside or near ROCm package. It's all open source, so it takes longer, but once finished, there will be a wide and free access to many professional and virtual GPU tools for all regardless of their GPU brand, which is great for consumers, especially for not-so-well-off creators who cannot afford packages behind Nvidia's pay wall.
Above all, many performance gaps will significantly reduce and in some workloads cards will trade blows. At the end fo the day, this area was significantly accelerated by AMD, effort is visible and growing number of creators and professionals do not want to accept CUDA-only dominance. At some point, in a few years, Nvidia is predicted to ditch many paid subscriptions once they see that people are able to use open source applications. Ironically, they might be tempted to supress some open source applications on GeForce cards, which would be absurd. There is no sane person on this planet who would not support accelerated adoption of open source. Ever growing Linux community is especially enthusiastic about it and cannot stand paid monopolies. It was you who mentioned 6700XT, so I went to available testing videos from Steve at HUB. Both 6700XT and 6800XT in his tests show how almost three year old RDNA2 cards expose the absurdity of price/performance of both 4060Ti 8GB and 16GB models. Watch it. Videos are eye-opening. 4060 would be good @ $250+game($40 saved) and 7600 is already good @230+game($70 saved). Those cards are fine. Prices are wrong, with or without bundled games. Low tier class 60 new cards with 8GB cannot cost more than $230-250 in 2023 as their generational uplift in performance is rather miserable, especially 7600. By thew way, Overwatch 2 is the worst game of Steam, voted by damning reviews, so this $40 game bundle with 40 series cards sounds like a fresh juice and rotten egg for breakfast. Laughable.
www.techspot.com/news/99766-overwatch-2-worst-game-steam-according-user-reviews.html Why such nonsense truism dude? Did you re-discover America today? Are you enjoying bathing in Columbus's glory? Is your Ego happy now? So many questions.
TPU survey overrepresents AMD cards in comparison to global GPU market share. If you add up numbers, AMD has 34%. In reality it's less. Brain hurts. Get over yourself and tell us whether Navi 41 is cancelled or not. Nvidia is not the main topic in this article and you are spamming the thread. Why such drama dude? Are you informed? That survey by Jon Peddie was officially called out for being deceptive regarding "9%" for Intel cards. It turned out that tens of thousands of rogue cards were wrongly calculated in client segment. Plus, you need to know his methodology and that he does not talk and gather data from all OEMs and is missing out on significant portion of market.
www.jonpeddie.com/news/adjustment-to-q422-jpr-dgpu-market-report/
You should know more about AMD's discrete GPU business in wider context before you post more brain dead comments. Their client GPUs are less than 9% of total revenues and small portion of other important product porfolio. They are not even aiming to increase GPU market share to 30% or above, as they are heavily focused now in other more lucrative segments, such as CPUs, AI APUs/GPUs, FPGAs and consoles. Those bring over 90% of revenues. Once you understand this, it is easier for you to see that there is no "drama" for AMD. Grow out of that brain dead narrative now. You only have one chance to be rational and to reason logically with adults here.
AMD has grown in server CPU market from almost 0% in 2017 to projected 30% by the end of this year. It has been a monumental effort and market share growth, unheard of in server, more difficult than any attempt to change GPU ratio with Nvidia, as Intel had almost exclusive foothold globally and very difficult to tackle with entranched deals with enterprises. But, investment in Zen architecture, relentless drive and R&D paid off. Entire company was invested in it.
You can't have two or three equally big pushes in different segments at the same time. AMD was a small revenue company pre-Covid, just below $7 billion annually. They almost quadrupled this in a few years thanks mostly to Zen, consoles and now FPGA and embedded too. Once they bring themselves into roughly equal position in server with Intel, in 2-3 years, they might be able to refocus more efforts and R&D into GPUs and give it more serious push, depending on how AI pans out.
Key takeaway is that AMD neither craves nor needs more makret share in client GPUs in this period, as their priority is elsewhere. They will be present with GPUs to hover around 15-25%, but that's it. So, it's completely stupid to post any narrative with "AMD GPU drama" words. Nonsense. If you want to be Nvidia's press secretary endlessly repeating how much market they have captured, go and find a job in Nvidia marketing department and good luck with good salary. I explained to you figures from Jon Peddie. I explained to you situation with media engines and constantly improving, open source modern technologies. The drivers "issue" is nonsense as latest research suggests that both cards have roughly equal numbers of reported issues. In last three years, I had three or four game crashes and mostly because I was pushing the card really hard. Adrenaline software has never been as stable and pleasant to interact with, whereas Nvidia's GUI on my laptop feels dated, from Stone Age. They could finally modernise it too. Actually, when was the last time you have used AMD card and interacted with its software? Have you noticed changes? Technologies are useless if generational performance uplift is dismal. RT is completely useless on lower tier cards. It kills performance and the card chokes in 8GB VRAM in increasing number of games. So, I'd buy 7600 for 230 with game ($70 saved) for my cousin's ITX build.
3070 and 3070Ti are increasingly choked with 8GB of VRAM and many Nvidia owners complain about it. They realised that having bare minimum of VRAM does not make their cards future-proof for more demanding games. Beggars belief how Nvidia has been able to get away with low VRAM offer on most of their Ampere cards. Clearly, marketing propaganda that blinded people with DLSS and RT worked, to hide and mask native hardware deficiencies. See testing in specific games by Hardware Unboxed. It's clearly visible in gameplay. Also, RT performance in several games makes it unplayable mess. There is stuttering and unloaded surfaces too. So, it depends on what you play. Most stuff will run well with reasonable settings though. Each card has its own limits.
But, let's leave all that and discuss Navi 41 in greater detail. Will it be released or not? What are the architectural challenges and where is the rumour rooted?
That being said, btt, I'm still not 100% sold that AMD will skip the enthusiast (high margin) market for RDNA 4 - rumors are just rumors, and rumors can also change opinions (of AMD).
I think this is a severe mismanagement on AMD's side. They should stop the development of CPUs, and instead focus 100% on the GPUs - there is much more money in it
I disagree with you and think your exactly 100% wrong, IMHO.
But they 100% won't do that because they use RTG products for data center (INSTINCT), that are high margin and successful, they use it for consoles (that are not high margin, but very successful) and they use it for APUs that are rather successful, and not bad margin-wise. And then of course lastly they use it to build gaming graphics cards people can buy and put into their PCs - or buy a laptop with a Radeon GPU integrated. They have no upside to sell it, in other words.
Oh and that's the other problem: they can only sell it to someone like Intel for example, a competitor. RTG can not suddenly become a independent company like ATI again. That ship has long sailed.
Edit: there is nothing "left over", it is just some minimum amount of wafers alloted to Radeon so they can barely succeed. In a perfect world Radeon would get much more wafers and even Ryzen / EPYC would. AMD can not saturate any of these markets maybe aside from console, right now.
Intel market share in the server and consumer is also higher than AMDs because they have a lot of wafers due to producing them themselves. Not because their products are any better, they aren't aside from some niches.
It is a critical mistake to rely only on one supplier - in this case only TSMC. There must always be diversification.
When the manufacturers are actually four (or more):
1. GLOBALFOUNDRIES
2. Intel
3. Samsung
4. TSMC
2) it's not their business to invest for other companies
3) these companies also need RESEARCH to be as good as TSMC is. That's years of research and recruiting possibly better personell that can compete with the excellent people that work at TSMC.
AMD can not finance that or do the job for others. What AMD can do is work with others on single products, like HBM in the past. Or Nvidia did with GDDR5X and GDDR6X with Micron.
The only (fabless) company I know of that directly subsidized nodes, is Apple. They got the money to do that, AMD does not. What they get for that is primary availability to their products (like phones) for eg 5nm, now 3nm for the next Apple products that are coming.
Don't forget also that GLOBALFOUNDRIES is actually the former AMD's manufacturing division.
No, GF was sold by AMD many years ago by now. First it was spun off into GF, that already meant AMD sold a lot of shares and control of the company to a certain investment group in the middle east (I don't know the exact names). Before it was GF, it was part of AMD. This was a decade ago and more. Otherwise, AMD is AMD (Advanced Micro Devices inc.), it's not a subsidiary of anyone.
MLID has a new video claiming that Navi 4C with between 13 to 20 ! chiplets was cancelled... :confused: :kookoo:
AMD went in full retard mode. The gaming market will feel the pain in the coming years.
MLID's video is named RX 8900 XTX Design Leak, Navi 43 Hopes, Nvidia Exiting High End, AMD FSR 3 | Broken Silicon 218
videocardz.com/newz/amds-canceled-radeon-rx-8000-navi-4c-gpu-diagram-has-been-partially-leaked