• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD "Navi 31" Memory Cache Die Has Preparation for 3D Vertical Cache?

Which charts ? Like I said 6600 is faster.
The charts in general.
Please decide. Vast majority don't care or vast majority being able to compare RX 6600 vs RTX 3050? And in both cases, why buy the RTX 3050?
So basically your theory is that if AMD has some hypothetical top performer in literally anything then they could sell every heaping pile of crap they can come up with because people will just buy it ?

Why would that be advantageous to you, the consumer ?
So, now you know which charts.

It's not just my theory. It's something that is said for decades.

And please decide. Do consumers know or don't? You keep using the "consumer knows", "consumer doesn't know" arguments in different parts of your posts.

Why wouldn't you ignore it ? Cards at that price point work fine in RT for 1080p and even 1440p but at 4K RT performance is crap on anything and you'd be an idiot in my opinion to buy any card for that purpose in particular because if games are barely playable right now at that resolution they're gonna run like absolute crap in a few years.
Because it's not only about being able to run fine today in current games. But also have enough performance to also perform in games that will come in 2-3 years. You also point at future performance in the end of your post as a parameter.
But, by putting your OWN bar of acceptance, you don't build an argument. You just express your opinion about your criteria and about your needs. That's NOT an argument.
 
Please decide.

No, you decide, first you told me Nvidia cards sell better because they have better RT performance, now you're telling me they sell better because people are low IQ troglodytes and they see Nvidia at the top of some random chart and they think any card that comes with a green sticker on it must be blazing fast. You didn't use those exact words but that's the implication.

But you still didn't explain why you think it's better for AMD to do that vs Nvidia.

That's NOT an argument.
My argument is pretty clear since it can be measured objectively : RT performance sucks on everything. But what's your argument for caring about it ?
 
Buying close to 1000 dollar GPUs for 1080 and even 1440p is comical though.

I couldn't agree more... The 4070 Ti was designed from the ground up as an $400 tops GPU. They're making a killing on the backs of gamers :(

Anyway, I think Intel has a safe position on that purely on their R&D resources. But as you said... they need a product before that...
 
I couldn't agree more... The 4070 Ti was designed from the ground up as an $400 tops GPU. They're making a killing on the backs of gamers :(

Anyway, I think Intel has a safe position on that purely on their R&D resources. But as you said... they need a product before that...

The cost of things is going to keep going up as complexity goes up. Repeat after me, things will get more expensive each generation, this won't change. If you want cheaper GPUs then quit asking for better graphics, physics, and games. Demand improvements stop. Gamers aren't a special class, gamers demand for graphics is why they are in this situation. You wanted improvements, you wanted the PC to beat consoles, now you got it... but you don't want to pay the bill for what you asked for.
 
The cost of things is going to keep going up as complexity goes up. Repeat after me, things will get more expensive each generation, this won't change. If you want cheaper GPUs then quit asking for better graphics, physics, and games. Demand improvements stop. Gamers aren't a special class, gamers demand for graphics is why they are in this situation. You wanted improvements, you wanted the PC to beat consoles, now you got it... but you don't want to pay the bill for what you asked for.
Moores law momento.
 
The cost of things is going to keep going up as complexity goes up. Repeat after me, things will get more expensive each generation, this won't change. If you want cheaper GPUs then quit asking for better graphics, physics, and games. Demand improvements stop. Gamers aren't a special class, gamers demand for graphics is why they are in this situation. You wanted improvements, you wanted the PC to beat consoles, now you got it... but you don't want to pay the bill for what you asked for.

Yes, because that is precisely the logic that has caused technology growth and evolution, right. Not because the entire point of new hardware generations is to provide more for less...
 
I'm more interested to see if AMD will add in some Xilinx blocks that could be configured for either AI, raytracing, or physics calculations as-needed. Although having more memory on-board would be great too; if only to further reduce the need to load up assets from the game drive as well as continue improving upon their SAM implementation.
 
Moore's law is dead, the tuber has talked about this several times over the last few weeks, others too, it's hardly new News.

He thinks it's not impossible that a OCD, double decker cache 7950XTX3D might turn up.

That might explain the 800watt 4090Ti rumour I also just heard,, ,, Again.

If an 800 watt GPU is released I think we definitely hit my limit of this is f£#@££ ridiculous amount of power just to play peggle.

I'm intrigued for sure, I think AMD will re roll the 7900 like Vega, IE next time it's refined And twice the memory, could end like Vega.

I tend to own all the stuff people call rotten and yet, I hold it as lofty as a 1080Ti that Vega64, oh I know the Vega was beaten by the 1080, but on newer games not often by much, I gamed at 4k on it fine for years, and that's the point, It served me very well.
 
No, you decide, first you told me Nvidia cards sell better because they have better RT performance, now you're telling me they sell better because people are low IQ troglodytes and they see Nvidia at the top of some random chart and they think any card that comes with a green sticker on it must be blazing fast. You didn't use those exact words but that's the implication.

I never said anything about people being troglodytes. Putting words in my mouth to make me look bad, is this your new ultra strong argument?

Also I haven't changed opinions left and right as you are doing. Please don't try to accuse me for what you are doing. Avoiding giving a straight answer, calling my opinions assumptions, presenting your opinions as facts. Rasterization performance and Raytracing performance are presented in charts. Talking about charts and one company being constantly on top of those, do I have to throw a list of 100 individual charts? They are not random. When we are talking about charts, we are talking about performance charts. Very specific. The only thing that it is random here is your, let's say, arguments.

And yes people do make choices based on charts. It's not about IQ. It's about where someone wants to spend their time. I might spend 10 hours looking at performance charts about GPUs and then go out and buy a vacuum cleaner based on some marketing material and 20-30 minutes of quick YouTube review videos. That doesn't make me a high IQ person in GPUs and a troglodyte in vacuum cleaners. I just choose where to spend my time.

But you still didn't explain why you think it's better for AMD to do that vs Nvidia.
Lost you here. Maybe if you start answering my questions, instead of jumping left and right avoiding them or disregarding them as assumptions, I could follow you.
My argument is pretty clear since it can be measured objectively : RT performance sucks on everything. But what's your argument for caring about it ?
Your argument is based on your own expectations and measurements about what you can consider acceptable.
________________________________________________________
________________________________________________________

The cost of things is going to keep going up as complexity goes up. Repeat after me, things will get more expensive each generation, this won't change. If you want cheaper GPUs then quit asking for better graphics, physics, and games. Demand improvements stop. Gamers aren't a special class, gamers demand for graphics is why they are in this situation. You wanted improvements, you wanted the PC to beat consoles, now you got it... but you don't want to pay the bill for what you asked for.
Well, maybe or maybe not. For start let me say that in the (distant) past we didn't had to pay extra for extra features and performance. In the (distant) past companies had to offer MUCH MUCH MORE and AT (about) THE SAME PRICE to convince us to pay them again. Also beating the consoles with a PC is not something new. You could always do that. PC always had strong enough components to beat console hardware and it's optimizations. It's not new and it's not ridiculous to expect to beat PS5 and XBOX with even a mid range PC. What changed is that the current Mid Range PC costs more than the old Hi End PC.

Let's see some examples that could be completely wrong, or maybe correct. Increased complexity, performance, technology, features at the same price. No software tricks (DLSS, FSR) allowed here.

GPUs

NVIDIA GeForce2 Ultra, 2000, $499​

1675107930427.png


NVIDIA GeForce FX 5900 Ultra, 2003, $499​

1675108085013.png

NVIDIA GeForce GTX 680, 2012, $499​

1675108279186.png


NVIDIA GeForce GTX 1080, 2016, $599​

1675108722040.png



NVIDIA GeForce RTX 3070, 2020, $499​

1675108819245.png

NVIDIA GeForce RTX 4060, 2023, $499​

1675108921461.png

The same is happening in other hardware, like motherboards, where features are removed and the PCB looks emptier compared to older motherboards, but the price has skyrocketed. While I ALSO have argued that the costs could be justified taking into consideration some stuff, I am still unsure about that.

Let's have a quick look at a few motherboards. Look at the PCB, from being overcrowded to becoming simple and almost empty, with the exception being around the CPU socket.

Gigabyte 990xa-ud3, AM3+ socket, around $140 (2011 ? )

1675109308944.png



MSI X470 Gaming Plus Max, AM4 socket, $120, 2018

1675109494048.png


Asrock B650 PG Lightning, AM5 socket, $200 ? , 2022

1675109555680.png




The Future $500 motherboard?

1675109910201.png
 

Attachments

  • 1675108163805.png
    1675108163805.png
    164.5 KB · Views: 50
Last edited:
Nah I'm the guy who play game everyday for the last 26 years, and bought lots of high-end GPUs during that period.

Sure I could have fun with whatever GPU i have got, but I would definitely have more fun with faster and more features packed GPU vs slower one.
Sorry for the late reply...
I'm not criticizing your purchase choice, mine is just a call for objectivity, a 10 as well as even a 20 percent increase is not a big leap.
I'm sure there are researches proving that biases have a great impact on how we perceive things.
 
And yes people do make choices based on charts.
Then you're just wrong according to your own claims.

The 6600 was faster than the 3050, that's what the charts say, so it should have sold better than the 3050, except it didn't. So either accept that you're wrong, or admit that most people don't look at that stuff, it's either one or the other.

Because the alternative is that people look at the charts, they see product X on top and then they proceed to buy product Y. How else am I supposed to interpret that other than they're dumb ?

Your argument is based on your own expectations and measurements about what you can consider acceptable.
I don't think so, thankfully TPU has charts that show the performance hit in % when RT is enabled, even the amazing Nvidia cards lose close to 50% of performance when it's enabled. That's huge by any subjective or objective measure.

Tell me who finds this acceptable, do you ? Mind you, this game is already 2 years old.

1675112224046.png
 
Last edited:
The topic is --> AMD "Navi 31" Memory Cache Die Has Preparation for 3D Vertical Cache?
Get on topic.
And, stop your bickering/arguing.
 
What's the point of more cache it already had what 5.3 terrbyes of bandwidth already for the cache chiplets? It surely never uses all of it now?

I'll bet it's so they can keep the cache ratio the same, while adding another main GCD to the package

Other wise adding more cache for higher bandwidth. For something that's not straved for bandwidtch completely useless endevor, seen Vega 64 for an example.
 
Last edited:
RDNA3's problem is RayTracing performance and I doubt extra cache will help there.

And just to be clear, my point of view is that, it doesn't matter if rasterization is still the king. Raytracing is what helps(if performance is great), or not(if performance is not great), in marketing those chips TODAY.
I feel the general assumption is RT is very important, but in reality, I don't really think so. I've been gaming without RT, and I really don't feel like I am losing out anything . I believe there are great games out there currently that may not have RT, or enabling RT don't really do much in terms of visuals. And with most common GPUs out there, the % of users with cards that can handle RT well is very low. RT certainly improves the visuals, but to me it is a nice to have, rather than a must have. If one's hardware can handle it without dropping FPS too severely into sub 60 FPS zone, then feel free to use it. With games getting so unoptimized, even RTX 3000 series may struggle in new game titles with RT and DLSS enabled. So one needs to decide, RT or FPS, which is generally a no brainer to focus on FPS for general gaming, and RT just to "sight see" in games.

What's the point of more cache it already had what 5.3 terrbyes of bandwidth already for the cache chiplets? It surely never uses all of it now?

I'll bet it's so they can keep the cache ratio the same, while adding another main chiplet to the shader array.

Other wise adding more cache for higher bandwidth. For something that's not straved for bandwidtch completely useless endevor, seen Vega 64 for an example.
I believe there was some rumors previously about seeing 3D cache on Navi 3x. Cache is generally very low in latency on top of high bandwidth. VRAM however tend to sacrifice latencies over raw bandwidth from what I observed. So for loads that favor latencies (which may not be 3D games), it will run faster. I may be wrong, just a crude guess.
 
My theory doesn't fall apart because you say so. Especially when you base that conclusion on nothing.

People buying RTX 3050 over RX 6600 do it because Nvidia is winning the charts and it's cards are considered as offering the best performance in everything and especially the feature that is most talked about the last 2-3 years at least. Raytracing. It's common knowledge for about forever and not just my opinion, that the best high end card also sells low end cards. So, people go and buy an RTX 3050 because Nvidia offers the fastest cards and also because it's cards are considered as the fastest in RayTracing. That's what they heard/been told. So, people who don't know about hardware will go and buy the Nvidia card because "Nvidia is faster". RX 6600 is the best card and at a lower price point, but people either choose Nvidia or avoid AMD.
People are as ignorant as they can be and they dont check anything but rather listen to others. Others, dont have to be correct in their assumptions. RT is not a selling point today. Those 3050 or 4050 are useless for RT just like 6500 is. The RT is so bad, NV had to create DLLS3 to get the frame generation going since it's not playable on cards going for $1000. Is RT a selling point for NV cards? Absolutely not. Why people buy NV cards? The PR behind NV cards for generation and hype was astronomical and the money spent for that huge. That is why people by NV not because of drivers not having problem or RT because all that is not a fact. You can have your opinion since nowadays it is important to have one and that is good, but that does not mean it is true or a fact.

I think the more cache is brilliant move to boost performance. I only hope it will not boost the price twice for cards having it.
Considering how crazy the GPU market is, companies will use whatever means necessary for a cash grab. I sincerely wish the 3Dvcach is not going to become a cash grab.
 
Last edited:
People are as ignorant as they can be and they dont check anything but rather listen to others. Others, dont have to be correct in their assumptions. RT is not a selling point today. Those 3050 or 4050 are useless for RT just like 6500 is. The RT is so bad, NV had to create DLLS3 to get the frame generation going since it's not playable on cards going for $1000. Is RT a selling point for NV cards? Absolutely not. Why people buy NV cards? The PR behind NV cards for generation and hype was astronomical and the money spent for that huge. That is why people by NV not because of drivers not having problem or RT because all that is not a fact. You can have your opinion since nowadays it is important to have one and that is good, but that does not mean it is true or a fact.

I think the more cache is brilliant move to boost performance. I only hope it will not boost the price twice for cards having it.
Considering how crazy the GPU market is, companies will use whatever means necessary for a cash grab. I sincerely wish the 3Dvcach is not going to become a cash grab.

I agree in general but I consider the RT as a selling point, no matter if people have a clue or not about it.
Also, if people buys the greens no matter what, it's an endgame for AMD. It's pointless to develop new archs and gpus for PCs.
They can focus on consoles and call it a day.

The 7900s are fast enough, raster and RT, but don't sell because of their prices.
If AMD creates a 7900XTX3D 10-15%+ faster, do you think it will sell?
I believe not.
The package is not attractive. I wrote again. RT performance, no CUDA alternative, no NVENC alternative, AV1 enc only on 7900s, image reconstruction tech inferior to competition or non existent, etc.
They have to be priced aggressively to make an impact.
 
I agree in general but I consider the RT as a selling point, no matter if people have a clue or not about it.
Also, if people buys the greens no matter what, it's an endgame for AMD. It's pointless to develop new archs and gpus for PCs.
They can focus on consoles and call it a day.
I get you consider it and there is a lot of people considering the same thing. I have not found evidence or facts that would support what you consider being a selling point fact. Actually, I've recently found a lot indications that RT has no influence on the GPU market sales in an any meaningful way and if it has in any way it is minor. DLSS3 is one of that indication. NV sees a lot of people arguing about the RT performance and thus DLSS3 otherwise DLLS 2 is plenty and looks better. They need to boost good riddance with RT because for now its tanking the performance even on cards for $2.5 that are claimed to be 2-3 times faster and still barely get 60FPS. That is of course for people who know their stuff. Most dont care about RT or simply does not know what it is.
 
I feel the general assumption is RT is very important, but in reality, I don't really think so.
In reality yes, I agree. But people buy based on what is in the trend and RT is, because let's not forget that the company that also pushes it, controls 90% of the market, throws a gazillion of dollars in marketing and also is the top brand. If AMD was trying to push RT, it wouldn't be important. But today it is.

RT over raster reminds me when graphics cards where going from 16bit color to 32bit color. In games probably no one could see the difference between 16bit graphics and 32bit graphics, considering the quality of 3D graphics 20 years ago and the fact that we where on 15-21'' CRT monitors, but the fact is that everyone was taking into major consideration the performance in 32bit color before buying a new graphics card. And 3DFX was the one suffering the worst performance drop when going from 16bit color to 32bit color. Nvidia was the best I think, with ATI second. Who didn't survived?

People are as ignorant as they can be and they dont check anything but rather listen to others.
We all are. I bet we, everyone, buy stuff after hours of checking to see what is the best option and stuff that we just go in a shop and blindly buy because we just don't care about that category of products and "anything will do just fine".
RT is not a selling point today. Those 3050 or 4050 are useless for RT just like 6500 is.
I think it is a selling point. And I think people do NOT know that those cards are slow. I doubt people know to ask "Yes, but is it fast in RT?".
Many many years ago I was trying to convince someone to buy a 2GB GDDR5 model over a 4GB GDDR3 model(same GPU, I think R5 250 or something). He gone with the GDDR3 model. Not only I lose an afternoon presenting him performance charts, he was almost annoyed at the end for trying to convince him to go with the 2GB model.

Is RT a selling point for NV cards? Absolutely not.
It's the main selling point, together with DLSS 2.0 and Nvidia is so certain that DLSS 3.0 will get accepted that, well, we see what they charge for their new cards. Nvidia by now should be obvious to everyone that they consider performance increase with software tricks like DLSS EQUAL to performance increase from faster hardware.

Then you're just wrong according to your own claims.
Nope. Sorry. You have to understand when I am talking about performance charts in general where the verdict is "Nvidia first (in general), AMD second(in general)", it's different than excessive searching and reading to see which one of two specific cards, RTX 3050 and RX 6600, is better. It's not my problem if you can't/won't understand my point.

The 6600 was faster than the 3050, that's what the charts say, so it should have sold better than the 3050, except it didn't. So either accept that you're wrong, or admit that most people don't look at that stuff, it's either one or the other.
Oh come on. Just look at the market numbers.

Because the alternative is that people look at the charts, they see product X on top and then they proceed to buy product Y. How else am I supposed to interpret that other than they're dumb ?
YOU can interpret it whatever you like. Just don't put words in my mouth. Either keep it to yourself, or present it as YOUR interpretation.

Tell me who finds this acceptable, do you ? Mind you, this game is already 2 years old.
That chart sends people to Nvidia.

Pity TechPowerUp doesn't have 8K or 16K charts to make your argument.... stronger.



And that's it.
 
That chart sends people to Nvidia.

Slideshow level performance sends people to Nvidia ? Come on dude, you gotta be joking, nobody looks at that goes "Hmm, better drop 1000 dollars to play at 30FPS, sounds amazing".

Games with RT enabled are often unplayable without DLSS on Nvidia cards and that's actually where Nvidia's marketing machine works the best.

It's dumb shit like this that draws people in : Woah, form 20 FPS to 80 FPS with ray tracing ? Sounds amazing.

1675175985898.png


Tech sites and reviewers have been shoveling this crap for years, YT channels like Digital Foundry, who are otherwise typically fairly respectable have been praising DLSS since day one despite the fact that even the most hardcore fanboys would agree DLSS1 looked like garbage. Nonetheless, it was presented as a wonderful solution to the catastrophic performance you're gonna get with RT on and not much as changed since then.

None of that stuff is a real disadvantage to AMD, the problem is that they've been too late with FSR and RT support so they missed out on the opportunity to present their RT performance as "passable" in the same way Nvidia did. I bet there is tons of people who still think RT doesn't work at all on AMD cards. The absolute performance figures are irrelevant, performance is shit on anything and games with RT support are only playable through the use of DLSS and FSR. That frame generation stuff is the next marketing battle AMD needs to face, they already announced FSR3 is going to have the same feature but we'll see if they get it out in time for it to be worthwhile otherwise Nvidia will plaster anything with "Get a billion times more FPS" once again.

AMD's problem is the same as always, they can't combat Nvidia's marketing machine.
 
Last edited:
Come on dude, you gotta be joking
Nope. You just miss the forest while focusing on the tree. People will look at the performance drop and see that on Nvidia it is smaller.
Then they will consider DLSS 2.x or 3.x, because everyone + the dog told them that it is better than native.

I could quote the rest of your post, specially the part about reviewers and youtubers, because I am saying that for years. But the thing is that, whoever controls the media, controls the narrative.

I have to say something about me. I am an AMD fan, NOT an Nvidia fan.
I stopped focusing on what is logical long ago and just SEE where things are going, where the media and marketing are driving people.
It's not just GPUs. It's everything. Look at CPUs. 13900K is a 24 CORES CPU, right? Rightttttttttt................... All equal, right? Rightttttttttttttt.............
I mean don't try to convince me for something. Just look what IS happening.

And I have to stop here. I mean, this is a forum, not an exam for a degree or something. Can't go back and start reading again everything we both wrote, just to make my next post better. No time for this.
 
I agree in general but I consider the RT as a selling point, no matter if people have a clue or not about it.
Also, if people buys the greens no matter what, it's an endgame for AMD. It's pointless to develop new archs and gpus for PCs.
They can focus on consoles and call it a day.

The 7900s are fast enough, raster and RT, but don't sell because of their prices.
If AMD creates a 7900XTX3D 10-15%+ faster, do you think it will sell?
I believe not.
The package is not attractive. I wrote again. RT performance, no CUDA alternative, no NVENC alternative, AV1 enc only on 7900s, image reconstruction tech inferior to competition or non existent, etc.
They have to be priced aggressively to make an impact.
You mean the XT. Any XTX in my entire country only lasts for 2 days max. The package not being attractive is subjective to you. When they add V cache right over the I/O die it should make the 4080 look like a 3060 vs a 6800 in performance. Some people buy GPUs for Gaming without needing software to make you feel like you are getting high frames. Even the XT is a great card and in Canada it is $400 cheaper.
 
You mean the XT. Any XTX in my entire country only lasts for 2 days max. The package not being attractive is subjective to you. When they add V cache right over the I/O die it should make the 4080 look like a 3060 vs a 6800 in performance. Some people buy GPUs for Gaming without needing software to make you feel like you are getting high frames. Even the XT is a great card and in Canada it is $400 cheaper.
I doubt extra cache will make RDNA3 to become a rocket. AMD gone with the cache option, probably to save costs, by using a smaller data bus and slower memory. Maybe power consumption too? But that did had a performance cost at higher resolutions. Also AMD lowered the size of cache in their new cards. They dropped it from 128MB in RX 6900XT to 96MB in RX 7900XTX. So they probably where not seeing enough performance gains with those extra 32MBs to justify the extra cost.
 
You mean the XT. Any XTX in my entire country only lasts for 2 days max. The package not being attractive is subjective to you. When they add V cache right over the I/O die it should make the 4080 look like a 3060 vs a 6800 in performance. Some people buy GPUs for Gaming without needing software to make you feel like you are getting high frames. Even the XT is a great card and in Canada it is $400 cheaper.
That's the problem. They don't need more performance. The 7900s (both) are already fast.
They are not fast enough, for the price, where it matters to me and for marketing purposes, in RT. They don't have features for the price. They don't have the package for the price.

The 3D cache will give only what they don't need. Performance.
 
That's the problem. They don't need more performance. The 7900s (both) are already fast.
They are not fast enough, for the price, where it matters to me and for marketing purposes, in RT. They don't have features for the price. They don't have the package for the price.

The 3D cache will give only what they don't need. Performance.
Do you think I care about RT when I am trying to avoid a tank shell in Just Cause? Does Ray Tracing make TWWH3 play better? Will CUDA cores give me better 1% lows? Is Adobe the only software that does what it does? Regardless of the narrative (most people supporting RT ad nauseum have 2022 join dates) Performance is still the mitigating factor in the sale of GPUs. As much as people like to malign AMD's offerings the 7000 series is faster than the 6000 series and also is better than the 3090 at RT so vs a $2000 card for 1/2 the price? That sounds pretty good to me. When my X3D chip comes in Feb I will let you know how good it feels there too. 4K high is pretty enough for me in most Games anyway.

I doubt extra cache will make RDNA3 to become a rocket. AMD gone with the cache option, probably to save costs, by using a smaller data bus and slower memory. Maybe power consumption too? But that did had a performance cost at higher resolutions. Also AMD lowered the size of cache in their new cards. They dropped it from 128MB in RX 6900XT to 96MB in RX 7900XTX. So they probably where not seeing enough performance gains with those extra 32MBs to justify the extra cost.
That is your opinion. Do you have a 6900XT? I see you have a 580 and promise you that when you step back and actually use the product instead of reading about how you think they would perform you will see what I mean. You have a 580 so I will break it down for you. Vega 64 was as fast as 2 optimized 580s in Crossfire. The 2080TI was 40% faster than a Vega 64. The 5700XT was as fast as a Vega 64 with 1/2 the power draw. The 6900XT is about 30% faster than a 2080TI in some Games but faster overall in everything. I have had the XTX and XT and both cards blow my 6800XT away in performance. Adding Vcache to avoid using system memory is always better and we don't know what SAM will look like with a Vcache CPU and GPU working together. I will be happy to sell a couple PCs to buy the X670E Ace Max and enjoy it. Regardless of how you feel because in PC means P means Personal there is no wrong choice. I will get blasted for this but I own a 6500XT and it one of my favourite purchases ($149 US) over the last few releases.
 
Do you think I care about RT when I am trying to avoid a tank shell in Just Cause? Does Ray Tracing make TWWH3 play better? Will CUDA cores give me better 1% lows? Is Adobe the only software that does what it does? Regardless of the narrative (most people supporting RT ad nauseum have 2022 join dates) Performance is still the mitigating factor in the sale of GPUs. As much as people like to malign AMD's offerings the 7000 series is faster than the 6000 series and also is better than the 3090 at RT so vs a $2000 card for 1/2 the price? That sounds pretty good to me. When my X3D chip comes in Feb I will let you know how good it feels there too. 4K high is pretty enough for me in most Games anyway.

Don't they perform already exceptionally in raster?
I mean why do you or everyone need more performance in raster? Even my 2080Ti provides enough raster performance for any game today.
Let alone the 6800/6950XTs.
The RT performance is the one that matters and the 3D V cache won't solve that problem.

I don't use Adobe, but engineering software from Autodesk, Leica Geosystems, Bentley etc.
And yes, there are no alternatives to that kind of software. And yes, most of them take advantage of CUDA cores.

I suppose it was the same with the nvenc. At least the 7900s support AV1.
 
Back
Top