• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Radeon RX 6700 Series Reportedly Launches in March

Not worth it for the price IMO.. better wait for 3080 SUPER or 3080-ti, more superior cards anyway.
 
They should just not bother with a 6700 put the effort into more important projects.
Agreed. Based on preliminary specs, it's just a ray tracing enabled 5700 anyway, which sounds quite useless based on 6800-series RT performance and 5700-series thermals.

I'm only not sure which more important projects we're talking about.
 
Agreed. Based on preliminary specs, it's just a ray tracing enabled 5700 anyway, which sounds quite useless based on 6800-series RT performance and 5700-series thermals.

I'm only not sure which more important projects we're talking about.

More refine 5700/5700XT with RT enabled should sufficient for 1080p gaming I suppose.
 
More refine 5700/5700XT with RT enabled should sufficient for 1080p gaming I suppose.
Without RT, totally. With RT (based on 6800 performance), not so sure.

I'm having loads of fun with my 5700 XT. I just wish the VRAM chips ran a bit cooler.
 
More refine 5700/5700XT with RT enabled should sufficient for 1080p gaming I suppose.
Of " whopping" a dozen or so RT games that we have 2 years into "RT greatness", are there RT games that I've missed that gain anything from RT on?
It arguably looks wore in CP2077 with RT on (not to mention fps dive)

Given that EPIC demoed UE5 without using any hardware RT on a platform that actually DOES have it, I'd be rather skeptical about hardware RT prospects in the forseable future:


 
TSMC really is a bottleneck at the moment.

It's why Nvidia went with Samsung and I'm hoping Intel's 14nm+++ doesn't die any time soon.

Is polaris still made at GloFo? That's probably why AMD haven't dropped it yet in favour of the TSMC-made 5500XT
I believe Nvidia probably deflect to Samsung more from a cost perspective, rather than trying to avoid the bottleneck at TSMC. In this case, even Samsung is bottlenecking the supply of Ampere cards as revealed by their CFO recently. As for Intel, I doubt their 14nm+++++ is going to go away anytime soon, but the question is how many people would want to use a 14nm chip from Intel in 2021? RKL is going to be released on 14nm, but I don't think the release will give them any tangible advantage in terms of winning over lost market share. If there is any reason for them gaining market share, it has to be the lack of supply for AMD Zen 3 based chips, and also the absurd price they command now.
 
Of " whopping" a dozen or so RT games that we have 2 years into "RT greatness", are there RT games that I've missed that gain anything from RT on?
It arguably looks wore in CP2077 with RT on (not to mention fps dive)

Given that EPIC demoed UE5 without using any hardware RT on a platform that actually DOES have it, I'd be rather skeptical about hardware RT prospects in the forseable future:


IMO the only RTX feature that makes a noticeable difference to image quality is realtime reflections, because in games where you have a lot of wet floor or vertical reflective surfaces, the difference between screen-space reflection and full ray-traced reflections is pretty significant.

All of the other crap - global illumination, ambient occlusion, diffuse light scattering - you can barely tell they're different to the raster-cheat methods which cost almost no performance in comparison.

It's dumb, but perhaps the game that looks best with raytracing IMO is still BFV - it's just a shame that it's the sort of game that can't really afford the performance hit, people are genuinely trying to hit 144fps in that title for good reason.
 
And how, exactly, do you plan on doing 8GB on a 192 bit bus? 6GB? no problem. 12GB? No problem. 9GB? bit of a lark, but sure. 8GB? Yeah no. I hope they dont gimp their card with a non standard bus just because people are repeating the opposite of the "10GB" nvidia argument.
192bit wide bus means 6 memory chips and you can have 4x 1GB + 2x 2GB configuration for a total of 8GB Vram, the problem is that full bandwidth is available for only 6GB and the last 2GB would have only 1/3 of that.
Honestly, I don't think they will do such a thing, Nvidia did It with GTX 970 and we know It didn't end well, although they also deactivated other parts without clearly mentioning It.

P.S. Even 192bit wide bus was non standard for AMD until RX 5600, they only used 64bit, 128bit, 256bit, 384bit and 512bit wide buses.
 
Last edited:
The fact that the other Fabs are not competitive is really bad for us customers... TSMC is carrying the whole tech world

I think ASML is kind of carrying the tech world these days there is limited supply relative to demand for that EUV machinery which plays a pivotal role in nodes these days. Also I don't believe Samsung is too far behind TSMC and Intel isn't too far behind Samsung while Global Foundries has kind of bowed out and has shifted focus to more specialized area's.
 
And how, exactly, do you plan on doing 8GB on a 192 bit bus? 6GB? no problem. 12GB? No problem. 9GB? bit of a lark, but sure. 8GB? Yeah no. I hope they dont gimp their card with a non standard bus just because people are repeating the opposite of the "10GB" nvidia argument.

Besides, if you want 6700 performance and only 8GB of RAM for a low price, the 5700xt will likely plummet in price once the 6700 and 3060 are available in decent numbers, and the vega 64 is still available for ~$250.


Yes lets just ignore a large market for more important projects, like........more PS5s?

The lower you go down the GPU totem pole, the larger your market. AMD already tried the "ignore the $300-400 market" thing with polaris, and it resulted in nvidia printing a mint with the 1070 alone.
Correct 6GB was a late night with the kids.

We will see about the 5700xt prices. Somehow I doubt they will drop much or have much in stock by March but hopefully I am wrong.

This is what I see and don’t forget to add the 13% sales tax to those prices. Doesn’t look too promising to me!
 
And how, exactly, do you plan on doing 8GB on a 192 bit bus? 6GB? no problem. 12GB? No problem. 9GB? bit of a lark, but sure. 8GB? Yeah no. I hope they dont gimp their card with a non standard bus just because people are repeating the opposite of the "10GB" nvidia argument.
Non standard? you mean something like 6 GB of fast memory on 192bits and 2Gb of slow memory on 64bits? Anyone would really have the nerve to sell something like that as a 8Gb GPU? :D /S
 
I don't know why they had to cut it down so much, the CU count would have been nicer at around 50CU's as the drop from 6800 to 6700 is gonna be a big one performance wise.

By the time they filll out the product stack the 6600/6500 will have under 2k SP's and will feel like a 5500/5600 refresh.
 
I don't know why they had to cut it down so much, the CU count would have been nicer at around 50CU's as the drop from 6800 to 6700 is gonna be a big one performance wise.

By the time they filll out the product stack the 6600/6500 will have under 2k SP's and will feel like a 5500/5600 refresh.


I'm going to guess smaller dies scale better with voltage and cooling, they may be into the 3Ghz range.
 
I don't know why they had to cut it down so much, the CU count would have been nicer at around 50CU's as the drop from 6800 to 6700 is gonna be a big one performance wise.

By the time they filll out the product stack the 6600/6500 will have under 2k SP's and will feel like a 5500/5600 refresh.
Thinking about it, AMD could push a 6700 XT model to fill the gap, couldn't they?
 
Polaris just refuses to die lol

Then again, is just under 5 years old. Still can make it for some low-end applications, I guess.

And to be fair, my RX 580 still can handle itself relatively well for my 1080p 60 Hz gaming, considering how old it is.

I must say that even if the RX 580 is from ancient times, it runs games weel in 1080p, and still runs 1440p... One gets blind by all FRAMERATEs etc, and most people (like me) want a new card, but in the end of the day MANY of the old good cards still work just fine. It even runs Cyberpunk 2077... and yes, we don´t have to have all settings on full to be happy to play.
 
Thinking about it, AMD could push a 6700 XT model to fill the gap, couldn't they?

I think it would make more sense than a "sort-of" 5700XT with RT cores in disguise.
 
Thinking about it, AMD could push a 6700 XT model to fill the gap, couldn't they?
The rumor right now is the 6700xt IS the 40CU variant, meaning the exact same core count as last generation.

IMO that would be a major disappointment, for whatever reason AMD is scared of making a 3072 core GPU, they refused to scale up polaris to 3072, same with rDNA, and Vega skipped right past it to 3584, resulting in a card that cannibalized their own vega 64.

If the 6700xt is a 40cu part that would be a major issue for AMD, as the 3060ti would have effectively no opponent, leaving a huge gap in AMD's lineup. Makes me wonder if we will see some last minute adjustments from their lineup.
 
The rumor right now is the 6700xt IS the 40CU variant, meaning the exact same core count as last generation.

IMO that would be a major disappointment, for whatever reason AMD is scared of making a 3072 core GPU, they refused to scale up polaris to 3072, same with rDNA, and Vega skipped right past it to 3584, resulting in a card that cannibalized their own vega 64.

If the 6700xt is a 40cu part that would be a major issue for AMD, as the 3060ti would have effectively no opponent, leaving a huge gap in AMD's lineup. Makes me wonder if we will see some last minute adjustments from their lineup.
Ah, since the article was titled RX 6700 with no XT in sight, I thought it was about "vanilla" 6700...

That aside, maybe they don't want to "hyper-segmentize" their product line up? Though if I were them, I would make the RX 6700 XT with 50 CU and leave anything below in the hands of the original RDNA, unless there were some other issues in the way (memory bandwidth maybe?)...
 
Last edited:
Back
Top