• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4 GHz Radeon RX 7900 XT?

The 6500 XT runs at nearly 3 GHz, so? What's the big deal? :kookoo:

Frequency without any other data is meaningless.
 
I don't think that's cpu limit, more like game engine limited.
Maybe, i didn't have the chance to check other reviews.
If the results stay nearly the same tested with even lower CPUs it's possibly engine limited!
RTX 4090 doesn't seem to scale all the differently vs Ampere in 1080p in these kind of games (CPU or engine limited), it's just like what the reviewer said regarding driver overhead.
I thought that there was a small chance RTX 4080 16GB to be a little faster than 4090 in 1080p in these "problematic" games being a 7GPC design (nothing major 0.5% per extra GPC, around 2% total) but now that i thought it more clearly, probably it will have same performance.
In any case it will be interesting to see what AMD will achieve in these cases, but moving infinity cache & memory controllers in a different die it may occur extra latency hitting the FHD performance potential somewhat.
 
Last edited:
DLSS3 is said to higher it.
you have to reflex on for DLSS 3 to work & it's latency is slighty worse than native resolution.
 
you have to reflex on for DLSS 3 to work & it's latency is slighty worse than native resolution.

Yeah, DLSS2 is higher than native and DLSS3 is higher than DLSS, so how much until it's bad ?.
 
Good or bad news? Reveal will be on November 3rd.
The bad news is that the retail availability will be as late as late December, and it won't compete with RTX 4090. :sleep:
The good news is that the prices can be more Earthly.

1665650937790.png

AMD Radeon RX 7000 "RDNA 3" Graphics Cards Allegedly Launching In December, Difficult To Compete With NVIDIA's RTX 40 GPUs (wccftech.com)
 
Good or bad news? Reveal will be on November 3rd.
The bad news is that the retail availability will be as late as late December, and it won't compete with RTX 4090. :sleep:
The good news is that the prices can be more Earthly.

View attachment 265305
AMD Radeon RX 7000 "RDNA 3" Graphics Cards Allegedly Launching In December, Difficult To Compete With NVIDIA's RTX 40 GPUs (wccftech.com)
Won't compete in what Rasterization or raytracing?
or just because they don't have any type of frame gerneration technique ?
 
Won't compete in what Rasterization or raytracing?
or just because they don't have any type of frame gerneration technique ?

RT and DLSS are irrelevant, so if they want to be serious, I think they mean the classic rendering.
 
Doesnt matter they won't be able to supply these in enough quantities anyways.
 
RT and DLSS are irrelevant, so if they want to be serious, I think they mean the classic rendering.
oh he ment both.
Moore's law is Dead said the one released this year probably won't beat the RTX 4090.
however if it's only 6-4% slower it's fine.
If AMD releases something next year that is better like with GDDR7 or HBM3 & beats the RTX 4090 in 2023, Nvidia will probably release a RTX 4090 ti then.
 
Maybe they're just not going to compete for ultra high end. Help cost at least.
 
Maybe they're just not going to compete for ultra high end. Help cost at least.
I actually hope so. Personally, I've had enough of all the $1,000+ "supercards" lately (looking at you, Nvidia).
 
I actually hope so. Personally, I've had enough of all the $1,000+ "supercards" lately (looking at you, Nvidia).

I really don't understand nvidia. It's probably the only company that thinks about brute force and maximum frames per second, and everything else being secondary. Its business model rotates around that "moar frames, moar frames", no matter what.
When was the last time it made a reasonably sized chip at a reasonably placed performance tier?
Maybe never? Or 20 years ago?
 
I really don't understand nvidia. It's probably the only company that thinks about brute force and maximum frames per second, and everything else being secondary. Its business model rotates around that "moar frames, moar frames", no matter what.
When was the last time it made a reasonably sized chip at a reasonably placed performance tier?
Maybe never? Or 20 years ago?
I think they're trying to be the Rolls Royce of GPUs. Very few units for a very few rich individuals.
 
There's such playground in regards of clocks and a "MCM" type of design. They have learned from the PS4 which has less shaders but clock higher, compared to the Xbox which has more shaders but clocks slower. 4Ghz is proberly a boost state, respective clock will be around 3Ghz ~ 3.5Ghz.
 
I wonder if AMD is willing to match Nvidia at $899 for the most cut down Navi31 based GPU.
We could have something like the below scenario for example:

7950X? $1399-1299 192RB/384TU/ 12288 SP/384bit bus/24GB/192MB IF+V-cache/420W

7900X? $1099 192RB/368-352TU /11776-11264SP/384bit bus/24GB/96MB IF/330W (1.5X 6950XT at 4K with 13900K)

7800X? $899 160RB/320-304TU /10240-9728SP/320bit bus/20GB/300W
 
How do they plan to overcome the potential heavy GPU under-utilisation because of CPUs being slow to feed the GPU with data? ? :confused:
At the same time, it is ridiculous that the games don't utilise the CPUs, too, meaning that they stay at very low load under 20-30%?? :rolleyes: :confused:

7950X? $1399-1299

Don't think they will launch such a card this year or soon.
 
Could've been more believable if it came from someone not named 9550pro with ATI logo on the banner :D
That's HXL, a long-time industry insider of two decades or so responsible for hundreds, maybe even thousands of accurate and verified leaks over the years.

If I had to guess, they work high-level at Sapphire or Powercolor based on their info from VR-Zone back in the day.
 
Speaking of leaks, do you have access to chiphell? I can't access it.

We need some fresh leaks.

Does AMD plan to change the naming scheme this generation, for example dropping the old RX 7900 XT naming to something more interesting like Radeon XI 100?

1665739350628.png
 
That's a shit thread from May about PCIe 5.0 with no replies.
The 8th October thread is just speculation over an Igor's Lab PCB image.
 
What matters is the performance per $ and power efficiency in this market. My guess is the 40xx series will be the biggest commercial flop for Nvidia in a long time. Can see how desperate they are to try and shift the old 30xx even at MSRP or close to it.

Absolute performance really doesn't matter at all now from a commercial perspective. The first company that can get a card at a normal price point of $600 or less with a slight performance bump over last gen and 200w power consumption will clean the market up.
 
Back
Top