• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

What do you mean similar setup? Yes he has a different cpu cooler, so?
In the video you posted, on the timestamp you selected, for whatever reason it is GPU bound. GPU usage is at 100%. How is it that you think that this has anything to do with CPU performance, I don’t know.

also, different time of day, different gpu, different place on the map, not clear if the settings are the same…
 
In the video you posted, on the timestamp you selected, for whatever reason it is GPU bound. GPU usage is at 100%. How is it that you think that this has anything to do with CPU performance, I don’t know.
Yes and 5 seconds later it drops to 70 and 80%. Here you go with a 4080, same exact scene, same framerate, GPU is sitting at 50%. Clearly GPU bound. I mean come on now, can't we just admit what's in front of us? Have we come to this?

 
It's literally 5 seconds after the timestamp.......
And I literally could not care less.

The video you posted of the 5800x3d also jumps around to the settings like 100 times per minute, and everyone knows that cyberpunk settings update properly only after a full re-launch of the game.

Not convinced, at all.
do you have any credible sources?
 
And I literally could not care less.

The video you posted of the 5800x3d also jumps around to the settings like 100 times per minute, and everyone knows that cyberpunk settings update properly only after a full re-launch of the game.

Not convinced, at all.
do you have any credible sources?
I can make you a video on the exact same scene changing my settings 200 times per minute. Actually I already have a video on my channel in the same road with all those npcs, doesn't drop below 140. I mean come on, let's face it, you don't want it to be true therefore youll pretend it isn't. It's fine. You can ask anyone with a 3d to post some numbers but they are going to avoid it like the plague, ive been trying for a year now :roll:

If you don't care then why are you asking for proof, lol
 
I can make you a video on the exact same scene changing my settings 200 times per minute. Actually I already have a video on my channel in the same road with all those npcs, doesn't drop below 140. I mean come on, let's face it, you don't want it to be true therefore youll pretend it isn't. It's fine. You can ask anyone with a 3d to post some numbers but they are going to avoid it like the plague, ive been trying for a year now :roll:
So no credible sources. OK

Have you tried to send the details of the place where this happens to TPU, HUB, or eurogamer? They usually like to do silly tests like this, trying to find the worst performing spots.
 
So no credible sources. OK

Have you tried to send the details of the place where this happens to TPU, HUB, or eurogamer? They usually like to do silly tests like this, trying to find the worst performing spots.
What do you consider a credible source? I don't understand

HUB has no clue what they are doing, their numbers never make sense. I basically test most games they have on their benchmarks and im getting wildly different results, with worse ram than they are running

Here is a video in spiderman with HUB settings. He is getting 122 average if I remember correctly, I get 140 lows or something :roll:

 
Someone who understands that test setup matters.
How do you expect same test setups between a 5800x3d and a 12900k? From mobo to ram, everything is going to be different. Anyways, I get it, it's something you don't want to accept so nothing will convince you, fine, let's move on
 
How do you expect same test setups between a 5800x3d and a 12900k? From mobo to ram, everything is going to be different. Anyways, I get it, it's something you don't want to accept so nothing will convince you, fine, let's move on
I mean the test setup as a whole, not just some hardware components.

what you do before you launch the game, which save file to use, what other sw is running, how the results are collected, what do you do in game.
 
I mean the test setup as a whole, not just some hardware components.

what you do before you launch the game, which save file to use, what other sw is running, how the results are collected, what do you do in game.
I have a feeling that if the 3d wasnt getting slaughtered none of that would have bothered you but thats just me.
 
Rofl. Just check my message history. I’m always very critical of proper test setups.
Yeah but the thing is, how do you know anyone has a proper test setup? How do you know hub tests are reliable? You dont know and you cant know.

You can only buy the hardware yourself and try to replicate their numbers. I did the first, but I couldnt do the latter. Only in a handful of games did his benchmark numbers align with mine.In some games (assetto corsa for example) i needed to run the games on the ecores (lol) to get the framerate he was getting.
 
Yeah but the thing is, how do you know anyone has a proper test setup? How do you know hub tests are reliable? You dont know and you cant know.

You can only buy the hardware yourself and try to replicate their numbers. I did the first, but I couldnt do the latter. Only in a handful of games did his benchmark numbers align with mine.In some games (assetto corsa for example) i needed to run the games on the ecores (lol) to get the framerate he was getting.
How do I know? Most proper testing sites post their test methodology online for people to comment on. And I know for sure that the test setup is shit when comparing videos posted by multiple sources on youtube.

If test setup does not matter, i guess you’d be fine with one cpu being tested with the camera pointing to a wall the whole time, with the other one in the busiest place in the game.
 
How do I know? Most proper testing sites post their test methodology online for people to comment on. And I know for sure that the test setup is shit when comparing videos posted by multiple sources on youtube.

If test setup does not matter, i guess you’d be fine with one cpu being tested with the camera pointing to a wall the whole time, with the other one in the busiest place in the game.
Uhm, but you can see the run on the video, so what camera pointed to the wall are you talking about?
 
For gods sake. So does test setup matter or not?
I don't even know what that means. You mean clean windows installs and background tasks running? Sure they matter, a tiny bit. Nobody is going to post benchmarks on the internet while running cinebench on the background. Test setup matters when the difference between 2 cpus is like 5-10-15%. When the framerate is close to double, yeah test setup is kinda irrelevant, nothing can close that gap
 
I have a feeling that if the 3d wasnt getting slaughtered none of that would have bothered you but thats just me.
What other reviewers show the 5800X3D getting "slaughtered" by the 12900K in CP 2077?

I don't think you're showing us difference between the CPUs, I think you're showing us your bias.

Here's TPU's results with a 3080:

1673783284681.png


1673783043221.png



1673783129867.png


1673783167712.png
 
What other reviewers show the 5800X3D getting "slaughtered" by the 12900K in CP 2077?

I don't think you're showing us difference between the CPUs, I think you're showing us your bias.

Here's TPU's results with a 3080:

View attachment 279224

View attachment 279220


View attachment 279222

View attachment 279223
Those numbers are with RT off. RT is insanely heavy on the CPU.


Most reviewers indeed do not test cyberpunk with RT on in CPU reviewers for obvious reasons ;)

There are a couple that do, I can link if you are interested
 
This seems like the Ryzen to buy. If they had this available in the Fall, I'd have done that instead of Alderlake. Hopefully they have a good launch.
If the 7800X3D is as good as this article suggests, then Intel have a big problem.

I'm gonna seriously consider this for my 2700K upgrade once the reviews are out. Will be really nice to dodge the e-core bullet, if nothing else.
E cores aren't an issue, I am still on 10 LTSC and still not an issue, never once has been an issue with what I do on this computer (mostly games, browsing).
That said, I would rather have a giant blob of cache than the E cores, but it is what it is. I think the E cores do provide some performance uplift just by the sheer quantity of extra cores to handle background stuff.
 
Those numbers are with RT off. RT is insanely heavy on the CPU.

That's complete nonsense, I've just ran the cyberpunk benchmark out of curiosity with and without RT and the CPU usage was higher without RT, which is as expected because the game is pushing significantly more frames. I've also ran it at a locked framerate and it was still roughly the same, no "insanely heavy" usage.
 
Last edited:
Back
Top