• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen ThreadRipper is Capable of Running Crysis without a GPU

Everyone can run Crysis without a GPU.

Almost every CPU from this decade is capable of running games in software renderer mode.


Above is a video of my very own 10-year-old Xeon running Crysis similarly like Linus.

Despite the non-sense, you'll need a GPU to run games on the highest quality, resolution, at decent FPS.

Was it really necessary to murder that poor turtle?
 
By definition, running it poorly is still running it.

Also why are you clarifying that it's in software mode? Of course it's in software mode. If it were in hardware mode it would be running on the GPU instead of entirely on the CPU, which is the point of the demonstration.

Lesser CPUs would be utterly incapable of this demonstration. The fact we've gone from being unable to run Crysis on many dedicated cards, to being able to run Crysis without any sort of dedicated GPU at all is a remarkable technological achievement and shows just how far we've come - and this demo is, well, sure, it's not bar charts and graphs, but it's a fun little experiment with a surprising result.

It would be interesting for us when they show actually how many of 3990X can run the game smoothly :laugh:
A 2-4 or 8-socketed system?
 
Oh, yes - GPU hardware ray-tracing is a lie... Best scenario and most honest is given on AMD slide:

A *selective* stuff can be done in ray-tracing scene can be ray-traced by individual GPUs, but the rendering of the sequential screens is so variable in computing power that it's... jut not possible in next, one after... and we shall see how many generations after... Therefore, idea that GPUs could've started with doing ray-traced is... hmmm...
Quake RTX would like to have a word with you.
Although not real-time, check the presentations on what professional tools (there should be some Arnold videos around from around RTX launch) are doing with some help from hardware.
There are performance limitations to everything, this does not make hardware acceleration a lie.
It would be interesting for us when they show actually how many of 3990X can run the game smoothly :laugh:
A 2-4 or 8-socketed system?
Judging from the task manager in the video, there is something incredibly inefficient in the software renderer.
 
There's really nothing wrong with Linus's content. No, it's not as in depth as an Anandtech or AdoredTV article, but that's not the point.

He's rarely outright wrong, he usually admits it when he IS wrong, and the greatest sin he commits on any sort of regular basis is just not going into enough depth. He's not lying to his viewers, he doesn't shill, he doesn't encourage fanboying, he's just creating content that is geared towards younger people getting into tech, and not crusty nerds who already have opinions on everything.

If you're tech literate enough to be annoyed at the stuff Linus oversimplifies or omits from explanations, then you're literally not the kind of person that needs Linus. And that's fine. It's not for you. Move on. It's great for the people who it's aimed at, and he's not *wrong* for deciding to explain things at a level of complexity his audience is comfortable with.

If you want more you have plenty of great options - Gamers Nexus, AdoredTV, etc.

Man you put that well. I used to get all cringy with Linus but its an acquired taste, I guess. Not at all 100% horrible to watch, sometimes... :D And its true, the guy isn't spreading BS, he does like his headlines though, but that is part of the job.

Steve still is the best TV guy of them all though. With Linus you can see he's acting. With Steve, he's just Steve being eloquent, most of the time, with a smile. Its pleasant to watch. Somewhat long winded. But OK.

The rest... straight up dustbin material IMO. With Adored I usually find my eyebrows up on my forehead in amazement over the content and the delivery... shock horror
 
Judging from the task manager in the video, there is something incredibly inefficient in the software renderer.


Yes.
If these 10% maximum load on the CPU running Crysis, then this test is really meaningless and not serious.
The software renderer doesn't work.
 
The drama queen and most biased youtuber. The one we need.
Man you put that well. I used to get all cringy with Linus but its an acquired taste, I guess. Not at all 100% horrible to watch, sometimes... :D And its true, the guy isn't spreading BS, he does like his headlines though, but that is part of the job.

Steve still is the best TV guy of them all though. With Linus you can see he's acting. With Steve, he's just Steve being eloquent, most of the time, with a smile. Its pleasant to watch. Somewhat long winded. But OK.

The rest... straight up dustbin material IMO. With Adored I usually find my eyebrows up on my forehead in amazement over the content and the delivery... shock horror

I think its a bit unfair to label a ytuber or reviewer regardless of their fanboism alignment. Everyone of them has something to offer to someone, you dont have to like it and you dont have to watch them, offering your fact based opinion to everyone who never asked for it just makes you look bad. @Vayra86 is correct, Ytubers are an acquired taste. I certainly dont like all of them, but then again one of them might actually make one video i think is worth watching.
 
Man you put that well. I used to get all cringy with Linus but its an acquired taste, I guess. Not at all 100% horrible to watch, sometimes... :D And its true, the guy isn't spreading BS, he does like his headlines though, but that is part of the job.

Steve still is the best TV guy of them all though. With Linus you can see he's acting. With Steve, he's just Steve being eloquent, most of the time, with a smile. Its pleasant to watch. Somewhat long winded. But OK.

The rest... straight up dustbin material IMO. With Adored I usually find my eyebrows up on my forehead in amazement over the content and the delivery... shock horror
I prefer Linus for YouTube tech reviewers, but Steve would be my second pick while he's often more informative Linus is just usually more entertaining despite doing really cringe somewhat frequently. I'm sure it's intentional majority of the time I think he's kind of adapted and started to do more of it over the years honestly he had a good gig and knew it and capitalized on it can't fault him for that he's unsual, but funny to listen to.
 
Well crap. I guess we need a substitute for "can it run Crysis?" If a CPU can, then it's not much of a challenge any more.

It only took 13 years for CPUs to catch up to GPUs. Moore's Law at work.
 
Well crap. I guess we need a substitute for "can it run Crysis?" If a CPU can, then it's not much of a challenge any more.

It only took 13 years for CPUs to catch up to GPUs. Moore's Law at work.

Studios don't really do it anymore, release games that run totally crippled on mainstream hardware at launch. They resort to an Ultra or extreme setting that offers low benefit for major perf hits.

Maybe Star Citizen qualifies... that is, if it was released two years ago lol

Nah I think the next battle will be some killer app with heavy RT in it. That'll be the new battleground. But even there you see a quick response to any complaints about performance. And stuff like Control also runs well, really, given the right hardware. I'll keep saying it though... physics. That is part of what made Crysis special too, and its still far too absent.
 
Almost every CPU made in the last decade can achieve the same.
Laughable statement. Not they can not.
Seems like they can, see below.

Everyone can run Crysis without a GPU.

Almost every CPU from this decade is capable of running games in software renderer mode.


Above is a video of my very own 10-year-old Xeon running Crysis similarly like Linus.

Despite the non-sense, you'll need a GPU to run games on the highest quality, resolution, at decent FPS.
That video shows GPU usage in GPU-Z. Sorry, you were not running in CPU exclusive mode.
I did not look at the video full screen. I saw the gauges for the GPU clock & fan...

This begs the question, does it really take a 64core CPU to run Crysis at a playable framerate?
 
Last edited:
Laughable statement. Not they can not.


That video shows GPU usage in GPU-Z. Sorry, you were not running in CPU exclusive mode.

He was running in windows 7 compatibility mode ;)
 
“AdoredTV” Lol... Cant watch him cause of a Big fan boy of amd and intel/nvidia hater that he is..

Gamernexus all The Way, How it should be.

AMD fanboy bad.

NV fanboy good.

Ahhh.... I gets it.

Always get your information from a variety of sources. :)
 
AMD intially builds CPU's for server / compute related, and puts a knockoff without support for ECC or "so much threads' out for consumers. Nothing wrong with it, just dont expect some cheaper alternative to the Epyc's.

intel does the same thing. All CPU's are designed with compute in mind. Consumers get a working knockoff. They dont need to run 2 different lines for consumer and enterprise.

Vega is a good example of that as well. The Vega is a knockoff from Mi-60 or so that proberly did'nt pass certain conditions required for enterprise.
 
I think its a bit unfair to label a ytuber or reviewer regardless of their fanboism alignment. Everyone of them has something to offer to someone, you dont have to like it and you dont have to watch them, offering your fact based opinion to everyone who never asked for it just makes you look bad. @Vayra86 is correct, Ytubers are an acquired taste. I certainly dont like all of them, but then again one of them might actually make one video i think is worth watching.
I dont want to label youtubers, but GN is exceptional. The way GN and his fanboy talks as if he is the only one telling true while others are lying. See the comment I replied.
 
So if I've been reading all the reviews on this awesome cpu correctly, no current version of Windows 10 is optimized yet to fully utilize all the 3990X's cores?

That being said, being able to run Crysis, albeit on low resolution, is very impressive. It's only a matter of time before AMD incorporates a GPU on one of these.
 
Kudos to AMD, though, for letting this SKU exist for the consumer HEDT segment.
It will, for certain, force Microsoft and software renderers to get their act together and start optimising their software for this kind of chips.
Because this is the future and AMD will make it sure they will release more cores for every segment in the coming years.

So if I've been reading all the reviews on this awesome cpu correctly, no current version of Windows 10 is optimized yet to fully utilize all the 3990X's cores?

That being said, being able to run Crysis, albeit on low resolution, is very impressive. It's only a matter of time before AMD incorporates a GPU on one of these.


AMD officially says that there is no performance difference between 10 Pro and 10 Enterprise.
However, you need to have the latest updates installed - 1909 version 18363.657.
 
I dont want to label youtubers, but GN is exceptional. The way GN and his fanboy talks as if he is the only one telling true while others are lying. See the comment I replied.
Could you please provide an example of why you've got such a hateboner for GN?
 
I actually ran Crysis using Microsoft Warp software library couple of years ago, on an i5 Haswell, 4 cores, 4 threads. It was running on low with like 800x600 resolution and it was not smooth at all.
I was actually thinking when threadripper came out that it would be interesting to see how it works on that one.
I don't know what library Linus used, but it was not utilizing the CPU properly. If the software library could really use those 120 threads it should work butter smooth.

Unfortunately there is no software library currently under development. Both solid libraries are old and not maintained.
The last ones were the Warp from MS and Swiftshader from a company acquired in the meantime by google.
 
Quake RTX would like to have a word with you.
Although not real-time, check the presentations on what professional tools (there should be some Arnold videos around from around RTX launch) are doing with some help from hardware.
There are performance limitations to everything, this does not make hardware acceleration a lie.
Judging from the task manager in the video, there is something incredibly inefficient in the software renderer.

Man, this is not a ray tracing thread, but... how should I put it... the facts would like a word with you... I won't continue this here, but feel free to create a new thread explaining real time ray tracing, the way you see it...
 
Back
Top