• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Minecraft to Get NVIDIA RTX Ray-tracing Support

There is one drawback , Windows 10.
 
I guess the overwhelming amount of lighting sources couldn't have run on the unused CPU cores or shaders.... Wait.... Someone already did this....
 
they just found the engine was too archaic to get it done.

The engine they are implementing this new stuff in is new, dx12 only, and hardly archaic.
 
Finally a game to take advantage of the powah of my PC.
 
I'm still pretty stunned by the fact that no Studio whatsoever (Mojang/Microsoft themselves, why not) replicated Minecraft in C++ with a real custom engine and LUA mods.
Java is so outdated for this kind of things ...

And it wouldn't take that much time... With a good small team, maybe ...5-6 months ?

Minecraft is not that complex. Once the graphical first layer is mastered, the rest is really easy even for a beginner.
 
vuovl.jpg


/sarcasm



Bleh - seems like a waste of time/resources. Then again, Nvidia CEO did say we should be buying cards that support RTRT....maybe this was why?
 
I don´t get all the hate, no one is forcing people to play with Ray Tracing on.

Amazing how we got to the point of criticizing game improvements, just because the order of the day is bash everything Nvidia related.

Sad times indeed...
 
Amazing how we got to the point of criticizing game improvements, just because the order of the day is bash everything Nvidia related.
Probably because RTX is rendered irrelevant in price/performance in the high end market, the 1080ti offered super good value in my opinion at least and when you can get a previous gen high end card for half as much as a current high end one with only 25-35% more performance at twice the price it's pretty poor value unless you have money to throw away.
 
Baseline for performance comparisons should probably be the RT shaders in SEUS pack which do not perform too well either. Although I am sure Nvidia will make things fuzzy with more and different effects to avoid direct comparisons. Another, lighter, point of comparison between RTX and GTX cards that leverages DXR would be cool either way.

The guy who made RT shaders said in an interview that Minecraft is a nice starting point for RT because the game assets translate well to BVH which makes some parts of RT implementation easier. I would assume its thanks to big blocks instead of more complex shapes :D

Its an interesting project no doubt, much like the Quake 2 one. I think these types of games are indeed the best 'first' implementations of RT, simply because there is an actually playable game behind it and the enhancements are very much in your face all the time.

That said, I would not touch Minecraft with a 10ft pole today, but ok.

The engine they are implementing this new stuff in is new, dx12 only, and hardly archaic.

Aha, so its going to be platform limited then to their PC/X1 versions and it won't be feasible on any other port. Definitely not going the Super Duper way then, that's good.

I don´t get all the hate, no one is forcing people to play with Ray Tracing on.

Amazing how we got to the point of criticizing game improvements, just because the order of the day is bash everything Nvidia related.

Sad times indeed...

Don't mistake hate with a lack of enthusiasm ;) The reason it might be seen as hate is because somehow a group seems to think you have to be standing there cheering on Nvidia for all the greatness they bring to our lives, or something. Against that, almost everyone who thinks otherwise looks like 'a hater'.

Surprise, there is a middle ground, where you just wait it out and judge things by what they are to you and you alone, not what they are for an internet movement. And if you take that stance, its pretty difficult to be cheering for Nvidia right now.
 
Last edited:
Probably because RTX is rendered irrelevant in price/performance in the high end market, the 1080ti offered super good value in my opinion at least and when you can get a previous gen high end card for half as much as a current high end one with only 25-35% more performance at twice the price it's pretty poor value unless you have money to throw away.

Don't mistake hate with a lack of enthusiasm ;) The reason it might be seen as hate is because somehow a group seems to think you have to be standing there cheering on Nvidia for all the greatness they bring to our lives, or something. Against that, almost everyone who thinks otherwise looks like 'a hater'.

Surprise, there is a middle ground, where you just wait it out and judge things by what they are to you and you alone, not what they are for an internet movement. And if you take that stance, its pretty difficult to be cheering for Nvidia right now.

It is one thing to criticize Nvidia as a company, it is another to criticize improvements or technology advancements and therefore all that can make use of them.

It doesn't make any sense in my opinion, it would be like criticizing OLED, just because someone may not like the leading manufacturer LG, for example.
 
I'm still pretty stunned by the fact that no Studio whatsoever (Mojang/Microsoft themselves, why not) replicated Minecraft in C++ with a real custom engine and LUA mods.

Someone missed my post above... engine has several c++ incarnations now.
 
It is one thing to criticize Nvidia as a company, it is another to criticize improvements or technology advancements and therefore all that can make use of them.

It doesn't make any sense in my opinion, it would be like criticizing OLED, just because someone may not like the leading manufacturer LG, for example.

The criticism isn't aimed at the technology, it is aimed at the way Nvidia forges its way into it, and tries to push it. The motivation is clearly not 'the technology' but raking in cash selling a promise that, thus far, is rather empty. We can't gauge Turing with respect to its RT capability - yet. But it does affect us already in the generational performance increases, and the prices those are sold at.

Simply put, we don't like paying for technology we can't use yet. It'd be like selling us a 4K TV without any 4K content available. Or a QLED TV that tries really hard to make us think its actually OLED, if we would keep to your example :)

And something could definitely be said about LG patenting OLED the way it did, and locking it down entirely for the rest of the market; at the same time relegating Samsung to its QLED and microled research which is by principle inferior to it. Nvidia didn't do quite that, but due to its timing (and high-end monopoly), the effect is rather close to that - there is only one supplier of the hardware.
 
Last edited:
It is one thing to criticize Nvidia as a company, it is another to criticize improvements or technology advancements and therefore all that can make use of them.
Performance impact is beyond absurd and is not viable until GPUs are powerful enough.
 
Someone missed my post above... engine has several c++ incarnations now.
Speaking of your DX12 part ? On my Win7 ? Sure !

What I mean is : the Java version should not exist at all as the "base game". A part of it is written in C++, C or Assembly ? pretty damn sure some parts.
But the core of the engine lays on Java.

It's cool for a object visualization, but not for a game.
 
I'm thinking MC should run great with rat tracing on.... it doesnt use much GPU power as it us.....
 
But the core of the engine lays on Java.

Nope, java version doesn't even get feature equivalency to the new fresh W10 rewrite anymore. Hint: it won't be getting raytracing.

That, and as a Java dev, Java is not the old engine's issue. Notch learning to code was.

Java is a ram hog, but is actually pretty performant and has world class multithreading (which Notch never used).

A part of it is written in C++, C or Assembly ?

The whole modern version of the game is.
 
Last edited:
I was shocked to see Reuters cover this:

NVIDIA is apparently making a big deal out of this.


Nope, java version doesn't even get feature equivalency to the new fresh W10 rewrite anymore. Hint: it won't be getting raytracing.

That, and as a Java dev, Java is not the old engine's issue. Notch learning to code was.

Java is a ram hog, but is actually pretty performant and has world class multithreading (which Notch never used).
There's lots of problems with using Java for anything. #1 is OpenGL gets poor performance on AMD cards because AMD's OpenGL focus is on professional software. #2 is Java specific things like no unsigned integer types can cause serious memory and/or performance problems because you're either forced to use types that are double in length to accommodate half of the values being useless or you're constantly converting to and from types to work around the fact nothing is unsigned. #3 JVM performance is generally not as good as .NET Framework performance. The same code ran on both systems, .NET is usually a great deal faster.


The UWP version of Minecraft...is locked into Microsoft's ecosystem where the Java version isn't. They both suck for their own reasons.
 
Last edited:
Finally the gamers get the benefits from their RT-cores. They can get ray traced graphics in Minecraft, Tetris and perhaps Quake.

At last they got their money´s worth :D
 
Yawn. Sildur's shaders High and Optifine have looked this good for years now. Typical Nvidia.
 
What I read from geforceuk:s Instagram post's comments, this is done via shaders?

I can be wrong, didn't dive into this since RTX, meh... And this game which could be ported to NES, meh.
 
AMD should add DXR support to their drivers like Nv did with Pascal, it would be nice to see how they compare above and beyond that Crytek demo no one can try.
 
A bulk of his revenues comes from selling non-RTX GPUs, namely the GTX 16-series.

Sure, they have DXR over shaders, but unplayable. It's a tickbox feature at best.
i thought the most popular card according to steam poles was the 2060.
 
There's lots of problems with using Java for anything.

We've been over this time and again. No there isn't. I get that you don't like Java but please quit that old song of actually coding in it's coding style having jack to do with it's performance issues, or lack thereof.

Unsigned integer arithmetic was introduced with Java 8 besides even if that were true. Which is nearly EOL by now...

The UWP version of Minecraft...is locked into Microsoft's ecosystem where the Java version isn't. They both suck for their own reasons

This I don't disagree with but... that wasn't the point either.
 
Last edited:
Back
Top