Monday, August 19th 2019

Minecraft to Get NVIDIA RTX Ray-tracing Support

Minecraft is the perfect gaming paradox. It's a stupidly-popular title, but with simple graphics that can run on practically any Windows machine, but supports the latest 3D graphics APIs such as DirectX 12. The title now adds another feather to its technical feature-set cap, with support for NVIDIA RTX real-time raytracing. RTX will now be used to render realistic light shafts using path-tracing, global illumination, shadows, ambient occlusion, and simple reflections. "Ray tracing sits at the center of what we think is next for Minecraft," said Saxs Persson, Franchise Creative Director of Minecraft at Microsoft. "RTX gives the Minecraft world a brand-new feel to it. In normal Minecraft, a block of gold just appears yellow, but with ray tracing turned on, you really get to see the specular highlight, you get to see the reflection, you can even see a mob reflected in it."

NVIDIA and Microsoft are yet to put out a release date on this feature update. It remains to be seen how hardcore crafters take this feature. Looking at images 1 and 2 (below), we can see that the added global illumination / bloom blurs out objects in the distance. This gives crafters the impression that the draw-distance is somehow affected. Crafters demand the highest possible draw-distance, with nothing blurring their view. We can't wait to try this out ourselves to see how RTX affects very-large crafting.
A video presentation by NVIDIA follows.

Add your own comment

66 Comments on Minecraft to Get NVIDIA RTX Ray-tracing Support

#26
Steevo
I guess the overwhelming amount of lighting sources couldn't have run on the unused CPU cores or shaders.... Wait.... Someone already did this....
Posted on Reply
#27
R-T-B
Vayra86they just found the engine was too archaic to get it done.
The engine they are implementing this new stuff in is new, dx12 only, and hardly archaic.
Posted on Reply
#28
oxrufiioxo
Finally a game to take advantage of the powah of my PC.
Posted on Reply
#30
Renald
I'm still pretty stunned by the fact that no Studio whatsoever (Mojang/Microsoft themselves, why not) replicated Minecraft in C++ with a real custom engine and LUA mods.
Java is so outdated for this kind of things ...

And it wouldn't take that much time... With a good small team, maybe ...5-6 months ?

Minecraft is not that complex. Once the graphical first layer is mastered, the rest is really easy even for a beginner.
Posted on Reply
#31
neatfeatguy


/sarcasm



Bleh - seems like a waste of time/resources. Then again, Nvidia CEO did say we should be buying cards that support RTRT....maybe this was why?
Posted on Reply
#32
kings
I don´t get all the hate, no one is forcing people to play with Ray Tracing on.

Amazing how we got to the point of criticizing game improvements, just because the order of the day is bash everything Nvidia related.

Sad times indeed...
Posted on Reply
#33
Unregistered
kingsAmazing how we got to the point of criticizing game improvements, just because the order of the day is bash everything Nvidia related.
Probably because RTX is rendered irrelevant in price/performance in the high end market, the 1080ti offered super good value in my opinion at least and when you can get a previous gen high end card for half as much as a current high end one with only 25-35% more performance at twice the price it's pretty poor value unless you have money to throw away.
Posted on Edit | Reply
#34
Vayra86
londisteBaseline for performance comparisons should probably be the RT shaders in SEUS pack which do not perform too well either. Although I am sure Nvidia will make things fuzzy with more and different effects to avoid direct comparisons. Another, lighter, point of comparison between RTX and GTX cards that leverages DXR would be cool either way.

The guy who made RT shaders said in an interview that Minecraft is a nice starting point for RT because the game assets translate well to BVH which makes some parts of RT implementation easier. I would assume its thanks to big blocks instead of more complex shapes :D
Its an interesting project no doubt, much like the Quake 2 one. I think these types of games are indeed the best 'first' implementations of RT, simply because there is an actually playable game behind it and the enhancements are very much in your face all the time.

That said, I would not touch Minecraft with a 10ft pole today, but ok.
R-T-BThe engine they are implementing this new stuff in is new, dx12 only, and hardly archaic.
Aha, so its going to be platform limited then to their PC/X1 versions and it won't be feasible on any other port. Definitely not going the Super Duper way then, that's good.
kingsI don´t get all the hate, no one is forcing people to play with Ray Tracing on.

Amazing how we got to the point of criticizing game improvements, just because the order of the day is bash everything Nvidia related.

Sad times indeed...
Don't mistake hate with a lack of enthusiasm ;) The reason it might be seen as hate is because somehow a group seems to think you have to be standing there cheering on Nvidia for all the greatness they bring to our lives, or something. Against that, almost everyone who thinks otherwise looks like 'a hater'.

Surprise, there is a middle ground, where you just wait it out and judge things by what they are to you and you alone, not what they are for an internet movement. And if you take that stance, its pretty difficult to be cheering for Nvidia right now.
Posted on Reply
#35
kings
Xx Tek Tip xXProbably because RTX is rendered irrelevant in price/performance in the high end market, the 1080ti offered super good value in my opinion at least and when you can get a previous gen high end card for half as much as a current high end one with only 25-35% more performance at twice the price it's pretty poor value unless you have money to throw away.
Vayra86Don't mistake hate with a lack of enthusiasm ;) The reason it might be seen as hate is because somehow a group seems to think you have to be standing there cheering on Nvidia for all the greatness they bring to our lives, or something. Against that, almost everyone who thinks otherwise looks like 'a hater'.

Surprise, there is a middle ground, where you just wait it out and judge things by what they are to you and you alone, not what they are for an internet movement. And if you take that stance, its pretty difficult to be cheering for Nvidia right now.
It is one thing to criticize Nvidia as a company, it is another to criticize improvements or technology advancements and therefore all that can make use of them.

It doesn't make any sense in my opinion, it would be like criticizing OLED, just because someone may not like the leading manufacturer LG, for example.
Posted on Reply
#36
R-T-B
RenaldI'm still pretty stunned by the fact that no Studio whatsoever (Mojang/Microsoft themselves, why not) replicated Minecraft in C++ with a real custom engine and LUA mods.
Someone missed my post above... engine has several c++ incarnations now.
Posted on Reply
#37
Vayra86
kingsIt is one thing to criticize Nvidia as a company, it is another to criticize improvements or technology advancements and therefore all that can make use of them.

It doesn't make any sense in my opinion, it would be like criticizing OLED, just because someone may not like the leading manufacturer LG, for example.
The criticism isn't aimed at the technology, it is aimed at the way Nvidia forges its way into it, and tries to push it. The motivation is clearly not 'the technology' but raking in cash selling a promise that, thus far, is rather empty. We can't gauge Turing with respect to its RT capability - yet. But it does affect us already in the generational performance increases, and the prices those are sold at.

Simply put, we don't like paying for technology we can't use yet. It'd be like selling us a 4K TV without any 4K content available. Or a QLED TV that tries really hard to make us think its actually OLED, if we would keep to your example :)

And something could definitely be said about LG patenting OLED the way it did, and locking it down entirely for the rest of the market; at the same time relegating Samsung to its QLED and microled research which is by principle inferior to it. Nvidia didn't do quite that, but due to its timing (and high-end monopoly), the effect is rather close to that - there is only one supplier of the hardware.
Posted on Reply
#38
Unregistered
kingsIt is one thing to criticize Nvidia as a company, it is another to criticize improvements or technology advancements and therefore all that can make use of them.
Performance impact is beyond absurd and is not viable until GPUs are powerful enough.
Posted on Edit | Reply
#39
Renald
R-T-BSomeone missed my post above... engine has several c++ incarnations now.
Speaking of your DX12 part ? On my Win7 ? Sure !

What I mean is : the Java version should not exist at all as the "base game". A part of it is written in C++, C or Assembly ? pretty damn sure some parts.
But the core of the engine lays on Java.

It's cool for a object visualization, but not for a game.
Posted on Reply
#40
Basard
I'm thinking MC should run great with rat tracing on.... it doesnt use much GPU power as it us.....
Posted on Reply
#41
R-T-B
RenaldBut the core of the engine lays on Java.
Nope, java version doesn't even get feature equivalency to the new fresh W10 rewrite anymore. Hint: it won't be getting raytracing.

That, and as a Java dev, Java is not the old engine's issue. Notch learning to code was.

Java is a ram hog, but is actually pretty performant and has world class multithreading (which Notch never used).
RenaldA part of it is written in C++, C or Assembly ?
The whole modern version of the game is.
Posted on Reply
#42
FordGT90Concept
"I go fast!1!11!1!"
I was shocked to see Reuters cover this:
www.reuters.com/article/us-nvidia-gaming/microsoft-nvidia-team-up-for-more-realistic-visuals-on-minecraft-game-idUSKCN1V90HS

NVIDIA is apparently making a big deal out of this.
R-T-BNope, java version doesn't even get feature equivalency to the new fresh W10 rewrite anymore. Hint: it won't be getting raytracing.

That, and as a Java dev, Java is not the old engine's issue. Notch learning to code was.

Java is a ram hog, but is actually pretty performant and has world class multithreading (which Notch never used).
There's lots of problems with using Java for anything. #1 is OpenGL gets poor performance on AMD cards because AMD's OpenGL focus is on professional software. #2 is Java specific things like no unsigned integer types can cause serious memory and/or performance problems because you're either forced to use types that are double in length to accommodate half of the values being useless or you're constantly converting to and from types to work around the fact nothing is unsigned. #3 JVM performance is generally not as good as .NET Framework performance. The same code ran on both systems, .NET is usually a great deal faster.


The UWP version of Minecraft...is locked into Microsoft's ecosystem where the Java version isn't. They both suck for their own reasons.
Posted on Reply
#43
forman313
Finally the gamers get the benefits from their RT-cores. They can get ray traced graphics in Minecraft, Tetris and perhaps Quake.

At last they got their money´s worth :D
Posted on Reply
#44
tabascosauz
Yawn. Sildur's shaders High and Optifine have looked this good for years now. Typical Nvidia.
Posted on Reply
#45
Keullo-e
S.T.A.R.S.
What I read from geforceuk:s Instagram post's comments, this is done via shaders?

I can be wrong, didn't dive into this since RTX, meh... And this game which could be ported to NES, meh.
Posted on Reply
#46
lexluthermiester
Xuperps : Whatever you disagree , Don't bother to reply !
Stuff that! You don't get to tell everyone what to do. To that, I am going to disagree. 2080ti is not needed. 2060@1080p will do just fine.
Posted on Reply
#47
Fluffmeister
AMD should add DXR support to their drivers like Nv did with Pascal, it would be nice to see how they compare above and beyond that Crytek demo no one can try.
Posted on Reply
#48
yotano211
btarunrA bulk of his revenues comes from selling non-RTX GPUs, namely the GTX 16-series.

Sure, they have DXR over shaders, but unplayable. It's a tickbox feature at best.
i thought the most popular card according to steam poles was the 2060.
Posted on Reply
#49
R-T-B
FordGT90ConceptThere's lots of problems with using Java for anything.
We've been over this time and again. No there isn't. I get that you don't like Java but please quit that old song of actually coding in it's coding style having jack to do with it's performance issues, or lack thereof.

Unsigned integer arithmetic was introduced with Java 8 besides even if that were true. Which is nearly EOL by now...
FordGT90ConceptThe UWP version of Minecraft...is locked into Microsoft's ecosystem where the Java version isn't. They both suck for their own reasons
This I don't disagree with but... that wasn't the point either.
Posted on Reply
#50
FordGT90Concept
"I go fast!1!11!1!"
R-T-BWe've been over this time and again. No there isn't. I get that you don't like Java but please quit that old song of actually coding in it's coding style having jack to do with it's performance issues, or lack thereof.
It's not difficult to do a comparison. :P
R-T-BUnsigned ints were introduced with Java 8 besides even if that were true. Which is nearly EOL by now...
Which debuted in 2014, five years after Minecraft made its public debut and too late to bother fixing it when, by that point, 90% of the effort went into porting to other platforms...and none of them use JVM.

Even a similar game, Rising World, made on JVM, has scrapped future development on JVM in favor of Unity. Video games on JVM are a rare breed for a reason.

Even Microsoft and NVIDIA overlooked the JVM version in favor of the UWP version because NVIDIA has no intention of supporting RTX on OpenGL.
Posted on Reply
Add your own comment
May 11th, 2024 04:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts