News Posts matching #DLSS

Return to Keyword Browsing

NVIDIA Announces DLSS 2.0 and New GeForce 445.75 Game Ready Drivers

NVIDIA today announced its new Deep Learning Supersampling 2.0 (DLSS 2.0) performance enhancement feature, being distributed through the new GeForce 445.75 Game Ready drivers. DLSS 2.0 is NVIDIA's second attempt at a holy grail of performance boost at acceptable levels of quality loss (think what MP3 did to WAV). It works by rendering the 3D scene at a lower resolution than what your display is capable of, and upscaling it with deep-learning reconstructing details using a pre-trained neural network. Perhaps the two biggest differences between DLSS 2.0 and the original DLSS that made its debut with GeForce RTX 20-series in 2018, is the lack of a need for game-specific content for training the DLSS neural net; and implementation of a rendering technique called temporal feedback.

As mentioned earlier, DLSS 2.0 offers image quality comparable to original resolution while only rendering 1/4 or1/2 the pixels. It then uses new temporal feedback techniques to reconstruct details in the image. DLSS 2.0 is also able to use tensor cores on GeForce RTX GPUs "more efficiently," to execute "2x faster" than the original DLSS. Lastly, DLSS 2.0 gives users greater control over the image quality, which affects the rendering resolution of your game: quality, balanced (1:2), and performance (1:4), where the ratio denotes rendering-resolution to display resolution. Resolution scaling is a sure-shot way to gain performance, but at noticeable quality loss. DLSS uses AI to restore some of the details. The difference between performance gained from resolution scaling and AI-based image quality enhancement is the net DLSS performance uplift. In addition to DLSS 2.0, GeForce 445.75 drivers come game-ready for "Half Life: Alyx."

DOWNLOAD: NVIDIA GeForce 445.75 Game Ready Drivers with DLSS 2.0

AORUS Announces the All-New AORUS 17 Flagship Laptop

Top-of-the-line gaming brand AORUS, reveals itself at the 2019 GAMESCOM trade show in Germany Cologne, in which the spotlight shines at the all-new 17 inch flagship gaming laptop, the AORUS 17. AORUS takes the lead yet again through the co-op with world renowned switch manufacture OMRON to innovate and develop a set of unique mechanical switches that is tailored for the AORUS 17, not only does the keys offer exceptional durability, it also offers one of the best feeling keys that gamers can find on a laptop. The AORUS greatness continues through the combination of the brand new Intel 8-core CPU, NVIDIA RTX graphics chip with Ray Tracing technology and an exclusive WINDFORCE INFINITY cooling system, the AORUS 17 steadily sits on the high-end gaming thrown with these specs.

AORUS leads the industry again by working with world renowned mechanical switch manufacture, OMRON in order to create a unique set of mechanical keys for the AORUS laptop, with gamer oriented design details, including an optimal 2.5 mm key travel and an actuation point of 1.6 mm, giving gamers both the sensational touch and sound of a crisp blue switch, which gamers can now enjoy the qualities of a full mechanical keyboard right on their AORUS laptop. AORUS pursues further by redesigning the key caps to produce stunning backlit keys with unique "concentric" keycaps, letting the LED underneath the keycap shine though evenly, increasing the overall lighting intensity by 27%, in addition to the AORUS exclusive FUSION 2.0 keyboard customization software, gamers can truly create a unique personal style.

NVIDIA Also Releases Tech Demos for RTX: Star Wars, Atomic Heart, Justice Available for Download

We've seen NVIDIA's move to provide RTX effects on older, non-RT capable hardware today being met with what the company was certainly expecting: a cry of dismay from users that now get to see exactly what their non-Turing NVIDIA hardware is capable of. The move from NVIDIA could be framed as a way to democratize access to RTX effects via Windows DXR, enabling users of its GTX 1600 and 1000 series of GPUs to take a look at the benefits of raytracing; but also as an upgrade incentive for those that now see how their performance is lacking without the new specialized Turing cores to handle the added burden.

Whatever your side of the fence on that issue, however, NVIDIA has provided users with one more raytraced joy today. Three of them, in fact, in the form of three previously-shown tech demos. The Star Wars tech demo (download) is the most well known, certainly, with its studies on reflections on Captain Phasma's breastplate. Atomic Heart (download) is another one that makes use of RTX for reflections and shadows, while Justice (download) adds caustics to that equation. If you have a Turing graphics card, you can test these demos in their full glory, with added DLSS for improved performance. If you're on Pascal, you won't have that performance-enhancing mode available, and will have to slog it through software computations. Follow the embedded links for our direct downloads of these tech demos.

NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

NVIDIA today announced that it is extending DXR (DirectX Raytracing) support to several GeForce GTX graphics models beyond its GeForce RTX series. These include the GTX 1660 Ti, GTX 1660, GTX 1080 Ti, GTX 1080, GTX 1070 Ti, GTX 1070, and GTX 1060 6 GB. The GTX 1060 3 GB and lower "Pascal" models don't support DXR, nor do older generations of NVIDIA GPUs. NVIDIA has implemented real-time raytracing on GPUs without specialized components such as RT cores or tensor cores, by essentially implementing the rendering path through shaders, in this case, CUDA cores. DXR support will be added through a new GeForce graphics driver later today.

The GPU's CUDA cores now have to calculate BVR, intersection, reflection, and refraction. The GTX 16-series chips have an edge over "Pascal" despite lacking RT cores, as the "Turing" CUDA cores support concurrent INT and FP execution, allowing more work to be done per clock. NVIDIA in a detailed presentation listed out the kinds of real-time ray-tracing effects available by the DXR API, namely reflections, shadows, advanced reflections and shadows, ambient occlusion, global illumination (unbaked), and combinations of these. The company put out detailed performance numbers for a selection of GTX 10-series and GTX 16-series GPUs, and compared them to RTX 20-series SKUs that have specialized hardware for DXR.
Update: Article updated with additional test data from NVIDIA.

Further Optimizations to NVIDIA RTX, DLSS For Battlefield V

DICE and NVIDIA have been hard at work on their partnership to bring RTX and DLSS to Battlefield V. It seems the tech is a constant work in progress, as this isn't the first time the companies have introduced optimizations to the games' handling of DLSS and RTX since its release. According to the patch notes from the latest update, the Trial by Fire Update #2, there have been further optimizations to RTX on Ultra - with increased ray trace counts to improve quality of reflections, which will definitely hit performance further.

Additionally, DLSS now supports rendering in borderless mode, and DLSS sharpness has also been improved. This likely means that NVIDIA's servers are still hard at work processing their "ground truth" image for the available scenarios in-game, further optimizing image quality. This is one of those rare technologies that will be improving with time, bringing the "fine wine" argument to (likely) its clearest scenario yet.

Anthem Gets NVIDIA DLSS and Highlights Support in Latest Update

Saying Anthem has had a rough start would be an understatement, but things can only get better with time (hopefully, anyway). This week saw an update to the PC version that brought along with it support for NVIDIA's new DLSS (Deep Learning Super Sampling) technology to be used with their new Turing-microarchitecture GeForce RTX cards. NVIDIA's internal testing shows as much as 40% improvement in average FPS with DLSS on relative to off, as seen in the image below, and there is also a video to help show graphical changes, or lack thereof in this case. DLSS on Anthem is available on all RTX cards at 3840x2160 resolution gameplay, and on the RTX 2060, 2070, and 2080 at 2560x1440. No word on equivalent resolutions at a non-16:9 aspect ratio, and presumably 1080p is a no-go as first discussed by us last month.

Note that we will NOT be able to test DLSS on Anthem, which is a result of the five activations limit as far as hardware configurations go. This prevented us from doing a full graphics card performance test, but our article on the VIP demo is still worth checking into if you were curious. In addition to DLSS, Anthem also has NVIDIA Highlights support for GeForce Experience users to automatically capture and save "best gameplay moments", with a toggle option to enable this setting in the driver. A highlight is generated for an apex kill, boss kill, legendary kill, multi kill, overlook interaction, or a tomb discovery. More on this in the source linked below in the full story.

Shadow of the Tomb Raider RTX Patch Now Available: RTX and DLSS Enabled

A new patch has become available for Shadow of the Tomb Raider, which updated the game to the latest graphical technologies in the form of RTX and DLSS. The PC port of the game has been handed by developer Nixxes, which partnered with NVIDIA to work on adding ray-tracing enabled shadows to the game (there's a thematic coherence there if I've ever seen one).

NVIDIA: Image Quality for DLSS in Metro Exodus to Be Improved in Further Updates, and the Nature of the Beast

NVIDIA, in a blog post/Q&A on its DLSS technology, promised implementation and image quality improvements on its Metro Exodus rendition of the technology. If you'll remember, AMD recently vouched for other, non-proprietary ways of achieving desired quality of AA technology across resolutions such as TAA and SMAA, saying that DLSS introduces "(...) image artefacts caused by the upscaling and harsh sharpening." NVIDIA in its blog post has dissected DLSS in its implementation, also clarifying some lingering questions on the technology and its resolution limitations that some us here at TPU had already wondered about.

The blog post describes some of the limitations in DLSS technology, and why exactly image quality issues might be popping out here and there in titles. As we knew from NVIDIA's initial RTX press briefing, DLSS basically works on top of an NVIDIA neural network. Titled the NGX, it processes millions of frames from a single game at varying resolutions, with DLSS, and compares it to a given "ground truth image" - the highest quality possible output sans any shenanigans, generated from just pure raw processing power. The objective is to train the network towards generating this image without the performance cost. This DLSS model is then made available for NVIDIA's client to download and to be run at your local RTX graphics card level, which is why DLSS image quality can be improved with time. And it also helps explain why closed implementations of the technology, such as 3D Mark's Port Royal benchmark, show such incredible image quality scenarios compared to, say, Metro Exodus - there is a very, very limited number of frames that the neural network needs to process towards achieving the best image quality.
Forumites: This is an Editorial

Tight Squeeze Below $350 as Price of GTX 1660 Ti Revealed

NVIDIA is reportedly pricing the GeForce GTX 1660 Ti at USD $279 (baseline pricing), which implies pricing of custom-designed and factory-overclocked cards scraping the $300-mark. The card is also spaced $70 apart from the RTX 2060, which offers not just 25% more CUDA cores, but also NVIDIA RTX and DLSS technologies. In media reporting of the card so far, it is being compared extensively to the GTX 1060 6 GB, which continues to go for under $230. Perhaps NVIDIA is planning a slower non-Ti version to replace the GTX 1060 6 GB under the $250-mark. That entry would place three SKUs within $50-70 of each other, a tight squeeze. Based on the 12 nm TU116 silicon, the GTX 1660 Ti is rumored to feature 1,536 CUDA cores, 96 TMUs, 48 ROPs, and a 192-bit wide GDDR6 memory interface, handling 6 GB of memory at 12 Gbps (288 GB/s). This GPU lacks RT cores.

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. However, company representatives said that they could, in theory, develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.

NVIDIA DLSS and its Surprising Resolution Limitations

TechPowerUp readers today were greeted to our PC port analysis of Metro Exodus, which also contained a dedicated section on NVIDIA RTX and DLSS technologies. The former brings in real-time ray tracing support to an already graphically-intensive game, and the latter attempts to assuage the performance hit via NVIDIA's new proprietary alternative to more-traditional anti-aliasing. There was definitely a bump in performance from DLSS when enabled, however we also noted some head-scratching limitations on when and how it can even be enabled, depending on the in-game resolution and RTX GPU employed. We then set about testing DLSS on Battlefield V, which was also available from today, and it was then that we noticed a trend.

Take Metro Exodus first, with the relevant notes in the first image below. DLSS can only be turned on for a specific combination of RTX GPUs ranging from the RTX 2060 to the RTX 2080 Ti, but NVIDIA appear to be limiting users to a class-based system. Users with the RTX 2060, for example, can't even use DLSS at 4K and, more egregiously, owners of the RTX 2080 and 2080 Ti can not enjoy RTX and DLSS simultaneously at the most popular in-game resolution of 1920x1080, which would be useful to reach high FPS rates on 144 Hz monitors. Battlefield V has a similar, and yet even more divided system wherein the gaming flagship RTX 2080 Ti can not be used with RTX and DLSS at even 1440p, as seen in the second image below. This brought us back to Final Fantasy XV's own DLSS implementation last year, which was all or nothing at 4K resolution only. What could have prompted NVIDIA to carry this out? We speculate further past the break.

Battlefield V Gets NVIDIA DLSS Support

Battlefield V became the first AAA title to support NVIDIA Deep-learning Supersampling or DLSS, a new-generation image-quality enhancement feature exclusive to NVIDIA GeForce RTX 20-series graphics cards, since it requires tensor cores. The feature was introduced as part of its comprehensive Battlefield V Chapter 2: Lightning Strikes Update late Tuesday. To use it, DXR must be enabled in the game. In its release notes for the update, EA-DICE describes DLSS as a feature "which uses deep learning to improve game performance while maintaining visual quality." The developers also improved the way the deploy screen displays on ultrawide monitors on the PC, particularly with the "Rotterdam" map.

Update: We have posted an article, taking a closer look at the DLSS implementation in Battlefield V.

NVIDIA DLSS Technology Coming to Battlefield V Soon According to DICE Update Notes

In a case of "Oops, we didn't mean to", DICE's update notes for Battlefield V came out at least a day before they were supposed to. While DICE quickly took to social media to mention these update notes were not necessarily final, everyone was quick to notice that the PC-specific improvements section listed NVIDIA DLSS support being added on February 25. We were able to take a look at DLSS in action on Battlefield V at the NVIDIA suite during CES 2019, and it made a vast difference in overall performance and graphics alike, especially since we could now turn on NVIDIA RTX and not get a massive decrease in average framerate.

Jaqub Ajmal, a producer at DICE for the game soon tweeted to clarify that the company is still working on this implementation, and does not actually have a set date yet. It may well be that the actual update notes that go out tomorrow (still Feb 11 in North America at the time of posting) may well have something else instead. Regardless of whether this happens Feb 25 or not, we here at TechPowerUp will take a closer look at DLSS and in-game effects, so be on the lookout for that. In the meantime, let us know your thoughts on DLSS coming to game titles and your expectations for the future.

3DMark Adds NVIDIA DLSS Feature Performance Test to Port Royal

Did you see the NVIDIA keynote presentation at CES this year? For us, one of the highlights was the DLSS demo based on our 3DMark Port Royal ray tracing benchmark. Today, we're thrilled to announce that we've added this exciting new graphics technology to 3DMark in the form of a new NVIDIA DLSS feature test. This new test is available now in 3DMark Advanced and Professional Editions.

3DMark feature tests are specialized tests for specific technologies. The NVIDIA DLSS feature test helps you compare performance and image quality with and without DLSS processing. The test is based on the 3DMark Port Royal ray tracing benchmark. Like many games, Port Royal uses Temporal Anti-Aliasing. TAA is a popular, state-of-the-art technique, but it can result in blurring and the loss of fine detail. DLSS (Deep Learning Super Sampling) is an NVIDIA RTX technology that uses deep learning and AI to improve game performance while maintaining visual quality.

ZOTAC Introduces ZBOX Magnus EC52070D Mini PC

ZOTAC Technology, a global manufacturer of innovation, today releases a new generation of MAGNUS E Series Mini PC built for hardware enthusiasts. The MAGNUS is re-engineered to give powerful performance for creators and prosumers in a compact and lightweight enclosure. With an 8th Generation Intel processor, a discrete GeForce RTX 2070 graphics card and equipped with premium connectivity, the all-new MAGNUS E Series Mini PC powers everything from demanding workloads, creative workflows, home entertainment experiences, and much more.

The MAGNUS EC52070D features a 6-core Intel Core i5-8400T processor with boost up to 3.3GHz, giving it the power to handle simultaneous, compute-intensive, multithreaded workloads. Users can get responsive processing on creative workflows such as photo post-processing, illustration, audio processing, video production, live-streaming, and more. With ready support for up to 32GB of high-speed DDR4 memory, NVMe M.2 SSD, 2.5-inch large capacity storage and Intel Optane memory technology for high performance storage, MAGNUS is capable of loading files and applications faster.

Final Fantasy XV Benchmark Gets DLSS Update, GeForce RTX 2080 Performance Tested

Square Enix has just updated their Final Fantasy XV Benchmark to version 1.2, adding support for NVIDIA's DLSS (Deep Learning Super-Sampling) technology. The new release will still allow users to test any graphics card(s) they have just as it did before. That said, owners of NVIDIA's RTX 2070, 2080, and 2080 Ti get the benefit of having access to DLSS for improved image quality and performance. NVIDIA claims that performance will improve by up to 38% with DLSS alone. In order to verify that we ran a few tests of our own to find out.

Preliminary testing was done using Corsair's Vengeance 5180 Gaming PC, which is equipped with an Intel i7-8700, 16 GB 2666 MHz DDR4 and an NVIDIA GeForce RTX 2080. At 3840x2160 with the highest possible settings, DLSS offered a 36% increase in performance. This is very close to NVIDIA's specified increase and within the expected margin of error. When compared to the older GTX 1080 Ti which was paired with a stock Intel i7-8700K, and 32 GB of 3466 MHz memory we see the GeForce RTX 2080 and GTX 1080 Ti offer roughly the same level of performance. Therefore DLSS really is the difference maker here allowing for better performance and image quality. It should also be noted both systems used the same NVIDIA 416.94 WHQL drivers.

Square Enix Cancels Future Final Fantasy XV Development, At Least We Have The Benchmark With DLSS Support

A few months ago the creation of Luminous Productions by members of Square Enix made us wonder if this could affect the development of Final Fantasy XV for PC. The company had to reorganize its resources and that affected the roadmap for this project, which has finally been cancelled. The departure of Hajime Tabata, director of development, has made Square Enix decide to cancel any further development, including DLCs and patches alike, for the game. This especially hurts the PC version given it is in need of some gameplay fixes sooner than later, as the comments on the Steam store page would tell you. The company is having a rough time lately, as it reported a $33 million loss in its latest financial briefing.

The demo-benchmark of Final Fantasy XV was in fact one of the surprises at the NVIDIA GeForce RTX 20 Series announcement, and thanks to it we were able to see how DLSS technology effectively posed a promising alternative to traditional anti aliasing techniques. With the project ending in the current state, we'll have to forget about these theoretical enhancements, and other eye-catching new features such as Vulkan API support.

NVIDIA Releases Comparison Benchmarks for DLSS-Accelerated 4K Rendering

NVIDIA released comparison benchmarks for its new AI-accelerated DLSS technology, which is part of their new Turing architecture's call to fame. Using the Infiltrator benchmark with its stunning real-time graphics, NVIDIA showcased the performance benefits of using DLSS-improved 4K rendering instead of the usual 4K rendering + TAA (Temporal Anti-Aliasing). Using a Core i9-7900X 3.3GHz CPU paired with 16 GB of Corsair DDR4 memory, Windows 10 (v1803) 64-bit, and version 416.25 of the NVIDIA drivers, the company showed tremendous performance improvements that can be achieved with the pairing of both Turing's architecture strengths and the prowess of DLSS in putting Tensor cores to use in service of more typical graphics processing workloads.

The results speak for themselves: with DLSS at 4K resolution, the upcoming NVIDIA RTX 2070 convincingly beats its previous-gen pair by doubling performance. Under these particular conditions, the new king of the hill, the RTX 2080 Ti, convincingly beats the previous gen's halo product in the form of the Titan Xp, with a 41% performance lead - but so does the new RTX 2070, which is being sold at half the asking price of the original Titan Xp.
Return to Keyword Browsing
Jun 1st, 2024 13:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts