• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial NVIDIA's Move From GTX to RTX Speaks to Belief in Revolutionary Change in Graphics

Geezus, what where they thinking at the marketing department... Lets raise pricing like Apple does, and charge a broad 1100$ for the premium model.

The lack of AMD competing in the high end is what creates this madness. AMD needs to get their stuff together, that 7nm vega aint going to cut it.
 
Yeah sadly for 7nm Vega, poor Volta is already powering the worlds most powerful supercomputer.
 
Raytracing is no more a "gimmick" than DirectX 12 was a "gimmick". We're still waiting for the good games to use it natively, yet many jumped on the hype train and bought cards that were supposed to be more "future proof". Most of those buyers have already or are about to replace their hardware anyway…
This, most likely ill just pick up whatever RTX 3070 comes, after i replace my ancient ivybridge build.
I remember years a go that majority of the games would be using DX12 and Vulkan by now, turns out that DX11 is still the dominant API, nobody wants to spend too much time optimizing for for DX12 when they could just let nvidia or amd to optimize their drivers for them.
Doom/Wolfenstein2 vulkan is amazing though
 
Turing brings mixed precision too, with Vega being on the market for a while, presumably loads of games support rapid packed math now?
 
Do understand just how useful ray-tracing is? When every movie studio on the planet uses it for their CGI, it's big and has been for decades. And now's it's come to the consumer. If you fail to understand what ray-tracing has done for the world and it's potential for gaming, that is only your failure.
How about every major Hollywood production that use CGI? Welcome to the real world. The future is now.
People here calling ray-tracing a scam.
After commenting they go back to watch the new Netflix movie filled with CGI they don't even know is CGI... because of ray-tracing. Ignorance never goes away, does it?

You guys really fail over and over to understand something. The CGI made for Hollywood's movies it's NOT real-time, but it's pre-rendered and post-processed over multiple layers, during very long periods of time. And yet, it's not very realistic for living characters, like Thanos from Avengers which was so fake and unrealistic, I thought I was watching a CGI cartoon. Extremely poor rendering compared to the Autobots from Transformers , for example, where the CGI was exemplary. - Sure it's not all raytracing, but...
 
https://videocardz.com/newz/nvidia-geforce-rtx-2070-does-not-have-nvlink

so it doesn't support dual gpu, big dea-

The story continues. The GeForce RTX 2070 might not even be using TU104 GPU, but a mid-range TU106 instead. The card clearly has a different board and different Device ID. It seems that the whole GPU segmentation has shifted and we are now paying more for the same GPU-classes than before.

wait what!? are we in a time where a low end codename now costs 500usd? (106 is used was used for low end models right before, 104 for midrange and 100/102 for high ends)
 
PCWorld reported that RTX demos still exhibited noise. During their Full Nerd show.

RTX also relies on a fancy new denoising module. Denoising is very important in ray tracing because you can only cast a limited number of rays from each pixel in the virtual camera. So unless you leave your ray tracer running long enough to fill in the scene, you have a lot of unpleasant-looking “bald spots” or “noise.” There are some great research projects that help optimize which rays are cast, but you still wind up with a lot of noise. If that noise can be reduced separately, you can produce quality output much faster than if you need to address the issue by sending out that many more rays. Nvidia uses this technique to help it create frames more quickly.
 
isn't that supposed to be resolved by the tensor cores performing denoise algorithm or is it still not enough?
 
Your comments will fall on deaf ears here. We don't care about consoles. Until consoles have the versatility and configurability of a PC, they will never have what it takes to compete.
Except for the millions of customers they have...

PC gaming has always been the luxury buyers choice, the graphical power stomps all over consoles, no debate there, and you can customise till your hearts content.

The difference IS shrinking with each generation though. Put the average gamer in a room with a PS4/XB1 and PC w/ 1070+ and they'd be hard pressed to find the enormous differences you'd have gotten a decade ago.

A PS5/XB2 is going full 4K next gen, the newest Xbox is already capable of it. It's pure 'fanboyism' to think otherwise. If Nvidia continue this trend with overpriced cards then there's only going to be one winner.
 
You guys really fail over and over to understand something. The CGI made for Hollywood's movies it's NOT real-time, but it's pre-rendered and post-processed over multiple layers, during very long periods of time. And yet, it's not very realistic for living characters, like Thanos from Avengers which was so fake and unrealistic, I thought I was watching a CGI cartoon. Extremely poor rendering compared to the Autobots from Transformers , for example, where the CGI was exemplary. - Sure it's not all raytracing, but...
I've done my fair share of realistic 3D-modeling with ray-tracing and without. I am well aware that it used to take many hours to render a scene. You know why it takes so long? Usually because of ray-tracing aka lighting! And Nvidia does it in real-time! Post-processing is easy and requires next to none performance, nearly all games are post-processed to some extent.

The reason Thanos looks unrealistic is because of the 3D-model, texturing and especially animation. Ray-tracing does nothing to those things. Ray-tracing doesn't mean photorealism. It is a key ingredient to achieve photorealism, but it alone is not it. Honestly, when was the last time you watched a modern hollywood movie and shouted "that shadow is unrealistic!", never, because it is ray-traced. I said something about ignorance, and this is what I meant. People don't even realize how amazeballs it is to have real-time ray-traced lights.
 
People don't even realize how amazeballs it is to have real-time ray-traced lights.
It's no so amazing anymore when you have to pay something like 1300€ for ~30Fps@1080p ... :laugh::laugh::laugh::laugh::banghead:
 
Except for the millions of customers they have...

PC gaming has always been the luxury buyers choice, the graphical power stomps all over consoles, no debate there, and you can customise till your hearts content.
I am one of those people. I have 8 Nintendo consoles, 5 Nintendo portables, 3 Sony consoles, 2 Sony portables and a number of classic gaming consoles. I am well aware of how popular and well loved console gaming is. What I meant by the above comment was that consoles will never compete with a well made mid-range PC in raw power.
The difference IS shrinking with each generation though. Put the average gamer in a room with a PS4/XB1 and PC w/ 1070+ and they'd be hard pressed to find the enormous differences you'd have gotten a decade ago.
My current PC is made with a 7 year old CPU/Mobo/Ram with other parts that are up to 12 years old(sound card), the newest part is the video card which is a 1080. Yet even with my older GTX 770, this system still kicks the XBOX and PS4Pro(which I own) in the nads performance-wise. Sure, a well made PC is more pricey, but you get more bang with it too. And you will always be able to do more with a PC than just game.
 
I am one of those people. I have 8 Nintendo consoles, 5 Nintendo portables, 3 Sony consoles, 2 Sony portables and a number of classic gaming consoles. I am well aware of how popular and well loved console gaming is. What I meant by the above comment was that consoles will never compete with a well made mid-range PC in raw power.

My current PC is made with a 7 year old CPU/Mobo/Ram with other parts that are up to 12 years old(sound card), the newest part is the video card which is a 1080. Yet even with my older GTX 770, this system still kicks the XBOX and PS4Pro(which I own) in the nads performance-wise. Sure, a well made PC is more pricey, but you get more bang with it too. And you will always be able to do more with a PC than just game.
Mid range PC topping out at $1000 i.e. excluding the monitor? Don't you think that's an unfair comparison, what's the latest price of these consoles?
 
Mid range PC topping out at $1000 i.e. excluding the monitor? Don't you think that's an unfair comparison, what's the latest price of these consoles?
The last mid-range PC I built for a client came in a $676 for the tower alone for all new parts. I don't count monitor because when comparing to a console, you still have to buy a display/TV and those prices can range all over the place, unless you already have one which is often the case.
 
"Relative Shader Performance" ?? nGreedia's leather guy never fails to amaze with those bug words saying absolutely nothing...
Jeegarayyyzzz!!!
 
Back
Top