• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Tom's Hardware Edioter-in-chief's stance on RTX 20 Series : JUST BUY IT

8d4801a76e0696baad98fe4c2a8619be1a6502117629cdfd5c7ec40b3e6d7c59494.jpg

:laugh::laugh::laugh::laugh::laugh::laugh::roll::roll:
 
We need to start an OT thread just for this meme, I think.

I have tons of it (both made by myself and collect from others). Ready to post if someone start a thread. :laugh::D
 
I just watched a video from Hardware Unboxed and one of the users that watch them, The Crazy Old Coot, asked if nVidia was going to provide a jar of Vaseline with each RTX 20XX card. It would be very much appreciated if nVidia did.
 
I have tons of it (both made by myself and collect from others). Ready to post if someone start a thread. :laugh::D
This one is among the favorites

rtx.png
 
I love dogs and girls in thigh high boots, so RTX ON is a win for me.
 
What if this isn't insanity, but all the genuine truth? What if the guy knows something we don't? Just imagine that he's speaking to you, he knows. HE KNOWS YOU'LL DIE SOON.

WHAT IF PREORDERING 1080TI WILL SAVE YOUR LIFE?

WHAT IF HE KNOWS YOU ONLY HAVE A YEAR LEFT TO LIVE?

WAR, WAR IS COMING, AND IT'S RAY-TRACED. I CAN SEE IT IN RAY-TRACED EYES FLASH.
 
What if this isn't insanity, but all the genuine truth? What if the guy knows something we don't? Just imagine that he's speaking to you, he knows. HE KNOWS YOU'LL DIE SOON.

WHAT IF PREORDERING 1080TI WILL SAVE YOUR LIFE?

WHAT IF HE KNOWS YOU ONLY HAVE A YEAR LEFT TO LIVE?

WAR, WAR IS COMING, AND IT'S RAY-TRACED. I CAN SEE IT IN RAY-TRACED EYES FLASH.

War... war never changes.
 
30fps at 1080p...

Most gamers will want at least 60 fps at 1080p which means 2080 Ti needs to be twice as fast, or 20 GR/s. We're talking about 40 billion transistors by the way.

60fps at 1080p is about $250 now... GTX 1060/RX 580... which coincidentally are the most popular gaming cards. These cards have less than 6 billion transistors.

So how long is it going to be until a card twice as fast as 2080 Ti retails for $250? Until that day comes to pass, developers have little reason to consider DXR.


Tom's Hardware bought a ticket for the crazy train.
 
I just wonder what the MO is.

There is absolutely no way you can't know people will have your head for this, especially as a CEO for a major tech journal. There HAS to be a reason. One where it doesn't matter if a lot of people think he's crazy and/or dumb and full of shit.

FWIW I kinda get the "saves money" argument. Say you decide it's not worth it and buy a 1080/ti now instead. Couple years later RT takes off and you get a 2080 anyway. Assuming market prices stay the same, you'll pay a whole 1080ti more just to ultimately arrive at the same point. Maybe the 2xxx are more money now, but it is less money later, for those who are gonna upgrade anyway. I think that's a reasonably fair thing to say, about early adoption in general. Though in that case, the real way to save ALL of your money is to just stick with what you have. Don't buy a 1080. Don't buy a 2080. Just. Wait. Just a little while longer...

The thing about early adoption... ...usually you have some real picture of performance potential. Like, even if it isn't coming into play now, we'll already be able to see how it's going to really be useful once better integrated. Usually that conversation starts during development... ...as in, before it gets anywhere near the mainstream market, not right when it drops lmao. By the time it hits, we know what it is, and what it can do.

Nvidia instead waited till the last minute and dumped a bunch of hype. I think that's a lot of the suspicion. If they've been working on it for this long, why not say anything up until now and then try to drop all of the hype all at once? Instead you get a sudden, aggressive tap on the shoulder... ...you turn around and there's a guy with hip sunglasses screaming "RAY TRACING BROOOH!!!"

And you're just standing there dumbfounded, thinking "Wtf is ray tracing?!" There really can only be one reaction there. When all you see is the latest and greatest tech you've never heard of and not a smidgen of established metrics to go by, yeah... "HUH?"

It's just... it assumes that you will want/need the performance and features of these cards soon enough to justify jumping on now... ...which realistically nobody knows if they will just yet. It could even be a whole nother generation before it takes off. What then? In that way, I think it's dumb to buy one now just to save later. You only think it's good foresight, when really it's a total fallacy. You could just as easily be stuck footing the bill for something you'll never want or need.

And then, most of us would be happy to NOT pay for RT and just get that large, assured improvement in general performance. You assume general performance goes up, but in this case we really don't know by how much yet. And that's a problem, because general performance will ALWAYS count - now and forever. That's where most people's "money's worth" is.

But then, at the end of the day... ...this is all bullshit. For that very reason! Nobody knows how they actually perform yet! Only a complete idiot buys something without knowing what it is! Especially when it costs so much. It's just insanity. He's very logical and concise. And very irresponsible.

I have a feeling this guy doesn't have that friend that puts a hand on his shoulder... "*sigh* Dude..."
 
Games are a bonus at this stage anyway, it's the Quadros where NVIDIA are going to see nice revenue gains as Turing is just soo much better than anything else in the pro workstation market and can make huge corps workflows considerably faster.

Game adaption is already better than TrueAudio and RPM combined but that is hardly a surprise. But naturally it also doesn't help that AMD have no answer to the two year old GP102.
 
Last edited:
We're talking about 40 billion transistors by the way.

That's... a damn lot of transistors to do ray tracing. Perhaps they need to find a way to make the process more efficient, rather than throwing more hardware at it.
 
Was Tom's Hardware good in the past, or it was always this Intel + Nvidia biased?

That editor sounds like someone being forced to shill the RTX. He starts by saying how the arch is new, how you should wait, and then goes full retard. That last "When you die", he seems to be mocking the entire article on purpose.
 
Last edited:
There HAS to be a reason.
Maybe a big fat stack of cash perhaps. Maybe even one of those briefcases with row after row of hundred dollar bills, non-sequential. You know, like you see in the movies.
 
That's... a damn lot of transistors to do ray tracing. Perhaps they need to find a way to make the process more efficient, rather than throwing more hardware at it.
That is doing it efficiently (faking it). To do it without faking it takes somewhere in the PFLOPs of compute performance. RTX 2080 Ti is 13.4 TFLOPs. In other words, to not fake it, you'd need at least 75 RTX 2080 Ti cards in one system.
 
  • Like
Reactions: hat
I'm aware we already have "fake" shadows and lighting, because ray tracing takes too much... what I was saying is I wonder if there's a way to make real ray tracing more efficient, in a way that isn't "fake". Maybe not, but if not, ray tracing should probably never happen, at least in real time, like in games. That's a lot of power just to do such a silly effect, for the power required to produce it, anyway.
 
I didn't make this , but thought I should share
7883.jpeg
 
I'm aware we already have "fake" shadows and lighting, because ray tracing takes too much... what I was saying is I wonder if there's a way to make real ray tracing more efficient, in a way that isn't "fake". Maybe not, but if not, ray tracing should probably never happen, at least in real time, like in games. That's a lot of power just to do such a silly effect, for the power required to produce it, anyway.

I don't even care about "fake" lighting (or fake whatever). As long it doesn't ruin the direct scene I'm looking at. Making it all work (and hiding the flaws) is the art of games. It's not supposed to be real. It's staging, just like films.

Suddenly, I'm reminded of the movie Three Amigos, where those bitter Germans wanted to kill the Amigos, because they found out they were mere actors. /not exactly related

The quest for "realism" is not my thing. Games are smoke and mirrors and tricks.. with fun pattern recognition puzzles in between. Not a damn simulation. edit: And if you really want me to be immersed, just don't write a shit story (or premise).

edit: I don't know how anyone can examine a 3d model (in it's raw form.. meshes/textures/etc) and expect anything real from it. 50 years from now people will probably laugh at how primitive it all is. No one amount of polishing that turd isn't going to make it better.
 
Last edited:
wow and people wonder why I use toms as a example of How Not to tech journalist
 
Low quality post by Steevo
This shit keeps popping up on my Google news feed and I'm fucking infuriated that such a green cock gobbling whore manages to pop up on my newsfeed.

They seem to be crying out put your baby in me to Nvidia while spreading their whore lips. Or maybe that's just what it takes to get a review card anymore from Nvidia.
 
Back
Top