• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Tom's Hardware Edioter-in-chief's stance on RTX 20 Series : JUST BUY IT

What's "expensive jewelry" and "thigh high boots" entail?

Is this when I can run Forza 7 and Doom at 4k better than Nvidia? *ahem probably the only things worth bragging about*
 
What makes Vega better to me is Freesync. Straight up battling cards against each other won't do it for me.
For me, it has a lot to do with driver support. In the Linux ecosystem I prefer open-source drivers. I'm pretty sure I've said this before but, I'm willing to overlook the moral implications of nVidia's business practices and decisions when it comes to drivers if the price was right but, it obviously isn't. I have a huge issue with nVidia not even being willing to release their firmware to open source drivers do they can even stand a chance.

So, in the words of Linus Torvalds: "nVidia, f**k you!"
 
Last edited:
Last edited:
I see more Ryzen fanboys than Vega. Why I'd consider them fanboys is because they have a habit of pointing out Intel's imminent demise. No rational person would think that's happening anytime soon :p

I don't see AMD digging a grave for Intel any time soon, but Intel certainly has been tripping all over themselves lately... everyone's kinda disillusioned with them, especially in the server space, for all these security holes they have to plug at the cost of performance (again, that's mostly in the server space), and personally I'm not a fan of what they've been doing with desktop chips either. I'm okay with small performance increases and process improvements (which makes them more efficient, though all this security nonsense kinda takes away from that performance), and I'm even okay with the paste up to a point, but they really screwed the pooch on all of that, all while locking down overclocking unless you bought the expensive K platform... so yeah, I'm liking AMD better than Intel right now, for a lot of reasons, and it makes me really hope Zen 2 shits all over Intel so I can maybe buy one and get the performance I want and none of the crap I don't from Intel. Now if everything stayed the same as it was before Ryzen and Spectre/Meltdown, I'd probably grudgingly get an Intel, cause that's where the performance would be.

I'm also liking AMD more than nVidia right now since all this RTX stuff happened. We don't know enough about it yet to be sure, but as of now, to me, it really seems like a big waste of silicon for a mostly useless gimmicky thing (yet again) from nVidia, which is going to be expensive and proprietary to their hardware (like G-Sync) and should hopefully fizzle out and die. I mean, kudos to nVidia for trying to make ray tracing actually possible, and it can look quite nice, but what they need to realize is that expensive, and worse yet, proprietary tech isn't good for anyone. They want to maintain a competitive edge by making cool stuff you can only get with nVidia, but that way of doing things only sucks for everyone. G-Sync is a little different, but proprietary tech like PhysX, Hairworks, and now this, where it only works in a few titles that nVidia works with the game developers and nowhere else to implement it, just sucks. It winds up being used in very few games and it's not accessible to anyone who doesn't use nVidia. I understand not simply giving stuff to your competition, but is it really worth it when only a few titles use it?
 
I don't see AMD digging a grave for Intel any time soon, but Intel certainly has been tripping all over themselves lately... everyone's kinda disillusioned with them, especially in the server space, for all these security holes they have to plug at the cost of performance (again, that's mostly in the server space), and personally I'm not a fan of what they've been doing with desktop chips either. I'm okay with small performance increases and process improvements (which makes them more efficient, though all this security nonsense kinda takes away from that performance), and I'm even okay with the paste up to a point, but they really screwed the pooch on all of that, all while locking down overclocking unless you bought the expensive K platform... so yeah, I'm liking AMD better than Intel right now, for a lot of reasons, and it makes me really hope Zen 2 shits all over Intel so I can maybe buy one and get the performance I want and none of the crap I don't from Intel. Now if everything stayed the same as it was before Ryzen and Spectre/Meltdown, I'd probably grudgingly get an Intel, cause that's where the performance would be.

I'm also liking AMD more than nVidia right now since all this RTX stuff happened. We don't know enough about it yet to be sure, but as of now, to me, it really seems like a big waste of silicon for a mostly useless gimmicky thing (yet again) from nVidia, which is going to be expensive and proprietary to their hardware (like G-Sync) and should hopefully fizzle out and die. I mean, kudos to nVidia for trying to make ray tracing actually possible, and it can look quite nice, but what they need to realize is that expensive, and worse yet, proprietary tech isn't good for anyone. They want to maintain a competitive edge by making cool stuff you can only get with nVidia, but that way of doing things only sucks for everyone. G-Sync is a little different, but proprietary tech like PhysX, Hairworks, and now this, where it only works in a few titles that nVidia works with the game developers and nowhere else to implement it, just sucks. It winds up being used in very few games and it's not accessible to anyone who doesn't use nVidia. I understand not simply giving stuff to your competition, but is it really worth it when only a few titles use it?

Oh, I don't disagree with anyone about those practices.. or being happy with AMD's resurgence. I just mean the gloating is a bit premature at times (to me, that's the sign of fanboy-ism). Gonna take a helluva lot more than that.
 
  • Like
Reactions: hat
Uuu... so many people beating the drum for "Tom's Hardware published an article of herculean stupidity that pertained to the idea of pre-ordering based on nothing more than promises. ".
How funny it is considering what was going on before Ryzen and Vega came out.
I bet most of the other company fanboys here would already order Navi if someone wanted to take their money:)

It's just sad that the same people who praise the other company for "fine wine", "future proofing" and "innovation" are now criticizing NV for pushing RTRT. :-D
This is the most advanced card available today. Just live with it. NV got so far ahead in performance that they finally had a moment to do something interesting.

And I simply knew HBM will appear here (mentioned by the usual members). :-D

Um, I read Tom's previews for Ryzen AND Threadripper. While they stated that they sounded good IF they were that big an improvement, they
clearly stated in both artilces "Wait for the reviews, we NEVER recommend pre-ordering without reviews". I picked up a 1700x from Microcenter
AFTER reading the reviews on launch day. My how things have changed at Tom's in such a short time.
 
And the "future proof" argument is here once again. :)
Vega 56 and 64 were tuned to match 1070 and 1080 in raw performance (at the cost of power consumption). They do up to 4K.

How big resolution are we talking about? Will Vega beat 1080 in 8K? 12K? I'm wondering how far we have to go.
And is it just how far in time? Or maybe we'll also have to move to a parallel universe where people have different eyes? :)
Because the way I see it, we're unlikely to move far beyond 4K. RTRT, on the other hand, seems like the next great advancement. So which company is more future proof now?
I am talking about heavy games graphic-wise that will reach to use over 12GB of VRAM in maybe 1-2 years and will have no problem running on Vega 56-64 with HBCC enabled opposing the Pascal line that will not have enough VRAM for them. And Ray Tracing with the best of the new nVidia GPUs will have to run on 1080P and on <60FPS for >$1000. What progress is this to force someone to play on 1080P to have the RT instead of 4K? Should someone sell his new 4K monitor and use the old 1080P one also since the 4K will look blurry on 1080P?
 
What progress is this to force someone to play on 1080P to have the RT instead of 4K?

It isn't forced. You can turn ray-tracing off.

Should someone sell his new 4K monitor and use the old 1080P one also since the 4K will look blurry on 1080P?

That shouldn't happen on a monitor with a smart scaler.
 
Wait, I thought that article is just a sarcastic write up lol
OK then, I ain't setting a digital foot at Tom's ever again...
 
It isn't forced. You can turn ray-tracing off.



That shouldn't happen on a monitor with a smart scaler.
Forced to anyone wanting the new tech ofc. Don't pretend you didn't got it...

All 4K monitors have this scaler you mentioned. I hope so...
 
I am talking about heavy games graphic-wise that will reach to use over 12GB of VRAM in maybe 1-2 years and will have no problem running on Vega 56-64 with HBCC enabled opposing the Pascal line that will not have enough VRAM for them.
LOL. Like what? Games paid as you go by generating crypto on the side? :-)

Even if this happens, NV will have another generation in stores and the "something-TX 3060" will match a Vega 64 for a fraction of the price and generated heat. Also with all current optimizations and a warranty! :-)

Also, do you really think game studios will make games that can't run on mainstream cards made by the dominant manufacturer? You can't be that naive.

And Ray Tracing with the best of the new nVidia GPUs will have to run on 1080P and on <60FPS for >$1000. What progress is this to force someone to play on 1080P to have the RT instead of 4K? Should someone sell his new 4K monitor and use the old 1080P one also since the 4K will look blurry on 1080P?
No one is forcing anyone to use it. NV simply gives you an opportunity to have beautiful visuals at the cost of resolution or fps. Thanks to the RTX move game studios might embrace RTRT and - maybe - in 2-3 years it will become an ubiquitous and affordable feature.

Also, my dear Watson, I'd say that quite a big chunk of the gaming community today plays at 1080P on 4K LCDs.

BTW: if your hardware can't properly scale 1080P to 2160P, maybe it's time to switch brands. :-D
 
BTW: if your hardware can't properly scale 1080P to 2160P, maybe it's time to switch brands. :-D
But I thought that scaling is generally not a good thing to do. Either do it right (native resolution) or not at all.
 
All 4K monitors have this scaler you mentioned. I hope so...

No, but they should. It's a crap shoot honestly.

Don't pretend you didn't got it...

I didn't buy this gen, if that is what you mean...

But I thought that scaling is generally not a good thing to do. Either do it right (native resolution) or not at all.

Integer divusible scaling is doable perfectly if the scaler has half a brain.
 
But I thought that scaling is generally not a good thing to do. Either do it right (native resolution) or not at all.
Why would it be?
Viewing 1080p output on a 2160p screen simply means showing the same image on 4 adjacent pixels. Of course if it's done properly.

Of course this is not how a generic scaling algorithm works.
When you open a 100x100 px image and save it as 71x71px, the algorithm runs a Fast Fourier Transform and resamples the pixels it needs. Results vary from very good to very sad. :-)
If you have a single black pixel on a white background, FFT scaling will produce a gray disc and so on. :-)
 
The general consensus here seems to be "if" it works properly. A lot of non-native resolutions look like shit on a lot of screens. That said, not sure who is buying a 4k screen to run 1080p on it, unless they're hoping for some monster GPU that can one day hopefully run it...
 
The general consensus here seems to be "if" it works properly. A lot of non-native resolutions look like shit on a lot of screens. That said, not sure who is buying a 4k screen to run 1080p on it, unless they're hoping for some monster GPU that can one day hopefully run it...
I am! :-D
4K is great for tasks I normally do - like coding, data analysis or photo editing (and many more which I don't care so much).
And 1080p is a great resolution for gaming. You're not missing details and games still look great. And I can live with a cheap and almost noiseless PC.

I don't think I'll even feel a need for more resolution in games. RTRT (and general evolution of realism) is another thing entirely. :-)
 
Sure, 4k is a lot of resolution, and is good for productivity tasks, because you have more space to work with... but any GPU can render that. If I had a 4k screen, I'd probably want to run games at 4k... unless 1080p actually looks good on such a screen, and I'm running into performance issues.
 
The general consensus here seems to be "if" it works properly. A lot of non-native resolutions look like shit on a lot of screens.
That's the point that I was trying to make. Most of us who've been around the block a few times have probably seen a lot of examples where display scaling (more often than not) looked like crap.
 
I am laughing my ass off at the people that are touting ray tracing like its some super game changing tech

it isn't and has been around since the 90's, there a reason we don't use it its because the requirements are absurd for very little improvement

you can get dam close using light maps and creative shader-fu if you know what you are doing, global illumination will get you 95% there and doesn't need es-special hardware
 
Did the convo take a turn to buy a 4k monitor to play at console rez ?

Next we'll have checkered board up-scaling vs DLSS
 
I'm honestly to the point where I think that they're trying to take the idea of realism way too damn far. Uncanny valley comes to mind.
 
Sure, 4k is a lot of resolution, and is good for productivity tasks
Yeah, I hate this word. There are names for such tasks, you know? :-) Coding, financial analysis, data mining, video encoding... These are different tasks, with different load and giving an advantage to different CPUs.
Because once you divide computer use cases into gaming and productivity, it seems like you don't care much about the latter. :-)
And where would we put watching movies?
So maybe let's divide computer tasks into productivity and time-wasting? That would look nice on a website header. :-D
What about reading e-books? Should we divide them? I mean... Lean For Dummies is definitely productive compared to the latest X-men comics, right?
And don't laugh at PDFs, because few hundred pages full of graphs and formulas can seriously whip a CPU. :-)

You see... gaming is pretty homogeneous, so making some general performance conclusions makes sense. The rest is a lot more complicated. :-)
 
Yeah, I hate this word. There are names for such tasks, you know? :) Coding, financial analysis, data mining, video encoding... These are different tasks, with different load and giving an advantage to different CPUs.
Because once you divide computer use cases into gaming and productivity, it seems like you don't care much about the latter. :)
And where would we put watching movies?
So maybe let's divide computer tasks into productivity and time-wasting? That would look nice on a website header. :-D
What about reading e-books? Should we divide them? I mean... Lean For Dummies is definitely productive compared to the latest X-men comics, right?
And don't laugh at PDFs, because few hundred pages full of graphs and formulas can seriously whip a CPU. :)

You see... gaming is pretty homogeneous, so making some general performance conclusions makes sense. The rest is a lot more complicated. :)
I divide tasks like that because they take different hardware to work efficiently. A computer with an i5 8400 and a GTX1080 (may as well throw in the old 16GB RAM too) will do great at gaming, but not so much anything that takes a lot of CPU grunt to run, or a ton of RAM. Just because a machine is good at gaming, or good at autoCAD, doesn't mean it'll be good at everything.
 
Usually if you build a gaming system that's built simply for the sake of overkill, it'll chew through damn near anything (short of heavy server load) you throw at it and not even sweat.
 
Back
Top