• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVidia, What gives? Seriously?!?

Status
Not open for further replies.
I don't agree, but it is what it is and it has been for a long, long time. I don't believe anything can be done about it, as an outsider looking in.

Besides.. Steve buys his own cards does he not? He seems to be doing ok..
 
Wait, what? We're talking about a corporation. You know, guys who would kidnap your children and sell you back their organs if there was a long term profit to be had.
So your solution to that is to just let them, shut up about it, and accept that that's life? Sorry, but as I said, some of us actually want the world to not be terrible. I know, it's crazy, right?
 
A question: was your comparison pic from a game with an actual XSX version, or from a game running an XO(X) game in backwards compatibility? I'm assuming the latter given the severe lack of real current-gen titles, which again undermines your comparison - it's running the same code with largely the same settings, so it stands to reason that the visual differences will be minimal. Frame rates and smoothness? Those are radically improved. And of course in time the graphical fidelity will also show itself to be much improved.

As for peopla "making millions doing reviews" - I don't think anyone outside of Linus is doing that. Please stop presenting tech reviewers as if they are some highly privileged class - it's not only not true, it's disingenuous and makes you come off as either willfully naive or biased as you're presenting an obviously untrue argument to the favor of major corporations.

There is of course reason to be skeptical of the relationship between ad payments, product access and review content, as this type of corruption is still relatively prevalent across the globe in all industries. It is something end users need to be aware of and critical towards when reading reviews, and it is something that sites' and channels' editorial policies need to explicitly account for to maintain any semblance of journalistic integrity. Thankfully there are a decent amount who do, and who even explicitly design their reviews around this (such as going back and re-testing their launch review samples compared to retail samples to check for cherry-picked samples).

But in the end, I think you're going way too far in your arguments here. I can agree with a lot of your base assumptions - the close relations to the industry necessary to produce this kind of review content are indeed a huge risk factor in terms of maintaining journalistic integrity and producing as unbiased content as possible. The massively skewed power dynamics between reviewers and corporations of course makes reviewers vulnerable in myriad ways. However, you are taking this and seemingly concluding that reviewers can't be trusted, period, and that corruption is the norm, not the exception. There's a significant logical disconnect here, and one that entirely ignores the human factors involved in this. Most reviewers are relatively ordinary people, with relatively ordinary morals, and would thus not be comfortable producing consciously biased content over time - that kind of stuff does some seriously nasty shit to your psyche. Nobody likes being a tool for propaganda except for zealots. And among prominent tech 'tubers, how many zealots do you know of? I could name a few, but none are a part of what we're discussing here.

Hand selected how? Silicon binning? I'm not personally worried about cherry-picked samples, no, as it wouldn't make much of a difference at all. If the reviewers' GPUs boost ... let's say 100MHz higher than everyone else's that's still a tiny increase when it comes down to it. And besides, the breadth of reviews out there makes this more or less impossible, especially when you have sites like GN who go out of their way to control for these things by buying further samples and doing comparison reviews at a later point.

Wait, was Zen 2's disadvantage in gaming somehow undercommunicated? To me it was plenty clear that they performed notably behind Intel in gaming, and especially in lightly threaded and latency-intensive workloads such as esports titles. I can't think of any review that hasn't highlighted this - but most of that was overshadowed by the superior value proposition and superior performance in a wide array of tasks, sure. That every reviewer was suddenly a video editor is ... again, this looks like an attempt at making a fair point, but you're taking it too far. CPU reviews have always focused on - and by necessity must focus on - a wide array of tasks. Video editing and 3D rendering has risen in popularity as benchmarks as those workloads have become far more common over the past decade. Does that make them the main workload of most PC enthusiasts? Of course not. Nor have I seen many reviews saying as such - most are pretty clear that if all you care about is gaming, the 9600K or 9700K were better than the 3600 or 3700X. Highlighting that the latter are superior for other workloads or mixed workloads (such as gaming while CPU encoding video) is the type of nuance a review needs, as it should cover the needs of as many readers as possible. So unless you can give examples of reviews actually arguing that the gaming performance deficit shouldn't matter to people who only game, or that rendering/video editing is more important and common a workload than gaming, then you're taking this way, way too far.

Sorry, who is whining here? The person saying "no, I won't compromise my editorial policies in order to get early access to review products", or the massive corporation throwing a hissy-fit over some imagined slight by a relatively small product review channel?

... they work for the same channel, no? Under the same review policies? Do you honestly believe that they work entirely independently from each other? Don't be daft. Stop reading distribution of work as if it says anything about the values of the person doing the work - there are hundreds of possible reasons why work is distributed the way it is, and none of them add up to "reviewer A thinks feature X is shit, reviewer B thinks it's cool, so reviewer B does that content". That's a bad-faith argument which assumes the reviewers have an agenda beyond investigating the performance and properties of the products they review, and it utterly ignores the facts of the degree of planning and cooperation involved in even low-level media production.

And besides, your analogy is wildly inaccurate. An electric car can't run on fossil fuels; an RTX GPU can still do rasterization - and it's still the main workload of the GPU. A more accurate analogy would be if Toyota sent out a Prius hybrid (not the plug-in type) to a reviewer and demanded that they prioritize covering the electric operation of the car in the review, rather than the main fossil fuel powered operation of the car, as "that is what the industry and drivers are interested in".

The problem here is you have what looks like blind faith in the advertisers - reviewers who get freebies from those big evil corporations, and make money by people clicking on their videos and related ads. Yes that's the advertising / marketing business.

You also don't understand that bias is not just something introduced via corporate pressures, the community itself can and does create bias. Say something the community doesn't like, and your revenue goes down. Ever notice how these review sites tend to say the same thing? Go somewhere the audience is different, and you'll see different conclusions, you might even see different benchmarks. Funny how that works, isn't it?

Same thing happens on all kinds of topics. Cars, motorcycles, TVs, skate boards, whatever.

All these things mashes the majority of review sites into one mold. It's why most of them run the same 'benchmarks', they test the same games, they come to the same conclusions. Haven't you noticed, very very few of these sites run a MS Office benchmark (TPU does yes I know). And yet, they all have an opinion on 'productivity'. Did you know there are over a billion instances of MS Office running in the world, and over 120 million active Office 365 accounts? What about MS Teams - there are over 500,000 *organizations* using that. How can anyone talk about productivity with a straight face and not talk about this?

So instead we're all going to run Cinema 4D benchmarks, but no one runs Premiere Pro? This is a freaking joke. Adobe is by far the largest media creation tool set in existence.

So no, these people are not unbias 3rd party observers. They are in the business of telling people what they want to hear, getting access to products, and putting food on the table for their families. There's nothing unethical about that, a lot of people are in that business, but people should understand what they are looking at. It is up to the viewer to find truth in the mix, but it isn't going to be handed to you on a silver plate like you seem to think.
 
Office is not a performance application so why benchmark it lol
 
The problem here is you have what looks like blind faith in the advertisers - reviewers who get freebies from those big evil corporations, and make money by people clicking on their videos and related ads. Yes that's the advertising / marketing business.

You also don't understand that bias is not just something introduced via corporate pressures, the community itself can and does create bias. Say something the community doesn't like, and your revenue goes down. Ever notice how these review sites tend to say the same thing? Go somewhere the audience is different, and you'll see different conclusions, you might even see different benchmarks. Funny how that works, isn't it?

Same thing happens on all kinds of topics. Cars, motorcycles, TVs, skate boards, whatever.

All these things mashes the majority of review sites into one mold. It's why most of them run the same 'benchmarks', they test the same games, they come to the same conclusions. Haven't you noticed, very very few of these sites run a MS Office benchmark (TPU does yes I know). And yet, they all have an opinion on 'productivity'. Did you know there are over a billion instances of MS Office running in the world, and over 120 million active Office 365 accounts? What about MS Teams - there are over 500,000 *organizations* using that. How can anyone talk about productivity with a straight face and not talk about this?

So instead we're all going to run Cinema 4D benchmarks, but no one runs Premiere Pro? This is a freaking joke. Adobe is by far the largest media creation tool set in existence.

So no, these people are not unbias 3rd party observers. They are in the business of telling people what they want to hear, getting access to products, and putting food on the table for their families. There's nothing unethical about that, a lot of people are in that business, but people should understand what they are looking at. It is up to the viewer to find truth in the mix, but it isn't going to be handed to you on a silver plate like you seem to think.
I know this happens, but jesus man, are that many people/reviewers really that weak? LOL

I know, personally, dozens of them from larges sites (Tom's/Anand) to small, and I can tell you with the utmost certainty, these people are not shills.

The problem with running some of these tests (games too) is that they are not canned. So a process to review something has to materialize. You can use PCMark 10 for Office productivity, but it's a shitshow for consistency. Games too... you say XX title(s) are the most popular why are these not getting benched? Because some of titles are multiplayer and not repeatable or don't have a canned benchmark. Going to a specific spot in a game and doing the same exact thing for multiple runs each time is a royal PITA. MAD respect for W1z who does that in some/many games. But the point is to find the RELATIVE difference between cards, not directly in-game performance. Anyone with half a clue knows these benchmarks don't represent all parts of the game.

Can you do custom runs? Sure! Have a file or two to render in Blender, say.. or figure out the command line options in some of these tests. It isn't easy to develop testing where there isn't any to begin with. Then you get into the minutia of 'why did he use that rendering option/or why didn't they use this filter, etc...in other words, reviewers can't win/please everyone. This is why we always say to look at more than one review. If you use Adobe suite, check out a review that covers it... really simple....better hope they are testing it the way you use it, however.......

As a reviewer, I don't tell users what they want to hear, I tell users what the hardware I reviewed told me. I don't care which way the audience leans. Are there some potato sites out there who are shills? Sure are! Is it anywhere close to a majority? Faaaaak no... it's an extreme MINORITY if anything. As an EIC/Editor and Reviewer, your assertions that we're all biased is just plain and simply wrong. Again, I'm sure some are influenced by that... but not all, not a majority... an extreme minority.
 
Last edited:
The problem here is you have what looks like blind faith in the advertisers - reviewers who get freebies from those big evil corporations, and make money by people clicking on their videos and related ads. Yes that's the advertising / marketing business.

You also don't understand that bias is not just something introduced via corporate pressures, the community itself can and does create bias. Say something the community doesn't like, and your revenue goes down. Ever notice how these review sites tend to say the same thing? Go somewhere the audience is different, and you'll see different conclusions, you might even see different benchmarks. Funny how that works, isn't it?

Same thing happens on all kinds of topics. Cars, motorcycles, TVs, skate boards, whatever.

All these things mashes the majority of review sites into one mold. It's why most of them run the same 'benchmarks', they test the same games, they come to the same conclusions. Haven't you noticed, very very few of these sites run a MS Office benchmark (TPU does yes I know). And yet, they all have an opinion on 'productivity'. Did you know there are over a billion instances of MS Office running in the world, and over 120 million active Office 365 accounts? What about MS Teams - there are over 500,000 *organizations* using that. How can anyone talk about productivity with a straight face and not talk about this?

So instead we're all going to run Cinema 4D benchmarks, but no one runs Premiere Pro? This is a freaking joke. Adobe is by far the largest media creation tool set in existence.

So no, these people are not unbias 3rd party observers. They are in the business of telling people what they want to hear, getting access to products, and putting food on the table for their families. There's nothing unethical about that, a lot of people are in that business, but people should understand what they are looking at. It is up to the viewer to find truth in the mix, but it isn't going to be handed to you on a silver plate like you seem to think.
A) I'm a media researcher. I'm quite aware of how biases in media are formed, expressed, reinforced, challenged, and so on, thank you.
B) I don't tend to have blind faith in much of anything. My trust in the tech media is both moderate and well reasoned. Please check your condescending BS at the door.
C) You're overestimating the homogenizing effect of audience tastes - this has more of an effect on form and presentation than on content, though of course it has some effect on content also. But arguing that community pressures are causing, for example, reviewers to present Nvidia (or AMD) in a better light? Nah. At least not the ones I would take seriously - TPU, AnandTech, GN, and a handful of others. Those also, for the record, typically publish their test methodologies as well as the reasoning behind the development of those methodologies, allowing readers to examine this for themselves and see if they agree with the choices made.
D) There are absolutely some weird and poor choices of benchmarks out there. Cinebench is definitely one. But there are also many good ones, as well as good reasons for not choosing things that would have been very interesting. Multiplayer games and various online (i.e. constantly updated) games are for example near impossible to benchmark in a reliable way, and if you can't benchmark reliably, not doing it at all is definitely the right choice. There are also valid questions about whether benchmarking a bot match in a multiplayer game is representative of real-world online play, etc. Benchmarking also tends to focus on challenging loads, which explains why esports games tend to have a very low priority (yes, some can be demanding, but most run passably on just about anything). There's also an argument to be made about when good performance at ultra actually matters (I'd say never, as Ultra is inevitably wasteful), and AAA and story-heavy games are definitely places where I would think most players care more about the style and aesthetics of the game than fast-paced competitive titles (though I by no means think that style or aesthetics are of no importance there - just that fluidity matters more).
E) There are absolutely people running Office benchmarks (AnandTech for example uses BapCo SysMark extensively in their system testing), though the reality is that while there are billions of Office users out there, the vast majority of them use it for things that run fine on 5-year-old 15W u-series laptop chips. There are of course some with massively demanding workloads like enormous spreadsheets etc., but those are a tiny minority. Measuring the performance of office tasks relevant to the majority of users is rather meaningless given how light a load they represent, so it's a better use of reviewers' time to test more demanding common workloads.
F) Are there no Premiere Pro benchmarks out there? What? I'm sorry, but what reviews are you reading/watching? Premiere Pro export time benchmarks are a dime a dozen, and even LTT (who are definitely more in the "entertainment" than "trustworthy review" category) places a lot of focus on Premiere Pro timeline performance (logical for them, really).

Lastly, I never claimed that reviewers are unbiased 3rd party observers. There is no such thing as an unbiased human being. In any matter, in any situation, ever. Period. However, the observable level of bias in the tech review sites I follow is generally low enough to not make much of a difference, and many of them do a good job of explicitly addressing and countering their own biases. You, on the other hand, seem to be arguing that reviewers are both biased in favor of the companies "giving them free stuff" (which, to be honest, assumes a level of unprofessionalism on their part that says more of you than of them, unless you have actual proof), and in favor of whatever their audience prefers. That sounds ... problematic, to say the least. What do they then do if those two don't align? Who are they more beholden to? And do they then have no journalistic integrity and self respect whatsoever? Your analysis of this seems shallow, simplistic, and overly black and white, and while I as I said previously agree with a lot of your base assumptions, I entirely disagree with your bombastic and all-encompassing black-and-white conclusions.
 
Storm in a tea cup.
Whining because he wasn't going to get any more free samples.
NOTHING stopped him from buying product and then reviewing.
That statement shows you entirely missed the reason why this is a problem and the reason why the public outrage is both justified and needed.

Sorry, who is whining here? The person saying "no, I won't compromise my editorial policies in order to get early access to review products", or the massive corporation throwing a hissy-fit over some imagined slight by a relatively small product review channel?
Exactly right!

I can see the problem, but ho cares? Don't you guys have your own minds as consumers?
Sure people do, but none of us are psychic, clairvoyant or mind-readers. We can not know how well tech products work until such performance is shown and demonstrated. This is why retail stores have display models, so people can see for themselves how a product works and what features it has to offer. Tech retailers don't have that level of display capability in the modern world any more which is why tech reviewers on the web and Youtube exist, to show us all how things work so we can decide for ourselves. Independent reviewers are an integral and essential part of the tech industry. For you to quite casually make the above statement shows that you have a partial or complete lack of understanding of how this part of the industry works.
For me ray tracing is years from being useful.
Very ignorant opinion. It's more useful with RTX3000 card than it was with the RTX2000 and it was very useful and impressive then. While AMD's first go at RTRT is behind NVidia's current offering, it's still respectable and useful. Ray-tracing is the future of lighting in games, full stop, end of discussion. Raster lighting has gone as far as it can go without shifting to light ray path traced methods, which is effectively ray-tracing in real-time.
 
Last edited:
What on earth are you on about? Not only is that statement completely nonsensical in this content; If you send out a product for review to the press, you have zero say over that review. Reviewers are not beholden to product makers, nor could they be, as that would make their reviews fundamentally untrustworthy, and thus worthless.

Or, if I give you a supercar that blazes in a straight road (RT, DLSS), but you live in the real world which it's mostly >95% city streets/roads aka Rasterization (should be higher than 95%, but keeping it conservative)...am I right to insist you drive only on highways and straight roads and ignore city streets and roads?

I've seen this argument so much but when and where was it ignored. Iirc even wizzard posts his ray tracing performance at the end on a single page with only one or two rt games used. Is he a problem?

It wasn't ignored by any stretch of the term though in fact it got it's own video.

If they just want a PR release, they don’t need reviewers and can do it themselves. You take a chance when you ask for your product to be reviewed. As I said earlier, they can’t have it both ways.

Good thing it wasn't.

Sure, Nvidia pulled plenty of dirty tricks over the years. Sure, like the company itself, some of their PRs are bullies and assholes. Bryan Del Rizzo is probably the worst of them.

I don't follow YouTubers. I prefer written reviews. Never seen even a single Hardware Unboxed review… but according to their own tweet:

Clipboard02.jpg


There is no denial.

What the hell is "focusing on rasterization"? Ray tracing is the current biggest thing, and being pushed in all AAA titles. It is not an exclusive feature, it is being discussed for over a decade now.

If they want to stick to DOS gaming that's their business, but they cannot expect RTX samples. It's like letting vegetarian review meat-based meals.

I hope they send him samples of GeForce 2 MX200 AGP in every launch from now on.
 
Last edited:
Sure, Nvidia pulled plenty of dirty tricks over the years. Sure, like the company itself, some of their PRs are bullies and assholes. Bryan Del Rizzo is probably the worst of them.

I don't follow YouTubers. I prefer written reviews. Never seen even a single Hardware Unboxed review… but according to their own tweet:

View attachment 179679

There is no denial.

What the hell is "focusing on rasterization"? Ray tracing is the current biggest thing, and being pushed in all AAA titles. It is not an exclusive feature, it is being discussed for over a decade now.

If they want to stick to DOS gaming that's their business, but they cannot expect RTX samples. It's like letting vegetarian review meat-based meals.
Why are you posting when you have no idea what's going on? You haven't watched their videos. Their coverage of DLSS and RTX is very good, even quoted in nvidia marketing slides.
 
Sure, Nvidia pulled plenty of dirty tricks over the years. Sure, like the company itself, some of their PRs are bullies and assholes. Bryan Del Rizzo is probably the worst of them.

I don't follow YouTubers. I prefer written reviews. Never seen even a single Hardware Unboxed review… but according to their own tweet:

View attachment 179679

There is no denial.

What the hell is "focusing on rasterization"? Ray tracing is the current biggest thing, and being pushed in all AAA titles. It is not an exclusive feature, it is being discussed for over a decade now.

If they want to stick to DOS gaming that's their business, but they cannot expect RTX samples. It's like letting vegetarian review meat-based meals.
They're accused of focusing on rasterisation, not admitting it ,dammmn.

Why you here , if this means naught to you?!, Why.

Who gives a shit if you only do written reviews?!.


And are you blind to the fact they also release every review in a published release too.

Does that even matter.





The point is Nvidia are trying to guide the narrative of reviews, and that's not on.
 
There was something similar with one of the biggest IT Tech Sites in Germany (heise.de) in 2018.
When nVidia send out a new pro forma NDA, heise.de refused to sign it and instead chose to publish the NDA in full lenght.
Heise.de has not gotten any samples from nVidia since that time and has therefore bought their review samples themselves. Their reviews of nVidia products differ quite a bit from what you read everywhere else up till today..

Didn't even hear about this, thanks for the heads up. For this reason and many more (Geforce Partner Program, hoarding best binned GPUs from AIBs and keeping them for their Founders cards, blatant price-gouging, advertising fake MSRPs, skimping out on VRAM, poor driver support and worse aging on any card older than 2 gens), I will not buy another Ngreedia card. I've already seen multiple videos on how badly old Nvidia cards aged against AMD cards that they supposedly beat back in the day (see 290X vs 780 Ti and how badly the Nvidia cards aged today against each other in comparison).
 
Sure, Nvidia pulled plenty of dirty tricks over the years. Sure, like the company itself, some of their PRs are bullies and assholes. Bryan Del Rizzo is probably the worst of them.

I don't follow YouTubers. I prefer written reviews. Never seen even a single Hardware Unboxed review… but according to their own tweet:

View attachment 179679

There is no denial.

What the hell is "focusing on rasterization"? Ray tracing is the current biggest thing, and being pushed in all AAA titles. It is not an exclusive feature, it is being discussed for over a decade now.

If they want to stick to DOS gaming that's their business, but they cannot expect RTX samples. It's like letting vegetarian review meat-based meals.

I hope they send him samples of GeForce 2 MX200 AGP in every launch from now on.
You’re obviously operating in an echo chamber. You have no idea what is going on and your half-cocked comments show it. Go back to sleep, Rip Van Winkle.
 
If its not true why they said:

"Their reasoning is that we are focusing on rasterization instead of ray tracing."
Because that's what Nvidia presented as their reasoning. It being a false claim just underscores the absurdity of this whole deal, but even then, attempting to restrict the editorial freedom of reviewers? That's mob tactics. If your products are good, reviews will make them look good, and you shouldn't have a problem. If your products are good, but you would like reviewers to parrot your PR lines instead of following their own priorities? Hire a bigger PR department, as you've fundamentally misunderstood what reviewers are and do.
 
If its not true why they said:

"Their reasoning("Nvidia's") is that we("Hwub") are focusing on rasterization instead of ray tracing." = Allegation
That's what Nvidia accused Hwub of in the email.

Their. = Nvidia , that's not them admitting they're not focusing on rays(oops).

It's Hwub telling you what Nvidia said FFS.


Hwub have focused videos on ray's and includes ray's in all reviews now, they already did, just not enough for Nvidia.
 
Last edited:
Storm in a tea cup.
Whining because he wasn't going to get any more free samples.
NOTHING stopped him from buying product and then reviewing.

Except you know, income and economics. Doubt that's an issue for him but for many, it could be.
 
You and RandallFlagg both seem like you are trying to derail the tread with nonsense. Knock it off.

You know, after Valantar devolved into personal attacks with his nonsense post, where he also revealed his role as a "media analyst" aka social media monitor / propaganda artist / marketing lacky / messaging control, I was just not going to respond.

But now here you are, playing your childish game. You guys are the forum thugs here, every forum has them. He called Nvidia the media mob, what a hypocrite.

These are corporations and they don't play by your make believe double standards, they never have and they never will.

Nvidia's only mistake here was sending an email - they should have just kept their mouths shut - Like AMD mostly did.

Oh, didn't think about that?

AMD withheld the R9 Nano from Techreport and KitGuru. Quite likely others. This is a few magnitudes of order bigger media outlet than that pea shooter two guys in a back bedroom with a camcorder HW unboxed channel.

Doesn't fit with your media control narrative? So sad..

2015 AMD witholds samples from KitGuru :


And did the same thing to TechReport :


Oh heck lets just tear them up. AMD bribed people to crash an Nvidia event. They still have these groups in operation. Team Red is a real thing. It is basically a propaganda / disinformation campaign, part of their marketing department. Email pic and article from Forbes below.

Now, this I would say, is playing dirty. Really unethical.

Are you dumb enough to think they have changed? Just a question.

Who is it Valantar works for now?

1608079290535.png


 

Attachments

  • Amd_witholding_samples.JPG
    Amd_witholding_samples.JPG
    178.3 KB · Views: 62
Last edited:
You know, after Valantar devolved into personal attacks with his nonsense post, where he also revealed his role as a "media analyst" aka social media monitor / propaganda artist / marketing lacky / messaging control, I was just not going to respond.

But now here you are, playing your childish game. You guys are the forum thugs here, every forum has them. He called Nvidia the media mob, what a hypocrite.

These are corporations and they don't play by your make believe double standards, they never have and they never will.

Nvidia's only mistake here was sending an email - they should have just kept their mouths shut - Like AMD mostly did.

Oh, didn't think about that?

AMD withheld the R9 Nano from Techreport and KitGuru. Quite likely others. This is a few magnitudes of order bigger media outlet than that pea shooter two guys in a back bedroom with a camcorder HW unboxed channel.

Doesn't fit with your media control narrative? So sad..

2015 AMD withholds samples from KitGuru :


And did the same thing to TechReport :

Thank You for proving my point. Anything else you'd like to embarrass yourself with?
You know, after Valantar devolved into personal attacks with his nonsense post
He wasn't attacking you. He was pointing out the flaws in your logic and you were harrassing him about it. Big difference there.
It is OK to disagree.
True. But they aren't disagreeing to make a point that has much merit, they're are disagreeing and arguing to derail the thread. There's a difference.
 
Last edited:
Thank You for proving my point. Anything else you'd like to embarrass yourself with?


True. But they aren't disagreeing to make a point that has much merit, they're are disagreeing and arguing to derail the thread. There's a difference.

The point is very simple but apparently beyond your cognitive ability.

Anyone who believes that big companies don't exert influence over first take reviewers is a fool, that would be you as you've demonstrated.

Nvidia's mistake was sending an email. They should have simply blacklisted them. If you think that doesn't happen all the time, like I said, you'd be a fool.

And no you can't read reviewer's minds. You don't know what happens behind the scenes. Your assumptions aren't truth. You're not that smart.
 
The point is very simple but apparently beyond your cognitive ability.

Anyone who believes that big companies don't exert influence over first take reviewers is a fool, that would be you as you've demonstrated.

Nvidia's mistake was sending an email. They should have simply blacklisted them. If you think that doesn't happen all the time, like I said, you'd be a fool.

And no you can't read reviewer's minds. You don't know what happens behind the scenes. Your assumptions aren't truth. You're not that smart.

Maybe we should all just bring this down a notch.
 
Status
Not open for further replies.
Back
Top