• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Some people accuse TPU of 'bias' towards Nvidia.

I wouldn't say amd shines I would say the ideal behind Dx12 can utilize a design that isn't really targeted directly at gaming

Not trying to be a jerk, but what really is the difference? When DX12 games are on hand, AMD generally does much better. Much of the time, even better than their NV contemporaries. If that isn't shining, I don't really know what it is. Compared to AMD's DX11 performance, DX12 is clearly shining.

At the end of the day, I'm still trying to work out what is so special about DX12
It is supposed to let the developers get closer to 'bare metal'. That is, it is supposed to get rid of all the overhead of Microsoft's bloated, slow api calls. In theory, this should allow developers to seriously optimize their stuff for better performance. You can see that it works when you look at AMD's performance jump. They have all that 'theoretical, extra horsepower' that gets all clogged up in fat DX11 calls but is allowed to breath in DX12.

However, what we get is a bunch of lazy developers publishers who don't want to go that route and get down and do what it takes to program at the lower levels, ie, it costs more and takes longer. The farther abstracted you get from machine language, the lower performance. That is why applications that need to be the fastest are written in the lowest levels.

This is why I see problems with RTX acceptance. Besides console adoption, if it is not stupid easy and have high returns, it isn't getting done.
 
Last edited:
I see your point. I guess what leaves me off-put is the constant comparison to Nvidia. Which is really my problem. I have no interest in Nvidia or their products...for personal reasons. Thus...I don't really care how other products stack up compared to theirs. I don't care how theirs perform. I don't care how much theirs cost. I don't care if for the same price theirs is better. I don't care if for a lesser price theirs is better. I JUST DO NOT CARE ABOUT NVIDIA. But I realize other people do. And that for a review something usually needs compared to something else to make certain points about it. I just wish it wasn't always a comparison to Nvidia somethings. Is that sensible or reasonable? Probably not. None-the-less...I don't like it.

Long story short...it's me. Not you. And I know that. But I still gotta be me.


I care generally first about price to performance. Most people generally do. Could reviews be restructured to only include results from Nvidia on an Nvidia card and only results from AMD cards on an AMD card review? Sure. I just don't see it being a thing as most people want that data aggregated together. As for TPU's review... It was alright but left me with some questions for W1zz.

1. Did you experience any of the buggy drivers that the other reviewers dealt with?
2. Would you consider revisiting the card if EKWB or other manufacturer did a full cover water block for the card? (to see if it helped both stock and OC results..)

As for Vega II... I just can't imagine being a reviewer and recommending it. Unless they can work out a deal with SK/Samsung on HBM2 costs and get a smaller PCB and maybe do a FuryX/Vega64 LC style card to keep thermals in check? I just don't see Vega II surviving as an air-cooled card.

Edit-

Its why I bought the Vega64 Liquid. @ $525 brand new, I couldn't pass it up. Its price to performance was better than anything else in that performance bracket.

I could have sold it 4 months later for $1200 but, I kept it. Cause I was happy with it. Drivers have matured. Its a great looking card.
 
Last edited:
That is also the problem that swings the other way, developers can be lazy, but equally in their defecne they can't squeeze all the performance out that say NV can.

Quantum Break was a Windows store DX12 only game, but when it went Steam, the DX11 version performed far better... whilst looking identical of course.

https://www.computerbase.de/2016-09/quantum-break-steam-benchmark/3/

Because one benefits doesn't mean the other should lose, hell if W1zz isn't busy enough he should bench both. :P
 
That is also the problem that swings the other way, developers can be lazy, but equally in their defecne they can't squeeze all the performance out that say NV can.

Quantum Break was a Windows store DX12 only game, but when it went Steam, the DX11 version performed far better... whilst looking identical of course.

https://www.computerbase.de/2016-09/quantum-break-steam-benchmark/3/

Because one benefits doesn't mean the other should lose, hell if W1zz isn't busy enough he should bench both. :p

NV should benefit as well, it is just that AMD will/does benefit more based on their architecture. I have no idea how Turing fairs in this but this was the case for Maxwell and possibly Kepler as well. I haven't really been paying attention as I gave up on DX12 mostly.

I am actually continually impressed at how much NV continues to squeeze out of DX11. I just wonder when they won't be able to squeeze out anymore.

I care generally first about price to performance. Most people generally do.

I hope not because that metric is really only good for comparing cards in the same bracket. There are only two possible metrics that really matter. Price and Performance. Cards can have great price to performance and great performance but if it you can't afford it, what good is it? Conversely, a card can have great price to performance and a great price but if it doesn't have the performance you need, what good is it?
 
NV should benefit as well, it is just that AMD will/does benefit more based on their architecture. I have no idea how Turing fairs in this but this was the case for Maxwell and possibly Kepler as well. I haven't really been paying attention as I gave up on DX12 mostly.

I am actually continually impressed at how much NV continues to squeeze out of DX11. I just wonder when they won't be able to squeeze out anymore.

Well they should, but has been mentioned many times before, Nvidia are far better at maximizing their hardware, they don't need DX12 to fully exploit their hardware as it already being fully ulilised already. As you say their DX11 path is honed to perfection, devs manage to get their hands dirty in DX12 and reduce performance more often than not.

On paper AMD often offer far more TFLOPS, but in reality they still end up losing.
 
I'm sure AMD would have loved to release this card in 2016, but they would have needed a time machine to get the tech required to make this card back then.

If anything this is a quick money card, which only would be better at $460-500, but profits are higher at 700. So if anything lower the cost in a few months.
 
Only thing that I can attribute as bias is whatever gimmick Nvidia is currently running being a perpetual con at the end of AMD card review regardless of the number of game titles if any actually support the feature, e.g. no physx, and today no RTX.
 
If anything this is a quick money card, which only would be better at $460-500, but profits are higher at 700. So if anything lower the cost in a few months.

Hmm, not what I've been reading. Lots of "insider sources" claiming that Radeon VII costs about exactly as much to make as it sells for. I mean the VRM and RAM components alone cost something like $400 if these sources are to be believed.
 
Did anyone here other than myself get the AMD email in regards to the Radeon VII? Link below the 4K 144Hz monitor.

4K60 Support

The older DisplayPort 1.2 is capable of 3840×2160, 4K, at 60 Hz; or 1080p resolution at 144HzDisplayPort 1.3, announced in September 2014, is capable of 8K at 60Hz or 4k at 120Hz! ... DisplayPort can run multiple monitors from a single cable: you can use hubs or displays that support daisy chaining.Jun 2, 2015

If so, check one spec, never mind the memory, 7nm & so on, rather 4K performance. One should expect more than that from a card with such advancements & come 2-3 more years, the market will be saturated with 4K monitors that'll run at 144Hz, or do 120FPS in games. Here's one & it's not exactly new.

https://www.theverge.com/circuitbre...c-hdr-4k-144hz-gaming-monitor-announced-specs

https://www.amd.com/en/products/gra...pJobID=1460608619&spReportId=MTQ2MDYwODYxOQS2

DisplayPort 1.4 spec is 8K is the same, '4K60' sounds more like DP 1.3. There are 8K monitors on the market, wonder how the Radeon VII will benchmark on one? Looking at Newegg, these cards are all sold out, at $699.99 a pop, I don't see full DP 1.4 specs, nor should we expect it at that price. DP 1.4 monitors finally arrived in 2018 to mainstream users, expect GPU's to be tested to the limit.

https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-2041834

DP 1.4 specs, to include support for 8K at 60Hz.

https://arstechnica.com/gadgets/201...an-drive-8k-monitors-over-a-usb-type-c-cable/

I believe that NVIDIA will beat AMD to the punch here & am not biased, rather telling it like it is. Give AMD credit for keeping NVIDIA from charging $1,000+ for RTX 1070's, which is a winner for all, yet they're playing by (or writing) their own rules. This Radeon VII release is not DP 1.4 compliant by any stretch of the imagination.

Cat
 
Hmm, not what I've been reading. Lots of "insider sources" claiming that Radeon VII costs about exactly as much to make as it sells for. I mean the VRM and RAM components alone cost something like $400 if these sources are to be believed.

I will pm you a video
 
One of the only reasons I still even casually hang around here is the reviews are regularly top notch.

As far as I can tell, there is no bias, only angry users with an axe to grind one way or another for their favorite brand.

I will pm you a video

I love how you just PM things to avoid general discussion as of late.
 
Well they should, but has been mentioned many times before, Nvidia are far better at maximizing their hardware,

Nvidia's software profiling lab & devrel setups are second to none.

they don't need DX12 to fully exploit their hardware as it already being fully ulilised already.

It's more accurate to say NVs uarch benefits less under DX12/Vulkan, but that doesn't mean it's necessarily slower in all DX12 games.

As you say their DX11 path is honed to perfection, devs manage to get their hands dirty in DX12 and reduce performance more often than not.

Funny, I thought we were playing the dev's games - not the IHV's. ;) DX12 needs to be built into the engine from the ground up to benefit, which means high man hour cost for game engine devs that studios don't want to risk.

On paper AMD often offer far more TFLOPS, but in reality they still end up losing.

Which games are alu limited & for what portion of a frame? How does geometry/pipeline/back-end compare? It should be quite clear to most that Turing has a better uarch, with much better cache pipeline resulting in less stalls/bubbles & better bandwidth & memory utilisation with seemingly lower peak resource values. It's also easier/more flexible to pack in more alus rather than fixed function logic. The tradeoff (in part) is silicon area/cost.

Incidentally, what ever happened to FCAT, TWIMTBB?
 
What's wrong with AMD drivers? much better than NVIDIA IMO the rest is par for the course.

Try installing AMD's latest 'recommended' drivers on a MSI Radeon 7770 GHz edition & the driver issues will be seen fast.:)

This is why I stick with 15.7 on Windows 7 & whatever Microsoft throws at me on Windows 10, last time I updated to latest, was a disaster. Had to use the DDI tool from here & re-updated Windows 10 to their recommended, can get a bit of an overclock w/out crashing. Maybe on newer cards these performs better, mine was from 2012.

Maybe it's luck, on Windows have yet to receive a bad NVIDIA driver package, unfortunately can't say the same for Linux, I've stopped trying to add a PPA for the latest. This means I'm like 7-8 major driver releases behind, yet as long as these works (just as the older AMD ones on Windows), I'm good. Newer drivers aren't always better & this applies to much of a computer's hardware.

Cat
 
New USA, get butthurt over everything, SMFH!
 
Did anyone here other than myself get the AMD email in regards to the Radeon VII? Link below the 4K 144Hz monitor.





If so, check one spec, never mind the memory, 7nm & so on, rather 4K performance. One should expect more than that from a card with such advancements & come 2-3 more years, the market will be saturated with 4K monitors that'll run at 144Hz, or do 120FPS in games. Here's one & it's not exactly new.

https://www.theverge.com/circuitbre...c-hdr-4k-144hz-gaming-monitor-announced-specs

https://www.amd.com/en/products/graphics/amd-radeon-vii?utm_source=silverpop&utm_medium=email&utm_campaign=38807948&utm_term=btn_ Learn More&utm_content=global-general-product-technology-2019Feb-RadeonVII-en (1):&spMailingID=38807948&spUserID=ODQ0MTcwODE0MjQ1S0&spJobID=1460608619&spReportId=MTQ2MDYwODYxOQS2

DisplayPort 1.4 spec is 8K is the same, '4K60' sounds more like DP 1.3. There are 8K monitors on the market, wonder how the Radeon VII will benchmark on one? Looking at Newegg, these cards are all sold out, at $699.99 a pop, I don't see full DP 1.4 specs, nor should we expect it at that price. DP 1.4 monitors finally arrived in 2018 to mainstream users, expect GPU's to be tested to the limit.

https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-2041834

DP 1.4 specs, to include support for 8K at 60Hz.

https://arstechnica.com/gadgets/201...an-drive-8k-monitors-over-a-usb-type-c-cable/

I believe that NVIDIA will beat AMD to the punch here & am not biased, rather telling it like it is. Give AMD credit for keeping NVIDIA from charging $1,000+ for RTX 1070's, which is a winner for all, yet they're playing by (or writing) their own rules. This Radeon VII release is not DP 1.4 compliant by any stretch of the imagination.

Cat
By that same token Maxwell's weren't HDMI 2.0 compliant as well, though we had seen the same bandwagon effect claiming they were.
Truth is AMD just claimed the 'VRR' HDMI designation, so it is Nvidia who has to open up support. I believe there were big market trends at play here, so dp 1.4 will never be in TVs for instance. It is for display sets only and not how integral to industry as AMD sees it.
PS: Also, electing variables is the same as cherrypicking which results come out on top and between these two it is "Intel versus AMD", so let's not attach bias on any party. It is by designation so.
 
Last edited:
Not trying to be a jerk, but what really is the difference? When DX12 games are on hand, AMD generally does much better. Much of the time, even better than their NV contemporaries. If that isn't shining, I don't really know what it is. Compared to AMD's DX11 performance, DX12 is clearly shining

The idea that it shows AMD shinning. It doesn't show that it shows that nvidia cards are good in all aspects of gaming. There shouldn't be sudden massive gains purely because someone else developed a work around for your product not being well rounded.
 
I'm with the rest of the oldschool gang who come from a time when people thought TPU was ATi biased. I even thought so at the time... not as much as to think that TPU were fudging reviews to make ATi look better than they were (or make nVidia look worse, etc), but it just seemed that ATi got more attention, with all the softmods, and, of course, ATiTool... though nVidia cards got attention, too. Perhaps more things were possible with ATi cards at that specific time, so that's why it looked that way to me?

Anyways, biased is one thing TPU is not. W1zzard can't help it if the numbers favor nVidia. They have the superior product. Somebody out there can't see through their own bias, so they point the bias finger at TPU. :ohwell:
 
There shouldn't be sudden massive gains purely because someone else developed a work around for your product not being well rounded.

I don't disagree but we talking about semantics.
 
New USA, get butthurt over everything, SMFH!
Pretty sure it’s an international forum that happens to use English. Perhaps it makes it confusing, and that’s why your snide comment about butthurt USA, which has nothing to do with the topic?
 
Pretty sure it’s an international forum that happens to use English. Perhaps it makes it confusing, and that’s why your snide comment about butthurt USA, which has nothing to do with the topic?

Yeah his comment is irrelevent to this topic, i think he has been drinking vodka again.
 
If we could get over the preponderence of Intel cpus in testing I believe we would shed all doubt on this matter because let's face it, the diffences are not so virtual. I appreciate TPU holding on its own and not delving into the mainstream but you've given past guidance on hardware as a whole. It is not like pc hardware is stopping any time soon. Demonstrability is great on a single system, but what about representability? A cpu that had a reputation for supply challenges is the basis point? I suppose not - not for everyone.
 
Last edited:
i still think the 1080Ti is best buy just not willing to throw that kind of money away.
weeeellll situation dependent .... a 1080Ti is still 900+ chf for me (2080 alike ) and the RVII is positioned as it should between these two and the 2070 (700chf~) and performance wise at 2k it is exactly where it is ...

for me it would be a RVII instead of a 2080 or 2070.

as for the bias ... either i see none or i i don't really care about it if there ever was one ...

eh? wth?
Vodka is made from potatoes
not only.... some are made from grains... but i doubt potatoes would relate Vodka to Idaho ... ah well whatever...
 
Back
Top