• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

nVidia Fanboy Here - Moving on from GTX 760 2GB - Should I go AMD?

wait for amd new lineup and benchmarks, your cpu might be your biggest bottleneck if you where going for 1070
how about a RX 480 and more memory or a faster/bigger ssd?

there are only 3 cards you should be looking at this summer
  1. RX 480 ($199)
  2. GTX 1070 ($449)
  3. GTX 1080 ($799)
we are all suspecting amd to have yet another polaris 10 card up their sleeves that is faster than RX 480
simply just wait a month or two, amd is expected to release RX 480 the 29th of june, and there is also the pc gaming show the 13th
where amd might reveal some more info on new cards

amd has been pretty tight lipped this time on their cards, no big leaks at all
upgrading from a GTX 760 to a RX 480 or 480X will be a hell of a upgrade
 
  • Like
Reactions: Ebo
  • GTX 1070 ($449)
  • GTX 1080 ($799)
Um 1080 is not 799, stop using 3rd party price gouge's. when AIB card makers get their cards out can get a 1080 for pretty close to 600$. 1070 likely be around 400$. FE of 1080 is $699 not 799
 
After my experience with dual cards, I will never do it again. I will buy one really good card and use whatever money is left to buy whiskey.

absolutely the best quote in this thread
 
Similar query as OP now and also 400 USD budget, GTX 660 OC going to either GTX 1070 or RX 480 (or if there is an RX 480X/490). Either way both will be a huge jump for me. Some say 1070 wil be overkill for my current 1080p HDTV and 900p monitor but i would like to max out all details at those said resolutions and i don't intend to upgrade my GPU for years to come (got my GTX 660 few months after release way back), though i might go to next gen i5 or what the zen cpu may bring next year.

Same thoughts about multi GPU, id prefer a fast single card over SLI/CF, my current board doesn't support either anyway.

Also, using a testbed like layout on my curent setup (Aerocool Dead SIlence) thinking if blower type might be a better one vs the custom ones
 
No, you are going to want to avoid crossfire at all costs. Reason? You motherboard and AMD's decision to use the PCI-E bus for crossfire data sent between the two cards. Sure, it technically supports crossfire, but the slots will dramatically cripple crossfire.
PCI-E cripples crossfire? Why would you say this when it is totally untrue?
I ran benches only last week as i was flashing my 290x crossfire cards with different custom bios's i had altered. I ran heaven bench maxed out with 8xAA and extreme tesselation etc.
Single card was 57 fps..Put the other card in and ran it again and 112 fps....Yeah that sure is crippled, right?
As for the original question. I would go single card if possible but also reckon waiting a few more months to see what amd vega brings would be a wise move. DX12 and Vulkan will more likely favour amd hardware due to those api's being mantle based.
 
absolutely the best quote in this thread


It's obvious that one should always buy the single fastest card one can afford. The problem is, what do you do when the single fastest card isn't fast enough?? The question of single vs dual is simple at first, but not when you get into large resolution setups. And now we have 4k 144hz panels coming out soon, lol the levels just got higher.
 
PCI-E cripples crossfire? Why would you say this when it is totally untrue?

Yes, a PCIe 2.0 x4 slot provided by the chipset as OP has will cripple crossfire.
 
Yes, a PCIe 2.0 x4 slot provided by the chipset as OP has will cripple crossfire.


I think that's obvious even without this thread, but some have been using that as a reason to take a dig at AMD for using the pcie bus to perform crossfire. I'm pretty sure the poster you are quoting is talking about that and not x4 slots off the chipset.
 
Price perf of the cards, based on current leaks/lowest prices (from ng thread), note how 1070 is roughly on par with 390/970:

Y1tTbJ5.png


It should be an obvious red flag to most, that AMD uses just ONE title, Ashes of Singularity, to example 480 Crossfire performance.
Nah, they leaked Doom performance, for instance, in the show they've shown how 480 runs other games.

They used only one title to take on 1080.

it's their own fault.
Yeah, why would you believe them, when there is another camp, with wooden screws, those are to be trusted.
 
Price perf of the cards, based on current leaks/lowest prices (from ng thread), note how 1070 is roughly on par with 390/970:

Y1tTbJ5.png
Nah, they leaked Doom performance, for instance, in the show they've shown how 480 runs other games.
Um only without and independent testing this graph is pretty much worthless. they "leaked" doom performance doesn't say much since don't know much about settings and such. Well AOTS performance is well as other person said can just toss in the trash.
 
I haven't read the whole thread, but on what resolution are you playing? 1080p or do you plan on playing on three monitors? If just 1080p, I would honestly just wait for the 480/s and see how that does.
 
Um only without and independent testing this graph is pretty much worthless.
No, definitely not worthless (especially Pascal vs previous gen part) merely unconfirmed on 480.
Oh, and in this graph 480 is assumed to be C4-ish, so, between 970 and 980, not between 980 and Fury.
 
This thread interests me. As a R290 Crossfire user, i am beginning to regret , with shitty drivers coming in AMD side and stupid game makers siding with 1 camp (Etc Tomb Raider, Witcher 3.)

Since it's not mentioned, does 1 GTX1080 wreck a R290 x-fire? Anyone know?
 
AMD has sucked for a really long time

mmhhhh? .... 7870Ghz 7970? R9 290 290X? R9 390 390X? mmmm? (fury was a mild disapointement but still was a fine card ... )

i mean ... my actual 980 did not offer me anything more than my previous 290 did ... altho i did get the 290 for 150$ and the 980 i have is priced at 620$ where i am ... and i don't even considere a 1070 as a tempting upgrade (even the 1080 nonetheless )


lucky that i am from neither side.

Um 1080 is not 799, stop using 3rd party price gouge's. when AIB card makers get their cards out can get a 1080 for pretty close to 600$. 1070 likely be around 400$. FE of 1080 is $699 not 799

ahah ... a 1080 non founder is 895chf where i am, and founder are 759chf (respectivelly 925.22$ and 784.63$) the price intended by nvidia (aka the new reference of 699 and not 599 since the references cards are 699) will never be seen for end consumer ... but rather those i stated previously

Since it's not mentioned, does 1 GTX1080 wreck a R290 x-fire? Anyone know?
probably ... but it would be better if some other CFX user would answer (strangely they will have a different experience and opinion i think ) aka: is it needed? since even a 290X solo is still plenty for now and probably further (so is a 980 etc etc etc )

what i mean is depending the price paid and if you aren't a "OEMEGEE i want the latest cardz older are sh*tzn no matter the price" and your actual setup give you satisfactory result there is no worries to have (if i didn't got my 980 i would still use my 290 atm and probably until Vega release )
 
Yes. Crossfire is giving me a headache and i am going back to a single card (AMD crossfire makes me don't wanna do it again.) I am not sure for SLI but are people have the same issues? (Flickering, frame shuttering etc etc)
 
does 1 GTX1080 wreck a R290 x-fire?
If CF works, they'd be roughly the same, if not, then of course.
Think of 1080 as well OCed 980Ti.
 
Yes. Crossfire is giving me a headache and i am going back to a single card (AMD crossfire makes me don't wanna do it again.) I am not sure for SLI but are people have the same issues? (Flickering, frame shuttering etc etc)
i had a SLI (no cfx despite my "both brand" history) well single card was better in any case ... SLI and CFX are not, imho, reliable tech: mainly the fault of the games developer, not "shitty AMD drivers" ... nvidia is far worse on the driver side... and i talk from personal experiences... (and i only mean driver side ... not, multi GPU support ... SLI tend to be a little more viable than CFX but it was quite some times before )
 
I'm pretty sure the poster you are quoting is talking about that and not x4 slots off the chipset.

Isn't the OP's board using an X4 slot for the crossfire setup? That's all it has, right?

but it would be better if some other CFX user would answer

I have R9-290X Sapphire Tri-X cards in crossfire. My best 980Ti (Gigabyte G1 Gaming Edition) beats them in a lot of benchmarks, but they all play games about the same. (fast and smooth)
Crossfire 290X cards are a decent solution while waiting for new products to arrive.

This is a Heaven result for two Sapphire TriX R9-290X in Crossfire. (both are at 8X speed on the PCI-E bus)

290 Crossfire Max.JPG
 
Last edited:
Isn't the OP's board using an X4 slot for the crossfire setup? That's all it has, right?



I have R9-290X Sapphire Tri-X cards in crossfire. My best 980Ti (Gigabyte G1 Gaming Edition) beats them in a lot of benchmarks, but they all play games about the same. (fast and smooth)
Crossfired 290X cards are a decent solution while waiting for new products to arrive.


Yea, which is why crossfire/sli is a terrible idea for him and not recommended on that point alone. Speaking of which, it's probably time for the OP to move up to a better foundation.
 
Only rumors are one's put out by AMD so take that with a grain of salt. Likely independent reviewers will see 2 480's around same performance as 1070. if you get a 1070 for 400$, then you will have 2 480 4gb cards at that price. to get same 8gb could be 100$ more for 8gb premium. other thing is 300watts vs 150watt draw.

1) RX480 won' go for sale at $250 but as $230 according to all info (not rumours).

2) TDP isn't power consumption. So, RX480 is more likely to consume 110-130W judjing by its die size and 14nm tech along with 1266MHz clock. If oced by much it will consume more but will go against more costly and more strong GPUs also for a fragment of their price.

So, let's wait for a review to judge RX480 eh? And we should hope for it to be the best ever GPU in its class as it would lower the price of more powerful cards.
 
Hmm thanks for the feedback. It literally tears my heart apart to see The Witcher 3 item icons flickering non-stop which annoys the crap out of me. And being an "Nvidia" game with their stupid gameworks and shit, I still can't get the idea why are developers siding one company to create a stink-hole for other other fan boy camp... I mean.....Seriously? What ERA are we living?
 
I would wait until the RX 480(X) is actually released and benchmarked before dropping and monies on a new graphics card. I was very impressed when I went from a 760 to a 970, and the 480 is supposed to exceed the 970's performance, so that could be a very attractive option for $200.

As for the GTX 1070, you know exactly what you're getting: 980 Ti-beating performance for much less cash and much less heat. Given that, I forsee the 1070 being in very short supply for quite some time after its launch, which will push the price up enough that $400 may not cover it.

I really don't want to change motherboards since I'd have to reinstall my OS and I really don't want to do that

You don't. My current Windows 7 install started out life on an Asrock Z77 board, then I went to a Gigabyte board, then to a different Asrock board, and finally to the current Gigabyte board I have. As long as you keep the same chipset, and your vital BIOS options like RAID config are the same on both old and new board, and you plug the same SATA cables into the same ports on the new board, everything should just work. There will probably be some driver installs necessary, and Windows will require you to reactivate, but that's literally it.

(For the record, my current Windows 7 install has remained the same over: 4 different motherboards; 2 different CPUs; 3 different boot hard disks/SSDs (disk cloning FTW); 5 different graphics cards; and remains rock solid. The only instability/blue screens I've had were when I clocked the CPU or graphics card too far.)
 
I think that's obvious even without this thread, but some have been using that as a reason to take a dig at AMD for using the pcie bus to perform crossfire. I'm pretty sure the poster you are quoting is talking about that and not x4 slots off the chipset.

That is why I specifically said "Sure, it[the motherboard] technically supports crossfire, but the slots will dramatically cripple crossfire."

I made it very clear in my first post that the issue was the motherboard and it's poor slot arrangement and that Crossfire communication over the PCI-E bus just makes that issue worse, but the issue is the motherboard and the x4 2.0 slot.
 
Does anyone know how much bandwidth crossfire communication uses? (just wondering)
 
Does anyone know how much bandwidth crossfire communication uses? (just wondering)

That's hard to tell but from testing there is no loss. Imo, there are probably other pressing things to worry about then lamenting AMD's XDMA engines.

True to their promises, AMD has delivered a PCie based Crossfire implementation that incurs no performance penalty versus CFBI, and on the whole fully and sufficiently resolves AMD’s outstanding frame pacing issues.

http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/4
https://community.amd.com/community/gaming/blog/2015/05/11/modernizing-multi-gpu-gaming-with-xdma
 
Back
Top