• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Six Year Old GTX 1060 Beats Intel Arc A380, GeForce GTX 1630 and Radeon RX 6400, Wins TPU popularity contest

A GTX 1080 or more so the GTX 1080 Ti is juuuuuuuuust fine for a 2022 build, not even shabby. I think it will still be fine in 2023. The people who bought it on launch (and still have it) made a really good investment.

I disagree. Pascal owners love their cards for good reason - but I keep seeing people swearing by these ancient graphics cards as if they were special. You simply need to experience Ampere if you think that's something to write home about. You'll find more than adequate a match for a GTX 1070 on an RTX 3050. Maybe even a 1070 Ti; and that's on the games that Pascal can run decently (DX11).

I know I may sound harsh calling Pascal ancient - but it's really an architecture from 2016. That's well over six years ago, more than half a decade. To put that into perspective, that's the year the iPhone 7 and the Galaxy S7 launched. See how these two mighty flagships compare to even midrange phone like a Galaxy A33 5G today? Tech has gone forward so much, and I am all too happy to acknowledge its age. Even if some people aren't willing to, but then again, many would defend using Windows 7 in this present year and I think that is completely :kookoo: myself.

End of the day, you know what's best for your own personal needs. I can respect that. But I personally won't recommend a Pascal architecture graphics card for a modern build.
 
Last edited:
I disagree. Pascal owners love their cards for good reason - but I keep seeing people swearing by these ancient graphics cards as if they were special. You simply need to experience Ampere if you think that's something to write home about. You'll find more than adequate a match for a GTX 1070 on an RTX 3050. Maybe even a 1070 Ti; and that's on the games that Pascal can run decently (DX11).
I just sold a GTX 1080 for £200. It's better than anything else you can buy for £200 right now, so as far as I'm concerned the buyer got a good deal. 3050 is almost 50% more than that and 3060 is almost double.

If it were me buying a new card right now I'd be grabbing a new RX 6600 8GB, you can find them in the UK for £250 and they're far and away the best performance/£ for anything new right now. They're not 25% faster than a 1080 on average though, so despite that - a £200 GTX 1080 is still a reasonable if you either don't have a £250 budget, or if you're comparing it to any Nvidia card (perhaps you want the drivers, the encoder, CUDA support - whatever...)
 
Last edited:
I just sold a GTX 1080 for £200. It's better than anything else you can buy for £200 right now, so as far as I'm concerned the buyer got a good deal. 3050 is almost 50% more than that and 3060 is almost double.

If it were me buying a new card right now I'd be grabbing a new RX 6600 8GB, you can find them in the UK for £250 and they're far and away the best performance/£ for anything new right now.

That may be so, but at the same time, the buyer probably isn't going after state of the art technology, which is somewhat of the point of my argument in favor of the Arc GPU. It's the one with the freshest technology in it, even if the drivers are not yet ready. These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now.

Other than that, both Ampere and RDNA 2 products are on the same list, and at this kind of price point, there are things that I personally value more than raw power. But you do you fellas, that's the beauty of the PC, end of the day, what makes it special is that you can tailor it to your own needs.
 
That may be so, but at the same time, the buyer probably isn't going after state of the art technology, which is somewhat of the point of my argument in favor of the Arc GPU. It's the one with the freshest technology in it, even if the drivers are not yet ready. These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now.

Other than that, both Ampere and RDNA 2 products are on the same list, and at this kind of price point, there are things that I personally value more than raw power. But you do you fellas, that's the beauty of the PC, end of the day, what makes it special is that you can tailor it to your own needs.
I agree with you about the freshest tech and feature set, but I don't think Intel's 1st-gen GPUs are the ones to buy for that. By the time their drivers are in a decent state, and by the time DX9 and DX11 performance is less relevant, this first-gen will already be retired/obsolete.

We know there are uncorrectable errors in the silicon itself (Intel admitted as much) and I believe that their second attempt next generation will be the one to seriously consider. They're still new at dGPUs and this first gen is full of mistakes and things that qualify as "trial and error" lessons.

We just have to hope that intel persist with this loss-leader and make it to a second generation. The fear is that it won't turn an immediate profit and so the stupid board of directors will just bow to shortsighted shareholder pressure to can the whole GPU lineup. Intel's expertise and in-house fab could mean that Intel become the #1 player in time, but it could easily take a decade for that to happen, if it happens at all. They have the money and they have the supply chain. All they need is commitment and persistence.
 
I agree with you about the freshest tech and feature set, but I don't think Intel's 1st-gen GPUs are the ones to buy for that. By the time their drivers are in a decent state, and by the time DX9 and DX11 performance is less relevant, this first-gen will already be retire/obsolete.

We know there are errors in the silicon itself (Intel admitted as much) and I believe that their second attempt next generation will be the one to seriously consider. They're still new at dGPUs and this first gen is full of mistakes and things that qualify as "trial and error" lessons.

Agreed, it is a bumpy start, but it is a start nonetheless. One must crawl before they can walk. I am sure many people thought the same when the GeForce 256 and the Radeon launched back in 1999-2000, or when unified shader cards launched with the G80 in 2006. New tech often comes with big changes, and change can understandably be scary.

RDNA 2 is by all means a polished and well maintained architecture, so I'd nudge people who want a more carefree experience towards it.

Those who seek raw performance will weigh their choices and take any path they deem best (such as buying older, less efficient but powerful hardware, even if they're behind in features or are approaching their end of life), and then there are those like me. I'm excited by new technology, and this matters more to me, and that's why I like to keep things fresh :)

I can't wait for RDNA 3, for example. I'm gonna have a field day with it.
 
Hi,
GPU's are just now are seeing price reductions so repeating vnidia PR line "just buy it" for an rtx cards is insane.

I won't be buying a new gpu until I have a real need for one so 980ti/ 1080ti/ titanXp will still do their thing until they can't anymore and I sure would waste any time or money with a 3050 or 3060 lol that's just crazy talk, unless of course if they came on a laptop but for a desktop that's just funny :laugh:
 
Hi,
GPU's are just now are seeing price reductions so repeating vnidia PR line "just buy it" for an rtx cards is insane.

I won't be buying a new gpu until I have a real need for one so 980ti/ 1080ti/ titanXp will still do their thing until they can't anymore and I sure would waste any time or money with a 3050 or 3060 lol that's just crazy talk, unless of course if they came on a laptop but for a desktop that's just funny :laugh:

Correct me if I'm wrong, but don't you use Windows 7? If your operating system doesn't support any of these newer graphics APIs, there is really nothing for you to see. It stands to reason that the 10 series would be fully capable of offering everything that the WDDM 1.1 model was designed to do.

My argument is by no means the same as Avram Piltch's infamous just buy it pitch, since we're talking about budget hardware. I fully realize that I'm playing devil's advocate.
 
Correct me if I'm wrong, but don't you use Windows 7? If your operating system doesn't support any of these newer graphics APIs, there is really nothing for you to see. It stands to reason that the 10 series would be fully capable of offering everything that the WDDM 1.1 model was designed to do.

My argument is by no means the same as Avram Piltch's infamous just buy it pitch, since we're talking about budget hardware. I fully realize that I'm playing devil's advocate.
Hi,
There are win-7 drivers for 20 & 30 series so not sure what your point is
My point is your statement is just wrong think maybe the devil made you do it stands to reason :laugh:

I use 7 - 10 and 11.
 
Hi,
There are win-7 drivers for 20 & 30 series so not sure what your point is
My point is your statement is just wrong think maybe the devil made you do it stands to reason :laugh:

The point is that even if drivers exist (even disregarding the fact that they are out of date), it doesn't mean you can use every functionality available because your OS is not aware of what these features are or what they are for.

You being blissfully oblivious to what I speak of doesn't make me wrong, but I'll tell you this, just having multiplane overlay support makes the upgrade worth it in my eyes. MPOs are unfortunately not supported in Pascal, though AMD does support this in Vega.
 
1060 6gb and 1080ti will never die, iconic beasts
 
...These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now...
I feel I have to reply to this - who has been the leading graphics chip producer (regardless of the GPU size) for ages and with the biggest market share? Intel.
If they aren't able to produce stable drivers now, it's not likely they will any time soon.
If we look at nvidia with their GeForce256 and up - they already had very stable drivers even with Riva TNT2, much more so with GF1 (256), it is not that GF256 was anything new, just continuation and upgrade of their architecture and all higher chips too (quite an achievement really, the key is probably discipline which Intel lacks).
 
I feel I have to reply to this - who has been the leading graphics chip producer (regardless of the GPU size) for ages and with the biggest market share? Intel.
If they aren't able to produce stable drivers now, it's not likely they will any time soon.
If we look at nvidia with their GeForce256 and up - they already had very stable drivers even with Riva TNT2, much more so with GF1 (256), it is not that GF256 was anything new, just continuation and upgrade of their architecture and all higher chips too (quite an achievement really, the key is probably discipline which Intel lacks).

Developing igpu's is not the same thing.

"Our software release on our discrete graphics was clearly underperforming," said Gelsinger. "We thought that we would be able to leverage the integrated graphics software stack, and it was wholly inadequate for the performance levels, gaming compatibility, etc. that we needed. So we are not hitting our four million unit goal in the discrete graphics space, even as we are now catching up and getting better software releases."

Riva was done at a different time, everything was much simpler. And both Nvidia and especially AMD even with decades of experience with high performance drivers still make same spectacularly shitty drivers. You can easilly search AMD releasing updated graphics for a single game giving 30% more performance (and i remenber cases of 40% and more in my time with AMD)

"you have an AMD GPU with a product name starting ‘Radeon RX 6…’ the driver should deliver the following performance improvements in these games:
  • World of Warcraft: Shadowlands – up to 30%
  • Assassin’s Creed Odyssey – up to 28%"
 
Developing igpu's is not the same thing.

"Our software release on our discrete graphics was clearly underperforming," said Gelsinger. "We thought that we would be able to leverage the integrated graphics software stack, and it was wholly inadequate for the performance levels, gaming compatibility, etc. that we needed. So we are not hitting our four million unit goal in the discrete graphics space, even as we are now catching up and getting better software releases."

Riva was done at a different time, everything was much simpler. And both Nvidia and especially AMD even with decades of experience with high performance drivers still make same spectacularly shitty drivers. You can easilly search AMD releasing updated graphics for a single game giving 30% more performance (and i remenber cases of 40% and more in my time with AMD)

"you have an AMD GPU with a product name starting ‘Radeon RX 6…’ the driver should deliver the following performance improvements in these games:
  • World of Warcraft: Shadowlands – up to 30%
  • Assassin’s Creed Odyssey – up to 28%"

Yup, and the biggest mistake with Arc is that whoever was in charge of the software development (folks blaming Raja again, he's like the boogeyman or something) thought they could leverage their existing driver code base for their latest generation integrated graphics and work from there. Let's just say that wasn't such a great idea
 
Yup, and the biggest mistake with Arc is that whoever was in charge of the software development (folks blaming Raja again, he's like the boogeyman or something) thought they could leverage their existing driver code base for their latest generation integrated graphics and work from there. Let's just say that wasn't such a great idea

thought they could is a weird thing. What were they doing, i'm sure they had a prototype for so long now, i can't understand how any of this came as a surprise for Intel. Back in February they were promising us the moon.

The argument is the igpu's never recieved optimized drivers, so wtf why didn't they tried that first before jumping head first into making new gpu's with drivers they didn't get to optimise on the existing igpu's. There's an insane amount of bad workmanship and leadership at Intel.
 
thought they could is a weird thing. What were they doing, i'm sure they had a prototype for so long now, i can't understand how any of this came as a surprise for Intel. Back in February they were promising us the moon.

The argument is the igpu's never recieved optimized drivers, so wtf why didn't they tried that first before jumping head first into making new gpu's with drivers they didn't get to optimise on the existing igpu's. There's an insane amount of bad workmanship and leadership at Intel.

I totally imagine that this preposterous idea was imposed by the suits that Gelsinger delegated to take care of the graphics division as he restructured the company.

I'm also fairly sure and inclined to believe that they were warned by the engineers but executives haven't the faintest clue, and probably saw that their integrated graphics ran CSGO and Dota and refused to allow development of an all new stack, which is a millionaire investment. Except that this time they can't just call in the Linux wiz kids they hire to maintain their open source iGPU drivers, so it all fell completely on top of the hardware itself.

At least Gelsinger is personally owning up to that mistake, and I strongly feel that he's done Intel very well in his tenure as CEO thus far.
 
Maybe it's a culture thing inside Intel, but in my company we are encouraged to point out things like this, that can be the difference between a lot of money won or lost, and we probably get a reward.
This company has a massive problem inside.
 
Maybe it's a culture thing inside Intel, but in my company we are encouraged to point out things like this, that can be the difference between a lot of money won or lost, and we probably get a reward.
This company has a massive problem inside.

Most mega corps have people in charge that shouldn't be, you know, the usual. But hopefully it's a learning experience.
 
Back
Top