• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX A2000

Because AMD is genuinely competing this time around.
When theres competition, use up all the headroom.
No competition? Focus on efficiency, use that headroom for reliably known future products (for the shareholders)
(See: intel running quad cores almost the same for 10+ years, with only memory clocks really changing)
I see what you mean. Though on my end, it's sad to see competition being more important than consumer needs. I get it, competition is good for the market. It's just not good for the final product. I mean, OK, everybody cried about Intel releasing extremely similar quad cores for such a long time, but was anybody forced to buy them? Not really. You could get away with using Sandy or Ivy bridge for a decade, which I think was great. I have no intention of upgrading my PC every year unless I need to, and I have no intention of swapping out a perfectly good PSU just to be able to feed a 300+ W graphics card with 5 power connectors.
 
because it'd make little sense from a binning perspective. The A2000 has about as many cores active as the 3060 - the point of the 3050 is to use up the dies that didn't quite make it to a 3060 or A2000, but aren't like totally broken to the point of being wholly useless.
Since the vast majority of users aren't doing SFF, the 3050 consuming 150W for higher performance's just fine, if it means more availability through binning.
After all, that's why the A2000 exists - an option for those who cannot use a 3050 instead.
The only reason RTX 3050 is configured that way is to allow Nvidia to switch to GA107 chip as soon as this craze go away and prices return to normal (you can see the GA106 chip is cut down to match full GA107 exactly), infact there is no chip so defective that 40% of the cores, 50% of the memory controller and 50% of the PCIe lanes need to be disabled and still works fine.
Yes chips going to RTX 3050 are pretty bad binns, but GA106 in RTX 3060 is already cut down quite a bit and A2000 even more. Perfect chips goes to laptops, defective chip wich can clock hig but needing more power goes to 3060 and defective chips not clocking as fast but with no power leaks goes to A2000.
3050 it's an artificially cut-down chip to meet a performance target, a marketing stunt to rain on 6500XT parade if you will.
 
Since the vast majority of users aren't doing SFF, the 3050 consuming 150W for higher performance's just fine, if it means more availability through binning.
After all, that's why the A2000 exists - an option for those who cannot use a 3050 instead.
You seem to be missing some important context..
 
Do enlighten me then.
The 75W/SFF crowd is an incredibly small tiny one, and guess what an option (the RTX A2000) actually exists for them.
Generally speaking, it is just much more profitable to jam way more power into your dies to increase performance vs jamming more cores/CUs/w/e into your die, so from a business perspective all the current SKUs make perfect sense.
I don't necessarily agree w/ how nv/AMD(/intel) operate at all, but from their perspective all of this is perfectly logical and makes perfect sense. And I can see that.

So yeah, deal w/ it.
 
Do enlighten me then.
The 75W/SFF crowd is an incredibly small tiny one, and guess what an option (the RTX A2000) actually exists for them.
Generally speaking, it is just much more profitable to jam way more power into your dies to increase performance vs jamming more cores/CUs/w/e into your die, so from a business perspective all the current SKUs make perfect sense.
I don't necessarily agree w/ how nv/AMD(/intel) operate at all, but from their perspective all of this is perfectly logical and makes perfect sense. And I can see that.

So yeah, deal w/ it.
I've just had a look... the A2000 isn't available through UK retailers. I've managed to find one on ebay for ÂŁ660. So no, SFF people don't have an option.

Other than that, I see your point.
 
I've just had a look... the A2000 isn't available through UK retailers. I've managed to find one on ebay for ÂŁ660. So no, SFF people don't have an option.

Other than that, I see your point.
I mean, I feel you but technically that's still availability. 'Twas no differet w/ uhm ordinary GPUs until a (short) while ago after all, anyways.
 
I see what you mean. Though on my end, it's sad to see competition being more important than consumer needs. I get it, competition is good for the market. It's just not good for the final product. I mean, OK, everybody cried about Intel releasing extremely similar quad cores for such a long time, but was anybody forced to buy them? Not really. You could get away with using Sandy or Ivy bridge for a decade, which I think was great. I have no intention of upgrading my PC every year unless I need to, and I have no intention of swapping out a perfectly good PSU just to be able to feed a 300+ W graphics card with 5 power connectors.

Yes, many people were forced to buy them.
Anyone who needed a new PC with warranty was forced to, anyone upgrading and wanting warranty, and so on.

We had almost 10 years where if a part failed, you had to risk the second hand market, or buy a new mobo/CPU combo even if you had working parts. Theres a reason my 2500K sat in a box for 5+ years, because i couldnt get a reliable mobo to use it with, and because buying new still got me a quad core so i kept it in case i DID find a mobo


This GPU wasnt made for the home user, no matter their intended use - its for those half height dell and HP prebuilts that need to reach minimum spec for 'workstation' applications and get the certified professional drivers
 
I see what you mean. Though on my end, it's sad to see competition being more important than consumer needs. I get it, competition is good for the market. It's just not good for the final product. I mean, OK, everybody cried about Intel releasing extremely similar quad cores for such a long time, but was anybody forced to buy them? Not really. You could get away with using Sandy or Ivy bridge for a decade, which I think was great. I have no intention of upgrading my PC every year unless I need to, and I have no intention of swapping out a perfectly good PSU just to be able to feed a 300+ W graphics card with 5 power connectors.

Competition can be good but also can be bad.

If we go back to the Intel stagnation from sandy bridge to start of coffee lake, the benefit of that era was game dev's were forced to reign in their coding to work on what was out there, and people with sandy bridge kept their cpu's for the majority of a decade without needing to upgrade.

Now cpu's are progressing rapidly from generation to generation again, and not just the cpu but also the chipset, tech has accelerated, this basically means pc's will feel obsolete quicker so makes pc gaming more expensive, at least in theory, I have excluded the current pricing crisis as that seems to have other circumstances, so my cost argument is based on frequency of upgrading.

With gpu's it seems that pressure to compete with a competitive AMD is going to lead to power hungry 4000 series which might need a new PSU standard. Another cost to add to the equation. Remains to be seen if pci express gen 3 will be ok on 4080s and 4090s.
 
Yes, many people were forced to buy them.
Anyone who needed a new PC with warranty was forced to, anyone upgrading and wanting warranty, and so on.
If you want warranty, you always have to buy new parts. That was the case 10 years ago, and that's the case today as well. What I meant is, people on Sandy / Ivy Bridge were settled for a decade. My brother still rocks a first gen Core i5 (Westmere, I think?), and only now is he starting to feel the need to upgrade.

We had almost 10 years where if a part failed, you had to risk the second hand market, or buy a new mobo/CPU combo even if you had working parts.
How is that different from any other era in IT history (or even nowadays)?

This GPU wasnt made for the home user, no matter their intended use - its for those half height dell and HP prebuilts that need to reach minimum spec for 'workstation' applications and get the certified professional drivers
I get that. :) All I'm saying is that it could very well be a commercial (gamer) product too. I see no reason why my new GeForce should require a kilowatt power supply and a new power connector simply because its designation isn't A-something.
 
If you want warranty, you always have to buy new parts. That was the case 10 years ago, and that's the case today as well. What I meant is, people on Sandy / Ivy Bridge were settled for a decade. My brother still rocks a first gen Core i5 (Westmere, I think?), and only now is he starting to feel the need to upgrade.


How is that different from any other era in IT history (or even nowadays)?


I get that. :) All I'm saying is that it could very well be a commercial (gamer) product too. I see no reason why my new GeForce should require a kilowatt power supply and a new power connector simply because its designation isn't A-something.
Question was asked: did intel force you to buy them

The answer was yes, people were forced to buy devices, basically the same, year after year.

School IT department? Every year, new purchases... saaaaame as last year. Except higher prices and more bundled garbage.
 
Question was asked: did intel force you to buy them

The answer was yes, people were forced to buy devices, basically the same, year after year.

School IT department? Every year, new purchases... saaaaame as last year. Except higher prices and more bundled garbage.
Who was forced? I wasn't.

My school certainly didn't buy tech every year. They jumped from 486 and Pentium 1 all the way to Core 2 Duo and Quad. I graduated in that era, but I wouldn't be surprised if they still used those PCs even now.

I'm talking about home gaming anyway. The deals Intel cut with companies and other agencies don't concern me. ;)
 
Last edited:
Isn't PNY the only authorized manufacturer for Nvidia professional GPU ?
No, but you probably wouldn't be able to outright purchase a Lenovo brand RTX A5000/6000 from some place like Best Buy or most other places where you would typically buy a PNY or other similar graded consumer card. To begin with, its a ball ache for the average Joe like myself to get just one unless you're with a company that orders enterprise systems/parts directly from Lenovo. I'm not 100% certain, but I think one of the main reasons for this is simply due to the level of quality control. If Lenovo is catering to customers such as the US government, which pays a lot more for better QC of products it buys in bulk, I don't think they're going to be producing extra to sell to the rest of us and expect to make much of a profit from it since they're not going to make the same product with two different levels of QC.
 
Who was forced? I wasn't.

My school certainly didn't buy tech every year. They jumped from 486 and Pentium 1 all the way to Core 2 Duo and Quad. I graduated in that era, but I wouldn't be surprised if they still used those PCs even now.

I'm talking about home gaming anyway. The deals Intel cut with companies and other agencies don't concern me. ;)
Oops sorry, forgot that you as an individual matter more than all other metrics

Yes, many places are forced to buy new hardware. Lucky you for not being one of them.
School students here? Forced to buy laptops and iPads (yep, MUST be apple) - and if intel release shit, you get shit for your money.
 
Oops sorry, forgot that you as an individual matter more than all other metrics

Yes, many places are forced to buy new hardware. Lucky you for not being one of them.
School students here? Forced to buy laptops and iPads (yep, MUST be apple) - and if intel release shit, you get shit for your money.
We never had any Apple stuff in my school! :eek: Maybe I would have grown to like it a bit more if we did. We'll never know now. :ohwell:
 
Okay, a different question since apparently it hadn't been asked before but:
As w/ all Quadros this uses the Quadro driver, yes? Any issues/things of note w/ them?

Game benchmarks seem to have run fine. I guess the GeForce Experience's out of the question (not that I care, but yea), or is there some sorcery & witchcraft you could do w/ nvcleaninstall to convince that it's actually a GeForce card or something?
 
Okay, a different question since apparently it hadn't been asked before but:
As w/ all Quadros this uses the Quadro driver, yes? Any issues/things of note w/ them?

Game benchmarks seem to have run fine. I guess the GeForce Experience's out of the question (not that I care, but yea), or is there some sorcery & witchcraft you could do w/ nvcleaninstall to convince that it's actually a GeForce card or something?
Yes, quadro drivers, no Geforce Experience (wich to me is a benefit also because after Nvidia got hacked there are fake drivers floating around).
 
We never had any Apple stuff in my school! :eek: Maybe I would have grown to like it a bit more if we did. We'll never know now. :ohwell:
Big long story incoming about dealing with decades of forced hardware choices. Bonus cookies for anyone who can actually read it all.


Work part:
Once a school or business gets offered cheaper prices to go exclusive, they do. Always. The true cost is that it means that's all those people learn to use: and it has a long, ongoing snowball effect (look at how graphics designers think mac is the only way, because those art schools got their cheap macs - and then the students had to sell kidneys to get the software for themselves later)

I wont go into details (legally cant) but i've worked at pizza stores and burger stores as IT support, and we were contractually forced to buy certain hardware only, from certain brands only - and the entire store at a time. No single purchases. Owners didn't usually notice that part,

Running decades old linux variants off 400Mhz Pentium's in 2021 was... Look i still have trauma. Not like it was forced over a few hundred stores or anything, nah cant possibly be.
(When a store upgraded one machine they had to upgrade ALL machines - so the old ones got traded around to try and prevent stores needing full upgrades)
We had a store get new hardware and the new OS that came with it, and rather than train the old staff for the new software they fired everyone and trained new people because the old system was just too far apart from the new one, it was impossible to unlearn.

Small country town with literally two high schools, across the road from each other. If one school fell behind in something - they just shared.

My generation in that town was forced to use either Dell pieces of poop (Mmmm pentium 4), or apple iMac G3's.
We had two high schools next to each other, both forced by contract to use the one supplier only.
Catholic college had to use Apple, the government one had to use Dell.
Us aussies had invented wifi back then, but it wasnt commercially available - so no laptops or ipads in that era.

What this meant is the catholic college kids grew up using apple and it's all they knew, while the public school kids grew up using windows.
Oh, except that the catholic college had to send their kids over to our school to break use the dells, because the macs couldn't get any spare parts, and were not compatible with the early online systems and word processing apps of the era the government required.
This was pre-wikipedia, pre google, pre wifi... look i'm old. You wanted information? You had to fire up Encyclopedia Britannica, search the info, print it, quit EB, open up... was it still word and office back then? retype it, print it, hand it in. It was rather shite.
So we had one school with working computers and one with MSpaint and CD-ROM drives, but no software - because you had to buy apples approved specialty mac software... which was simply not available, or really highly priced (cheap hardware, expensive software)

So y'know - an entire school forced by contract to use computers that didn't work with any of the required software or systems. Most of the kids from that apple school i still talk to gave up on tech and can't even do a copy paste. Yeah, it's f*cking sad and i wouldn't believe it if they didn't keep asking me for help.

Why does this happen? Because companies give out huge discounts and rebates if you buy into their ecosystem exclusively. Apple will give their stuff out free to schools if it means kids grow up using it, so their parents buy it at home, and they get a lifetime of software purchase income. Once they spend money and find it cant be transferred OUT of that ecosystem, they tend to stick around even longer.

Skipping on the decades, we had 'One Laptop per Child Australia' which of course used the cheapest shittiest laptops out there. These things were under spec and cheaply made - they did technically do what was required for the classrooms but they were fragile, broke easily on their own and the moment the software requirements changed (like adding in a mandated bloated antivirus like mcafee or nortons) they became near useless.

Following that, some schools thought: oh, let's use something EASIER and just get a bulk deal for ipads.
Apple gave huge discounts, locked the ipads into school accounts only with total oversight (so zero resale value, locked and disabled outside the agreed upon apps and initial school accounts)
kids bring em home... find they cant do half what they need. Oh sure they can login to submit their work, but they'd have to do the work on a fully working PC or mac, email it to the ipad , then submit it to the school websites... except half the time they couldnt do that since apple didnt allow apps to share files, and had no file browser.

Every year, they had to buy the new series. If kids broke any the previous year (they're kids, they broke tons) they couldn't be replaced with the original - making classes split between different tech, needing different instructions for the kids.
So that turned fast into every year, every kid in every class needs the new model ipad (i swear we did this with the Ti-8x calculators in my day too) at school prices - and if they managed to keep it working, they couldnt log out of the school account and use it for personal use anyway - and within a few years it'd suddenly stop getting updated apps and become useless for the school approved apps anyway.




My kid's 8, he's been trained (via gaaaaames) to use iOS, windows 7 10 and 11, Edge, chrome, firefox, discord, and basic file browsing/copy-pasting.

Their solution this year? One country wide app that only works on iOS and Android (phone only, no tablet, PC, web browser support) that is a glitchy fucking mess. I'm meant to download homework, print it have him complete it, and upload photos of it.


TL;DR: Tech companies play politics forcing sales of what they want sold, and when people grow up using it or spend their entire education using it - they're locked in for life.
 
^ Yep. Microsoft bought out Michigan State University back in 2000/2001. I spent my first year in CSE learning on Sun Unix systems. 2nd year it was "here's a free volume license of Windows XP, and Visual Studio" ALL new PC's in the entire engineering building. I was totally lost.

In other news - still no ETA on my PNY A2000 from ShopBLT ordered months ago. I cancelled my A4000 order the other day after it was pushed back to July. I'm content to sit and wait now, the A4000 is in stock everywhere else with prices crashing. $1600 a little over a month ago, and I'm seeing brand new Dell pulls for $1150 on e-bay now. NiB cards for $1250-$1300.
 
Available for 880EUR in Hungary, while 3060 starts at around 600EUR.

We had almost 10 years where if a part failed, you had to risk the second hand market, or buy a new mobo/CPU combo even if you had working parts.
We have a very strong second hand market here (much more reasonable prices than ebay), so it was never a problem.

School IT department? Every year, new purchases... saaaaame as last year. Except higher prices and more bundled garbage.
In the 3rd world, schools doesnt have such contracts. Instead it meant that their purchases werent become obsolote by the time they got the resources for another upgrade.
 
What I'm surprised by is no game in your benchmark suite requires more than 6GB of VRAM. Even at 4K. Is that normal? No games actually require more than 6GB of VRAM or am I missing something like the card not being powerful enough for the VRAM to be a limiting factor?
 
Yup. A lot of games will run on 6GB. They just don't run as well as they would a card with more VRAM.
I watched a few videos saying even 8gb cards throttle on some games at 4k with max textures like doom eternal. So I assumed 6gb would be a issue at 4k. Not that A2000 is a 4k card but with dlss it's possible to get a decent performance.
 
It really would unless you turn some settings down. The A2000 should really be considered a great 1080p raytracing card for small-form-factor PC's if using it for gaming. Though that is not it's primary intended focus by NVidia.
That's good to know thank you
 
Back
Top