• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX A2000

School students here? Forced to buy laptops and iPads (yep, MUST be apple) - and if intel release shit, you get shit for your money.
WTF is this shit? Somebody should sue them ASAP. Outside of It class student's don't need any computer at all and definitely shouldn't be lugging one either. And any attempt to force students into some single brand should guarantee some jail time and permanent removal of permit to work in education.
 
WTF is this shit? Somebody should sue them ASAP. Outside of It class student's don't need any computer at all and definitely shouldn't be lugging one either. And any attempt to force students into some single brand should guarantee some jail time and permanent removal of permit to work in education.
Try this one from the UK
Student Punished for Arriving With 93 Percent Battery in Her iPad at UK School (news18.com)

(But lets keep this on topic, my fault for derailing originally)
 
WTF is this shit? Somebody should sue them ASAP. Outside of It class student's don't need any computer at all and definitely shouldn't be lugging one either. And any attempt to force students into some single brand should guarantee some jail time and permanent removal of permit to work in education.

People have been forced into a brand of calculator and other items for decades. That's how it works. The school standardizes on item X and you need to go out and get it.

Like it or not the iPad has taken over the tablet market. And for higher end art classes, compsci, or straight science the macpro is what's supported at most schools now and Windows laptops means get thine ass back to the lower level classes with the stupids.
 
Last edited by a moderator:
So apparently I forgot to cancel one of my RTX A2000 orders. It showed up in the mail the other day. $582 new-in-box, figure I'll go with it.
And yeah, it's twice as fast as an RTX 3080 w/ reg hack in SolidWorks 2022 w/ FSAA and Enhanced Graphics option turned on.


5800X_16GB_RTX3080 Game FSAA_on EG_on.png
5800X_16GB_RTXA2000_FSAA_on EG_on.png
 
So now I have had the pleasure of trying out rtx a2000 this weekend. A pleasant experience.

I have only tested 1 game ( Wolfenstein ii: the new colossus) so far as I have only had time for gaming and testing this weekend. But I will share my personal experiences.

Performance is definitely good for the size and form factor. I have compared it to my old gtx 1650 with gddr5 memory and low profile cooler and 75 watt tdp rating. So the slowest og gtx 1650 version, but since that card is with low profile cooler and similar tdp rating. It's a very fair comparison.

So with game settings 1440P and medium settings, my GTX 1650 could barely manage 90 fps at best. I could not go higher do to vram limit. Rtx a2000 at 1440P and all settings at high or ultra. It manages at least t 110 fps and hovered between 120 and 130 fps the most and up to 145 fps at best. So a significant jump with even 5 watt less used.

Fan noise was quite pleasant as well. Ilde is 3000 rpm and at 100 % fan speed, my card goes to 6500 rpm. In normal use the settle in at around 4000 rpm or 50 % fan speed and temperature at around 72 to 75 degrees Celsius. You can manually lock the fan speed to a given speed it will keep. From 30 % to 100 % fan speed.

Overclocking is quite good. In the game I tried. I dial in 280+ for gpu core clock and 1300+ for memory in msi afterburner. That results in gpu clock jump from 1170 mhz to between 1300 and 1350 mhz just by oc it despite power limit. In an area I tested where I at stock got 130 fps, the oc raised that fps to 141. So there are some oc potential in this card. I might have to dial this down as it might not be stable in other games.

Power target can be a justed. From 100 % and down to 14 %. Or from 10 watt to 70 watt. But despite that. The card can't be limited to less that than 45 watt or after around 60 % or lower power target. The card will keep pulling 45 watt. But you can still save around 25 watt of power if you lower the power target. You can't go above the 70 watt. It's locked.

The good thing
Fast for a mini-itx card size.
Very power efficient
Despite the high ilde fan rpm, ilde is close to silent. At 4000 rpm, you can hear it, but it's far from annoying.

The bad thing
It's expensive
I would have wished nvidia would have used all 75 watt pcie can deliver to give the card as much power headroom as possible.
8 GB vram had been more fitting for this card. At least when we talk about gaming.
6 gb vram is still a limiting factor for this card.

So if you can live with the price and you need a small card for your mini-itx build. Yes I will recommend the card. It's definitely going to be a nice upgrade for me over gtx 1650.
 
Thanks for the report, that was great. I have a slot power only GTX 1050 Ti and it prefers to draw about 68.5W max though it will occasionally stray to 69W and very rarely will momentarily touch 70W. I assume your A2000 does the same. With my card, it's already exiting it's most efficient range so those extra W wouldn't make much of a difference but your card is likely operating well within its max efficiency range so every extra W would have been a nice boost. Even so, the times when those extra FPS will be actually noticeable in-game will probably be quite rare and +280/1300 is already pretty nice.

What CPU/Mobo are you using? Mine is just in an Optiplex 9020, i7-4790, 16GB Dell RAM so losing any little efficiency on my 1050 Ti is a minor thing as the rest of the system fails to be beastly.
 
Thanks for the report, that was great. I have a slot power only GTX 1050 Ti and it prefers to draw about 68.5W max though it will occasionally stray to 69W and very rarely will momentarily touch 70W. I assume your A2000 does the same. With my card, it's already exiting it's most efficient range so those extra W wouldn't make much of a difference but your card is likely operating well within its max efficiency range so every extra W would have been a nice boost. Even so, the times when those extra FPS will be actually noticeable in-game will probably be quite rare and +280/1300 is already pretty nice.

What CPU/Mobo are you using? Mine is just in an Optiplex 9020, i7-4790, 16GB Dell RAM so losing any little efficiency on my 1050 Ti is a minor thing as the rest of the system fails to be beastly.
Somebody correct me if I'm wrong, but I think those cards only touch South of 70 W even with an official TDP of 75 W to accommodate any potential power spikes (70 W is an average - not maximum). If you had an average consumption of 75 W with spikes of up to 80-85 W, the PCI-e slot wouldn't be able to handle it. By spikes, I mean the kind that last milliseconds and so can't be detected by software.
 
Somebody correct me if I'm wrong, but I think those cards only touch South of 70 W even with an official TDP of 75 W to accommodate any potential power spikes (70 W is an average - not maximum). If you had an average consumption of 75 W with spikes of up to 80-85 W, the PCI-e slot wouldn't be able to handle it. By spikes, I mean the kind that last milliseconds and so can't be detected by software.
Indeed, graphics card mfrs have to be very careful when pulling power from only the PCIe slot as more could damage the mobo. So much more headroom and leeway with a PCIe power connector.
 
Thanks for the report, that was great. I have a slot power only GTX 1050 Ti and it prefers to draw about 68.5W max though it will occasionally stray to 69W and very rarely will momentarily touch 70W. I assume your A2000 does the same. With my card, it's already exiting it's most efficient range so those extra W wouldn't make much of a difference but your card is likely operating well within its max efficiency range so every extra W would have been a nice boost. Even so, the times when those extra FPS will be actually noticeable in-game will probably be quite rare and +280/1300 is already pretty nice.

What CPU/Mobo are you using? Mine is just in an Optiplex 9020, i7-4790, 16GB Dell RAM so losing any little efficiency on my 1050 Ti is a minor thing as the rest of the system fails to be beastly.
Yes rtx a2000 is pulling the exact same wattage as your gtx 1050 ti does. Around 67 to 69 watts.

I am using the card in a dual system. Meaning two computers in 1 case. A2000 is hooked to a Ryzen 5 5600X cpu and a Asus rog strix B550-I gaming mini-itx motherboard and 32 GB DDR4 ram in a custom dual system built. You can see my system under "project logs".
 
I'm trying to determine if I can use this card to drive two 5k displays (with TB3 inputs). The review says 4x 4k are supported, but this PDF from Nvidia says 4x 5k60 is ALSO supported. Which one of those is mistaken, the article or the info sheet? Anyone have any real-world experience with 5k on this card?
 
I'm trying to determine if I can use this card to drive two 5k displays (with TB3 inputs). The review says 4x 4k are supported, but this PDF from Nvidia says 4x 5k60 is ALSO supported. Which one of those is mistaken, the article or the info sheet? Anyone have any real-world experience with 5k on this card?
trust your nvidia source, it says this
1668151717719.png


the fine print says you'll need two DP 1.4a monitors to do it
 
the fine print says you'll need two DP 1.4a monitors to do it

Well, I can't find a 5k display that I like and can afford two of. Apple Studio Display is *baller* but doesn't really play that nicely with windows/pc. LG Ultrafine is… not well made, even though the panel is great. I'll just use the LG ultrawide 5k2k I already have. It's really nice, even though it doesn't have the 220 ppi of a good 5k display.

What I really want, is a 37" diagonal 7680 8k wide by 2880 3k tall 21:9 display, at 220 ppi. That would be PERFECT. But, this does not exist, as far as I can tell...
 
Why 5k? Is 4k not a large enough resolution for what you do?

Video editing maybe?
no video editing. I just really like big high dpi screens. Ideally 220 dpi.

For a mid 30s-ish ultrawide, at 220 dpi, do some math, and you end up at 8k3k. I’d pay $3-4k for that. (I mean preferably less, but lots of pixels costs $$).

integrated gpus can now drive full 8k over DisplayPort 1.4, so it’s just cost and market problems.
 
no video editing. I just really like big high dpi screens. Ideally 220 dpi.

For a mid 30s-ish ultrawide, at 220 dpi, do some math, and you end up at 8k3k. I’d pay $3-4k for that. (I mean preferably less, but lots of pixels costs $$).

integrated gpus can now drive full 8k over DisplayPort 1.4, so it’s just cost and market problems.
Fair enough. If you want high DPI, get yourself a 32" 4k display. That should satisfy you.
 
Fair enough. If you want high DPI, get yourself a 32" 4k display. That should satisfy you.
Doubt it. The only thing that comes close to what he's wanting is the 32" 6K Retina Display from Apple or the 8K screen from Dell. The rate things are going I don't think we see the monitor he wants within the next 4 years, if at all. And if you're buying $4000 monitors, why are you dinking around w/ a $450 entry level workstation card? :p

*edit* almost forgot, found this neat little calculator for Size/Resolution/PPI
 
Doubt it. The only thing that comes close to what he's wanting is the 32" 6K Retina Display from Apple or the 8K screen from Dell. The rate things are going I don't think we see the monitor he wants within the next 4 years, if at all. And if you're buying $4000 monitors, why are you dinking around w/ a $450 entry level workstation card? :p

*edit* almost forgot, found this neat little calculator for Size/Resolution/PPI
My point was that for the size and resolution it'd would be close enough and good for what they want. Though a 27" 4k display would be closer. The only display that will get really close would be the following;
27" 5120x2880 at 217DPI
https://toolstud.io/video/screensiz...unit=inch&resolution_w=5120&resolution_h=2880
But it's also $1300 and the pixel refresh rate is 14ms! I wouldn't touch that if they paid me..
 
Last edited:
32" 4k is great

I run at 150% scale to get exactly (and it is mathematically exact) the same size as my 1440p 32" display

At 100% scale everythings too damn small to read and use anyway, higher DPI doesnt benefit you at that level (And its why 28" and smaller 4K displays are useless, you cant use them without the high DPI scaling negating a lot of the benefits)
 
This would work great for a dedicated streaming/broadcasting computer setup, with Wirecast, Overband, Streamlabs or Vmix software.
Now it's the turn of professional graphics AMD Radeon PRO W6400 or the news W7800
 
Last edited:
Can i use MSI afterburner with this card and undervolt it?

From what i tried the fan control works, but I can't undervolt - it does nothing.
 
Can i use MSI afterburner with this card and undervolt it?

From what i tried the fan control works, but I can't undervolt - it does nothing.
Hardware mods exist for voltage control, i'm unsure about software.
You've ticked "unlock voltage control" in afterburner, and you've tried using a curve with Ctrl-F?

(On a 3090 i dont need that ticked for the curve to work)
 
Back
Top