Sunday, May 8th 2016

NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support

NVIDIA appears to have done away with support for the legacy D-Sub (VGA) analog connector, with its latest GeForce GTX 1080 graphics card. The card's DVI connector does not have wiring for analog signals. Retail cards won't include DVI to D-Sub dongles, even aftermarket dongles won't work. What you get instead on the card, are one dual-link DVI-D, an HDMI 2.0b, and three DisplayPort 1.4 connectors. NVIDIA's rival AMD did away with D-Sub support on its high-end graphics cards way back in 2013, with the Radeon R9 290 series.
Add your own comment

81 Comments on NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support

#51
cdawall
where the hell are my stars
FordGT90ConceptIs there any DisplayPort KVMs that support MST?
Not that I have seen local
Posted on Reply
#52
FordGT90Concept
"I go fast!1!11!1!"
Startech has a dual DisplayPort KVM but it is literally dual DisplayPort (2 inputs per port, 2 outputs):
www.startech.com/Server-Management/KVM-Switches/2-Port-Dual-DisplayPort-USB-KVM-Switch-with-Audio-and-USB-20-Hub~SV231DPDDUA

MST is apparently broken when switching...probably because of the plug and play like why converters don't work so well switching. DisplayPort needs a keep-alive standard. I don't think any of the newer versions of DisplayPort add that functionality.
Posted on Reply
#53
webdigo
Anyone with a decent IT knowledge knows about the famous Sony FW900 CRT monitor. I still use this one. I use a high quaility vga to dvi cable. Where the VGA output is in the monitor, and gets converted to a dvi output for my gtx 760.
Will I not be able to use a 1080 card then?
Posted on Reply
#54
AsRock
TPU addict
zitheNice! Maybe Walmart will discover DVI cables and start stocking them as their pile of old tech in store now near their monitor aisle instead of stocking VGA cables when none of the systems they sell in store (That I could find) have VGA outputs
They do already ;), all so why do that and spend $6+ on a cable when just going to a second hand store or car boot \ yard sale will get ya one.
Posted on Reply
#55
cdawall
where the hell are my stars
webdigoAnyone with a decent IT knowledge knows about the famous Sony FW900 CRT monitor. I still use this one. I use a high quaility vga to dvi cable. Where the VGA output is in the monitor, and gets converted to a dvi output for my gtx 760.
Will I not be able to use a 1080 card then?
Nope.
Posted on Reply
#56
webdigo
cdawallNope.
I might be missing something. But the dvi-d output goes to the videocard, and then the vga output to my FW900. But the signal still gets converted to vga, so this is where the issue is?
Posted on Reply
#57
Dethroy
VGA should've been dead long ago. Nvidia should get rid of DVI as well.
Next step - get rid of HDMI. I know that's only wishful thinking and highly unlikely to happen. But we should have dreams, right!
Posted on Reply
#58
cdawall
where the hell are my stars
webdigoI might be missing something. But the dvi-d output goes to the videocard, and then the vga output to my FW900. But the signal still gets converted to vga, so this is where the issue is?
Right there is no magical converter. DVI-I is the one that had analogue signals in it. DVI-D does not.
Posted on Reply
#59
R-T-B
cdawallNo decent refresh rate monitor will be vga...
CRT begs to differ, and yes, it still does have fans out there. Limited market, but it exists.
Posted on Reply
#60
cdawall
where the hell are my stars
R-T-BCRT begs to differ, and yes, it still does have fans out there. Limited market, but it exists.
Still not 144hz...I am not anti-CRT, but the three people left with the viewsonic and sony 27" tubes will be fine on 980ti's...resolution isn't exactly high enough to not be maxed out by that card anyway.
Posted on Reply
#61
R-T-B
cdawallStill not 144hz...I am not anti-CRT, but the three people left with the viewsonic and sony 27" tubes will be fine on 980ti's...resolution isn't exactly high enough to not be maxed out by that card anyway.
Oh I agree. I was just pointing out that "decent" refresh CRTs do exist. (decent in my mind being like 72hz and higher).
Posted on Reply
#62
cdawall
where the hell are my stars
R-T-BOh I agree. I was just pointing out that "decent" refresh CRTs do exist. (decent in my mind being like 72hz and higher).
That's not a "decent" refresh rate lol. Throw away LCD monitors can do that.
Posted on Reply
#63
R-T-B
cdawallThat's not a "decent" refresh rate lol. Throw away LCD monitors can do that.
I guess I'm misinterpreting "decent." I'm going by that meaning "basically average/maybe slightly above average"
Posted on Reply
#64
Prima.Vera
webdigoAnyone with a decent IT knowledge knows about the famous Sony FW900 CRT monitor. I still use this one. I use a high quaility vga to dvi cable. Where the VGA output is in the monitor, and gets converted to a dvi output for my gtx 760.
Will I not be able to use a 1080 card then?
Yes, but you will need to buy a digital->analog converter box with the proper ports.
Something like this:
www.gefen.com/kvm/ext-dvi-2-vgan.jsp?prod_id=9569
or
www.monoprice.com/product?p_id=8214
Posted on Reply
#66
cdawall
where the hell are my stars
R-T-BI guess I'm misinterpreting "decent." I'm going by that meaning "basically average/maybe slightly above average"
Decent would be at least 100hz to me. Anything less is just plain Jane average
Posted on Reply
#67
Ubersonic
Solidstate89It's not "going the budget" route, it's removing support for a connector that hasn't seen widespread support in almost a decade.
It is about cost, they haven't technically even removed any connectors, the DVI-D port occupies the exact same space the DVI-I port did, the only difference is it's cheaper and offers less functionality as a result. As for your decade comment, even as recently as three years ago companies were releasing decent IPS 1080p screens with just VGA input which many users bought as second/third screens. This is why VGA still had widespread support with both vendors until 2013 and with Nvidia until now.
GoldenXThe problem with the digital standards is the competition between them, HDMI, DVI DisplayPort. VGA was the only analog output, digital ones should be merged into a single standard.
That's the idea with DisplayPort. DVI is a late 90's standard designed to replace VGA/component in computing, HDMI is a late 90's standard designed to replace SCART/component in television and DisplayPort is a 21st century standard designed to replace DVI and HDMI in computing and television.
Posted on Reply
#68
Frick
Fishfaced Nincompoop
cdawallDecent would be at least 100hz to me. Anything less is just plain Jane average
Your world view is skewed man.
Posted on Reply
#69
bobrix
UbersonicIt is about cost, they haven't technically even removed any connectors, the DVI-D port occupies the exact same space the DVI-I port did, the only difference is it's cheaper and offers less functionality as a result. As for your decade comment, even as recently as three years ago companies were releasing decent IPS 1080p screens with just VGA input which many users bought as second/third screens. This is why VGA still had widespread support with both vendors until 2013 and with Nvidia until now
So did they just ditched only dvi-i or ramdac too. If the ramddac is still there maybe there will be other versions with dvi-i and analogue lines.
I'am using vga for crt MB diamondpro at 110hz to play csgo and other fast fps as my second monitor.
Posted on Reply
#70
FordGT90Concept
"I go fast!1!11!1!"
DirectX Diagnostics claims my R9 390 has a RAMDAC but I doubt it does. If they get rid of DVI-A support, there is no reason to keep the RAMDAC as well.
Posted on Reply
#71
Kadano
Prima.VeraYes, but you will need to buy a digital->analog converter box with the proper ports.
Something like this:
www.gefen.com/kvm/ext-dvi-2-vgan.jsp?prod_id=9569
or
www.monoprice.com/product?p_id=8214
These are limited to 165 MHz pixel clock, just like mini-DisplayPort to VGA adapters (which are only 5-10$ btw). To drive a Sony GDM-FW900 at 1920x1200 at 96 Hz, you need at least 221 MHz pixel clock (probably 230 even).
For CRT users, I think the GTX 9XX series will be the one to keep. And concerning true contrast / black levels and input lag, CRTs are still state-of-the-art until OLED monitors are affordable (which will take no less than 5 years probably), so it'll be quite some time for me to hold out with my GTX 960.
Posted on Reply
#72
arbiter
about only way i can see people still using analog for video is some type of game capture but if by some small chance they are still using art monitor, probably need to be hit upside the head.
Posted on Reply
#73
GoldenX
You obviously don't live on an "emerging economy" country. All low end LCD/LED monitors here are VGA. The client doesn't want to pay the extra U$S 33 (aprox.) that buys you a DVI/HDMI screen.
Not all countries are first world economies.
Posted on Reply
#74
FordGT90Concept
"I go fast!1!11!1!"
arbiterabout only way i can see people still using analog for video is some type of game capture but if by some small chance they are still using art monitor, probably need to be hit upside the head.
#1 advantage to VGA: no HDCP.
Posted on Reply
#75
Solidstate89
GoldenXYou obviously don't live on an "emerging economy" country. All low end LCD/LED monitors here are VGA. The client doesn't want to pay the extra U$S 33 (aprox.) that buys you a DVI/HDMI screen.
Not all countries are first world economies.
They won't be buying high-end, modern graphics cards on those budgets either. So what's your point?
Posted on Reply
Add your own comment
May 19th, 2024 16:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts