dsc00639100660063orig

R.I.P. VGA: Nvidia's GeForce GTX 1080 dumps analog support, following Intel and AMD's lead

Nvidia appears to be joining the post-analog revolution. One notable item appears to be missing from Nvidia’s recently unveiled GeForce GTX 1080 graphics card: a DVI port with wiring for analog signals, also known as DVI-I. Instead, the GTX 1080 packs a digital-only DVI-D port. That means the reference card does not have native support for VGA, as first reported by TechPowerUp

If this is a sign of things to come, the end of native analog support would be a significant change for Nvidia. It’s easy to find versions of Nvidia’s current flagship card—the $1,00 Titan X—with a DVI-I port, for example. Graphics cards that natively support analog connections typically include either an actual VGA port, or a DVI-I port with a DVI-I to VGA adapter in the box.

If Nvidia doesn’t have any plans to continue supporting VGA it could mean we are finally coming to the end of a technological line that began nearly thirty years ago. VGA first came into existence in 1987 and has been a mainstay on PCs and monitors ever since.

HP’s 22cwa 21.5-inch 1080p monitor is modern, cheap, and rocking a legacy VGA option.

In fact, it’s not hard to find new flat screen LCD monitors still rocking a VGA port. Just type “computer monitor VGA” into Amazon’s search box and you’ll find a number of options for going analog. This is largely because there’s still a big enough demand for the legacy technology from enterprises and hobbyists rocking older peripherals such as projectors and monitors.

But time may finally be running out for VGA. Both AMD and Intel said they would end chipset support for VGA by 2015, with Intel's Skylake platform ending native VGA support. AMD went as far as to phase out even DVI support in its Fury graphics card lineup. Now it looks like Nvidia may be following suit.

It’s hard to blame AMD, Nvidia, and its partners from dumping support for legacy technologies. DVI is no longer under development and far more physically bulky than HDMI and DisplayPort connections. That means using these technologies automatically enforces some design constraints on newer cards.

The impact on you at home: Just because Nvidia’s reference cards are dumping native VGA support doesn’t necessarily mean it’ll actually disappear. If card manufacturers feel there’s high enough demand for analog they could add a DVI-I port to custom versions of the cards. But don’t count on it. When AMD did away with DVI last year, the custom graphics cards introduced by partners like Asus and Sapphire only added back a DVI-D port—which, of course, lacks native analog support.

IDG Insider

PREVIOUS ARTICLE

« Big data messaging with Kafka, Part 2

NEXT ARTICLE

35% off Logitech G230 Stereo Gaming Headset - Deal Alert »
author_image
IDG Connect

IDG Connect tackles the tech stories that matter to you

  • Mail

Recommended for You

How to (really) evaluate a developer's skillset

Adrian Bridgwater’s deconstruction & analysis of enterprise software

Unicorns are running free in the UK but Brexit poses a tough challenge

Trevor Clawson on the outlook for UK Tech startups

Cloudistics aims to trump Nutanix with 'superconvergence' play

Martin Veitch's inside track on today’s tech trends

Poll

Is your organization fully GDPR compliant?