DVI-D adapters - How do they work?

My mate is over at my place and was trying to connect to my projector that only has a VGA interface.
Anyway his video card only has two DVI-D connectors and one HDMI connector and he forgot to bring his VGA adapter that came with his card.

Anyway i grab out my DVI adapter and then discover it wont fit because its for DVI-I not DVI-D.

His adapter looks like this:
http://images.cecompass.com/productimages/D/DVI_D_VGA_ADAP/DVI_D_VGA_ADAP_B.jpg

Since DVI-D basically has no analog pins, im curious to know how his VGA adapter would work since its only a digital signal that is output.

Im reading two different stories here, the first is that using these adapters will work with most modern devices using the VGA connector, because such devices can detect if a digital signal is carried down instead of analog and switch to digital accordingly.
The second piece of information ive read appears to be that some video cards can send an analog signal down the DVI interface and automatically switch between analog and digital depending on the device connected.
I dont know if both pieces of information are correct or not, but i feel the latter seems rather correct.

Im also reading that such adapters are not working for alot of people, so it appears that not all video cards support them.

Either way its a real nuisance if i need to connect a VGA interface to such video cards.
It seems that only the newer video cards have gone this way. IDK what the reason is for using DVI-D, perhaps as a cost cutting measure, if there is an absence of a DAC chip on the board perhaps?

geek_nzoomed, Jun 27, 11:00 am

I've got a new GTX-980 with a DVI-I port so I don't think we've seen the last of them. It also has a DVI-D port which cannot be used with an adapter.

geek_stevexc, Jun 27, 2:49 pm

Manufacturers have to start phasing out VGA support sometime to force people to upgrade hardware.

geek_spyware, Jun 27, 7:30 pm



Skylake won't support vga i think. So no more vga cpu supporting from Intel.

geek_ross1970, Jun 27, 7:38 pm


But isint it all done with a separate DAC chip anyway?
VGA is such a widely used interface that pretty much all TV's have it.

Most laptops I buy still have a VGA connector too, it seems most people use it for presentations still, i think it will years before VGA goes away.

Anyway none of ths answers my questions as the video card in question is able to connect to VGA despite the interface not actually having any analog data pins on the plug.

geek_nzoomed, Jun 27, 7:51 pm

DVD-D digital - VGA analogue. A simple converter will not work. Notice how all DVI to VGA adapter have 4 pins around the big pin on the DVI side - a DVI-D plug does not have those PINs. The color channels on a VGA plug matches to those pins on a DVI-I plug - matches to nothing on a DVI-D plug.

geek_acura, Jun 27, 8:02 pm

Those adapters don't actually work for anything - there are a lot of different adapters and a blew which aren't a really useful in any way.

geek_vtecintegra, Jun 27, 9:36 pm

What model video card BTW?

Almost all cards still have a DVI-I connector (aside from the new AMD cards that have no DVI at all)

geek_vtecintegra, Jun 27, 9:38 pm

There is a cheap hardware chip to do the conversion. Same as in all the HDMI to VGA adapters. Usually powered from the HDMI port even.

geek_richms, Jun 27, 11:06 pm



Its an XFX R9-290, the adapter that was shipped with his card from factory works perfect, so there must be some hardware conversion done on the card, i dont know if any such chip is in the adapter itself that could be powered from the DVI port itself. Anyway i have seen adapters on Ebay that do require a power source from a USB port and i expect that these will do the conversion job properly, but i do find it very unusual that this card has no DVI-I port.

geek_nzoomed, 6 days, 12 hours

Share this thread