Today’s video technology is moving away from analog (VGA) for a digital connection formats (HDMI, DVI, DisplayPort). However, there are still situations where you need to change the display technologies; usually due to differing ports on a computer and display. Video signals may be adapted in either direction (from DVI to VGA and VGA to DVI), but there are certain limitations and expense differences.

DVI to VGA

The DVI-to-VGA adapter is a cable or a small device (shown above). It allows a display device with a VGA connector (input) to receive an analog signal from a DVI-A (analog only) or DVI-I (analog and digital) connector attached to a computer’s GPU. The converters are inexpensive and are occasionally packaged with video cards or available at many online retailers for around $5.

VGA to DVI

There’s also the much less common scenario where a GPU on an older computer needs to send its signal to a newer display device with no VGA connector. In this case, a VGA-to-DVI converter is required to create a digital representation of the analog signal. These devices are harder to locate and often cost upwards for $100. If you find yourself in this situation, we recommend upgrading your video card or buying a new computer.

Signal quality considerations

Please note that moving between these two technologies may cause you to experience a loss of video quality. VGA, even in its highest resolution from SVGA, is only capable of an 800 x 600-pixel resolution. Therefore, if your computer is sending out a higher resolution DVI signal, it will be restricted by the lower resolution of VGA. Additionally, analog VGA signals are susceptible to a loss of quality based on how the cable was made.