Difference Between VGA and DVI
Video Graphics Array or VGA and Digital Video Interface or DVI are the two methods of connecting your monitor into your computer. The main difference between these two is that VGA is an analog standard while DVI is digital.
DVI is next logical step for connecting your computer and monitor. Video signals are originally digital signals but are converted into analog before leaving the GPU of your computer through the VGA port. VGA was created because all of the monitors at that time are based on CRTs which were analog in nature. Instead of transmitting digital data and having the monitor convert it to an analog signal, having the GPU convert it to analog before transmitting was the more economical route.
The advent of LCDs meant that the data is digital at the origin and at the destination but it needs to be converted into an analog signal since VGA was the standard interface at that time. This created an additional load on both the graphics card and the monitor. The conversion could also lead to inaccurate displays especially in LCD displays where it needs to be converted back to a digital signal. Certain pixels might not appear as they should due to the multiple conversions done on the data.
Later on, DVI was added to most LCD monitors and graphics cards to allow the digital data to be transmitted without modification or conversion. This meant that each pixel on the LCD display would appear as the computer intended it to be since there was no conversion involved. Soon DVI would be widespread enough that it would totally replace and make VGA an obsolete port.
The digital nature of the information passing through the DVI cables also meant that there is a lesser probability of the signal getting distorted as it passes through. Digital signals are discrete in nature and minor changes would not affect the final outcome of the data. The analog signals in a VGA cable can be distorted especially when the cable is not shielded properly, this can lead to screen flickers of banding on the monitor. Despite being superior to VGA, DVI cables still need to be within the maximum length so that data loss does not occur.
1. DVI is digital while VGA is analog
2. VGA is for CRT monitors while DVI is best for LCD monitors
3. Using VGA for LCD monitors leads to multiple conversions that might slightly alter the final image
4. DVI is newer and would soon make VGA obsolete
5. Both DVI and VGA cables are still limited to a maximum length
Search DifferenceBetween.net :
Email This Post : If you like this article or our site. Please spread the word. Share it with your friends/family.
- Difference Between VGA and QVGA | Difference Between | VGA vs QVGA
- Difference Between DVI and Analog | Difference Between | DVI vs Analog
- Difference Between DVI and D-Sub | Difference Between | DVI vs D-Sub
- Difference Between DVI and AGP | Difference Between | DVI vs AGP
- Difference Between XGA and VGA | Difference Between | XGA vs VGA
Leave a Response