VGA is short for Video Graphics Array. A VGA port uses a 15-pin connector, usually located on the back or side of a computer. VGA is the standard interface used to connect a monitor or projector to computers.
The VGA standard was originally developed by IBM in 1987 and was the replacement for the MDA, CGA and EGA standards. To understand the need and importance, at that time, of the VGA video card, let’s see a summary of the previously used standards:
- The MDA or monochrome display adapter standard allowed only monochrome text and a resolution of 720/350 pixels, it was used for non-graphics applications;
- Standard CGA or color graphics adapter supported colored text but with 320/200 pixel resolution;
- Standard EGA or Enhanced Graphics Adapter had 16 colors and could serve 640/350 resolution graphics.
VGA allowed the display resolution of the time to be increased to 640×480 pixels, with 256 colors available and a color palette of 262,144. Shortly thereafter, VGA resolution rose to 800X600, becoming the setting for Windows operating systems until the 2000s.
Currently the VGA standard has been replaced by SVGA and in modern computers by DVI and even by HDMI that allows better resolution. Many computers still come with a VGA port, there are signal to VGA converters in case you need such an output and your computer doesn’t have one.
Similar is true of modern LCD or LED flat panel monitors and televisions, many of which have a VGA input connector available as one of two or three inputs used to connect to a computer.