As an interface, Video Graphics Array (VGA) cables were initially introduced to the market in 1987 by computer manufacturer IBM. Due to their widespread use, they were quickly adopted as the industry standard in personal computer graphics hardware. All PCs, display monitors, and graphics hardware produced after 1990 can be expected to be VGA-compatible, possessing VGA connections and capabilities as the most minimal and basic interface on offer. This makes it the basic standard in terms of video connectivity and signal reproduction between a computer and a monitor, supported across the vast array of machines sold. Today, however, VGA is considered somewhat outdated. This is largely due to the proliferation of digital signal transfer cables – particularly High-Definition Multimedia Interface (HDMI) and Digital Visual Interface (DVI) – serving as highly affordable, user-friendly alternatives that result in superior quality signal reproduction when compared to analogue interfaces such as VGA.
VGA cables are capable of only carrying a lo-res analogue video signal as opposed to the binary codes of the hi-res digital video signal carried by both HDMI and DVI – though it is possible to force a monitor to display a higher range of video resolutions through a VGA cable by utilising higher frequencies. Despite this, the visual quality of an analogue VGA signal directly corresponds to the quality and length of the VGA cable, with significant degradations in signal quality over extended periods of time and cable length. Unlike its high definition counterparts, this disparity in signal quality is made even more apparent when outputting higher resolutions through a VGA interface. As such, VGA cables are much more temperamental than both HDMI and DVI when it comes to signal reproduction – the quality of a VGA image is noticeably inconsistent due to variables such as cable length and manufacturer, as well as internal cable design.