Computers and other media devices use video interfaces to connect to displays and provide content on-screen. There are multiple interface standards and this can cause some confusion. Should you go for HDMI? What if you want to enjoy 8K content? We take a look at the most popular interfaces and explain what they’re about, as well as why you should choose them.
Here’s a list of the more common interfaces found on monitors, graphics cards and other media devices:
This is a relatively new standard when compared to others in our round-up, and is able to comfortably output 4K content, and 8K if you’re rocking version 1.3 support. The latest release is 1.4, which introduced numerous improvements and really pushed the standard further into supporting higher quality content. It has also become a common alongside HDMI on various consumer devices.
As well as being able to power content at super-high resolutions like 7680×4320 (this is 8K, folks) at 60Hz, it’s also possible to “chain” multiple displays to a single DisplayPort outlet thanks to Multi-Stream Transport (MST). You’ll need to keep an eye on the maximum resolution output for the specification of DisplayPort you have support for. For example, for DisplayPort 1.4 you could drive two 4K displays with only one hook-up.
DisplayPort is also utilized by AMD for FreeSync, which is another bonus for gaming and media consumption. We’ve gone into some detail about AMD FreeSync (as well as NVIDIA G-Sync), but it essentially aids in reducing screen tearing and input lag by synchronizing both the display and graphics card. DisplayPort can be harder to find on monitors and TVs than HDMI, which is more widely available and supported, but it’s considered by many to be the go-to interface.
Bottom line:DisplayPort is perfect for those who require support for Freesync/G-Sync for gaming, or are looking to future-proof their setup for 4K and 8K experiences.
High Definition Multimedia Interface (HDMI) is arguably the most popular video interface used today. It’s on TVs, video game consoles, most graphics cards and laptops. It can power 4K content, much like DisplayPort, but refresh rates depend on which version of HDMI your device sports. HDMI version 2.0 brought increased bandwidth back in 2013, allowing for more data to be transferred between devices, while 1.4 caps at 4096×2160 (4K) at 30Hz.
HDMI 2.0 not only brought 4K at 60FPS, it also enabled advanced color depth for HDR content, which is set to be the next big hit for gaming and media consumption with the Xbox One S. Most new products ship with HDMI 2.0, while version 2.1 is expected to further enhance HDR and UHD experiences — though you will, of course, need a display that supports HDR color depth and UHD resolution to take advantage of these technologies.
FreeSync can also be supported over HDMI, so long as you have a monitor that supports it. Be sure to do a little digging on your next monitor to make sure what you’re investing in has said support.
Bottom line: Generally speaking, HDMI is your best bet to have peace of mind that it’ll be compatible with everything else you plan to hook up. It’s not quite as advanced as DisplayPort, but it’s pretty much found on all TVs, graphics cards and other products.
Digital Visual Interface, better known and referred to as DVI, is a common connection used for hooking up a PC to a monitor. It’s generally considered as the successor to the older VGA standard. To make matters worse for consumers, there are actually numerous versions of DVI:
- DVI-A — Analog signal
- DVI-D — Digital signal
- DVI-I — Integrated digital and analog signal
You’re more likely to find both DVI-D and DVI-I in graphics cards and monitors of today, since the older DVI-A isn’t really up to scratch anymore. These two versions of DVI can also come in both single- and dual-link variants, which dictate just how much data the cables can transfer. Single-link tops out at 1920×1200 resolution and dual-link bumps things up to 2560×1600. DVI is still popular, but is slowly being phased out in favor of HDMI and DisplayPort.
Bottom line: Looking to hook up multiple monitors and don’t need to go beyond 1440p? DVI-D and DVI-I are more than enough, though you’ll need to switch to HDMI or DisplayPort for 4K and above.
VGA is super old. Video Graphics Array is a format that has been around since the days of CRT monitors and has been faded out in most instances in the home. You’ll only really be able to achieve a maximum resolution of 1920×1080 and many modern graphics cards don’t even have VGA ports anymore (though your motherboard may have one for diagnostic and backup purposes).
1080p isn’t the ceiling for VGA in a technical sense, however. Depending on connectors and cabling, it’s possible to achieve higher resolutions, but at reduced quality compared to other interfaces. Luckily, if you happen to be stuck with a VGA output on a motherboard or input on an older monitor, there are converters available that takes the VGA format and turns it into a more widely supported HDMI or DisplayPort.
Max. resolution: 2053 x 1536*
Bottom line: Don’t use VGA, unless you absolutely have to. It’s still a useful standard for connecting older devices and output displays/projectors.
* Depends on components and cabling deployed.