- slide 1 of 5
Taking It Slow
As we have mentioned, the difference between VGA and DVI on its own is almost never worth upgrading more than a cable. The previous article explains the different kinds of DVI cable, and which type you will need. As you upgrade your equipment, however, you will want to move to the newer standard. DVI is backwards compatible with VGA, which makes your life a lot easier: you can upgrade to DVI one piece of equipment at a time. If you buy the right gear, and the right cables (again see the previous article): your DVI stuff will work with your VGA stuff just fine.
- slide 2 of 5
If you are shopping for a monitor, but your computer only outputs an analog graphic signal, don’t let it hold you back from going DVI. Just make sure you get a monitor that has a VGA port, or can at least accept analog input via a DVI-I port. Newegg lists this in monitor specifications, under Input Video Compatibility. Most monitors can accept analog input, but there are exceptions, particularly at the higher end, where manufacturers figure you are unlikely to plunk a grand on a monitor if you don’t have a recent graphics card. If the monitor doesn’t have a VGA port, you will need an inexpensive VGA to DVI-I cable or adaptor. Such an adaptor, which was included with a BFG graphics card, is pictured at right. One can just make out the little plastic caps, used to protect the connectors in transit, included on good adaptors and cables
Don’t skimp and buy a VGA only LCD monitor. You will regret it the next time you upgrade your graphics. There is also a theoretical possibility that an unscrupulous manufacturer could use DVI-I’s ability to carry an analog signal to offer a DVI equipped monitor that uses an analog VGA signal. I have never heard of this actually occurring, but keep an eye out for manufacturers you have never heard of and prices that compare too favourably to similar equipment. Make sure you don’t just have the support for your existing analog gear, but are really moving to digital, again something Newegg lists as Input Video Compatibility. You need “Analog RGB" to be backward compatible and “Digital" to be future proof.
- slide 3 of 5
What about the Other Way?
Going the other way is even easier. If you are hanging on to a CRT clunker or VGA-only LCD while buying a new graphics card (or computer, if you don’t use graphics cards), go crazy. For the time being, graphics cards still have Digital to Analog Converters (DACs). Even if the card doesn’t have a VGA connector, it can use DVI-I to send the analog signal. You will just need an inexpensive DVI-I to VGA cable or adaptor.
Actually, using that adaptor or cable is a good idea. Most graphics cards (about two-thirds) have two DVI-I connectors as opposed to including a VGA one. And those that have the VGA connector tend to be from the lower or older end of the spectrum. Using an adaptor greatly increases the selection available to you. Plus the adaptors are often included with graphics cards.
- slide 4 of 5
Another detail to check out, if you want to play high definition content on your computer, or there is any chance you might decide to in the future, is HDCP. High-bandwidth Digital Content Protection is a DRM scheme included on some high def media. If your equipment doesn’t support HDCP, it won’t be able to deliver the media at full resolution, or at all. Looking for HDCP capable monitors and graphics cards doesn’t significantly drive up price or reduce selection, so you may as well get it. Even if you never make use of it, three years from now, the family member you give it to or person you sell it to might appreciate it.
This wasn’t the case only a couple years ago. When HDCP showed up on media in 2006, it wasn’t in a lot of hardware on store shelves, let alone user’s desks. Many people found that their monitors and graphics cards would have to be replaced to play the new content, even very expensive recently purchased ones.
So even if your monitor or graphics card uses a digital signal, it may not support HDCP. While you will benefit from digital video in computing applications, playing HDCP content will require an HDCP card and monitor. That means that if you are already half-way to DVI, you can still get to digital in one upgrade, but getting HDCP capability may require a second.
For instance, your monitor is digital, your graphics card is analog. You buy a digital card and get the benefits of digital quality, but can’t play HDCP because your monitor doesn’t support it. The next time you upgrade your monitor, you can get an HDCP ready one. You are now DVI/HDCP capable provided you had the good sense to buy an HDCP graphics card.
In summary, you should look for HDCP and DVI support when you perform a monitor or graphics card upgrade. Neither standard restricts you to using only equipment that shares the it, so you can work the transitions in one upgrade at a time. You don’t get the extra functionality immediately when upgrading just one of the monitor or graphics card. But, for no or little cost, or reduction in selection, it paves the way to bring in the advantages when you upgrade the other component.
- slide 5 of 5
Up Next: HDMI
Before you fork over any cash, our next article reveals that HDMI is identical or slightly better than DVI at carrying digital video. Plus, HDMI is easier to use, smaller, and has many other features that make it more attractive. It hasn’t yet caught on to the extent of DVI, however, which can limit selection. And, there is no difference between it and DVI in terms of how your monitor will look. This means it is only worth going to HDMI in some cases, which we explain in the next article.
If, When, and How to Upgrade to DVI
VGA has almost been replaced by DVI, but DVI is losing ground to HDMI. We explain all the differences and tell you if, when, and how to upgrade to what depending on your needs and budget.