Choosing a Video Card for a Widescreen HD 1080 Monitor

Page content

For more than 50 years, televisions and then computer monitors have been manufactured using the standard 4:3 ratio. This means that no matter how large the monitor was, its ratio of width to length was 4 to 3. Until quite recently, most people assumed that televisions and monitors had to be this dimension. The only clue the average person had that this was not the case was when viewing a movie in a theater versus a television. So called letterbox technology cut off the picture on the left and right of the screen to make it viewable on a 4:3 dimension television or monitor. The advent of widescreen monitors, now standard for new computer purchases, has opened a new era of computing by increasing the monitor’s viewable real estate and breaking out of the 4:3 standard.

Do You Need a Special Video Card for a WideScreen HD Monitor?

The short answer is no. The belief that widescreen HD monitors require special video cards derives from a combination of two sources. First, most Windows users know that you can change the display resolution by right clicking on the desktop, clicking on settings, and then choosing a resolution from a drop-down menu. What most users do not know is that the choices displayed in that drop down menu are what Windows, the Video Card manufacturer, and the Monitor’s manufacturer want you to see. The fact is, most video cards are capable of displaying popular widescreen resolutions such as 1680x1050 dpi and 1920x1080 HD which are 16:10 and 16:9 resolutions.

Video cards are capable of displaying many more resolutions than those that show up in that drop down menu. However, Windows, your graphics card manufacturer, and your monitor’s manufacturer do not want to confuse you with hundreds of choices. Those choices in the menu are the most common resolutions that your monitor is capable of displaying. It is actually possible to damage a monitor by choosing an incorrect resolution so these three players only show you the ones that are possible and the most popular.

If you have a 4:3 monitor hooked up to your computer, the menu will only show 4:3 resolution choices. This is why so many people assume that you need a special graphics card to use a widescreen monitor. These users assume that the choices in the menu are only those that the video card is capable of displaying.

The second reason has to do with marketing than with user misconceptions. Marketing campaigns run by computer manufacturers must cater to the lowest common denominator or the least computer-savvy consumers. Consequently, these campaigns use slogans such as “widescreen capable” or “widescreen ready.” In fact, just about any video card from the past 10 years or so is “widescreen ready.” These marketing slogans are designed to attract those consumers least knowledgeable about computer components. By slinging a computer as “widescreen ready,” these consumers are led to believe that you need something special to run widescreen resolutions.

Although it is beyond the scope of this document to discuss the technical aspects involved, both Windows and your graphics card are capable of displaying custom (user designated) resolutions. Many gamers choose custom and non-standard resolutions because they believe it gives them an advantage over other players when playing a multi-player video game. The truth is that your card can do far more than what Windows or hardware manufacturers tell you. Running a custom resolution simply involves adding the resolution to the drop-down menu.


Do not assume that what Windows and hardware manufacturers are telling you about monitor resolutions is the final word about what your graphics card can do. The two reasons discussed above show how the belief is perpetuated that a computer’s graphics card is limited to what Windows and computer manufacturers say it can do. The fact is, you probably need nothing special to run widescreen resolutions on your new widescreen monitor. Your graphics card is more than likely already ready to serve up 16:10 or 16:9 High Definition resolutions. Of course, checking with the technical details of your card will tell you for sure.