Introduction
You want to know a secret? None of the current ATI or NVIDIA graphics cards will support the full capabilities of Windows Vista.
But let’s start from the beginning. This story starts with my upcoming LCD Monitor Round-Up. As you know, a good monitor should last several years and outlive every other component in your PC, other than perhaps a keyboard or a mouse. So, when it came time to do another review of LCD monitors, my attention turned towards “Windows Vista-ready� monitors: those with HDCP. After all, it makes no sense to recommend a monitor that will go obsolete in just a few months.
At the time I started my article, there were only 10 PC monitors with DVI/HDCP support (we’re reviewing 5 of them). I was disappointed, but what was surprising is that many of these monitor manufacturers weren’t advertising their HDCP support. For monitors, HDCP support is the most important feature for having a “future proof� solution.
What is HDCP?
HDCP stands for High-bandwidth Digital Content Protection and is an Intel-initiated program that was developed with Silicon Image. This content protection system is mandatory for high-definition playback of HD-DVD or Blu-Ray discs. If you want to watch movies at 1980x1080, your system will need to support HDCP. If you don’t have HDCP support, you’ll only get a quarter of the resolution. A 75% loss in pixel density is a pretty big deal – Wouldn’t you be angry if your car was advertised as doing 16 mpg, and you only got 4 mpg? Or if you bought a 2 GHz CPU and found out that it only ran at 500 MHz?
As part of the Windows-Vista Ready Monitor article, I was going to publish a list of all of the graphics cards that currently support HDCP. I mean, I remember GPUs dating as far back as the Radeon 8500 that had boasted of HDCP support.
Turns out, we were all deceived.
Source: *www.firingsquad.com/hardware/ati_nvidia_hdcp_support/default.asp
So for all of you that went out and splurged on a big graphics card...well...think again... this is a serious concern..