Early adopters have been wringing their hands for months over the news that many Windows Vista-capable PCs and monitors may not play HD DVD and Blu-Ray content at their full high-definition resolutions--if at all--because the discs contain copy protection that requires special hardware to play. The good news: Vendors are finally beginning to ship that hardware. The bad news: Even now you have to choose your upgrades carefully, or you could end up with a bad picture.
HDCP for Your PC
People dramatically smarter than I am have already gone to great lengths to describe, discuss, and dissect the content-protection scheme of today's high-definition discs and how Windows Vista will handle them. Two of my favorites include articles by PCW's own Scott Spanbauer and the clever folks over at Ars Technica. If you want the nitty-gritty details, check out those articles.
Here's the simple version: To prevent evil pirates from duplicating their high-definition content, the studios implemented a content-protection scheme that will allow full-resolution playback of Blu-ray or HD DVD movies on your Vista PC only if you have a graphics card and monitor that include support for a technology called HDCP (High-bandwidth Digital Content Protection).
Here's another take: If you've recently spent a small fortune on a high-end graphics card (or two) and a giant LCD monitor (or two) in order to watch hi-def Blu-ray or HD DVD content later, there's a pretty good chance you're about to get very angry.
How angry? Full-resolution playback means the ability to view this content at the high-def standard resolution of 720p, 1080i, or 1080p. So if either your graphics card or your monitor lacks an HDCP-based digital output--instead using an analog VGA or component connection--the operating system will either a) downgrade the content to a much lower resolution such as 540p or b) block it altogether.
Feel free to howl in disapproval now. Trust me, you're not alone (this applies to HD televisions, too).
Now, to be clear, the Hollywood studios behind Blu-Ray and HD DVD haven't said if or when they'll actually enable the technology that blocks playback. In fact, its looks like most first-generation discs don't have the software turned on. According to Richard Doherty, research director for consumer research firm Envisioneering, the studios aren't about to set a firm date for implementation.
"I keep hearing 'Not anytime soon,'" he says, "but nobody will put a date on it." Doherty points out that about 30 million satellite receivers have their own content-protection technology built in, but nobody's turned it on yet.
He may have more faith in the industry than the average PC user. I'm with PCW's resident Blu-ray and HD DVD expert Senior Associate Editor Melissa Perenson, who says simply, "they put it in there for a reason."
Oh, and if you think you're going to do an end-run around this issue by sticking with Windows XP, you can forget about it. As Spanbauer points out, Windows XP's security and driver models lack the ability to support HDCP.
Finding HDCP Stuff
For some time now ATI and nVidia have been saying that their latest graphics chips offer "HDCP support." Unfortunately, as enthusiast site FiringSquad points out, just because a GPU supports HDCP doesn't mean the graphics board where it resides does, too.
Truth is, beyond a few isolated products, the June launch of boards based on nVidia's dual-chip GeForce 7950 GX2 package was the first across-the-board implementation of HDCP. And just last week several vendors announced plans to launch additional HDCP-enabled cards based on both nVidia chips and ATI chips.
As always, the new technology will appear first at the top end and then trickle down. For example, a spokesperson for board maker XFX says it will soon launch high-end 7900 GTX boards with HDCP support, while midrange and low-end boards will ship later.
Once you have the right type of graphics board, you need to connect it to the right type of monitor. To do so, you need to match the graphics board connector to that of the monitor. Currently, two graphics-card-to-monitor digital connections will support HDCP: DVI and HDMI.
The familiar DVI (Digital Video Interface) port is what you'll find on most of today's high-end graphics cards and monitors. Unfortunately, an HDCP-enabled port looks just like the garden-variety DVI port.
An HDMI (High Definition Multimedia Interface) connection should be familiar to owners of recent-vintage HD televisions. HDMI combines audio and video into one handy connection, but it's still fairly new on graphics cards and monitors.
Now it's time to find a monitor. Here again, things get complicated. For example, Dell's current generation of well-reviewed wide-screen monitors--including its 20.1-inch, 24-inch, and 30-inch models--all support HDCP. Unfortunately, none of its standard-width models currently support the standard. A Dell spokesperson says there's a simple reason for that: The HDCP technology wasn't ready when the company launched its current 17- and 19-inch standard-aspect-ratio monitors. The company will likely add the support to future revs of the monitors.
You'll find similar situations with other major vendors such as NEC, Samsung, Sony, and ViewSonic. So just because you know that one of the company's LCDs offers HDCP support, never assume its others product also support the standard. Be sure to read the fine print.
Regardless, once you've laid out the cash for your HDCP-enabled graphics card and monitor, and bought a $1000 Blu-ray drive, you're all set to watch HD on your PC. Now all you have to do is sit back and wait patiently for Microsoft to ship Vista...