Need Video Card for...Video!?
October 26, 2008 1:37 PM   Subscribe

What's the best video card to get for xvid/divx and h.264 decoding?

Long story short: I've given up on Cable TV, most of what I watch now is either streamed from hulu and netflix or downloaded from the internets. Getting a faster connection has improved my streaming performance, but watching ostensibly "HD" h.264 videos has been somewhat less than satisfying. There's still a lot of noise, and sometimes visible interlacing.

I'm using Media Player Classic under Vista Ultimate for the most part, although I've got VLC and WMP and Quicktime and everything else on this machine. I'm running a GeForce 7600GT with component out on a Core 2 Quad machine.

I've heard here and there that newer videocards "help out" on video decoding, but I don't really need it to free up CPU time or anything, but instead improve the quality of the output video. Games are less of a concern, I'm more or less a full-time Xbot now, so if the Geforce 9600 does it as well as the 9800 GTX+ or whatever, I'd rather save the money. Also, if I'm exporting to a LCD HDTV, would simply going from a component video connection to a HDMI connection solve my problem?
posted by Oktober to Computers & Internet (7 answers total) 2 users marked this as a favorite
 
Best answer: I'm not sure that a video card is really what you're after. As far as I know, offloading the decoding to the video card only helps in terms of CPU utilization, not video quality. If you're getting interlacing and noise while watching on a computer, it sounds like the video was just improperly encoded.

I recently upgraded to a budget C2D 2.0Ghz system, and was surprised to find that it still couldn't decode 1080p in software without stuttering. Given that my old P4 could do 720p without a problem, I didn't think 50% more resolution would be that taxing for a much faster, dual-core CPU.

I ended up buying a Radeon 3650 and am using it in conjunction with Media Player Classic Home Cinema, which is a fork of the no-longer-developed original. It supports hardware x264 acceleration for Radeon HD and GeForce 8/9000 series cards, as well as a host of other improvements with HTPC users in mind. I'm now playing 1080p Blu-Ray re-compresses with 0% CPU utilization.
posted by gngstrMNKY at 2:07 PM on October 26, 2008


Response by poster: I don't think there's a magic bullet here, really. I figure that by making tweaks at every level (MPCHC instead of MPC, HDMI instead of Component, hardware decoding instead of soft...) I'll improve the overall quality.
posted by Oktober at 2:40 PM on October 26, 2008


Is your directly outputting your LCD's native resolution?

Hardware decoding instead of soft should make exactly zero difference to the video quality, except for frame rate if your CPU can't keep up. A frame with a certain set of bits decoded in hardware has those same bits decoded in software.

I expect that mostly you're just seeing the limits of low-bitrate HD. The stuff you download or, even more so, stream is usually cut down so the bitrate is down to 10Mb/sec or (much) less, instead of the 20Mb/s you get from OTA HD or 48Mb/s you can get from a blu-ray disc.
posted by ROU_Xenophobe at 3:41 PM on October 26, 2008


Response by poster: Yeah, for the most part, I'm watching 1280x720 anime episodes, that are being exported to a 720P LCD.
posted by Oktober at 3:52 PM on October 26, 2008


Best answer: The deal is: If you have the correct H.264/MPEG-2/VC-1 decoders set up to use DXVA, high-end video cards can peform hardware deinterlacing (and occasionally IVTC) that is superior in quality to what you can accomplish with software deinterlacing at playback. That said, properly encoded material 720p shouldn't have interlacing artifacts, period, and high-quality software deinterlacing done before re-encoding should in most cases be indistinguishable from hardware decoding quality on playback.

Anime is a special case because of the weird field patterns and mix of source frame rates. If your source material has been resized before deinterlacing and/or is flagged as progressive, no video card will fix the problem.

IMO, you don't have a hardware deficiency, but some of the material you're watching was improperly processed.
posted by Inspector.Gadget at 4:07 PM on October 26, 2008


You're unlikely to see any improvement over CoreAVC (a fast software decoder) with a high end graphics card, which isn't even necessary for hardware (DXVA) h.264 decoding - the most recent integrated chipsets (780G motherboards) can handle this, as can the last generation or two of low-end (ex. Radeon HD 3450, <>
It sounds like your quality issues concern lower-quality source files; the way to improve this material is to make use of post-processing filters to mask these artifacts. The simple way to do this is to insert FFDshow into the playback chain; people at AVSForum have come up with much more elaborate methods, however.
posted by unmake at 8:33 PM on October 26, 2008


Artifacts in video are almost always going to be from the source material, the encoder's fault (improper deinterlacing) or low bitrates. Film grain can be a desired effect. Your decoder is not the problem. Other than deinterlacing (which is hard to do properly - especially in realtime), there's nothing you do to improve quality.

If you're having problems with 1080p playback, a much cheaper solution ($15) would be to buy CoreAVC for decoding. It is very fast and can use multiple CPU cores, unlike x264 at the moment. I see 50-60% CPU usage on an original Core 2 Duo.
posted by easyasy3k at 8:35 PM on October 26, 2008


« Older Move this Aussie from down under to up yonder.   |   Good inheritance lawyer in NYC? Newer »
This thread is closed to new comments.