Choppy, stuttering video on new 4K computer monitor
July 6, 2020 7:14 AM   Subscribe

I recently bought a 32-inch 4K monitor (model ViewSonic VX3276-4K-MHD). I tried playing a bunch of 4K test videos that I found here, and they all play with noticeable choppiness. I tried Google Chrome, Edge, and Firefox. The same thing happens when I play a 4K video file using VLC Player. How should I fix this?

My Windows 10 computer is about seven years old, but the graphics card is newer (about three years old, I think). Here are the specs for the hardware:
Computer:
Operating System: Windows 10 64-bit
Hard drive: SSD
Processor: Intel(R) Core(TM) i5-2320 CPU @ 3.00GHz (4 CPUs), ~3.0GHz
Memory: 32GB RAM
Motherboard: ASUSTek Sabertooth Z77 LGA 1155
DirectX Version: DirectX 12

Graphics card:
Card name: AMD Radeon (TM) R7 360 Series
Manufacturer: Advanced Micro Devices, Inc.
Chip type: AMD Radeon Graphics Processor (0x665F)
DAC type: Internal DAC(400MHz)
Display Memory: 18378 MB
Dedicated Memory: 2024 MB
Shared Memory: 16354 MB
Current Mode: 3840 x 2160 (32 bit) (29Hz)
HDR Support: Supported
Display Topology: Internal
Monitor Name: Generic PnP Monitor
Monitor Model: VX3276-UHD
Native Mode: 3840 x 2160(p) (60.000Hz)
Output Type: HDMI
Monitor Capabilities: HDR Supported (BT2020RGB BT2020YCC Eotf2084Supported )
I found out that I wasn't using the latest graphics drivers, so I downloaded an installer from the AMD website, and now I'm current.

My Internet connection is a fast fiber optic line with download speeds over 400 Mbps and upload around 180 Mbps (tested using testmy.net).

I realize that I don't have a high-end gaming rig, but I guess I'm surprised that my existing hardware can't seem to handle 4K videos.

What should I do about this? Would a better graphics card solve the problem? If so, which card should I get (assuming that cost is somewhat of an issue)? I'm not a gamer, but I'd like to be able to watch 4K videos.
posted by akk2014 to Computers & Internet (22 answers total) 1 user marked this as a favorite
 
It's the video card - it may be able to display 4K images, but video at that resolution requires a more powerful card.
posted by pipeski at 7:30 AM on July 6, 2020 [2 favorites]


I'm not so sure about it being the graphics card. Yeah it's not going to render a game at 4k, but simply decoding and displaying a 4k video should work fine assuming the drivers are correctly using hardware decoding. At least I think; I don't have experience with that exact GPU.

I would blame drivers (and it falling back to CPU decoding) but you say you installed them. I think all your playback attempts have been via Youtube; either in the browser or using VLC's Youtube mode. Have you tried downloading a 4k video file and playing it locally?

(Also to eliminate the obvious; are you sure it's not a bandwidth / buffering problem?)
posted by Nelson at 7:44 AM on July 6, 2020


4K and above can be pretty taxing on cards - the videos are compressed really tightly to be able to stream, and use a different encoder than other videos (h.265 encoding.) Newer graphics cards have specific hardware for decoding. I ended up replacing my RX 480 (slightly newer than your card) to stop stuttering on VR videos which have a similar resolution problem.

You can make sure you have the right codecs by installing a codec pack like k-lite
posted by Wulfhere at 8:10 AM on July 6, 2020


Current Mode: 3840 x 2160 (32 bit) (29Hz)

You are currently running at a lower frame rate than the 60Hz that your monitor and video card are capable of. This is probably because the R7 360 does not support HDMI 2.0 (which is required if you want to do 4K at 60Hz over HDMI).

You can probably get a full 60 Hz and much smoother video if you can use DisplayPort instead of HDMI to connect your monitor to the graphics card. I found various forum posts that confirm this.
posted by mbrubeck at 8:19 AM on July 6, 2020 [2 favorites]


oh I missed that; that 30Hz is definitely bad and you want to fix that first.

(I believe that video card is capable of H.265 hardware decoding.)
posted by Nelson at 8:40 AM on July 6, 2020


Also, check that your TV isn't set up to do "motion smoothing" or similar. That will absolutely make video look awful, and will add so much latency as to make gaming impossible.

Cue rant about configuration defaults on consumer displays being a collection of all the worst possible choices.
posted by sourcequench at 9:28 AM on July 6, 2020


Response by poster: Switching to a Display Port cable did increase the frame rate to 60 Hz. So thanks go out to mbrubeck for that suggestion. The YouTube test videos still have very slight pauses in them, but maybe that's not caused by the graphics card. I downloaded a 4K .mp4 movie trailer and played it in VLC Player. It seems to play fine, but it's a bit hard to tell for sure, because there are so many rapid edits in the trailer. I need to look for a better video file to use for testing.
posted by akk2014 at 9:30 AM on July 6, 2020 [1 favorite]


Perhaps this might help: I suddenly experienced incredibly choppy graphics recently, and it was driving me crazy. I eventually discovered that the latest Windows update turned on "Game mode", which is supposed to optimize the PC for gaming, including frame rates. Definitely not the case for me! I turned it off, and everything went back to being nice and fast and not choppy. You can find it in Settings as a subsection of Gaming.
posted by odin53 at 10:01 AM on July 6, 2020


With a different monitor I mistakenly plugged in both the monitor display cable (the one that screws on) and an HDMI cable. Strange things happened. I see on the ViewSonic site, that your monitor has both inputs. Are both plugged in to your computer?

Just a wild chance since it happened to me.
posted by tmdonahue at 11:34 AM on July 6, 2020


That video card is an AMD 'Southern Islands' variant codenamed 'Tobago'. You may have purchased it recently, but the chip that powers it is a 2013-2015 vintage that would struggle with 4K. Newer drivers may have optimizations, but the bottleneck is the hardware.
posted by Jessica Savitch's Coke Spoon at 3:36 PM on July 6, 2020


Response by poster: @Jessica: Thanks. You're probably right. Do you have some recommendations on what I should be considering for a replacement card? I'm not a gamer.
posted by akk2014 at 4:48 PM on July 6, 2020


If you aren't wedded to ATI, Nvidia's GTX 1650 cards will decode H264 and H265 as well as anything else. You need not spend the extra money on the super variant if you aren't doing anything 3D heavy. If you are, something like the 1660 Super would be a more appropriate minimum, though you still won't get 4k60 out of it.

What you want is the lowest-grade card using the most modern chipset you can get so that it will support hardware decoding of modern codecs. The integrated Intel GPU you have only does H.264, though it will do it at 4k60, IIRC.
posted by wierdo at 10:17 PM on July 6, 2020


Piggybacking on wierdo's nVidia suggestion, the similar AMD/ATI card would be a Radeon RX 560/570 or such if you'd rather stay there. I can speak from experience that it will indeed push a 4k video at 60Hz just fine. YouTube will always be a bit stuttery, I think, though, just due to the overhead doing that sort of thing in a browser and the age of the CPU in there, though a fancier video card would help some I think.
posted by mrg at 9:41 AM on July 7, 2020


I just realized that I was a bit unclear in my answer. A GTX 1650 or any other card with a hardware decoder will happily decode and display 4k60 video using any of the supported codecs, even with your relatively old CPU. I'm pretty sure current versions of every browser support offloading video decode duties.

What it will not do is render modern 3D games at 4k60. The same goes for the ATI equivalents. (The 1660 Super is just barely enough to do 4k30 with middling quality settings, FWIW)
posted by wierdo at 11:41 AM on July 7, 2020


Response by poster: So, just to follow up: I bought a new card (the XFX RX 570 4GB GDDR5 RS XXX Edition PCI-Express 3.0 Graphics Card)... and the result was exactly the same. I see no difference between the new card and the old one. The 4K Youtube videos still play with a small amount of choppiness -- nothing horrible, but not what I was hoping to see.
posted by akk2014 at 5:16 PM on July 11, 2020


Try using youtube-dl or a similar tool to download the video file and try playing it in VLC. If that works smoothly, you are having a browser problem.

Also, Youtube 4k videos aren't always H.265. I believe all new 4k videos are, but some of the older ones use a different codec (VC-1 maybe? I can't remember right now) that isn't fully hardware accelerated on any PC hardware.

The latter issue would be distinguishable from a general browser problem by trying 4K Netflix videos, which AFAIK are all H.265.
posted by wierdo at 5:35 PM on July 11, 2020


I forgot to mention that the Windows 10 task manager has a page that will show the resource utilization of the various video card components and, at least on my system, breaks out the video decode and encode engines in separate graphs. It might be worth checking that to make sure the hardware acceleration is actually working on your system.
posted by wierdo at 5:38 PM on July 11, 2020


Response by poster: @wierdo: Here's a screenshot of my Task Manager window while I was playing one of the 4K Youtube videos. The CPU load hovered around 90 to 100%. Is that normal? Seems like the CPU is doing all the work, when the graphics card ought to be doing it.
posted by akk2014 at 9:52 PM on July 11, 2020


It looks like the hardware decoder isn't being used at all for whatever reason. The "stats for nerds" button on YT should show you what codec is in use. I bet it is a VP-9 video.

Having just looked it up on Wikipedia, it looks like the RX570 uses the Polaris core, which does not support VP-9 decode since it predates Video Core Next. You'd need a Navi card like the RX5500 to get VP-9 decode support on AMD. (or Renoir if you want 8k decode, though that seems unnecessary)

For completeness' sake and distant future Googlers, the Nvidia equivalent is called PureVideo. Turns out VP-9 decode began to be supported on the GTX 10xx series.
posted by wierdo at 10:56 PM on July 11, 2020


FWIW, the cheapest RX5500 I'm seeing on Amazon is $170. Even the cheapest GTX1650 non-super is $150 right now. GT 1030s are available starting at $83, but their 3D performance would be complete crap compared to an RX570, being a several year old budget budget card.
posted by wierdo at 11:12 PM on July 11, 2020


Response by poster: You're right about the VP-9 codecs. Here's the "Stats for Nerds" screenshot. Overall, I'm less worried about YouTube than about Netflix and Hulu. I was just using YouTube as a quick test of my new 4K monitor. If Netflix and Hulu work OK with that card, then I'll probably keep it. Maybe.

There is one other complexity here. I said I'm not a gamer, but there is one new game coming out that I wanted to play. It's called Peace Island, and it's an open-world exploration game where you're a cat exploring an island (I'm a huge cat-lover). When I run the alpha of that game, with the video quality set on "medium", it looks like I'm totally maxing-out the graphics card. You can see the Task Manager screenshot here.

I can't really buy a whole new computer right now, so I'm stuck with my seven-year-old motherboard and Intel i5 processor. I can return the existing RX570 card to Amazon and buy something better, but I'd prefer not to spend more than about $200. Is there another card I should consider?

Thanks for all your help.
posted by akk2014 at 6:17 AM on July 12, 2020


Best answer: My desktop PC is an Ivy Bridge i5, only a couple percent faster, at best, than what you've got and I've got no trouble playing modern games like Red Dead Redemption 2, Ace Combat 7, and a few others at 30+ FPS at 1080p with a GTX 1660 Super that cost $220 early this year.

It takes a complete monster of a video card to do native resolution 4k gaming in most games. That's why your video card is maxing out before your old CPU.

I'd try setting the game to 1440p or 1080p before worrying about a different card if you're OK with not having VP9 support. The difference in visual quality isn't actually that huge in most cases. In games that have resolution scaling options, I like to run the game at 4k with 0.5x scaling so the UI and HUD, where the difference in sharpness is most visible, will render at 4k but the actual 3D scene will render at 1080p so my framerates don't suck too bad. (In most games, I'm fine with even 30 fps. I prefer higher quality settings/more fancy effects over the extra frames or higher resolution)

Also, I'd expect Peace Island to get better in terms of performance as it matures. Rarely are alphas or even beta games well optimized, so they run slow compared to what they will in the future.

Digital Foundry did a great video sometime between December and February comparing the RX570, RX5500/XT, and the GTX 1650/1660/super cards that should give you a good idea of what the relative performance of the options are.

If the extra $20-30 isn't an issue for you, you might consider exchanging the RX570 for an RX5500 or 5500XT. The 5500 or 5500XT would be more future proof in the sense that they wouldn't be a huge bottleneck if you were to upgrade your motherboard and CPU to a recent 6 or 8 core Ryzen in the next year or two and would not be such overkill in your current system that you'd be wasting money.
posted by wierdo at 4:14 PM on July 12, 2020 [1 favorite]


« Older Getting help to be "out and about" more. But what...   |   Trying to conceive and choosing an OB. Newer »
This thread is closed to new comments.