Photos, Videos, Stills, and Captures -- Are they same resolution?
December 4, 2022 8:36 AM Subscribe
I have a Samsung S22+. I can take a picture. I can make a video and while making a video take a still. I can also take the video and then in Google Photos export stills from the video. Are these equivalent resolution and quality or are they different? How do they compare?
It occurred to me recently that given the number of times I miss a shot maybe I should just get videos and then export stills, but when I tried that I'm not sure pics were the same resolution. In particular i thought faces were kind of weird and pixelated. But maybe it was lighting or cropping or something? I'm about to do my son's xmas portraits. Can I video and do stills while videoing and then use still videos for extra shots or will that be worse quality than just taking stills and risking missing a perfect moments?
It occurred to me recently that given the number of times I miss a shot maybe I should just get videos and then export stills, but when I tried that I'm not sure pics were the same resolution. In particular i thought faces were kind of weird and pixelated. But maybe it was lighting or cropping or something? I'm about to do my son's xmas portraits. Can I video and do stills while videoing and then use still videos for extra shots or will that be worse quality than just taking stills and risking missing a perfect moments?
Best answer: Default S22 photo resolution is exactly equal to 4K video (though you can set it to up to 9 times more pixels) to save space, but yeah, photos will be much less optimised. The photo settings and algorithms do a lot of heavy lifting in modern phone cameras. Try burst mode instead.
posted by I claim sanctuary at 8:53 AM on December 4, 2022
posted by I claim sanctuary at 8:53 AM on December 4, 2022
Best answer: Another factor: video is compressed very differently from stills. I'm not deeply familiar with what current-generation codecs are doing, but at a high level, lossy image compression, for video or stills, relies on visual tricks that exploit the limits of human perception to reduce the amount of data needed to reproduce the image. Still image compression (like JPEG) uses techniques that are optimized for a single still image, since that's the only data the codec has. Video compression, though, often uses the relationships between frames to identify and remove data, but in ways that a viewer won't notice when played back normally. If you take a still from a compressed video, some of those compression artifacts may become more visible, since you're staring at the one frame for a long time rather than as part of a sequence of images where the defects in any single frame get lost in the larger motion. Still frames from compressed video will not always look terrible, but I imagine a still from a video will almost always look worse than the equivalent image. You'll also get stacked artifacts if you take a video still, and then save that as an image -- you've got whatever lossy compression happened to the video frame, You could try it in a low-stakes context and see if you're satisfied or dissatisfied with the outcome.
Agree that burst mode might be more likely to have positive results, if your phone's camera supports it.
posted by Alterscape at 9:46 AM on December 4, 2022
Agree that burst mode might be more likely to have positive results, if your phone's camera supports it.
posted by Alterscape at 9:46 AM on December 4, 2022
The still you export from Google Photos will not necessarily be the same quality as what you'd get out of the camera app because of differences in processing and the fact that Photos is working from the compressed video file, not raw sensor data.
It is true that on modern phones you don't really take photos any more. If it has a zero shutter lag feature, it is always recording video to an in-memory buffer when the camera app is open and merely saves a frame when you hit the shutter button. The ZSL feature is working with raw frames from the camera sensor, though, not trying to use the downscaled and already processed video.
That said, "good enough" is good enough. If you're missing stuff you wish you had been quick enough to capture, do what works for you. For the most part, something is better than nothing!
posted by wierdo at 10:23 AM on December 4, 2022
It is true that on modern phones you don't really take photos any more. If it has a zero shutter lag feature, it is always recording video to an in-memory buffer when the camera app is open and merely saves a frame when you hit the shutter button. The ZSL feature is working with raw frames from the camera sensor, though, not trying to use the downscaled and already processed video.
That said, "good enough" is good enough. If you're missing stuff you wish you had been quick enough to capture, do what works for you. For the most part, something is better than nothing!
posted by wierdo at 10:23 AM on December 4, 2022
« Older US equity market capitalization by asset category | Where Can I Buy Great Christmas Stollen Online? Newer »
This thread is closed to new comments.
I looked it up and that phone, and it appears to have three cameras: a 12mp ultrawide angle, a 50mp wide angle, and a 10mp telephoto. So your video image would be at least slightly lower in resolution than the image from any of those cameras.
Of course, resolution isn't directly related to image quality, but it's also unlikely that you'd be able to pull a still from your video that was higher quality than an actual still, even if they were the same resolution. The camera is probably going to have a much slower shutter speed when shooting video, which will means that there will be some motion blur in each frame. This doesn't matter in a video when you're seeing one picture after another 30 times per second, but it definitely matters when looking at single frames.
posted by jonathanhughes at 8:49 AM on December 4, 2022