Avoiding Blurriness at Start & End of Video (Even After Using Setpreferredvideostabilizationmode:Avcapturevideostabilizationmodeauto)

Swift: extracting image from video comes out blurry even though video looks sharp?

Your code is working without errors or problems. I've tried with a video and the grabbed image was not blurry.

I would try to debug this by using a different timescale for CMTime.

With CMTimeMake, the first argument is the value and the second argument is the timescale.

Your timescale is 1, so the value is in seconds. A value of 0 means 1st second, a value of 1 means 2nd second, etc. Actually it means the first frame after the designated location in the timeline.

With your current CMTime it grabs the first frame of the first second: that's the first frame of the video (even if the video is less than 1s).

With a timescale of 4, the value would be 1/4th of a second. Etc.

Try finding a CMTime that falls right on a steady frame (it depends on your video framerate, you'll have to make tests).

For example if your video is at 24 fps, then to grab exactly one frame of video, the timescale should be at 24 (that way each value unit would represent a whole frame):

let cgImage = try imgGenerator.copyCGImageAtTime(CMTimeMake(0, 24), actualTime: nil)

On the other hand, you mention that only the first and last frames of the video are blurry. As you rightly guessed, it's probably the actual cause of your issue and is caused by a lack of device stabilization.

A note: the encoding of the video might also play a role. Some MPG encoders create incomplete and interpolated frames that are "recreated" when the video plays, but these frames can appear blurry when grabbed with copyCGImageAtTime. The only solution I've found for this rare problem is to grab another frame just before or just after the blurry one.

Swift record video using same method as taking photo

Your AVCaptureSession currently only has one output, which is an AVCaptureStillImageOutput. If you want to capture video, you need to add either an AVCaptureVideoDataOutput or an AVCaptureMovieFileOutput as an output to your capture session.

AV Foundation Programming Guide here: Still and Video Media Capture.

iOS : How to activate image stabilization on an AVCaptureSession?

Firstly, you've forgotten to add your AVCaptureStillImageOutput to the AVCaptureSession. You must do that before querying its capabilities!

captureSession.addOutput(stillImageOutput)

Secondly, neither Digital nor Optical Image Stabilisation are supported on the front camera.

Thirdly, on the back camera, on supported platforms (digital appears to be available on 5S up) AVCaptureStillImageOutput automaticallyEnablesStillImageStabilizationWhenAvailable defaults to YES, so if you switch to the back camera - then you already will be using some form of image stabilisation.

NB: Optical Image Stabilisation is only available on the 6+ and 6S+ (although the linked technote has not been updated for the 6S models yet).

AVCaptureSession is not giving a good photo quality and good resolution

When using the preset AVCaptureSessionPresetPhoto with an AVCaptureStillImageOutput, I'm able to capture images on an iPhone 4S at a resolution of 3268x2448, which is the exact same resolution that the built-in camera application yields. The same is true for the iPhone 4, Retina iPad, etc., so if you use that preset with a still image input, you will get a sample buffer back from -captureStillImageAsynchronouslyFromConnection:completionHandler: that is the native camera resolution.

In regards to photo quality, remember that the built-in camera application has the ability to capture high-dynamic-range (HDR) photos by the quick acquisition of images at different exposure levels. We do not have access to this via the standard AV Foundation APIs, so all we get is one image at a defined exposure level.

If you turn HDR off, the image quality looks identical to me. Here is a zoomed-in portion of a photo captured using an AVCaptureStillImageOutput:

AVCaptureStillImageOutput image

and here is one from the built-in photo application:

Built-in Photos image

Ignoring the slight differences in lighting due to a little shift in camera direction, the resolution and fidelity of images captured both ways appear to be the same.

I captured the first image using the SimplePhotoFilter example application from my open source GPUImage framework, replacing the default GPUImageSketchFilter with a GPUImageGammaFilter that didn't adjust the gamma any, just acted as a passthrough.



Related Topics



Leave a reply



Submit