Grab Frames from Video Using Swift

Grab frames from video using Swift

Thanks to @eric-d who found this post:
iOS Take Multiple Screen Shots

I manage to find out that adding:

    assetImgGenerate.requestedTimeToleranceAfter = kCMTimeZero;
assetImgGenerate.requestedTimeToleranceBefore = kCMTimeZero;

...to my function will do the trick.

My updated function looks like this:

func generateThumnail(url : NSURL, fromTime:Float64) -> UIImage {
var asset :AVAsset = AVAsset.assetWithURL(url) as! AVAsset
var assetImgGenerate : AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
assetImgGenerate.requestedTimeToleranceAfter = kCMTimeZero;
assetImgGenerate.requestedTimeToleranceBefore = kCMTimeZero;
var error : NSError? = nil
var time : CMTime = CMTimeMakeWithSeconds(fromTime, 600)
var img : CGImageRef = assetImgGenerate.copyCGImageAtTime(time, actualTime: nil, error: &error)
var frameImg : UIImage = UIImage(CGImage: img)!
return frameImg
}

var grabTime = 1.22
img = generateThumnail(urlVideo, fromTime: Float64(grabTime))

Swift - get all frames from video

When generating more than 1 frame Apple recommends using the method:
generateCGImagesAsynchronously(forTimes:completionHandler:)

Still, if you prefer to follow your current approach there are a couple of improvements you could do to reduce memory usage:

  • You are instantiating AVAsset and AVAssetImageGenerator inside the loop, you could instantiate them just once and send it to the method generateFrames.
  • Remove the line

    UIImageWriteToSavedPhotosAlbum(frameImg, nil, nil, nil)//I saved here to check

    because you are saving every frame in the photos
    album, that takes extra memory.

Final result could look like this:

var videoFrames:[UIImage] = [UIImage]
let asset:AVAsset = AVAsset(url:self.mutableVideoURL as URL)
let assetImgGenerate:AVAssetImageGenerator = AVAssetImageGenerator(asset:asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let duration:Float64 = CMTimeGetSeconds(asset.duration)
let durationInt:Int = Int(mutableVideoDuration)

for index:Int in 0 ..< durationInt
{
generateFrames(
assetImgGenerate:assetImgGenerate,
fromTime:Float64(index))
}

func generateFrames(
assetImgGenerate:AVAssetImageGenerator,
fromTime:Float64)
{
let time:CMTime = CMTimeMakeWithSeconds(fromTime, 600)
let cgImage:CGImage?

do
{
cgImage = try assetImgGenerate.copyCGImage(at:time, actualTime:nil)
}
catch
{
cgImage = nil
}

guard

let img:CGImage = cgImage

else
{
continue
}

let frameImg:UIImage = UIImage(cgImage:img)
videoFrames.append(frameImg)
}

Update for Swift 4.2

var videoUrl:URL // use your own url
var frames:[UIImage]
private var generator:AVAssetImageGenerator!

func getAllFrames() {
let asset:AVAsset = AVAsset(url:self.videoUrl)
let duration:Float64 = CMTimeGetSeconds(asset.duration)
self.generator = AVAssetImageGenerator(asset:asset)
self.generator.appliesPreferredTrackTransform = true
self.frames = []
for index:Int in 0 ..< Int(duration) {
self.getFrame(fromTime:Float64(index))
}
self.generator = nil
}

private func getFrame(fromTime:Float64) {
let time:CMTime = CMTimeMakeWithSeconds(fromTime, preferredTimescale:600)
let image:CGImage
do {
try image = self.generator.copyCGImage(at:time, actualTime:nil)
} catch {
return
}
self.frames.append(UIImage(cgImage:image))
}

Extracting frame from video in swift 3

I understand your problem to the full extent. Few days back, I was also facing this problem, that's why I have developed a complete solution from showing live camera preview, arranging it properly on the view to getting camera frames continuously and converting the frames into UIImages efficiently without memory leak to utilise it accordingly. Kindly utilise the solution according to your need. The solution is optimised for swift 4.2 and developed on Xcode 10.0.

THIS IS THE LINK OF GITHUB REPO FOR THIS :- https://github.com/anand2nigam/CameraFrameExtractor

Kindly use your iPhone or iPad to test the application because it will not work on simulator. Please let me know about the app functionality and its working and if any help needed, do contact me.
Hope it can solve your problem. Happy Learning.

Create frames from video in Swift (iOS)

I tried solution from Amin Benarieb, and it seems to work:

static func toImages(fromVideoUrl url: URL) -> [UIImage]? {
let asset = AVAsset(url: url)
guard let reader = try? AVAssetReader(asset: asset) else { return nil }
let videoTrack = asset.tracks(withMediaType: .video).first!
let outputSettings = [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]
let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: outputSettings)
reader.add(trackReaderOutput)
reader.startReading()
var images = [UIImage]()
while reader.status == .reading {
autoreleasepool {
if let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
let ciImage = CIImage(cvImageBuffer: imageBuffer)
images.append(UIImage(ciImage: ciImage))
}
}
}
}
return images
}

How to get frames from a local video file in Swift?

I believe AVAssetReader should work. What did you try? Have you seen this sample code from Apple? https://developer.apple.com/library/content/samplecode/ReaderWriter/Introduction/Intro.html

How to read a video file from disk in real time in iOS

In order to playback and process a video in real-time you can use the AVPlayer class. The simplest way to live-process video frames is through a custom video composition on the AVPlayerItem.

You might want to check out this sample project from Apple where they highlight HDR parts in a video using Core Image filters. It shows the whole setup required for real-time processing and playback.

How to get frame from video on iOS

Thnx to @shallowThought I found an answer in this question Grab frames from video using Swift

You just need to add this two lines

assetImgGenerate.requestedTimeToleranceAfter = kCMTimeZero;
assetImgGenerate.requestedTimeToleranceBefore = kCMTimeZero;


Related Topics



Leave a reply



Submit