Save Avcapturevideodataoutput to Movie File Using Avassetwriter in Swift

Save AVCaptureVideoDataOutput to movie file using AVAssetWriter in Swift

I was able to find out how to use AVAssetWriter. In case anyone else needs help the code I used is as follows:

func setUpWriter() {

do {
outputFileLocation = videoFileLocation()
videoWriter = try AVAssetWriter(outputURL: outputFileLocation!, fileType: AVFileType.mov)

// add video input
videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: [
AVVideoCodecKey : AVVideoCodecType.h264,
AVVideoWidthKey : 720,
AVVideoHeightKey : 1280,
AVVideoCompressionPropertiesKey : [
AVVideoAverageBitRateKey : 2300000,
],
])

videoWriterInput.expectsMediaDataInRealTime = true

if videoWriter.canAdd(videoWriterInput) {
videoWriter.add(videoWriterInput)
print("video input added")
} else {
print("no input added")
}

// add audio input
audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)

audioWriterInput.expectsMediaDataInRealTime = true

if videoWriter.canAdd(audioWriterInput!) {
videoWriter.add(audioWriterInput!)
print("audio input added")
}


videoWriter.startWriting()
} catch let error {
debugPrint(error.localizedDescription)
}


}

func canWrite() -> Bool {
return isRecording && videoWriter != nil && videoWriter?.status == .writing
}


//video file location method
func videoFileLocation() -> URL {
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
let videoOutputUrl = URL(fileURLWithPath: documentsPath.appendingPathComponent("videoFile")).appendingPathExtension("mov")
do {
if FileManager.default.fileExists(atPath: videoOutputUrl.path) {
try FileManager.default.removeItem(at: videoOutputUrl)
print("file removed")
}
} catch {
print(error)
}

return videoOutputUrl
}

// MARK: AVCaptureVideoDataOutputSampleBufferDelegate
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

let writable = canWrite()

if writable,
sessionAtSourceTime == nil {
// start writing
sessionAtSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
videoWriter.startSession(atSourceTime: sessionAtSourceTime!)
//print("Writing")
}

if output == videoDataOutput {
connection.videoOrientation = .portrait

if connection.isVideoMirroringSupported {
connection.isVideoMirrored = true
}
}

if writable,
output == videoDataOutput,
(videoWriterInput.isReadyForMoreMediaData) {
// write video buffer
videoWriterInput.append(sampleBuffer)
//print("video buffering")
} else if writable,
output == audioDataOutput,
(audioWriterInput.isReadyForMoreMediaData) {
// write audio buffer
audioWriterInput?.append(sampleBuffer)
//print("audio buffering")
}

}

// MARK: Start recording
func start() {
guard !isRecording else { return }
isRecording = true
sessionAtSourceTime = nil
setUpWriter()
print(isRecording)
print(videoWriter)
if videoWriter.status == .writing {
print("status writing")
} else if videoWriter.status == .failed {
print("status failed")
} else if videoWriter.status == .cancelled {
print("status cancelled")
} else if videoWriter.status == .unknown {
print("status unknown")
} else {
print("status completed")
}

}

// MARK: Stop recording
func stop() {
guard isRecording else { return }
isRecording = false
videoWriterInput.markAsFinished()
print("marked as finished")
videoWriter.finishWriting { [weak self] in
self?.sessionAtSourceTime = nil
}
//print("finished writing \(self.outputFileLocation)")
captureSession.stopRunning()
performSegue(withIdentifier: "videoPreview", sender: nil)
}

I now have another problem where this solution doesn't work when I'm using AVCaptureMetadataOutput, AVCaptureVideoDataOutput and AVCaptureAudioDataOutput together. The app crashes when I add AVCaptureAudioDataOutput.

Recording Video Using AVCaptureVideoDataOutput at Swift 3

After spending some part of life I found how to record video when I am getting pixel information to make some basic analysis on live video.

First I am setting the AVAssetWriter and call that function before giving actual record order.

var sampleBufferGlobal : CMSampleBuffer?
let writerFileName = "tempVideoAsset.mov"
var presentationTime : CMTime!
var outputSettings = [String: Any]()
var videoWriterInput: AVAssetWriterInput!
var assetWriter: AVAssetWriter!


func setupAssetWriter () {

eraseFile(fileToErase: writerFileName)

presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBufferGlobal!)

outputSettings = [AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : NSNumber(value: Float(videoWidth)),
AVVideoHeightKey : NSNumber(value: Float(videoHeight))] as [String : Any]

videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)


assetWriter = try? AVAssetWriter(outputURL: createFileURL(writerFileName), fileType: AVFileTypeQuickTimeMovie)

assetWriter.add(videoWriterInput)

}

I wrote an other function to make recording and called that function in captureOutput function to make recording after I copied sampleBuffer to sampleBufferGlobal, sampleBufferGlobal = sampleBuffer, in the same function.

func writeVideoFromData() {

if assetWriter?.status == AVAssetWriterStatus.unknown {

if (( assetWriter?.startWriting ) != nil) {

assetWriter?.startWriting()
assetWriter?.startSession(atSourceTime: presentationTime)

}
}



if assetWriter?.status == AVAssetWriterStatus.writing {

if (videoWriterInput.isReadyForMoreMediaData == true) {


if videoWriterInput.append(sampleBufferGlobal!) == false {

print(" we have a problem writing video")

}
}
}
}

Then to stop recording I used following function.

   func stopAssetWriter() {

videoWriterInput.markAsFinished()

assetWriter?.finishWriting(completionHandler: {


if (self.assetWriter?.status == AVAssetWriterStatus.failed) {

print("creating movie file is failed ")

} else {

print(" creating movie file was a success ")

DispatchQueue.main.async(execute: { () -> Void in




})

}

})

}

Capturing video and saving it via AVAssetWriter

I am not familiar with UISaveVideoAtPathToSavedPhotosAlbum. But browsing stack overflow and git, many people use PHPhotoLibrary and so do I. Regardless of url, code below adds the video to photoLibrary.

https://developer.apple.com/documentation/photokit/phassetchangerequest/1624057-creationrequestforassetfromvideo

1) Info.plist
Add new key-value pair by + button. Select "Privary - Photo Library Usage Description" as a key. Set value something like "save video in photo library"

2) code

fileWriter.finishWriting(completionHandler: {
let status = PHPhotoLibrary.authorizationStatus()

//no access granted yet
if status == .notDetermined || status == .denied{
PHPhotoLibrary.requestAuthorization({auth in
if auth == .authorized{
saveInPhotoLibrary(url)
}else{
print("user denied access to photo Library")
}
})

//access granted by user already
}else{
saveInPhotoLibrary(url)
}
})

private func saveInPhotoLibrary(_ url:URL){
PHPhotoLibrary.shared().performChanges({

//add video to PhotoLibrary here
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: url)
}) { completed, error in
if completed {
print("save complete! path : " + url.absoluteString)
}else{
print("save failed")
}
}
}

Hope this helps.
GW

AVFoundation: How to write video to file in real time instead of using exportAsync?

It's not clear where your video is coming from, butexportAsync makes it sound like you're using AVAssetExportSession with an existing file or composition.

  1. capture your video (and audio?) frames

    a. if from an existing composition or file, with AVAssetReader

    b. if from the camera, with AVCaptureSession etc
  2. progressively write the frames to file using AVAssetWriter & AVAssetWriterInput

If you're expecting the writing to file to be interrupted for some reason,
consider setting the AVAssetWriter's movieFragmentInterval property to something small .

Corrupt video capturing audio and video using AVAssetWriter

I figured it out. I was setting the assetWriter.startSession source time to 0, and then subtracting the start time from current CACurrentMediaTime() for writing the pixel data.

I changed the assetWriter.startSession source time to the CACurrentMediaTime() and don't subtract the current time when writing the video frame.

Old start session code:

assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)

New code that works:

let presentationStartTime = CMTimeMakeWithSeconds(CACurrentMediaTime(), 240)

assetWriter.startWriting()
assetWriter.startSession(atSourceTime: presentationStartTime)

AVAssetWriter - Capturing video but no audio

Ripped my hair out for days on this. My mistake was simple - The delegate method was being called, but was being returned BEFORE I reached the audio statements. These were the culprits which needed to be moved to after the audio processing portion of my code:

            if (connection.isVideoOrientationSupported) {
connection.videoOrientation = currentVideoOrientation()
} else
{
return
}

if (connection.isVideoStabilizationSupported) {
//connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationMode.auto
}


Related Topics



Leave a reply



Submit