Avassetwriter Avvideoexpectedsourceframeratekey (Frame Rate) Ignored

AVAssetWriter AVVideoExpectedSourceFrameRateKey (frame rate) ignored

You can control the timing of each sample you append to your AVAssetWriterInput directly with CMSampleBufferCreateCopyWithNewTiming.

You need to adjust the timing in the CMSampleTimingInfo you provide.
Retrieve current timing info with CMSampleBufferGetOutputSampleTimingInfoArray and just go over the duration of each sample and calculate the correct duration to get 12 frames per second and adjust presentation and decode timestamps to match this new duration.
You then make your copy and feed it to your writer's input.

Let's say you have existingSampleBuffer:

CMSampleBufferRef sampleBufferToWrite = NULL;
CMSampleTimingInfo sampleTimingInfo = {0};

CMSampleBufferGetSampleTimingInfo(existingSampleBuffer, 0, &sampleTimingInfo);

// modify duration & presentationTimeStamp
sampleTimingInfo.duration = CMTimeMake(1, 12) // or whatever frame rate you desire
sampleTimingInfo.presentationTimeStamp = CMTimeAdd(previousPresentationTimeStamp, sampleTimingInfo.duration);
previousPresentationTimeStamp = sampleTimingInfo.presentationTimeStamp; // should be initialised before passing here the first time

OSStatus status = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, existingSampleBuffer, 1, &sampleTimingInfo, &sampleBufferToWrite);

if (status == noErr) {
// you can write sampleBufferToWrite
}

I'm making some assumptions in this code:

  • SampleBuffer contains only one sample
  • SampleBuffer contains uncompressed video (otherwise, you need to handle decodeTimeStamp as well)

Using AVAssetWriter to re-encode H264 mov file - how to set frame-rate?

As mentioned in the question, I used SDAVAssetExportSession for ease of video export. I made some small changes to it that enabled me to use that easily to change the framerate.

The main gist is you can change framerate using AVMutableVideoComposition, setting the frameDuration property to your desired framerate, and passing this composition object to the AVAssetReaderVideoCompositionOutput object used in the transcoding.

In SDAVAssetExportSession's buildDefaultVideoComposition method, I modified it to look a bit like this:

- (AVMutableVideoComposition *)buildDefaultVideoComposition
{
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
AVAssetTrack *videoTrack = [[self.asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

// ...

videoComposition.frameDuration = CMTimeMake(1, myDesiredFramerate);

// ...

That did the trick.

Reduce Video Frame Rate While Keeping Same Length/Duration Using AVAssetWriter

For each dropped frame, you need to compensate by doubling the duration of the sample you're going to write.

while(videoInput.isReadyForMoreMediaData){
if let sample = assetReaderVideoOutput.copyNextSampleBuffer() {
if counter % 2 == 0 {
let timingInfo = UnsafeMutablePointer<CMSampleTimingInfo>.allocate(capacity: 1)
let newSample = UnsafeMutablePointer<CMSampleBuffer?>.allocate(capacity: 1)

// Should check call succeeded
CMSampleBufferGetSampleTimingInfo(sample, 0, timingInfo)

timingInfo.pointee.duration = CMTimeMultiply(timingInfo.pointee.duration, 2)

// Again, should check call succeeded
CMSampleBufferCreateCopyWithNewTiming(nil, sample, 1, timingInfo, newSample)
videoInput.append(newSample.pointee!)
}
counter = counter + 1
}
}

How do I control AVAssetWriter to write at the correct FPS

I'm reaching here, but I think this is where you're going wrong. Think of your video capture as a pipeline.

(1) Capture buffer -> (2) Do Something With buffer -> (3) Write buffer as frames in video.

Sounds like you've successfully completed (1) and (2), you're getting the buffer fast enough and you're processing them so you can vend them as frames.

The problem is almost certainly in (3) writing the video frames.

https://developer.apple.com/reference/avfoundation/avmutablevideocomposition

Check out the frameDuration setting in your AVMutableComposition, you'll need something like CMTime(1, 60) //60FPS or CMTime(1, 240) // 240FPS to get what you're after (telling the video to WRITE this many frames and encode at this rate).

Using AVAssetWriter, it's exactly the same principle but you set the frame rate as a property in the AVAssetWriterInput outputSettings adding in the AVVideoExpectedSourceFrameRateKey.

NSDictionary *videoCompressionSettings = @{AVVideoCodecKey                  : AVVideoCodecH264,
AVVideoWidthKey : @(videoWidth),
AVVideoHeightKey : @(videoHeight),
AVVideoExpectedSourceFrameRateKey : @(60),
AVVideoCompressionPropertiesKey : @{ AVVideoAverageBitRateKey : @(bitsPerSecond),
AVVideoMaxKeyFrameIntervalKey : @(1)}
};

To expand a little more - you can't strictly control or sync your camera capture exactly to the output / playback rate, the timing just doesn't work that way and isn't that exact, and of course the processing pipeline adds overhead. When you capture frames they are time stamped, which you've seen, but in the writing / compression phase, it's using only the frames it needs to produce the output specified for the composition.

It goes both ways, you could capture only 30 FPS and write out at 240 FPS, the video would display fine, you'd just have a lot of frames "missing" and being filled in by the algorithm. You can even vend only 1 frame per second and play back at 30FPS, the two are separate from each other (how fast I capture Vs how many frames and what I present per second)

As to how to play it back at different speed, you just need to tweak the playback speed - slow it down as needed.

If you've correctly set the time base (frameDuration), it will always play back "normal" - you're telling it "play back is X Frames Per Second", of course, your eye may notice a difference (almost certainly between low FPS and high FPS), and the screen may not refresh that high (above 60FPS), but regardless the video will be at a "normal" 1X speed for it's timebase. By slowing the video, if my timebase is 120, and I slow it to .5x I know effectively see 60FPS and one second of playback takes two seconds.

You control the playback speed by setting the rate property on AVPlayer https://developer.apple.com/reference/avfoundation/avplayer

AVAssetWriter / AVAssetWriterInputPixelBufferAdaptor - black frames and frame rate

You don't need to count frame timestamps on your own. You can get the timestamp of the current sample with

CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

However, it seems to me you are just passing the pixel buffer of the frame to the adaptor without modifications. Wouldn't it be easier to pass the sample buffer itself directly to the assetWriterInput like the following?

[self.assetWriterInput appendSampleBuffer:sampleBuffer];


Related Topics



Leave a reply



Submit