Pulling Data from a Cmsamplebuffer in Order to Create a Deep Copy

Pulling data from a CMSampleBuffer in order to create a deep copy

Alright, I think I finally got it. I created a helper extension to make a full copy of a CVPixelBuffer:

extension CVPixelBuffer {
func copy() -> CVPixelBuffer {
precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")

var _copy : CVPixelBuffer?
CVPixelBufferCreate(
nil,
CVPixelBufferGetWidth(self),
CVPixelBufferGetHeight(self),
CVPixelBufferGetPixelFormatType(self),
CVBufferGetAttachments(self, kCVAttachmentMode_ShouldPropagate)?.takeUnretainedValue(),
&_copy)

guard let copy = _copy else { fatalError() }

CVPixelBufferLockBaseAddress(self, kCVPixelBufferLock_ReadOnly)
CVPixelBufferLockBaseAddress(copy, 0)

for plane in 0..<CVPixelBufferGetPlaneCount(self) {
let dest = CVPixelBufferGetBaseAddressOfPlane(copy, plane)
let source = CVPixelBufferGetBaseAddressOfPlane(self, plane)
let height = CVPixelBufferGetHeightOfPlane(self, plane)
let bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(self, plane)

memcpy(dest, source, height * bytesPerRow)
}

CVPixelBufferUnlockBaseAddress(copy, 0)
CVPixelBufferUnlockBaseAddress(self, kCVPixelBufferLock_ReadOnly)

return copy
}
}

Now you can use this in your didOutputSampleBuffer method:

guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }

let copy = pixelBuffer.copy()

toProcess.append(copy)

But be aware, one such pixelBuffer takes up about 3MB of memory (1080p), which means that in 100 frames you got already about 300MB, which is about the point at which the iPhone says STAHP (and crashes).

Note that you don't actually want to copy the CMSampleBuffer since it only really contains a CVPixelBuffer because it's an image.

Deep Copy of Audio CMSampleBuffer

Here is a working solution I finally implemented. I sent this snippet to Apple Developer Technical support and asked them to check if it is a correct way to copy incoming sample buffer. The basic idea is copy AudioBufferList and then create a CMSampleBuffer and set AudioBufferList to this sample.

AudioBufferList audioBufferList;
CMBlockBufferRef blockBuffer;
//Create an AudioBufferList containing the data from the CMSampleBuffer,
//and a CMBlockBuffer which references the data in that AudioBufferList.
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);
NSUInteger size = sizeof(audioBufferList);
char buffer[size];

memcpy(buffer, &audioBufferList, size);
//This is the Audio data.
NSData *bufferData = [NSData dataWithBytes:buffer length:size];

const void *copyBufferData = [bufferData bytes];
copyBufferData = (char *)copyBufferData;

CMSampleBufferRef copyBuffer = NULL;
OSStatus status = -1;

/* Format Description */

AudioStreamBasicDescription audioFormat = *CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef) CMSampleBufferGetFormatDescription(sampleBuffer));

CMFormatDescriptionRef format = NULL;
status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &audioFormat, 0, nil, 0, nil, nil, &format);

CMFormatDescriptionRef formatdes = NULL;
status = CMFormatDescriptionCreate(NULL, kCMMediaType_Audio, 'lpcm', NULL, &formatdes);
if (status != noErr)
{
NSLog(@"Error in CMAudioFormatDescriptionCreator");
CFRelease(blockBuffer);
return;
}

/* Create sample Buffer */
CMItemCount framesCount = CMSampleBufferGetNumSamples(sampleBuffer);
CMSampleTimingInfo timing = {.duration= CMTimeMake(1, 44100), .presentationTimeStamp= CMSampleBufferGetPresentationTimeStamp(sampleBuffer), .decodeTimeStamp= CMSampleBufferGetDecodeTimeStamp(sampleBuffer)};

status = CMSampleBufferCreate(kCFAllocatorDefault, nil , NO,nil,nil,format, framesCount, 1, &timing, 0, nil, ©Buffer);

if( status != noErr) {
NSLog(@"Error in CMSampleBufferCreate");
CFRelease(blockBuffer);
return;
}

/* Copy BufferList to Sample Buffer */
AudioBufferList receivedAudioBufferList;
memcpy(&receivedAudioBufferList, copyBufferData, sizeof(receivedAudioBufferList));

//Creates a CMBlockBuffer containing a copy of the data from the
//AudioBufferList.
status = CMSampleBufferSetDataBufferFromAudioBufferList(copyBuffer, kCFAllocatorDefault , kCFAllocatorDefault, 0, &receivedAudioBufferList);
if (status != noErr) {
NSLog(@"Error in CMSampleBufferSetDataBufferFromAudioBufferList");
CFRelease(blockBuffer);
return;
}

Code-Level Support answer:

This code looks ok (though you’ll want to add some additional error
checking). I've successfully tested it in an app that implements the
AVCaptureAudioDataOutput delegate
captureOutput:didOutputSampleBuffer:fromConnection: method to
capture and record audio. The captured audio I'm getting when using
this deep copy code appears to be the same as what I get when directly
using the provided sample buffer (without the deep copy).

Apple Developer Technical Support

Create a copy of CMSampleBuffer in Swift 2.0

Literally you're attempting to use the variable bufferCopy before it is initialized.

You've declared a type for it, but haven't allocated the memory it's pointing to.

You should instead create CMSampleBuffers using the following call https://developer.apple.com/library/tvos/documentation/CoreMedia/Reference/CMSampleBuffer/index.html#//apple_ref/c/func/CMSampleBufferCreate

You should be able to copy the buffer into this then (as long as the format of the buffer matches the one you're copying from).

Creating copy of CMSampleBuffer in Swift returns OSStatus -12743 (Invalid Media Format)

I was able to fix the problem by creating a format description off the newly created image buffer and using it instead of the format description off the original sample buffer. Unfortunately while that fixes the problem here, the format descriptions don't match and causes problem further down.

How could save the CMSampleBuffer to a CFArrayRef (or other container) that from didOutputSampleBuffer

CMSampleBufferCreateCopy is shallow copy, according to the answer from how to deep copy pixelbuffer

recreate the samplebuffer with a new pixelbuffer. then I could cache sample buffers.

But this way cost resource, I can't hold too many CMSampleBufferRefs in memory(each of them takes about 3MB), and the process of deep copy is waste of time.



Related Topics



Leave a reply



Submit