Create CMSampleBufferRef from an AudioInputIOProc
Three things look wrong:
You declare that the format ID is
kAudioFormatMPEG4AAC
, but configure it as LPCM. So trymonoStreamFormat.mFormatID = kAudioFormatLinearPCM;
You also call the format "mono" when it's configured as stereo.
Why use
mach_timebase_info
which could leave gaps in your audio presentation timestamps? Use sample count instead:CMTime presentationTime = CMTimeMake(numSamplesProcessed, 44100);
Your
CMSampleTimingInfo
looks wrong, and you're not usingpresentationTime
. You set the buffer's duration as 1 sample long when it can benumSamples
and its presentation time to zero which can't be right. Something like this would make more sense:CMSampleTimingInfo timing = { CMTimeMake(numSamples, 44100), presentationTime, kCMTimeInvalid };
And some questions:
Does your AudioBufferList
have the expected 2 AudioBuffers
?
Do you have a runnable version of this?
p.s. I'm guilty of it myself, but allocating memory on the audio thread is considered harmful in audio dev.
Deep Copy of Audio CMSampleBuffer
Here is a working solution I finally implemented. I sent this snippet to Apple Developer Technical support and asked them to check if it is a correct way to copy incoming sample buffer. The basic idea is copy AudioBufferList
and then create a CMSampleBuffer
and set AudioBufferList
to this sample.
AudioBufferList audioBufferList;
CMBlockBufferRef blockBuffer;
//Create an AudioBufferList containing the data from the CMSampleBuffer,
//and a CMBlockBuffer which references the data in that AudioBufferList.
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);
NSUInteger size = sizeof(audioBufferList);
char buffer[size];
memcpy(buffer, &audioBufferList, size);
//This is the Audio data.
NSData *bufferData = [NSData dataWithBytes:buffer length:size];
const void *copyBufferData = [bufferData bytes];
copyBufferData = (char *)copyBufferData;
CMSampleBufferRef copyBuffer = NULL;
OSStatus status = -1;
/* Format Description */
AudioStreamBasicDescription audioFormat = *CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef) CMSampleBufferGetFormatDescription(sampleBuffer));
CMFormatDescriptionRef format = NULL;
status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &audioFormat, 0, nil, 0, nil, nil, &format);
CMFormatDescriptionRef formatdes = NULL;
status = CMFormatDescriptionCreate(NULL, kCMMediaType_Audio, 'lpcm', NULL, &formatdes);
if (status != noErr)
{
NSLog(@"Error in CMAudioFormatDescriptionCreator");
CFRelease(blockBuffer);
return;
}
/* Create sample Buffer */
CMItemCount framesCount = CMSampleBufferGetNumSamples(sampleBuffer);
CMSampleTimingInfo timing = {.duration= CMTimeMake(1, 44100), .presentationTimeStamp= CMSampleBufferGetPresentationTimeStamp(sampleBuffer), .decodeTimeStamp= CMSampleBufferGetDecodeTimeStamp(sampleBuffer)};
status = CMSampleBufferCreate(kCFAllocatorDefault, nil , NO,nil,nil,format, framesCount, 1, &timing, 0, nil, ©Buffer);
if( status != noErr) {
NSLog(@"Error in CMSampleBufferCreate");
CFRelease(blockBuffer);
return;
}
/* Copy BufferList to Sample Buffer */
AudioBufferList receivedAudioBufferList;
memcpy(&receivedAudioBufferList, copyBufferData, sizeof(receivedAudioBufferList));
//Creates a CMBlockBuffer containing a copy of the data from the
//AudioBufferList.
status = CMSampleBufferSetDataBufferFromAudioBufferList(copyBuffer, kCFAllocatorDefault , kCFAllocatorDefault, 0, &receivedAudioBufferList);
if (status != noErr) {
NSLog(@"Error in CMSampleBufferSetDataBufferFromAudioBufferList");
CFRelease(blockBuffer);
return;
}
Code-Level Support answer:
This code looks ok (though you’ll want to add some additional error
checking). I've successfully tested it in an app that implements the
AVCaptureAudioDataOutput delegate
captureOutput:didOutputSampleBuffer:fromConnection:
method to
capture and record audio. The captured audio I'm getting when using
this deep copy code appears to be the same as what I get when directly
using the provided sample buffer (without the deep copy).Apple Developer Technical Support
Creating `CMSampleBufferRef` from a .mov file
Finally, I fixed my problem using AVMutableComposition
. Here is my code :
AVMutableComposition *mixComposition = [AVMutableComposition new];
AVMutableCompositionTrack *mutableCompVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVURLAsset *videoAsset = [[AVURLAsset alloc]initWithURL:3SecFileURL options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero, [videoAsset duration]);
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
[mutableCompVideoTrack setPreferredTransform:rotationTransform];
CMTime currentCMTime = kCMTimeZero;
for (NSInteger count = 0 ; count < 5 ; count++)
{
[mutableCompVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:currentCMTime error:nil];
currentCMTime = CMTimeAdd(currentCMTime, [videoAsset duration]);
}
NSString *fullMoviePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[@"moviefull" stringByAppendingPathExtension:@"mov"]];
NSURL *finalVideoFileURL = [NSURL fileURLWithPath:fullMoviePath];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
[exportSession setOutputFileType:AVFileTypeQuickTimeMovie];
[exportSession setOutputURL:finalVideoFileURL];
CMTimeValue val = [mixComposition duration].value;
CMTime start = CMTimeMake(0, 1);
CMTime duration = CMTimeMake(val, 1);
CMTimeRange range = CMTimeRangeMake(start, duration);
[exportSession setTimeRange:range];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
{
NSLog(@"Export failed: %@ %@", [[exportSession error] localizedDescription], [[exportSession error]debugDescription]);
}
case AVAssetExportSessionStatusCancelled:
{
NSLog(@"Export canceled");
break;
}
case AVAssetExportSessionStatusCompleted:
{
NSLog(@"Export complete!");
}
default: NSLog(@"default");
}
}];
Related Topics
Protocol Extension Initializer Forcing to Call Self.Init
Cloudkit Ckqueryoperation Doesn't Get All Records
Swift: Overriding Typealias Inside Subclass
Why Does Using Dynamictype on a Force Unwrapped Nil Optional Value Type Work
How to Suppress a Specific Warning in Swift
Getting a Segmentation Fault: 11 with Swift 5.2 When Using Filemanager.Default.Currentdirectorypath
Swiftui: Update Navigationview After Deletion (Ipad)
What Does This Two Question Mark Mean
How to Get Distinct Results from a Single Field in Core Data (Swift 4)
Using Animoji/Memoji as Profile Photo
Curl with Alamofire - Swift - Multipart/Form-Data
Gmsplace Returns Invalid Coordinate (-180, -180), But Name and Place Id Are Correct
Having Trouble with Nstimer (Swift)
Symbol Is Considered to Be an Identifier, Not an Operator
Swift - How to Get Text Formatting in a Text Editor Like in the Notes App? Swiftui
Uiswipegesturerecognizer Doesn't Recognize Swipe Gesture Initiated Outside the View