Recording to Aac from Remoteio: Data Is Getting Written But File Unplayable

Recording to AAC from RemoteIO: data is getting written but file unplayable

So I finally sorted this out! Ugh, what an information scavenger hunt.

Anyway, here is the bit in the docs for ExtAudioFile that I missed (see bolded text). I wasn't setting this property. Data was being written to my .m4a file but it was unreadable at playback. So to sum up: I have a bunch of AUSamplers -> AUMixer -> RemoteIO. A render callback on the RemoteIO instance writes the data out to disk in a compressed m4a format. So it is possible to generate compressed audio on the fly (iOS 5/iPad 2)

Seems pretty robust - I had some printf statements in the rendercallback and the write worked fine.

Yay!

ExtAudioFileProperty_CodecManufacturer
The manufacturer of the codec to be used by the extended audio file object. Value is a read/write UInt32. You must specify this property before setting the kExtAudioFileProperty_ClientDataFormat (page 20) property, which in turn triggers the creation of the codec. Use this property in iOS to choose between a hardware or software encoder, by specifying kAppleHardwareAudioCodecManufacturer or kAppleSoftwareAudioCodecManufacturer.
Available in Mac OS X v10.7 and later.
Declared in ExtendedAudioFile.h.

// specify codec
UInt32 codec = kAppleHardwareAudioCodecManufacturer;
size = sizeof(codec);
result = ExtAudioFileSetProperty(extAudioFileRef,
kExtAudioFileProperty_CodecManufacturer,
size,
&codec);

if(result) printf("ExtAudioFileSetProperty %ld \n", result);

RemoteIO recorded audio file is either silent or 4KB

Aside from the (fairly arduous but better covered by Apple's example code) setup process of RemoteIO itself, the key points of insight were:

  1. Using the same AudioStreamBasicDescription (*audioFormat) that I used to set up the stream in the first place. I don't know how long I spent trying to set up a new one with slightly different parameters, based on other questions and posts. Just referencing the stream attributes from my ivar was sufficient.
  2. Set an "isRecording" bool so that you can turn on and off write-to-file without having to tear down and re-set-up your RemoteIO session
  3. It is ok to write to a file in the recordingCallback, um, callback, but do it asynchronously. Lots of info talks about doing it in the playbackCallback or setting up some third audioFileWriteCallback. This resulted in silent files or 4KB (i.e. empty) files. Don't do it.
  4. Also, be sure to use a copy of the ioData that got passed into the callback

in recordingCallback after AudioUnitRender into bufferList:

    AudioDeviceManager* THIS = (__bridge AudioDeviceManager *)inRefCon;
if (THIS->isRecording) {
ExtAudioFileWriteAsync(THIS->extAudioFileRef, inNumberFrames, bufferList);
}
  1. Start and stop recording functions, for reference:


-(void)startRecording {

  NSArray  *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *destinationFilePath = [documentsDirectory stringByAppendingPathComponent:kAudioFileName];
CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, (CFStringRef)destinationFilePath, kCFURLPOSIXPathStyle, false);

OSStatus status;

// create the capture file
status = ExtAudioFileCreateWithURL(destinationURL, kAudioFileWAVEType, &audioFormat, NULL, kAudioFileFlags_EraseFile, &extAudioFileRef);
if (status) NSLog(@"Error creating file with URL: %ld", status);

// use the same "audioFormat" AudioStreamBasicDescription we used to set up RemoteIO in the first place
status = ExtAudioFileSetProperty(extAudioFileRef, kExtAudioFileProperty_ClientDataFormat, sizeof(AudioStreamBasicDescription), &audioFormat);

ExtAudioFileSeek(extAudioFileRef, 0);
ExtAudioFileWrite(extAudioFileRef, 0, NULL);

isRecording = YES;
}

- (void)stopRecording {

isRecording = NO;
OSStatus status = ExtAudioFileDispose(extAudioFileRef);
if (status) printf("ExtAudioFileDispose %ld \n", status);
}

That's it!

Recording from RemoteIO: resulting .caf is pitch shifted slower + distorted

Ok - found some code that solves this - though I don't fully understand why.

I had been setting the mBitsPerChannel to 16 for both the RemoteIO output stream and the ExtFileRef. The result was slowed down & scratchy audio. Setting the ExtFileRef mBitsPerChannel to 32 plus adding the kAudioFormatFlagsNativeEndian flag solves the problem: the .caf audio is perfect (while leaving the RemoteIO output stream settings to what they were).

But then also setting the RemoteIO output stream settings to match my new settings also works. So I'm confused. Shouldn't this work so long as the AudioStreamBasicDescription settings are symmetrical for the RemoteIO instance and the ExtFileRef?

Anyway... the working setting is below.

size_t bytesPerSample = sizeof (AudioUnitSampleType);

AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate= graphSampleRate;
audioFormat.mFormatID=kAudioFormatLinearPCM;
audioFormat.mFormatFlags=kAudioFormatFlagsNativeEndian|kAudioFormatFlagIsSignedInteger|kAudioFormatFlagIsPacked;
audioFormat.mBytesPerPacket=bytesPerSample;
audioFormat.mBytesPerFrame=bytesPerSample;
audioFormat.mFramesPerPacket=1;
audioFormat.mChannelsPerFrame=1;
audioFormat.mBitsPerChannel= 8 * bytesPerSample;
audioFormat.mReserved=0;

ExtAudioFileWrite to m4a/aac failing on dual-core devices (ipad 2, iphone 4s)

I had a very similar problem where I was attempting to use Extended Audio File Services in order to stream PCM sound into an m4a file on an iPad 2. Everything appeared to work except that every call to ExtAudioFileWrite returned the error code -66567 (kExtAudioFileError_MaxPacketSizeUnknown). The fix I eventually found was to set the "Codec Manufacturer" to software instead of hardware. So place

UInt32 codecManf = kAppleSoftwareAudioCodecManufacturer;
ExtAudioFileSetProperty(FileToWrite, kExtAudioFileProperty_CodecManufacturer, sizeof(UInt32), &codecManf);

just before you set the client data format.

This would lead me to believe that Apple's hardware codecs can only support very specific encoding, but the software codecs can more reliably do what you want. In my case, the software codec translation to m4a takes 50% longer than writing the exact same file to LPCM format.

Does anyone know whether Apple specifies somewhere what their audio codec hardware is capable of? It seems that software engineers are stuck playing the hours-long guessing game of setting the ~20 parameters in the AudioStreamBasicDescription and AudioChannelLayout for the client and for the file to every possible permutation until something works...

RemoteIO and Recording AAC on iOS 6

Problem solved. Use the software codec:

// specify codec
UInt32 codec = kAppleSoftwareAudioCodecManufacturer;
int codecSize = sizeof(codec);
status = ExtAudioFileSetProperty(recorderState.audioFile,
kExtAudioFileProperty_CodecManufacturer,
codecSize,
&codec);

Before I was using kAppleHardwareAudioCodecManufacturer... evidently the hardware codec can't handle interruptions. Hope others find this useful!

ios audio unit remoteIO playback while recording

I have not used VOIP or kAudioSessionCategory_PlayAndRecord. But if you want to record/transmit voice picked up from the mic and play back incoming data from network packages: Here is a good sample which included both mic and playback. Also if you have not read this doc from Apple, I would strongly recommend this.

In short: You need to create an AudioUnits instance. In it, configure two callbacks: one for mic and one for playback. The callback mic function will supply you the data that was picked up from the mic. You then can convert and transmit to other devices with whatever chosen network protocol. The playback callback function is where you supply the incoming data from other network devices to play back.



Related Topics



Leave a reply



Submit