Ios: Sample Code for Simultaneous Record and Playback

iOS: Sample code for simultaneous record and playback

As suggested by Viraj, here is the answer.

Yes, you can achieve very good results using AVFoundation. Firstly you need to pay attention to the fact that for both the player and the recorder, activating them is a two step process.

First you prime it.

Then you play it.

So, prime everything. Then play everything.

This will get your latency down to about 70ms. I tested by recording a metronome tick, then playing it back through the speakers while holding the iPhone up to the speakers and simultaneously recording.

The second recording had a clear echo, which I found to be ~70ms. I could have analysed the signal in Audacity to get an exact offset.

So in order to line everything up I just performSelector:x withObject: y afterDelay: 70.0/1000.0

There may be hidden snags, for example the delay may differ from device to device. it may even differ depending on device activity. It is even possible the thread could get interrupted/rescheduled in between starting the player and starting the recorder.

But it works, and is a lot tidier than messing around with audio queues / units.

Playback and Recording simultaneously using Core Audio in iOS

The RemoteIO Audio Unit can be used for simultaneous record and play. There are plenty of examples of recording using RemoteIO (aurioTouch) and playing using RemoteIO. Just enable both unit input and unit output, and handle both buffer callbacks. See an example here

How to record and play audio simultaneously in iOS using Swift?

You are setting the InputCallback and RenderCallback method incorrectly. Other settings seems OK. So your init method should be like this.

init() {

var status: OSStatus

do {
try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(preferredIOBufferDuration)
} catch let error as NSError {
print(error)
}


var desc: AudioComponentDescription = AudioComponentDescription()
desc.componentType = kAudioUnitType_Output
desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO
desc.componentFlags = 0
desc.componentFlagsMask = 0
desc.componentManufacturer = kAudioUnitManufacturer_Apple

let inputComponent: AudioComponent = AudioComponentFindNext(nil, &desc)

status = AudioComponentInstanceNew(inputComponent, &audioUnit)
checkStatus(status)

var flag = UInt32(1)
status = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, kInputBus, &flag, UInt32(sizeof(UInt32)))
checkStatus(status)

status = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &flag, UInt32(sizeof(UInt32)))
checkStatus(status)

var audioFormat: AudioStreamBasicDescription! = AudioStreamBasicDescription()
audioFormat.mSampleRate = 8000
audioFormat.mFormatID = kAudioFormatLinearPCM
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked
audioFormat.mFramesPerPacket = 1
audioFormat.mChannelsPerFrame = 1
audioFormat.mBitsPerChannel = 16
audioFormat.mBytesPerPacket = 2
audioFormat.mBytesPerFrame = 2

status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, kInputBus, &audioFormat, UInt32(sizeof(UInt32)))
checkStatus(status)


try! AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, kOutputBus, &audioFormat, UInt32(sizeof(UInt32)))
checkStatus(status)


// Set input/recording callback
var inputCallbackStruct = AURenderCallbackStruct(inputProc: recordingCallback, inputProcRefCon: UnsafeMutablePointer(unsafeAddressOf(self)))
AudioUnitSetProperty(audioUnit, AudioUnitPropertyID(kAudioOutputUnitProperty_SetInputCallback), AudioUnitScope(kAudioUnitScope_Global), 1, &inputCallbackStruct, UInt32(sizeof(AURenderCallbackStruct)))


// Set output/renderar/playback callback
var renderCallbackStruct = AURenderCallbackStruct(inputProc: playbackCallback, inputProcRefCon: UnsafeMutablePointer(unsafeAddressOf(self)))
AudioUnitSetProperty(audioUnit, AudioUnitPropertyID(kAudioUnitProperty_SetRenderCallback), AudioUnitScope(kAudioUnitScope_Global), 0, &renderCallbackStruct, UInt32(sizeof(AURenderCallbackStruct)))


flag = 0
status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_ShouldAllocateBuffer, kAudioUnitScope_Output, kInputBus, &flag, UInt32(sizeof(UInt32)))
}

Try with this code and let us know if that helps.

Problems Recording and Playing Back Audio Simultaneously

The appropriate way to do this is via AudioUnit APIs, even though it seems like a common scenario which should be handled by higher level APIs.

I wrote a small demo app using AudioUnit. You're free to try it our and modify it for suiting your purpose. The demo app does record audio and play it simultaneously, but it's recommended to use a ear phone to see the effect.

Record and play simultaneously on iOS (Phonegap build)

In order to be able to simultaneously play media and record audio, one must set the category property of the AVAudioSession to AVAudioSessionCategoryPlayAndRecord. To do so you must deploy a custom iOS plugin that sets the corresponding value.

At the time speaking, this process is not quite straightforward because of a bug in the Cordova Media Plugin. Before starting to record, the plugin sets the category of the AVAudioSession to AVAudioSessionCategoryRecord indiscriminately. Because of that, after starting to record, playing back the desired media becomes impossible, unless you explicitly set the AVAudioSession category to play-and-record, after calling the record method. This clearly insinuates that you can only play your media, after recording has started.

However, this workaround may not be acceptable in many scenarios, which may require that media starts to play before sound is recorded. Thus, I have filed a bug report regarding the issue, which you may find here:

iOS Media plugin: cannot play video and record audio simultaneously

Plus, I have applied the fix and performed the following pull request:

Check if avSession.category is already set to "AVAudioSessionCategoryPlayAndRecord" before recording

It's really a minor fix (literally two lines of code), thus I believe that it will be soon applied to the master branch.

Record sound and play simultaneously in iphone SDKs programmatically

Check out this sample from Apple.

http://developer.apple.com/library/ios/#samplecode/aurioTouch/Introduction/Intro.html

It should get you started.



Related Topics



Leave a reply



Submit