Avaudiopcmbuffer Built Programmatically, Not Playing Back in Stereo

AVAudioPCMBuffer built programmatically, not playing back in stereo

Unsafe pointers in swift are pretty weird to get used to.

floatChannelData.memory[j] only accesses the first channel of data. To access the other channel(s), you have a couple choices:

Using advancedBy

// Where current channel is at 0

// Get a channel pointer aka UnsafePointer<UnsafeMutablePointer<Float>>
let channelN = floatChannelData.advancedBy( channelNumber )

// Get channel data aka UnsafeMutablePointer<Float>
let channelNData = channelN.memory

// Get first two floats of channel channelNumber
let floatOne = channelNData.memory
let floatTwo = channelNData.advancedBy(1).memory

Using Subscript

// Get channel data aka UnsafeMutablePointer<Float>
let channelNData = floatChannelData[ channelNumber ]

// Get first two floats of channel channelNumber
let floatOne = channelNData[0]
let floatTwo = channelNData[1]

Using subscript is much clearer and the step of advancing and then manually
accessing memory is implicit.


For your loop, try accessing all channels of the buffer by doing something like this:

for i in 0..<Int(capacity) {
for n in 0..<Int(buffer.format.channelCount) {
barBuffer.floatChannelData[n][j] = buffer.floatChannelData[n][i]
}
}

Hope this helps!

Play segment of AVAudioPCMBuffer

You can create a new buffer that's a segment of the original. AVAudioPCMBuffer and AudioBufferlist are kind of a pain in Swift. There are a few ways to do this, if you are using floats you can access AVAUdioPCMBuffer.floatChannelData, but here's a method that works for both float and int samples.

func segment(of buffer: AVAudioPCMBuffer, from startFrame: AVAudioFramePosition, to endFrame: AVAudioFramePosition) -> AVAudioPCMBuffer? {
let framesToCopy = AVAudioFrameCount(endFrame - startFrame)
guard let segment = AVAudioPCMBuffer(pcmFormat: buffer.format, frameCapacity: framesToCopy) else { return nil }

let sampleSize = buffer.format.streamDescription.pointee.mBytesPerFrame

let srcPtr = UnsafeMutableAudioBufferListPointer(buffer.mutableAudioBufferList)
let dstPtr = UnsafeMutableAudioBufferListPointer(segment.mutableAudioBufferList)
for (src, dst) in zip(srcPtr, dstPtr) {
memcpy(dst.mData, src.mData?.advanced(by: Int(startFrame) * Int(sampleSize)), Int(framesToCopy) * Int(sampleSize))
}

segment.frameLength = framesToCopy
return segment
}

AVAudioPCMBuffer Memory Management

My rule inside audio callback functions, blocks, or taps to always immediately copy any data to be processed out of the PCM buffers into your own private sample buffers. You can then safely publish your copy, and/or process the copied data at you own convenience.

This is because the underlying PCM buffers might be being updated in a separate RemoteIO Audio Unit thread running inside a hard real-time (Mach kernel) context. And thus its possible that some audio unit implementations might potentially be ignoring all concurrency locks in order to attempt to meet their hard real-time completion targets.

Play audio from AVAudioPCMBuffer with AVAudioEngine

Skip the raw NSData format

Why not use AVAudioPlayer all the way? If you positively need NSData, you can always load such data from the soundURL below. In this example, the disk buffer is something like:

let soundURL = documentDirectory.URLByAppendingPathComponent("sound.m4a")

It makes sense to record directly to a file anyway for optimal memory and resource management. You get NSData from your recording this way:

let data = NSFileManager.defaultManager().contentsAtPath(soundURL.path())

The code below is all you need:

Record

if !audioRecorder.recording {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(true)
audioRecorder.record()
} catch {}
}

Play

if (!audioRecorder.recording){
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
audioPlayer.play()
} catch {}
}

Setup

let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try audioRecorder = AVAudioRecorder(URL: self.directoryURL()!,
settings: recordSettings)
audioRecorder.prepareToRecord()
} catch {}

Settings

let recordSettings = [AVSampleRateKey : NSNumber(float: Float(44100.0)),
AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)),
AVNumberOfChannelsKey : NSNumber(int: 1),
AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Medium.rawValue))]

Download Xcode Project:

You can find this very example here. Download the full project, which records and plays on both simulator and device, from Swift Recipes.



Related Topics



Leave a reply



Submit