Stream Data from Network in Avaudioengine, Is It Possible

Stream data from network in AVAudioEngine, is it possible?

Apple's documentation doesn't specify if it can work with a mp3 hosted online but as soon as I try to initialize an AVAudioFile with a mp3 url it crashes instantly with a fatal exception. I think that answers your question.

iOS8 AVAudioEngine how to send microphone data over Multipeer Connectivity?

I haven't tried this solution:

- (NSOutputStream *)startStreamWithName:(NSString *)streamName
toPeer:(MCPeerID *)peerID
error:(NSError **)error

You can receive a float array by using the property buffer.floatChannelData. Now you can pack this float array into NSOutputStream and send it.

On client side you can try to receive a stream:

- (void)session:(MCSession *)session
didReceiveStream:(NSInputStream *)stream
withName:(NSString *)streamName
fromPeer:(MCPeerID *)peerID
{
stream.delegate = self;
[stream scheduleInRunLoop:[NSRunLoop mainRunLoop]
forMode:NSDefaultRunLoopMode];
[stream open];
}

But before you try this, you could try to send random values (therefore white noise) instead of the float array, so you can make sure, that the time slot for sending these buffers (we are talking about real-time) is wide enough.

Update 15.10.2014
I found exactly what you need:
http://robots.thoughtbot.com/streaming-audio-to-multiple-listeners-via-ios-multipeer-connectivity

How can mp3 data in memory be loaded into an AVAudioPCMBuffer in Swift?

I went down the rabbit hole on this one. Here is what probably amounts to a Rube Goldberg-esque solution:

A lot of the pain comes from using C from Swift.

func data_AudioFile_ReadProc(_ inClientData: UnsafeMutableRawPointer, _ inPosition: Int64, _ requestCount: UInt32, _ buffer: UnsafeMutableRawPointer, _ actualCount: UnsafeMutablePointer<UInt32>) -> OSStatus {
let data = inClientData.assumingMemoryBound(to: Data.self).pointee
let bufferPointer = UnsafeMutableRawBufferPointer(start: buffer, count: Int(requestCount))
let copied = data.copyBytes(to: bufferPointer, from: Int(inPosition) ..< Int(inPosition) + Int(requestCount))
actualCount.pointee = UInt32(copied)
return noErr
}

func data_AudioFile_GetSizeProc(_ inClientData: UnsafeMutableRawPointer) -> Int64 {
let data = inClientData.assumingMemoryBound(to: Data.self).pointee
return Int64(data.count)
}

extension Data {
func convertedTo(_ format: AVAudioFormat) -> AVAudioPCMBuffer? {
var data = self

var af: AudioFileID? = nil
var status = AudioFileOpenWithCallbacks(&data, data_AudioFile_ReadProc, nil, data_AudioFile_GetSizeProc(_:), nil, 0, &af)
guard status == noErr, af != nil else {
return nil
}

defer {
AudioFileClose(af!)
}

var eaf: ExtAudioFileRef? = nil
status = ExtAudioFileWrapAudioFileID(af!, false, &eaf)
guard status == noErr, eaf != nil else {
return nil
}

defer {
ExtAudioFileDispose(eaf!)
}

var clientFormat = format.streamDescription.pointee
status = ExtAudioFileSetProperty(eaf!, kExtAudioFileProperty_ClientDataFormat, UInt32(MemoryLayout.size(ofValue: clientFormat)), &clientFormat)
guard status == noErr else {
return nil
}

if let channelLayout = format.channelLayout {
var clientChannelLayout = channelLayout.layout.pointee
status = ExtAudioFileSetProperty(eaf!, kExtAudioFileProperty_ClientChannelLayout, UInt32(MemoryLayout.size(ofValue: clientChannelLayout)), &clientChannelLayout)
guard status == noErr else {
return nil
}
}

var frameLength: Int64 = 0
var propertySize: UInt32 = UInt32(MemoryLayout.size(ofValue: frameLength))
status = ExtAudioFileGetProperty(eaf!, kExtAudioFileProperty_FileLengthFrames, &propertySize, &frameLength)
guard status == noErr else {
return nil
}

guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(frameLength)) else {
return nil
}

let bufferSizeFrames = 512
let bufferSizeBytes = Int(format.streamDescription.pointee.mBytesPerFrame) * bufferSizeFrames
let numBuffers = format.isInterleaved ? 1 : Int(format.channelCount)
let numInterleavedChannels = format.isInterleaved ? Int(format.channelCount) : 1
let audioBufferList = AudioBufferList.allocate(maximumBuffers: numBuffers)
for i in 0 ..< numBuffers {
audioBufferList[i] = AudioBuffer(mNumberChannels: UInt32(numInterleavedChannels), mDataByteSize: UInt32(bufferSizeBytes), mData: malloc(bufferSizeBytes))
}

defer {
for buffer in audioBufferList {
free(buffer.mData)
}
free(audioBufferList.unsafeMutablePointer)
}

while true {
var frameCount: UInt32 = UInt32(bufferSizeFrames)
status = ExtAudioFileRead(eaf!, &frameCount, audioBufferList.unsafeMutablePointer)
guard status == noErr else {
return nil
}

if frameCount == 0 {
break
}

let src = audioBufferList
let dst = UnsafeMutableAudioBufferListPointer(pcmBuffer.mutableAudioBufferList)

if src.count != dst.count {
return nil
}

for i in 0 ..< src.count {
let srcBuf = src[i]
let dstBuf = dst[i]
memcpy(dstBuf.mData?.advanced(by: Int(dstBuf.mDataByteSize)), srcBuf.mData, Int(srcBuf.mDataByteSize))
}

pcmBuffer.frameLength += frameCount
}

return pcmBuffer
}
}

A more robust solution would probably read the sample rate and channel count and give the option to preserve them.

Tested using:

let url = URL(fileURLWithPath: "/tmp/test.mp3")
let data = try! Data(contentsOf: url)

let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: false)!
if let d = data.convertedTo(format) {
let avf = try! AVAudioFile(forWriting: URL(fileURLWithPath: "/tmp/foo.wav"), settings: format.settings, commonFormat: format.commonFormat, interleaved: format.isInterleaved)
try! avf.write(from: d)
}

Trying to stream audio from microphone to another phone via multipeer connectivity

Your AVAudioPCMBuffer serialisation/deserialisation is wrong.

Swift3's casting has changed a lot & seems to require more copying than Swift2.

Here's how you can convert between [UInt8] and AVAudioPCMBuffers:

N.B: this code assumes mono float data at 44.1kHz.

You might want to change that.

func copyAudioBufferBytes(_ audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
let srcLeft = audioBuffer.floatChannelData![0]
let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)

// initialize bytes to 0 (how to avoid?)
var audioByteArray = [UInt8](repeating: 0, count: numBytes)

// copy data from buffer
srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
audioByteArray.withUnsafeMutableBufferPointer {
$0.baseAddress!.initialize(from: srcByteData, count: numBytes)
}
}

return audioByteArray
}

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
// format assumption! make this part of your protocol?
let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
audioBuffer.frameLength = frameLength

let dstLeft = audioBuffer.floatChannelData![0]
// for stereo
// let dstRight = audioBuffer.floatChannelData![1]

buf.withUnsafeBufferPointer {
let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
dstLeft.initialize(from: src, count: Int(frameLength))
}

return audioBuffer
}


Related Topics



Leave a reply



Submit