Play audio from AVAudioPCMBuffer with AVAudioEngine
Skip the raw NSData
format
Why not use AVAudioPlayer
all the way? If you positively need NSData
, you can always load such data from the soundURL
below. In this example, the disk buffer is something like:
let soundURL = documentDirectory.URLByAppendingPathComponent("sound.m4a")
It makes sense to record directly to a file anyway for optimal memory and resource management. You get NSData
from your recording this way:
let data = NSFileManager.defaultManager().contentsAtPath(soundURL.path())
The code below is all you need:
Record
if !audioRecorder.recording {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(true)
audioRecorder.record()
} catch {}
}
Play
if (!audioRecorder.recording){
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
audioPlayer.play()
} catch {}
}
Setup
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try audioRecorder = AVAudioRecorder(URL: self.directoryURL()!,
settings: recordSettings)
audioRecorder.prepareToRecord()
} catch {}
Settings
let recordSettings = [AVSampleRateKey : NSNumber(float: Float(44100.0)),
AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)),
AVNumberOfChannelsKey : NSNumber(int: 1),
AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Medium.rawValue))]
Download Xcode Project:
You can find this very example here. Download the full project, which records and plays on both simulator and device, from Swift Recipes.
How to play audio from AVAudioPCMBuffer converted from NSData
ended up using an objective-c function:data is getting converted fine
-(AudioBufferList *) getBufferListFromData: (NSData *) data
{
if (data.length > 0)
{
NSUInteger len = [data length];
//NSData *d2 = [data subdataWithRange:NSMakeRange(4, 1028)];
//I guess you can use Byte*, void* or Float32*. I am not sure if that makes any difference.
Byte* byteData = (Byte*) malloc (len);
memcpy (byteData, [data bytes], len);
if (byteData)
{
AudioBufferList * theDataBuffer =(AudioBufferList*)malloc(sizeof(AudioBufferList) * 1);
theDataBuffer->mNumberBuffers = 1;
theDataBuffer->mBuffers[0].mDataByteSize =(UInt32) len;
theDataBuffer->mBuffers[0].mNumberChannels = 1;
theDataBuffer->mBuffers[0].mData = byteData;
// Read the data into an AudioBufferList
return theDataBuffer;
}
}
return nil;
}
Related Topics
How to Make a Uiview Focusable Using the Focus Engine on Apple Tv
List All Available Audio Devices
Uisplitviewcontroller in Portrait on iPhone Always Show Master and Detail in iOS 8
Update Core Data Object Order - Not Working
Calculating Angle Between Two Points on Edge of Circle Swift Spritekit
iOS 9 iPad Keyboard Get Rid of "Undo View"
What Does the Swift 'Mutating' Keyword Mean
How to Pass Protocol with Associated Type (Generic Protocol) as Parameter in Swift
Need Self to Set All Constants of a Swift Class in Init
Iterate an Array W/ Explicit Object Type in Swift
Do Local Notifications Need User Permission on iOS
Xcode10 - Dyld: Library Not Loaded for Pod Installed in Framework
iOS 11 Black Bar Appears on Navigation Bar When Pushing View Controller
iOS 10 App Crashes When Trying to Save Image to Photo Library
Convert Array of Unicodescalar into String in Swift
How to Capture Local Variable Inside an Async Closure in Swift