How to monitor for AVAudioPlayerNode finished playing
Instead of using avPlayer.scheduleFile(file, at: nil)
, use the form of the method with a completion handler:
avPlayer.scheduleFile(file, at: nil) {
//call your completion function here
print("Done playing")
}
Swift AVAudioEngine crash: player started when in a disconnected state
The statement: "player started when in a disconnected state" indicates that there is a problem with the connection chain. This means either the nodes were not attached to the engine or the nodes were not linked together properly.Because both the audioFilePlayer and the timePitch nodes were attached, my impression would be to say that the problem lies with these two lines:
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
The connection should link all components together:
audioFilePlayer -> timePitch -> audioEngine.mainMixerNode (or outputNode)
So the connection should look like:
audioEngine.connect(audioFilePlayer, to:timePitch, format: audioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
I hope this helps.
AVAudioEngine crashes when plug headphones in or out
You need to register to the AVAudioEngineConfigurationChangeNotification
notification and do the necessary steps to update the output node.
AKPlayer crashes when playing from buffer on channelCount condition
Ok, this was super uncool debugging session. I had to investigate the AVAudioEngine
and how this kind of scenario could be done there, which of course not the final result I was looking. This quest helped me to understand how to solve it with AudioKit
(half of my app is implemented using AudioKit
's tools so it doesn't make sense to rewrite it with AVFoundation
).
AFFoundation
solution:
private let engine = AVAudioEngine()
private let bufferSize = 1024
private let p: AVAudioPlayerNode = AVAudioPlayerNode()
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, mode: .default, options: .defaultToSpeaker)
} catch {
print("Setting category to AVAudioSessionCategoryPlayback failed.")
}
let inputNode = self.engine.inputNode
engine.connect(inputNode, to: engine.mainMixerNode, format: inputNode.inputFormat(forBus: 0))
// !!! the following lines are the key to the solution.
// !!! the player has to be attached to the engine before actually connected
engine.attach(p)
engine.connect(p, to: engine.mainMixerNode, format: inputNode.inputFormat(forBus: 0))
do {
try engine.start()
} catch {
print("could not start engine \(error.localizedDescription)")
}
recordBufferAndPlay(duration: 4)
recordBufferAndPlay
function:
func recordBufferAndPlay(duration: Double){
let inputNode = self.engine.inputNode
let total: Double = AVAudioSession.sharedInstance().sampleRate * duration
let totalBufferSize: UInt32 = UInt32(total)
let recordedBuffer : AVAudioPCMBuffer! = AVAudioPCMBuffer(pcmFormat: inputNode.inputFormat(forBus: 0), frameCapacity: totalBufferSize)
var alreadyRecorded = 0
inputNode.installTap(onBus: 0, bufferSize: 256, format: inputNode.inputFormat(forBus: 0)) {
(buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
recordedBuffer.copy(from: buffer) // this helper function is taken from audio kit!
alreadyRecorded = alreadyRecorded + Int(buffer.frameLength)
print(alreadyRecorded, totalBufferSize)
if(alreadyRecorded >= totalBufferSize){
inputNode.removeTap(onBus: 0)
self.p.scheduleBuffer(recordedBuffer, at: nil, options: .loops, completionHandler: {
print("completed playing")
})
self.p.play()
}
}
}
AudioKit
solution:
So in the AudioKit solution these line should be invoked on your AKPlayer object. Note that this should be done before you actually start your engine.
self.player.buffering = .always
AudioKit.engine.attach(self.player.playerNode)
AudioKit.engine.connect(self.player.playerNode, to: self.mixer.inputNode, format: AudioKit.engine.inputNode.outputFormat(forBus: 0))
than the record is done pretty similarly to how you would have done it in AVAudioEngine, you install a tap on your node (microphone or other node) and record the buffer of PCM samples.
Related Topics
Sktexture Nearest Filtering Mode Doesn't Work (Making Pixel Art)
Swiftui View and Uihostingcontroller in Uiscrollview Breaks Scrolling
Swift: Convert String to Hex Color Code
How to Hide the Back Button from the Status Bar on the Apple Watch
Swift UI Test - User Notifications System Alert
Treat a Single Integer Value as a Range in Swift
Core Data with Pre-Filled .Sqlite (Swift3)
Big O of Accessing a String with an Index in Swift 3.0
How to Increase the Scope of Variables in Switch-Case/Loops in Swift
How to Disable "Save to Files" in iOS 11
Swift: Reduce Function with a Closure
How to Initialize a Unichar Variable in Swift
Compile Latex Code Using Swift