Playing an Audio File Repeatedly with Avaudioengine

Playing an audio file repeatedly with AVAudioEngine

I found the solution in another question, asked and also auto-answered by @CarveDrone , so I've just copied the code he used:

class aboutViewController: UIViewController {

var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()

override func viewDidLoad() {
super.viewDidLoad()

let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)

var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)

audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}

The only thing you have to change is the filePath constant. Here is the link to the original answer: Having AVAudioEngine repeat a sound

Having AVAudioEngine repeat a sound

After hours and hour of searching, this did it:

class aboutViewController: UIViewController {

var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()

override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.


let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)

var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)

audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}

...
}

AVAudioengine play / loop audio , multiple buttons

You can handle this behavior by using the completionHandler parameter of .scheduleBuffer.

For example, you could do something like this:

var nextAudioFilePath: String  
var isPlaying: Bool = false

@IBAction func playLoopA() {
guard let path = Bundle.main.path(forResource: "audioFileA", ofType: "wav") else { return }
nextAudioFilePath = path
guard !isPlaying else { return }
play()
}

@IBAction func playLoopB() {
guard let path = Bundle.main.path(forResource: "audioFileB", ofType: "wav") else { return }
nextAudioFilePath = path
guard !isPlaying else { return }
play()
}

private func play() {
let fileURL = URL(fileURLWithPath: nextAudioFilePath)
...
playerNode.scheduleBuffer(audioFileBuffer, at: nil, options: [], completionHandler: { [weak self] in
self?.play()
})
}

Cannot play sound with AVAudioEngine

I ran your code, and it worked fine, there are three things I would check:

  1. Make sure your file is found, so change your code to work with AVAudioPlayer just to check that the program knows where the file is, and that it can play it
  2. Second, check where you are putting your statements, my example with the code is below.

    import UIKit
    import AVFoundation
    class aboutViewController: UIViewController {

    var audioUrl = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!)
    var audioEngine = AVAudioEngine()
    var myPlayer = AVAudioPlayerNode()

    override func viewDidLoad() {
    super.viewDidLoad()

    // Do any additional setup after loading the view.

    audioEngine.attachNode(myPlayer)
    var audioFile = AVAudioFile(forReading: audioUrl, error: nil)
    var audioError: NSError?
    audioEngine.connect(myPlayer, to: audioEngine.mainMixerNode, format: audioFile.processingFormat)
    myPlayer.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
    audioEngine.startAndReturnError(&audioError)

    }

    override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
    }

    @IBAction func testSound(sender: AnyObject) {
    myPlayer.play()
    }

    }

Hope this helps, ask if you have any questions!

Playing a stereo audio buffer from memory with AVAudioEngine

I have fixed the issue!

I tried a lot of solutions and have ended up completely re-writing the audio engine section of my app and I now have the AVAudioEngine and AVAudioPlayerNode declared within the ViewController class as the following:

class ViewController: UIViewController {

var audioEngine: AVAudioEngine = AVAudioEngine()
var playerNode: AVAudioPlayerNode = AVAudioPlayerNode()

...

I am still unclear if it is better to declare these globally or as class variables in iOS, however I can confirm that my application is playing audio with these declared within the ViewController class. I do know that they shouldn't be declared in a function as they will disappear and stop playing when the function goes out of scope.

However, I still was not getting any audio output until I set the AVAudioPCMBuffer.frameLength to frameCapacity.

I could find very little information online regarding creating a new AVAudioPCMBuffer from an array of floats, but this seems to be the missing step that I needed to do to make my outputBuffer playable. Before I set this, it was at 0 by default.

The frameLength member isn't required in the AVAudioFormat class declaration. But it is important and my buffer wasn't playable until I set it manually, and after the class instance declaration:

let bufferFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: hrtfSampleRate, channels: 2, interleaved: false)!
let frameCapacity = UInt32(audioFile.length)
guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: bufferFormat, frameCapacity: frameCapacity) else {
fatalError("Could not create output buffer.")
}
outputBuffer.frameLength = frameCapacity // Important!

This took a long time to find out, hopefully this will help someone else in the future.

loop audio with AVAudioEngine

The problem is these lines:

let audioFile = try AVAudioFile(forReading: url)
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)

Where’s the loop? Nowhere. That code plays the file once.

You cannot loop with a file in AVAudioEngine. You loop with a buffer. You read the file into a buffer and call scheduleBuffer(buffer, at: nil, options: .loops).

Swift: Play gapless audio with AVAudioEngine (AVAudioPlayerNode)?

Finally! I have solved this issue with AudioQueue. If anyone struggling the same issue as mine in Swift, this AudioQueue sample code saved me.

https://gist.github.com/zonble/635ea00cb125bc50b3f5880e16ba71b7



Related Topics



Leave a reply



Submit