Avaudioplayer.Play() Works But Avaudioplayernode.Play() Fails

How to monitor for AVAudioPlayerNode finished playing

Instead of using avPlayer.scheduleFile(file, at: nil), use the form of the method with a completion handler:

avPlayer.scheduleFile(file, at: nil) {
//call your completion function here
print("Done playing")
}

AVAudioFile doesn't play in AVAudioEngine

declare your AVAudioPlayerNode *player as global

refer this link Can't play file from documents in AVAudioPlayer

AVAudioEngine throws exception when connecting AVAudioPlayerNode to output

Apparently, the exception thrown is normal. It occurs under any condition and in any environment I have tested, but does not interfere with normal functionality.

The reason why no sound was played is that the AVAudioEngine and AVAudioPlayerNode objects were released as soon as the function returned, because they had no strong pointers keeping them alive. I have fixed the issue by keeping those two objects as properties.

Repeating Audio in WatchKit (AVAudioPlayer?)?

If your audio file fits into memory, you could schedule playback as an AVAudioBuffer with the AVAudioPlayerNodeBufferLoops option (N.B. only tested on simulator!):

AVAudioFormat *outputFormat = [_audioPlayer outputFormatForBus:0];

__block AVAudioPCMBuffer *srcBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:asset.processingFormat frameCapacity:(AVAudioFrameCount)asset.length];

if (![asset readIntoBuffer:srcBuffer error:&error]) {
NSLog(@"Read error: %@", error);
abort();
}

AVAudioPCMBuffer *dstBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)asset.length];

AVAudioConverter *converter = [[AVAudioConverter alloc] initFromFormat:srcBuffer.format toFormat:dstBuffer.format];
AVAudioConverterOutputStatus status = [converter convertToBuffer:dstBuffer error:&error withInputFromBlock:^AVAudioBuffer * _Nullable(AVAudioPacketCount inNumberOfPackets, AVAudioConverterInputStatus * _Nonnull outStatus) {
if (srcBuffer) {
AVAudioBuffer *result = srcBuffer;
srcBuffer = NULL;
*outStatus = AVAudioConverterInputStatus_HaveData;
return result;
} else {
*outStatus = AVAudioConverterInputStatus_EndOfStream;
return NULL;
}
}];

assert(status != AVAudioConverterOutputStatus_Error);

[_audioPlayer scheduleBuffer:dstBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil];
[_audioPlayer play];

Playing an audio file repeatedly with AVAudioEngine

I found the solution in another question, asked and also auto-answered by @CarveDrone , so I've just copied the code he used:

class aboutViewController: UIViewController {

var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()

override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.

let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)

var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)

audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}

The only thing you have to change is the filePath constant. Here is the link to the original answer: Having AVAudioEngine repeat a sound

Cannot play sound with AVAudioEngine

I ran your code, and it worked fine, there are three things I would check:

  1. Make sure your file is found, so change your code to work with AVAudioPlayer just to check that the program knows where the file is, and that it can play it
  2. Second, check where you are putting your statements, my example with the code is below.

    import UIKit
    import AVFoundation
    class aboutViewController: UIViewController {

    var audioUrl = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!)
    var audioEngine = AVAudioEngine()
    var myPlayer = AVAudioPlayerNode()

    override func viewDidLoad() {
    super.viewDidLoad()

    // Do any additional setup after loading the view.

    audioEngine.attachNode(myPlayer)
    var audioFile = AVAudioFile(forReading: audioUrl, error: nil)
    var audioError: NSError?
    audioEngine.connect(myPlayer, to: audioEngine.mainMixerNode, format: audioFile.processingFormat)
    myPlayer.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
    audioEngine.startAndReturnError(&audioError)

    }

    override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
    }

    @IBAction func testSound(sender: AnyObject) {
    myPlayer.play()
    }

    }

Hope this helps, ask if you have any questions!



Related Topics



Leave a reply



Submit