How to Set Avaudioengine Input and Output Devices (Swift/Macos)

How to set AVAudioEngine input and output devices (swift/macos)

So I filed a support request with apple on this and another issue and the response confirms that an AVAudioEngine can only be assigned to a single Aggregate device (that is, a device with both input and output channels) - the system default units create effectively an aggregate device internally which is why they work, although I've found an additional issue in that if the input device also has output capabilities (and you activate the inputNode) then that device has to be both the input and output device as otherwise the output appears not to work.

So answer is that I think there is no answer..

Set AVAudioEngine Input and Output Devices

Ok, after re-reading the docs for the 10th time, I noticed AVAudioEngine has members inputNode and outputNode (not sure how I missed that!).

The following code seems to do the job:

AudioDeviceID inputDeviceID = 53; // get this using AudioObjectGetPropertyData
AVAudioEngine *engine = [[AVAudioEngine alloc] init];
AudioUnit audioUnit = [[engine inputNode] audioUnit];

OSStatus error = AudioUnitSetProperty(audioUnit,
kAudioOutputUnitProperty_CurrentDevice,
kAudioUnitScope_Global,
0,
&inputDeviceID,
sizeof(inputDeviceID));

I borrowed the non-AVFoundation C code from the CAPlayThrough example.

How to get real-time microphone input in macOS?

Entered the following code into testRecord.swift :

import Foundation
import AVFoundation

print("starting")

public let audioEngine = AVAudioEngine()

var flag = 0

func startRecording() throws {

let inputNode = audioEngine.inputNode
let srate = inputNode.inputFormat(forBus: 0).sampleRate
print("sample rate = \(srate)")
if srate == 0 {
exit(0);
}

let recordingFormat = inputNode.outputFormat(forBus: 0)
inputNode.installTap(onBus: 0,
bufferSize: 1024,
format: recordingFormat) {
(buffer: AVAudioPCMBuffer, when: AVAudioTime) in
let n = buffer.frameLength
let c = buffer.stride
if flag == 0 {
print( "num samples = \(n)") ;
print( "num channels = \(c)") ;
flag = 1
}
}

try audioEngine.start()
}

func stopRecording() {
audioEngine.stop()
}

do {
try startRecording()
} catch {
print("error?")
}

usleep(UInt32(1000*1000)) // sleep 1 second before quitting
stopRecording()
print("done")
exit(0)

Compiled testRecord.swift using swiftc on macOS 10.14.5 / Xcode 10.2.1 ; then tried to run the result from Terminal. The first time it ran, macOS asked if Terminal could have microphone permissions. Replied yes, but no output.

But then on subsequent runs it output:

starting

sample rate = 44100.0

num samples = 4410

num channels = 1

done

So it might be you need to allow your app some permissions in System Preferences : Privacy : Microphone

AVAudioEngine audio stops when switching the audio output device

I heard back from Apple tech support and as Dad commented here when the audio engine configuration changes there is a AVAudioEngineConfigurationChange notification being sent.

At this point the nodes are detached and the audio setup needs to be re-constructed and the engine restarted to start playing audio on the new output device.

I'm including below the complete AppDelegate that tests the original premise of the question. On starting the app I'm calling both setupAudio() to load the audio and playAudio() to start the playback.

Additionally each time the audio engine configuration changes I'm calling playAudio() to restart the playback:

@main
class AppDelegate: NSObject, NSApplicationDelegate {

var engine = AVAudioEngine()
var player = AVAudioPlayerNode()

// Load audio file
let file = try! AVAudioFile(forReading: URL(fileURLWithPath: Bundle.main.path(forResource: "sheep1.m4a", ofType: nil)!))
var buffer: AVAudioPCMBuffer!

func applicationDidFinishLaunching(_ aNotification: Notification) {
// Attach audio player
setupAudio()
playAudio()
}

/// Call this only ONCE
func setupAudio() {
engine.attach(player)

// Load the audio buffer
let fileFormat = file.processingFormat
let fileFrameCount = UInt32(file.length)
buffer = AVAudioPCMBuffer(pcmFormat: fileFormat, frameCapacity: fileFrameCount)
file.framePosition = .zero
try! file.read(into: buffer!, frameCount: fileFrameCount)

// Observe for changes in the audio engine configuration
NotificationCenter.default.addObserver(self,
selector: #selector(handleInterruption),
name: NSNotification.Name.AVAudioEngineConfigurationChange,
object: nil
)
}

/// Call this every time you want to restart audio
func playAudio() {
// Connect to the mixer
let mainMixer = engine.mainMixerNode
engine.connect(player, to: mainMixer, format: file.processingFormat)

// Start the engine
try! engine.start()

// Play the audio
player.scheduleBuffer(buffer, at: nil, options: .loops, completionHandler: nil)
player.play()
}

@objc func handleInterruption(notification: Notification) {
playAudio()
}
}

This code works for me on Big Sur.

Tap Mic Input Using AVAudioEngine in Swift

It might be the case that your AVAudioEngine is going out of scope and is released by ARC ("If you liked it then you should have put retain on it...").

The following code (engine is moved to an ivar and thus sticks around) fires the tap:

class AppDelegate: NSObject, NSApplicationDelegate {

let audioEngine = AVAudioEngine()

func applicationDidFinishLaunching(aNotification: NSNotification) {
let inputNode = audioEngine.inputNode
let bus = 0
inputNode.installTapOnBus(bus, bufferSize: 2048, format: inputNode.inputFormatForBus(bus)) {
(buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
println("sfdljk")
}

audioEngine.prepare()
audioEngine.startAndReturnError(nil)
}
}

(I removed the error handling for brevity)



Related Topics



Leave a reply



Submit