macOS/swift Capture Audio with AVCaptureSession
It's called for me. You don't show how you use it, but maybe your AudioCaptureSession
is going out of scope and being deallocated.
How to record my Mac's internal sound, not the microphone!, using AVCaptureSession?
I have not found an easy way to do it but it turns out that it is possible to make the original code record the audio given that there is another software installed on my machine: Soundflower.
Soundflower is an open source kernel extension for MacOS, designed to create a virtual audio output device that can also act as an input.
Given that the Soundflower is installed, one can use configure the macOS using the Applications / Utilities / Audio MIDI Setup
app to send the audio to both virtual and real audio devices. This way the code above captures the audio from the Soundflower but you can still hear it on your normal audio output device.
The setup of the Applications / Utilities / Audio MIDI Setup
application is described here: How can I send my computer's audio to multiple outputs?.
Using AVCapture session to capture audio frames
It turns out (as @RhythmicFistman suggested) I was just missing setting weather or not I wanted the frame to be interleaved. So my settings look like this now:
let audioSettings = [AVFormatIDKey : kAudioFormatLinearPCM,
AVSampleRateKey : 48000,
AVLinearPCMBitDepthKey : 16,
AVLinearPCMIsFloatKey : false,
AVLinearPCMIsNonInterleaved : false] as [String : Any]
How to add multiple audio AVCaptureDevice to an AVCaptureSession
I thought multiple audio and video AVCaptureInput
s were unsupported, yet I couldn't find any documentation on that either.
A while ago this person cleverly managed to create multiple video inputs and outputs by using addOutputWithNoConnections()
, then manually creating the connections to the inputs (but why no addInputWithNoConnections()
?):
https://stackoverflow.com/a/30191013/22147
That could be worth trying with audio! Please report back with your results if you try this!
However I like to wrap the input devices in an aggregate audio device and then configure the AVCaptureSession
to use that. This gives you the convenience of working with a single "device" and you don't have to worry about multiple clocks and timestamps because both devices are synchronised.
You can create an aggregate audio device manually in Audio MIDI Setup.app
: https://stackoverflow.com/a/65704755/22147
or programmatically, using AudioHardwareCreateAggregateDevice()
:
https://stackoverflow.com/a/56415699/22147
If you use the programmatic route you can hide the resulting aggregate device by setting kAudioAggregateDeviceIsPrivateKey
to true
. You might want to do this to stop users messing with it.
AVCaptureSession Record Video With Audio
You have not included the audio device:
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
[session addInput:audioInput]
between beginConfiguration
and commitConfiguration
. It'll work!!!
Related Topics
Swift - Get Nsbundle for Test Target
Dyld: Library Not Loaded: @Rpath/Libswiftcore.Dylib Problem with New Xcode (10.2)
Views Compressed by Other Views in Swiftui VStack and List
Reversing the Order of a String Value
Translucent Status Bar with No Navigation Bar
Subclass.Fetchrequest() Swift 3.0, Extension Not Really Helping 100%
How to Add a Double Tap Gesture Recognizer in Swift
Swiftui Onhover Doesn't Register Mouse Leaving the Element If Mouse Moves Too Fast
Swiftui 2 Firebase Push Notification
Swift Didset Get Index of Array
How to Open Another Window in MACos in Swift with Cocoa
Reason for Assigning Optional to New Variable in Conditional Statement in Swift
What Are the Advantages/Use Cases of Optional Patterns Introduced in Swift 2
How to Store a Reference to an Integer in Swift