Play segment of AVAudioPCMBuffer
You can create a new buffer that's a segment of the original. AVAudioPCMBuffer and AudioBufferlist are kind of a pain in Swift. There are a few ways to do this, if you are using floats you can access AVAUdioPCMBuffer.floatChannelData, but here's a method that works for both float and int samples.
func segment(of buffer: AVAudioPCMBuffer, from startFrame: AVAudioFramePosition, to endFrame: AVAudioFramePosition) -> AVAudioPCMBuffer? {
let framesToCopy = AVAudioFrameCount(endFrame - startFrame)
guard let segment = AVAudioPCMBuffer(pcmFormat: buffer.format, frameCapacity: framesToCopy) else { return nil }
let sampleSize = buffer.format.streamDescription.pointee.mBytesPerFrame
let srcPtr = UnsafeMutableAudioBufferListPointer(buffer.mutableAudioBufferList)
let dstPtr = UnsafeMutableAudioBufferListPointer(segment.mutableAudioBufferList)
for (src, dst) in zip(srcPtr, dstPtr) {
memcpy(dst.mData, src.mData?.advanced(by: Int(startFrame) * Int(sampleSize)), Int(framesToCopy) * Int(sampleSize))
}
segment.frameLength = framesToCopy
return segment
}
How to play looping compressed soundtrack without using to much ram?
Well, as a workaround, I decided to create a wrapper to split the audio into chunk of few second and playing and buffering them one at the time into the AVAudioPlayerNode.
As a result only a few seconds are RAM (twice that when buffering) at any time.
It brung the memory usage for my use case from 350Mo to less than 50Mo.
Here is the code, don't hesitate to use it or improve it (it's a first version). Any comments are welcome!
import Foundation
import AVFoundation
public class AVAudioStreamPCMPlayerWrapper
{
public var player: AVAudioPlayerNode
public let audioFile: AVAudioFile
public let bufferSize: TimeInterval
public let url: URL
public private(set) var loopingCount: Int = 0
/// Equal to the repeatingTimes passed in the initialiser.
public let numberOfLoops: Int
/// The time passed in the initialisation parameter for which the player will preload the next buffer to have a smooth transition.
/// The default value is 1s.
/// Note : better not go under 1s since the buffering mecanism can be triggered with a relative precision.
public let preloadTime: TimeInterval
public private(set) var scheduled: Bool = false
private let framePerBuffer: AVAudioFrameCount
/// To identify the the schedule cycle we are executed
/// Since the thread work can't be stopped when they are scheduled
/// we need to be sure that the execution of the work is done for the current playing cycle.
/// For exemple if the player has been stopped and restart before the async call has executed.
private var scheduledId: Int = 0
/// the time since the track started.
private var startingDate: Date = Date()
/// The date used to measure the difference between the moment the buffering should have occure and the actual moment it did.
/// Hence, we can adjust the next trigger of the buffering time to prevent the delay to accumulate.
private var lastBufferingDate = Date()
/// This class allow us to play a sound, once or multiple time without overloading the RAM.
/// Instead of loading the full sound into memory it only reads a segment of it at a time, preloading the next segment to avoid stutter.
/// - Parameters:
/// - url: The URL of the sound to be played.
/// - bufferSize: The size of the segment of the sound being played. Must be greater than preloadTime.
/// - repeatingTimes: How many time the sound must loop (0 it's played only once 1 it's played twice : repeating once)
/// -1 repeating indéfinitly.
/// - preloadTime: 1 should be the minimum value since the preloading mecanism can be triggered not precesily on time.
/// - Throws: Throws the error the AVAudioFile would throw if it couldn't be created with the URL passed in parameter.
public init(url: URL, bufferSize: TimeInterval, isLooping: Bool, repeatingTimes: Int = -1, preloadTime: TimeInterval = 1)throws
{
self.url = url
self.player = AVAudioPlayerNode()
self.bufferSize = bufferSize
self.numberOfLoops = repeatingTimes
self.preloadTime = preloadTime
try self.audioFile = AVAudioFile(forReading: url)
framePerBuffer = AVAudioFrameCount(audioFile.fileFormat.sampleRate*bufferSize)
}
public func scheduleBuffer()
{
scheduled = true
scheduledId += 1
scheduleNextBuffer(offset: preloadTime)
}
public func play()
{
player.play()
startingDate = Date()
scheduleNextBuffer(offset: preloadTime)
}
public func stop()
{
reset()
scheduleBuffer()
}
public func reset()
{
player.stop()
player.reset()
scheduled = false
audioFile.framePosition = 0
}
/// The first time this method is called the timer is offset by the preload time, then since the timer is repeating and has already been offset
/// we don't need to offset it again the second call.
private func scheduleNextBuffer(offset: TimeInterval)
{
guard scheduled else {return}
if audioFile.length == audioFile.framePosition
{
guard numberOfLoops == -1 || loopingCount < numberOfLoops else {return}
audioFile.framePosition = 0
loopingCount += 1
}
let buffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat, frameCapacity: framePerBuffer)!
let frameCount = min(framePerBuffer, AVAudioFrameCount(audioFile.length - audioFile.framePosition))
print("\(audioFile.framePosition/48000) \(url.relativeString)")
do
{
try audioFile.read(into: buffer, frameCount: frameCount)
DispatchQueue.global().async(group: nil, qos: DispatchQoS.userInteractive, flags: .enforceQoS) { [weak self] in
self?.player.scheduleBuffer(buffer, at: nil, options: .interruptsAtLoop)
self?.player.prepare(withFrameCount: frameCount)
}
let nextCallTime = max(TimeInterval( Double(frameCount) / audioFile.fileFormat.sampleRate) - offset, 0)
planNextPreloading(nextCallTime: nextCallTime)
} catch
{
print("audio file read error : \(error)")
}
}
private func planNextPreloading(nextCallTime: TimeInterval)
{
guard self.player.isPlaying else {return}
let id = scheduledId
lastBufferingDate = Date()
DispatchQueue.global().asyncAfter(deadline: .now() + nextCallTime, qos: DispatchQoS.userInteractive) { [weak self] in
guard let self = self else {return}
guard id == self.scheduledId else {return}
let delta = -(nextCallTime + self.lastBufferingDate.timeIntervalSinceNow)
self.scheduleNextBuffer(offset: delta)
}
}
}
Schedule buffer with AVAudioPCMBuffer int16 data
The AVAudioMixerNode is good for sampleRate conversions, but for broad format changes like Int16 to Float, you're probably better off converting yourself. For performance, I suggest using vDSP Accelerate.
import Cocoa
import AVFoundation
import Accelerate
import PlaygroundSupport
let bufferSize = 512
let bufferByteSize = MemoryLayout<Float>.size * bufferSize
var pcmInt16Data: [Int16] = []
var pcmFloatData = [Float](repeating: 0.0, count: bufferSize) // allocate once and reuse
// one buffer of noise as an example
for _ in 0..<bufferSize {
let value = Int16.random(in: Int16.min...Int16.max)
pcmInt16Data.append(value)
}
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 48_000.0, channels: 1)!
let mixer = engine.mainMixerNode
engine.attach(player)
engine.connect(player, to: mixer, format: audioFormat)
engine.prepare()
do {
try engine.start()
} catch {
print("Error info: \(error)")
}
player.play()
if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(bufferSize)) {
let monoChannel = buffer.floatChannelData![0]
// Int16 ranges from -32768 to 32767 -- we want to convert and scale these to Float values between -1.0 and 1.0
var scale = Float(Int16.max) + 1.0
vDSP_vflt16(pcmInt16Data, 1, &pcmFloatData, 1, vDSP_Length(bufferSize)) // Int16 to Float
vDSP_vsdiv(pcmFloatData, 1, &scale, &pcmFloatData, 1, vDSP_Length(bufferSize)) // divide by scale
memcpy(monoChannel, pcmFloatData, bufferByteSize)
buffer.frameLength = UInt32(bufferSize)
player.scheduleBuffer(buffer, completionHandler: nil) // load more buffers in the completionHandler
}
PlaygroundPage.current.needsIndefiniteExecution = true
If instead you'd like to play an AVAudioFile, use the AVAudioPlayerNode.scheduleFile() and .scheduleSegment methods rather than trying to read the Int16 data directly from a WAV/AIFF. You'll want to pay attention to the AVAudioFile.processingFormat parameter and use that for the format of the connection from the player to the mixer.
import Cocoa
import PlaygroundSupport
import AVFoundation
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let playEntireFile = true
func playLocalFile() {
// file needs to be in ~/Documents/Shared Playground Data
let localURL = playgroundSharedDataDirectory.appendingPathComponent("MyAwesomeMixtape6.aiff")
guard let audioFile = try? AVAudioFile(forReading: localURL) else { return }
let audioFormat = audioFile.processingFormat
let mixer = engine.mainMixerNode
engine.attach(player)
engine.connect(player, to: mixer, format: audioFormat)
engine.prepare()
do {
try engine.start()
} catch {
print("Error info: \(error)")
}
player.play()
if playEntireFile {
player.scheduleFile(audioFile, at: nil, completionHandler: nil)
} else { // play segment
let startTimeSeconds = 5.0
let durationSeconds = 2.0
let sampleRate = audioFormat.sampleRate
let startFramePostion = startTimeSeconds * sampleRate
let durationFrameCount = durationSeconds * sampleRate
player.scheduleSegment(audioFile, startingFrame: AVAudioFramePosition(startFramePostion), frameCount: AVAudioFrameCount(durationFrameCount), at: nil, completionHandler: nil)
}
}
playLocalFile()
PlaygroundPage.current.needsIndefiniteExecution = true
For remote files, try AVPlayer.
import Cocoa
import AVFoundation
import PlaygroundSupport
var player: AVPlayer?
func playRemoteFile() {
guard let remoteURL = URL(string: "https://ondemand.npr.org/anon.npr-mp3/npr/me/2020/03/20200312_me_singapore_wins_praise_for_its_covid-19_strategy_the_us_does_not.mp3"
) else { return }
player = AVPlayer(url: remoteURL)
player?.play()
}
playRemoteFile()
PlaygroundPage.current.needsIndefiniteExecution = true
Related Topics
How, Exactly, Do I Render Metal on a Background Thread
Difference Between Optional and Forced Unwrapping
How to Avoid Duplicate Key Error in Swift When Iterating Over a Dictionary
How to Draw Dashed Line in Arkit (Scenekit) Like in the Measure App
Swiftui List View Not Updating After Core Data Entity Updated in Another View
How to Access Modifiers of a View in Swiftui
Get Nil When Looking for File in Subdirectory of Main Bundle
JSONencoder and Propertylistencoder Don't Conform to Encoder
How to Compare Result to .Succeed in Swift
Cannot Invoke Initializer for Type 'Sqlite3_Destructor_Type'
Programmatically Scroll Nsscrollview to the Right
How to Add a Left Bar Button Without Overriding the Natural Back Button
Explicitly Unwrapping Optional Nil Does Not Cause Crash
Swiftui Casting Tupleview to an Array of Anyview
Swift Protocol with Variadic Property
Update Nstouchbar on the Fly to Add/Remove Items Programmatically
Swift Subclasses Used in Generics Don't Get Called When Inheriting from Nsobject