Swift Solid Metronome System

Swift Solid Metronome System

A metronome built purely with NSTimer will not be very accurate, as Apple explains in their documentation.

Because of the various input sources a typical run loop manages, the effective resolution of the time interval for a timer is limited to on the order of 50-100 milliseconds. If a timer’s firing time occurs during a long callout or while the run loop is in a mode that is not monitoring the timer, the timer does not fire until the next time the run loop checks the timer.

I would suggest using an NSTimer that fires on the order of 50 times per desired tick (for example, if you would like a 60 ticks per minute, you would have the NSTimeInterval to be about 1/50 of a second.

You should then store a CFAbsoluteTime which stores the "last tick" time, and compare it to the current time. If the absolute value of the difference between the current time and the "last tick" time is less than some tolerance (I would make this about 4 times the number of ticks per interval, for example, if you chose 1/50 of a second per NSTimer fire, you should apply a tolerance of around 4/50 of a second), you can play the "tick."

You may need to calibrate the tolerances to get to your desired accuracy, but this general concept will make your metronome a lot more accurate.

Here is some more information on another SO post. It also includes some code that uses the theory I discussed. I hope this helps!

Update
The way you are calculating your tolerances is incorrect. In your calculations, notice that the tolerance is inversely proportional to the square of the bpm. The problem with this is that the tolerance will eventually be less than the number of times the timer fires per second. Take a look at this graph to see what I mean. This will generate problems at high BPMs. The other potential source of error is your top bounding condition. You really don't need to check an upper limit on your tolerance, because theoretically, the timer should have already fired by then. Therefore, if the elapsed time is greater than the theoretical time, you can fire it regardless. (For example if the elapsed time is 0.1s and and the actual time with the true BPM should be 0.05s, you should go ahead and fire the timer anyways, no matter what your tolerance is).

Here is my timer "tick" function, which seems to work fine. You need to tweak it to fit your needs (with the downbeats, etc.) but it works in concept.

func tick(timer:NSTimer) {
let elapsedTime:CFAbsoluteTime = CFAbsoluteTimeGetCurrent() - lastTick
let targetTime:Double = 60/timer.userInfo!.objectForKey("bpm")!.doubleValue!
if (elapsedTime > targetTime) || (abs(elapsedTime - targetTime) < 0.003) {
lastTick = CFAbsoluteTimeGetCurrent()
# Play the click here
}
}

My timer is initialized like so: nextTimer = NSTimer(timeInterval: (60.0/Double(bpm)) * 0.01, target: self, selector: "tick:", userInfo: ["bpm":bpm], repeats: true)

Timing issues: Metronome using AVAudioEngine scheduleBuffer's completion handler

As Phil Freihofner suggested above, here's the solution to my own problem:

The most important lesson I learned: The completionHandler callback provided by the scheduleBuffer command is not called early enough to trigger re-scheduling of another buffer while the first one is still playing. This will result in (inaudible) gaps between the sounds and mess up the timing. There must already be another buffer "in reserve", i.e. having been schdeduled before the current one has been scheduled.

Using the completionCallbackType parameter of scheduleBuffer didn't change much considering the time of the completion callback: When setting it to .dataRendered or .dataConsumed the callback was already too late to re-schedule another buffer. Using .dataPlayedback made things only worse :-)

So, to achieve seamless playback (with correct timing!) I simply activated a timer that triggers twice per period. All odd numbered timer events will re-schedule another buffer.

Sometimes the solution is so easy it's embarrassing... But sometimes you have to try almost every wrong approach first to find it ;-)

My complete working solution (including the two sound files and the UI) can be found here on GitHub:

https://github.com/Alexander-Nagel/Metronome-using-AVAudioEngine

import UIKit
import AVFoundation

private let DEBUGGING_OUTPUT = true

class ViewController: UIViewController{

private var engine = AVAudioEngine()
private var player = AVAudioPlayerNode()
private var mixer = AVAudioMixerNode()

private let fileName1 = "sound1.wav"
private let fileName2 = "sound2.wav"
private var file1: AVAudioFile! = nil
private var file2: AVAudioFile! = nil
private var buffer1: AVAudioPCMBuffer! = nil
private var buffer2: AVAudioPCMBuffer! = nil

private let sampleRate: Double = 44100

private var bpm: Double = 133.33
private var periodLengthInSamples: Double {
60.0 / bpm * sampleRate
}
private var timerEventCounter: Int = 1
private var currentBeat: Int = 1
private var timer: Timer! = nil

private enum MetronomeState {case running; case stopped}
private var state: MetronomeState = .stopped

@IBOutlet weak var beatLabel: UILabel!
@IBOutlet weak var bpmLabel: UILabel!
@IBOutlet weak var playPauseButton: UIButton!

override func viewDidLoad() {

super.viewDidLoad()

bpmLabel.text = "\(bpm) BPM"

setupAudio()
}

private func setupAudio() {

//
// MARK: Loading buffer1
//
let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
let url1 = URL(fileURLWithPath: path1)
do {file1 = try AVAudioFile(forReading: url1)
buffer1 = AVAudioPCMBuffer(
pcmFormat: file1.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file1.read(into: buffer1!)
buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer1 \(error)") }

//
// MARK: Loading buffer2
//
let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
let url2 = URL(fileURLWithPath: path2)
do {file2 = try AVAudioFile(forReading: url2)
buffer2 = AVAudioPCMBuffer(
pcmFormat: file2.processingFormat,
frameCapacity: AVAudioFrameCount(periodLengthInSamples))
try file2.read(into: buffer2!)
buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
} catch { print("Error loading buffer2 \(error)") }

//
// MARK: Configure + start engine
//
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
engine.prepare()
do { try engine.start() } catch { print(error) }
}

//
// MARK: Play / Pause toggle action
//
@IBAction func buttonPresed(_ sender: UIButton) {

sender.isSelected = !sender.isSelected

if state == .running {

//
// PAUSE: Stop timer and reset counters
//
state = .stopped

timer.invalidate()

timerEventCounter = 1
currentBeat = 1

} else {

//
// START: Pre-load first sound and start timer
//
state = .running

scheduleFirstBuffer()

startTimer()
}
}

private func startTimer() {

if DEBUGGING_OUTPUT {
print("# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # ")
print()
}

//
// Compute interval for 2 events per period and set up timer
//
let timerIntervallInSamples = 0.5 * self.periodLengthInSamples / sampleRate

timer = Timer.scheduledTimer(withTimeInterval: timerIntervallInSamples, repeats: true) { timer in

//
// Only for debugging: Print counter values at start of timer event
//
// Values at begin of timer event
if DEBUGGING_OUTPUT {
print("timerEvent #\(self.timerEventCounter) at \(self.bpm) BPM")
print("Entering \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) ")
}

//
// Schedule next buffer at 1st, 3rd, 5th & 7th timerEvent
//
var bufferScheduled: String = "" // only needed for debugging / console output
switch self.timerEventCounter {
case 7:

//
// Schedule main sound
//
self.player.scheduleBuffer(self.buffer1, at:nil, options: [], completionHandler: nil)
bufferScheduled = "buffer1"

case 1, 3, 5:

//
// Schedule subdivision sound
//
self.player.scheduleBuffer(self.buffer2, at:nil, options: [], completionHandler: nil)
bufferScheduled = "buffer2"

default:
bufferScheduled = ""
}

//
// Display current beat & increase currentBeat (1...4) at 2nd, 4th, 6th & 8th timerEvent
//
if self.timerEventCounter % 2 == 0 {
DispatchQueue.main.async {
self.beatLabel.text = String(self.currentBeat)
}
self.currentBeat += 1; if self.currentBeat > 4 {self.currentBeat = 1}
}

//
// Increase timerEventCounter, two events per beat.
//
self.timerEventCounter += 1; if self.timerEventCounter > 8 {self.timerEventCounter = 1}


//
// Only for debugging: Print counter values at end of timer event
//
if DEBUGGING_OUTPUT {
print("Exiting \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) \tscheduling: \(bufferScheduled)")
print()
}
}
}

private func scheduleFirstBuffer() {

player.stop()

//
// pre-load accented main sound (for beat "1") before trigger starts
//
player.scheduleBuffer(buffer1, at: nil, options: [], completionHandler: nil)
player.play()
beatLabel.text = String(currentBeat)
}
}

Thanks so much for your help everyone! This is a wonderful community.

Alex

Accuracy of NSTimer

Here's a class you can use to do what you want:

@interface StopWatch()
@property ( nonatomic, strong ) NSTimer * displayTimer ;
@property ( nonatomic ) CFAbsoluteTime startTime ;
@end

@implementation StopWatch

-(void)dealloc
{
[ self.displayTimer invalidate ] ;
}

-(void)startTimer
{
self.startTime = CFAbsoluteTimeGetCurrent() ;
self.displayTimer = [ NSTimer scheduledTimerWithTimeInterval:0.05 target:self selector:@selector( timerFired: ) userInfo:nil repeats:YES ] ;
}

-(void)stopTimer
{
[ self.displayTimer invalidate ] ;
self.displayTimer = nil ;

CFAbsoluteTime elapsedTime = CFAbsoluteTimeGetCurrent() - self.startTime ;
[ self updateDisplay:elapsedTime ] ;
}

-(void)timerFired:(NSTimer*)timer
{
CFAbsoluteTime elapsedTime = CFAbsoluteTimeGetCurrent() - self.startTime ;
[ self updateDisplay:elapsedTime ] ;
}

-(void)updateDisplay:(CFAbsoluteTime)elapsedTime
{
// update your label here
}

@end

The key points are:

  1. do your timing by saving the system time when the stop watch is started into a variable.
  2. when the the stop watch is stopped, calculate the elapsed time by subtracting the stop watch start time from the current time
  3. update your display using your timer. It doesn't matter if your timer is accurate or not for this. If you are trying to guarantee display updates at least every 0.1s, you can try setting your timer interval to 1/2 the minimum update time (0.05s).

-[UIApplication delegate] must be called from main thread only

Just call it from the main thread like this.

Objective-C

dispatch_async(dispatch_get_main_queue(), ^{
[[UIApplication delegate] fooBar];
});

Swift

DispatchQueue.main.async {
YourUIControlMethod()
}

Reaching out to your app delegate like this, is a hint that your architecture could use a little cleanup.

You can call delegates from any thread you want. You only need to make sure you're on the main thread for UIKit calls.
Or that you're on the correct thread your CoreData objects expect. It all depends on the API contract your objects have.



Related Topics



Leave a reply



Submit