samedi 27 mars 2021

sleep_until() and steady_clock loop drifting from real time in macOS

Good evening everyone, I'm trying to learn concurrency using the C++ Concurrency Book by Anthony Williams. Having read the first 2 chapters I thought about coding a simple metronome working in its own thread:

#include <iostream>
#include <thread>
#include <chrono>
#include <vector>

class Metro
{
public:
    // beats per minute
    Metro(int bpm_in);
    void start();

private:
    // In milliseconds
    int mPeriod;
    std::vector<std::thread> mThreads;

private:
    void loop();
};

Metro::Metro(int bpm_in):
    mPeriod(60000/bpm_in)
{}

void Metro::start()
{
    mThreads.push_back(std::thread(&Metro::loop, this));
    mThreads.back().detach();
}

void Metro::loop()
{
    auto x = std::chrono::steady_clock::now();
    while(true)
    {
        x += std::chrono::milliseconds(mPeriod);
        std::cout << "\a" << std::flush;
        std::this_thread::sleep_until(x);
    }
}

Now, this code seems to work properly, except for the time interval: the period (assuming bpm = 60 => mPeriod = 1000ms) is more than 1100ms. I read that sleep_until is not guaranteed to wake the process up exactly at the correct time (cppreference.com), but the lack of precision should not change the average period time, only delay the single "tic" inside the time grid, am I understanding it correctly? I assumed that storing the steady_clock::now() time only the first time and then using only the increment would be the correct way not to add drifting time at every cycle. Nevertheless, I also tried to change the x var update in the while loop to

x = std::chrono::steady_clock::now() + std::chrono::milliseconds(mPeriod);

but the period increases even more. I also tried using std::chrono::system_clock and high_resolution_clock, but the period didn't improve. Also, I think the properties I'm interested in for this application are monotonicity and steadiness, which steady_clock has. My question is: is there anything completely wrong I did in my code? Am I missing something concerning how to use std::chrono clocks and sleep_until? Or is this kind of method inherently not precise?

I've started analyzing the period by simply comparing it against some known metronomes (Logic Pro, Ableton Live, some mobile apps) and then recorded the output sound to have a better measurement. Maybe the sound buffer has some delay on itself, but same problem happens when making the program output a char. Also, the problem I'm concerned about is the drifting, not the single tic being a bit out of time.

I'm compiling from macos 10.15 terminal with g++ --std=c++11 -pthread and running it on Intel i7 4770hq.

Aucun commentaire:

Enregistrer un commentaire