Repeated timeouts period



The behavior of GLib timeouts is, I believe somewhat unpractical. I am
talking about, quoting the doc, the fact that they "[do] not try to 'catch
up' time lost in delays".

Suppose, for example, that I want to update a clock on the screen, with
second accuracy. The obvious way to do that is something like:

	g_timer_reset(clock_timer);
	g_timeout_add(1000, update_clock, NULL);
	g_timer_start(clock_timer);

and have update_clock return TRUE to be restarted every second. But it will
not work: the actual period between calls to update_clock will be something
like 1005ms (reasonable with HZ=100 for example), and after two minutes, the
clock will be updated more than half a second after the actual second
change, and after three minutes or so, the display will miss a whole second.

There is a trivial solution to this problem: in g_timeout_dispatch,
g_timeout_set_expiration uses current_time; if it uses
timeout_source->expiration instead, we get an accurate period.

Of course, I am not suggesting that particular change: all hell would break
loose for existing programs if the system is too heavily loaded to service
all timeouts in due time.

Nevertheless, I am sure that the need for timeouts with accurate period on
the long run is not uncommon at all. Therefor, GLib should maybe include
some new support for that need.

I am just about to write sopport for such timeouts in a project of mine. I
am wondering if I should make it more reusable that my immediate needs and
try to contribute it to GLib.

Would you be interested in such functions?

Attachment: signature.asc
Description: Digital signature



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]