Re: microseconds ought to be enough for anybody



On Nov 1, 2010, at 9:06 AM, Ryan Lortie wrote:

> Hello
> 
> After an extensive process of dithering and soul-searching, I've come to
> the following conclusions about time in GLib:
> 
>  - we have too many time types
> 
>      no explanation required.
> 
>  - microseconds ought to be enough for anybody
> 
>     A nanosecond is on the scale of individual clock cycles and
>     processors are getting slower, not faster.  I have an extremely
>     hard time imagining this sort of precision being meaningful (much
>     less, actually useful) in our platform.  Heck; our mainloop is
>     still based on milliseconds.
> 
>  - double isn't as good as int64
> 
>     Using doubles for time intervals is extremely wasteful.  Doubles
>     were designed to represent extremely small and extremely large
>     numbers -- orders of magnitude that we never meaningfully encounter
>     while dealing with time.  Those extra bits used to express order of
>     magnitude are lost for expressing actual information.  If we used
>     fixed point, we get those 12 extra bits back.
> 
>     I also don't like the strange feeling that I get when I consider
>     that the precision of double-based time expression slowly decreases
>     with each passing year.  Not that it's really a practical
>     consideration -- just a weird feeling.
> 
>  - use of int64-as-microseconds is already extremely common
> 
>     This is another reason to avoid doubles.
> 
>  - who cares about stupid platforms?
> 
>     I spent a significant amount of time hand-wringing over
>     hypothetical "stupid" platforms that have the epoch of their
>     monotonic clock in biblical times.  Although that's a theoretical
>     possibility, I don't know of any such platforms and if they do
>     exist, we can cross that bridge when we get there.  Our primary
>     concern should be Linux -- and Linux is quite sane here.
> 
> The conclusion of all of this is one point: barring substantial
> complaints, the be-all and end-all of time in glib is going to be
> microseconds stored in a gint64.
> 
> GDateTime already calls this a "GTimeSpan".  That's OK.
> 
> 
> I will do the following:
> 
>  - rip out the newly introduced GTimeSpec and all APIs using it
> 
>  - deprecate GTimeVal and all APIs using it
> 
>  - replace both with APIs using int64 of microseconds
> 
>  - deprecate GTimer
> 
>    with int64-based time, it's trivially easy to do it yourself
> 
> 
> Somewhat related (but not directly): I'd like to deprecate GDate (and
> the half-dozen or so related types) in favour of eventual replacement
> with a GDateTime-style immutable opaque structure.  I guess we could
> call that GDay (pronounced "g'day") for lack of a better available name
> or we could wait until glib4 and replace GDate with GDate at that point.
> Probably we should consider support for non-Gregorian calendars before
> that point anyway (the fabled GCalendar API).
> 
> I'm going to start pushing this work into a branch today and I'll merge
> it to master quite soon unless there are objections.
> 

+1

I don't see any reason to have a replacement for GDate. Just add any extra functionality it provides to GDateTime -- which, with int64, will be good for +/- 290,000 years from whatever its epoch date is, with microsecond resolution. That should be sufficient for everyone except astronomers, who have their own time libraries anyway.

As for bizarre OSes with epochs beginning at some time before the common era, well, if glib is ever ported to one a simple ifdef can just add the correction value and be done with it. No worries.


Regards,
John Ralls




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]