Re: On the usefulness of g_ascii_dtostr



On Tue, 2007-05-15 at 22:20 -0400, Tim Evans wrote:
> g_ascii_dtostr converts a double to a string with enough precision that 
> if the string is converted back to a double the value will be the same. 
>   This is, I imagine, and operation that is commonly needed, and often 
> done incorrectly when people just use "%g" in a printf.
> 
> I think that g_ascii_dtostr could be a lot more useful if it added 
> another constraint: that it will produce the *shortest* string that 
> fulfills it's current design.  For example, rather than 
> g_ascii_dtostr(1.1) producing "1.1000000000000001" it would produce 
> "1.1".  This functionality is found in other languages, such as 
> Double.toString in Java.
> 
> The files I've attached two files: dtoa.c and g_fmt.c, the first is the 
> code that seems to be used by everyone else that does this task, the 
> second is basically an example of how to use it to produce a "%g" type 
> output.  The files are under an MIT-like license.
> 
> Does anyone think that this type of function would be useful in GLib?

I totally agree with you on the usefulness of what you suggest.
However, a complete implementation of dtostr in glib is
wasteful/impractical/problematic.  You can see that in the size of the
implementation you sent already.

I can think of a completely different technique that works most of the
time and if much much simpler:  Detect that by adding an undetectable
amount of error you can round "1.1000000000000001" down one digit.  When
you do that, it goes straight to desired "1.1".  Same
about .999999999999 sequences.

-- 
behdad
http://behdad.org/

"Those who would give up Essential Liberty to purchase a little
 Temporary Safety, deserve neither Liberty nor Safety."
        -- Benjamin Franklin, 1759






[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]