Re: Getting greatest decimal accuracy out of G_PI



On Fri, 2 Feb 2007 23:43:05 +0100
David Nečas (Yeti) <yeti physics muni cz> wrote:

>On Fri, Feb 02, 2007 at 04:30:17PM -0500, zentara wrote:
>> Please pardon my basic questions, but I am self taught
>> and run into questions I can't solve thru googling.
>
>The problem of this question is rather that it's completely
>unrelated to Gtk+.

Well I'm wondering why the header defines G_PI to 49 places
(if I counted right), if the biggest Gtk2 data type only holds precision
to 15?  So I'm wondering if someone knew something about gtk2
that would handle such a big number? 

>> I see that G_PI is defined in the header to
>> 3.14159265358979323846
>
>I wonder where you found this.  gtypes.h defines G_PI:
>
>  #define G_PI    3.1415926535897932384626433832795028841971693993751

Yeah you are right. I  copied it off 

 g.lib-2.12.9/docs/reference/glib/html/lib-Numerical-Definitions.html

correctly, but I must have made a cut-n-paste typo somewhere. Sorry.
It was still greater prescision than I could get with Long Double.

>
>The standard IEEE 754 double precision floating point type
>has 64 bits, of which 52 bits is mantissa, that's 15-16
>decimal digits.  The standard type obviously has the same
>precision everywhere because if it differed it would not be
>the standard type any more.
>
>Both i386 (or rather its 387 math coprocessor) and x86_64
>are capable of calculation with 80bit floating point
>numbers, mantissa is 64 bits, that's 19-20 decimal digits.
>These numbers are available as the `long double' type.  long
>double is implementation-defined, it can look differently on
>other platforms.
>
>Yeti

Thank you for your help. After looking thru a bunch of comp.lang.c
posts, where this was asked, the best answer I saw was

"C only guarantees 10 digits of precision with double and long double.
Since IEEE representations have become widespread 15 or so is common
but the requirement is still only 10. "

And that is what I'm seeing..... 15 decimal places accuracy.

It also answered a question lurking in the back of my mind, why
Fortran is still so widely used in numerical computation instead
of c.

zentara


-- 
I'm not really a human, but I play one on earth.
http://zentara.net/japh.html



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]