Re: Getting greatest decimal accuracy out of G_PI
- From: Ed Catmur <ed catmur co uk>
- To: gtk-list gnome org
- Subject: Re: Getting greatest decimal accuracy out of G_PI
- Date: Fri, 02 Feb 2007 22:29:09 +0000
On Fri, 2007-02-02 at 16:30 -0500, zentara wrote:
> Please pardon my basic questions, but I am self taught
> and run into questions I can't solve thru googling.
>
> I see that G_PI is defined in the header to
> 3.14159265358979323846
>
> So on my 32 bit Athlon machine, I get limited precision
> to 15 decimal places, do I need a 64 bit machine for better
> precision? Or is there a different format to use?
Google? Did you try Wikipedia?
double is 64 bits everywhere; 52 bits of mantissa gives 15 decimal
places. (Remember, 1 bit is 0.3 bans). long double may have the
precision you're after, again, read Wikipedia.
Ed
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]