Re: Getting greatest decimal accuracy out of G_PI



Before this discussion gets any further, maybe the original poster
could tell us what he intends to *do* with the highly accurate value
of pi he is after? I am certainly no expert on numerical computation,
but I know that using the wrong approach in numerical calculations
quickly can reduce any initial accuracy a *lot*. Just having a highly
accurate initial value of pi, as long double or whatever, is pointless
if one then uses it in a silly way in calculations, or if the actual
*variable data* used in the calculations has much less accuracy.

--tml



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]