[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
## [Help-gsl] gsl_sf_beta efficiency question

**From**: |
Helfer Thomas |

**Subject**: |
[Help-gsl] gsl_sf_beta efficiency question |

**Date**: |
Fri, 19 Jan 2007 18:57:34 +0100 |

Hi,
I need to use the beta function with the GSL version 1.6. I have tried
to compute it either using gsl_sf_beta or by using the relation
Beta(w,z) = Gamma(w)*Gamma(z)/Gamma(w,z) and calling (gsl_sf_gamma).
I measured the CPU time to compute each solution using the rdtsc
assembly function (my processor is an Intel P4).
The results where quite surprising: using the Gamma function
(gsl_sf_gamma) to compute the Beta function seems to be 2x more
efficient than using directly the Beta function (gsl_sf_beta).
To compare the CPU times, I have made a small loop (10 times) that calls
each solution. Here are the results:
CPU frequency : 3.19213e+09
Beta CPU Gamma CPU
Loop 1
150736 11936
Loop 2
8632 4896
Loop 3
8280 4760
Loop 4
8280 4752
Loop 5
8288 4752
Loop 6
8536 4760
Loop 7
8296 4760
Loop 8
8280 4760
Loop 9
8280 4760
Loop 10
8280 4760
I’m also surprise to notice that the first iteration costs much more CPU
time than the following iterations.
I’ve attached my C++ test to my mail.
Regards,
Helfer Thomas

**
**`gsl_test_beta.cxx`

*Description:* Text Data

**[Help-gsl] gsl_sf_beta efficiency question**,
*Helfer Thomas* **<=**