help-gsl
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: SV: [Help-gsl] On conjugate gradient algorithms in multidimentional


From: Max Belushkin
Subject: Re: SV: [Help-gsl] On conjugate gradient algorithms in multidimentional minimisation problems.
Date: Fri, 16 Dec 2005 21:35:40 +0100
User-agent: Thunderbird 1.5 (X11/20051025)

Brian Gough wrote:
Just for clarification, what type of function are you minimising?

It's a standard chi squared of a model function (which is quite complicated, but only has 9 parameters for the problem in this post). The function itself is, technically, a sum of "a/(b+x)" functions. The chi squared is computed in the standard way based on this function, the data, and the errors of the data. To the fit, chi squared is being fed, the gradient is computed numerically in each parameter.

Back to your question - I'm minimizing a real function without any singularities in the region of minimization. I don't see why any exact form of a continuous, non-oscillatory function should affect the performance of conjugate gradient algorithms, when any simple random walk algorithm does at least some part of the job.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]