[Scilab-users] leastsq question : what is 'gopt' useful for?

Samuel Gougeon sgougeon at free.fr
Tue Mar 22 21:54:10 CET 2016


Hi,

Le 22/03/2016 10:55, Stéphane Mottelet a écrit :
> Hello,
>
> Le 22/03/2016 10:41, antoine.monmayrant at laas.fr a écrit :
>> Hi everyone,
>>
>> I have a very general and naive question concerning leastsq: what am 
>> I to do with "gopt", the "gradient of f at xopt"?
>>
>> Is there a way to link it to the confidence interval for each 
>> parameter of my fit?
> Not really, but since leastsq is a wrapper for optim, which returns 
> the gradient at returned "optimal" solution, it is also returned. 
> However, if the final gradient is seen to be far from the zero vector, 
> then all confidence intervals based on the inverse of the Fisher 
> matrix (computed with the Jacobian) will not have any sense, since 
> these "linear" statistics are based on a development where the first 
> term (using the gradient) is supposed to vanish... Hence, having 
> access to the final gradient can be of interest.

We may guess that, if bounds constraints are set for x, a non-zero 
gradient could as well be returned whether xopt reaches a point on the 
boundary, where fun() has not a true minimum, just a negative 
interrupted slope leading it to a low value.

>> For the moment, I know how to estimate these confidence intervals 
>> when I have access to the Jacobian matrix of my fit function.
I am not sure that we are speaking about the same jacobian.
On one hand, we have some c(i) coordinates (say of spaces: c(1)=x, 
c(2)=y, etc))
on which fun() depends. On the other hand, we have p(j) parameters on 
which fun() also depends.
As good quiet parameters, p(j) have fixed values, whereas c(i) are 
varied by leastsq().

AFAIU, Stephane's answer assumes that the x "passed" to leastqr() is the 
full set gathering {c(i)} AND {p(j)}.
Whereas, still afaiu, you look interested in somewhat getting the 
sensitivity of fun() with respect to each
parameter p(j) around the minimum value of fun({c(i)}) (parameters p(i) 
being fixed in the fun()
definition).
Here, i don't think that we can speak more than about sensitivity. 
"Confidence" is not the proper term,
since parameters values p(j) are deterministic and fixed. They are not 
random variables.
To assess this sensitivity, you will need the fun() jacobian, BUT with 
respect to p(j), not wrt to c(i) !

To get what you want, i could suggest to run leastsq() with x={c(i)} U 
{p(j)}
Then, unless the optimum is reached on a boundary, the absolute value of 
the derivative of order
*2* of fun() along each p(j) direction evaluated at xopt will be related 
to the "confidence interval",
Unless the derivative of order 2 vanishes as well... (so then of order 
3... etc)
A more pragmatic way to get this interval could be to evaluate fun() 
around xopt, just varying
(with grand()) the p(j) component of xopt you want to get the 
confidence, and measuring the
spread of fun()'s answers.

>> Could "gopt" be of some use to estimate the confidence intervals when 
>> the Jacobian matrix is not known?
As Stephane said, in no way.

HTH
Samuel

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20160322/86383417/attachment.htm>


More information about the users mailing list