<html>
<head>
<meta content="text/html; charset=windows-1252"
http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<div class="moz-cite-prefix">Hi,<br>
<br>
Le 22/03/2016 10:55, Stéphane Mottelet a écrit :<br>
</div>
<blockquote cite="mid:56F11692.2090508@utc.fr" type="cite">Hello,
<br>
<br>
Le 22/03/2016 10:41, <a class="moz-txt-link-abbreviated" href="mailto:antoine.monmayrant@laas.fr">antoine.monmayrant@laas.fr</a> a écrit :
<br>
<blockquote type="cite">Hi everyone,
<br>
<br>
I have a very general and naive question concerning leastsq:
what am I to do with "gopt", the "gradient of f at xopt"?
<br>
<br>
Is there a way to link it to the confidence interval for each
parameter of my fit?
<br>
</blockquote>
Not really, but since leastsq is a wrapper for optim, which
returns the gradient at returned "optimal" solution, it is also
returned. However, if the final gradient is seen to be far from
the zero vector, then all confidence intervals based on the
inverse of the Fisher matrix (computed with the Jacobian) will not
have any sense, since these "linear" statistics are based on a
development where the first term (using the gradient) is supposed
to vanish... Hence, having access to the final gradient can be of
interest.
<br>
</blockquote>
<br>
We may guess that, if bounds constraints are set for x, a non-zero
gradient could as well be returned whether xopt reaches a point on
the boundary, where fun() has not a true minimum, just a negative
interrupted slope leading it to a low value. <br>
<br>
<blockquote cite="mid:56F11692.2090508@utc.fr" type="cite">
<blockquote type="cite">For the moment, I know how to estimate
these confidence intervals when I have access to the Jacobian
matrix of my fit function.
<br>
</blockquote>
</blockquote>
I am not sure that we are speaking about the same jacobian. <br>
On one hand, we have some c(i) coordinates (say of spaces: c(1)=x,
c(2)=y, etc)) <br>
on which fun() depends. On the other hand, we have p(j) parameters
on which fun() also depends.<br>
As good quiet parameters, p(j) have fixed values, whereas c(i) are
varied by leastsq().<br>
<br>
AFAIU, Stephane's answer assumes that the x "passed" to leastqr() is
the full set gathering {c(i)} AND {p(j)}.<br>
Whereas, still afaiu, you look interested in somewhat getting the
sensitivity of fun() with respect to each <br>
parameter p(j) around the minimum value of fun({c(i)}) (parameters
p(i) being fixed in the fun() <br>
definition).<br>
Here, i don't think that we can speak more than about sensitivity.
"Confidence" is not the proper term, <br>
since parameters values p(j) are deterministic and fixed. They are
not random variables.<br>
To assess this sensitivity, you will need the fun() jacobian, BUT
with respect to p(j), not wrt to c(i) !<br>
<br>
To get what you want, i could suggest to run leastsq() with x={c(i)}
U {p(j)}<br>
Then, unless the optimum is reached on a boundary, the absolute
value of the derivative of order <br>
<b>2</b> of fun() along each p(j) direction evaluated at xopt will
be related to the "confidence interval",<br>
Unless the derivative of order 2 vanishes as well... (so then of
order 3... etc)<br>
A more pragmatic way to get this interval could be to evaluate fun()
around xopt, just varying<br>
(with grand()) the p(j) component of xopt you want to get the
confidence, and measuring the <br>
spread of fun()'s answers.<br>
<br>
<blockquote cite="mid:56F11692.2090508@utc.fr" type="cite">
<blockquote type="cite">Could "gopt" be of some use to estimate
the confidence intervals when the Jacobian matrix is not known?
<br>
</blockquote>
</blockquote>
As Stephane said, in no way.<br>
<br>
HTH<br>
Samuel<br>
<br>
</body>
</html>