[scilab-Users] Academic question

Charles Basenga Kiyanda cbk at lanl.gov
Tue Jan 25 16:49:11 CET 2011


You could try a genetic-type algorithm, such as differential evolution. 
I don't think there's a scilab implementation (at the very least, I 
don't think there's one distributed with scilab).

See information (and code for several ports) here:
http://www.icsi.berkeley.edu/~storn/code.html

Implementing the basic algorithm in scilab (in any language for that 
matter) is easy and quick.

There are several references on that page. I recommend the first few 
chapters of "Differential Evolution - A Practical Approach to Global 
Optimization" by Ken Price, Rainer Storn, and Jouni Lampinen.

Otherwise, search for algorithms such as 'simulated annealing'. I've 
heard good things although I've lately been helping colleagues, who 
relied on that algorithm for some minimization problems, migrate to DE 
as a potentially more robust and efficient algorithm.

Good luck,

Charles


On 01/25/2011 04:38 AM, Carrico, Paul wrote:
> Dear All,
>
>
>
> For some times I'm testing Scilab macros for optimization purposes ;
> "optimization" has to be intended here as an inverse method for fitting
> parameters (i.e. so that finite element analysis fit test measurements).
>
>
>
> I tested basic macros such as fminsearch for unbounded fitting as well
> as Nelder-Mead one for bracketed parameters fitting. Basically the "cost
> function" is the normalized Sum of the Square errors SSE from the FEA to
> the measurements.
>
>
>
> In my FEA's the variables can be material parameters, spring
> stiffness's, damping ratio and so on : the simplex method is well
> adapted since the former variables are not analytically described in the
> cost function and since it's not necessary to calculate the gradient
> vector nor the Hessian matrix ...
>
>
>
>
>
> If this method is robust nevertheless it remains rather slow !
>
>
>
>
>
> Is there a way or another macro I can use to reduce the number of loops
> and the time consuming consequently ?
>
>
>
> Nota : I was thinking in OPTIM macro but from my understanding the cost
> function need to be twice differentiable at least  (for the Gradient and
> the Hessian ) i.e. analytically link to the parameters ... isn't it ?
>
>
>
> Thanks in advance for any advice (for a better understanding)
>
>
>
> Paul
>
>
>
> --------------------------------------------------------------------------------
>
>
> Le présent mail et ses pièces jointes sont confidentiels et destinés à la personne ou aux personnes visée(s) ci-dessus. Si vous avez reçu cet e-mail par erreur, veuillez contacter immédiatement l'expéditeur et effacer le message de votre système. Toute divulgation, copie ou distribution de cet e-mail est strictement interdite.
>
> This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please contact the sender and delete the email from your system. If you are not the named addressee you should not disseminate, distribute or copy this email.
>





More information about the users mailing list