[Scilab-users] {EXT} Re: horner test

Rafael Guerra jrafaelbguerra at hotmail.com
Fri Oct 28 13:06:18 CEST 2016


Hi again Christophe,

After revision and to be fair with 'optim', if one uses the same fitting function (to compare apples to apples), then optim finds accurately the solution but it is, indeed, much slower than linear regression:
Method#0 (optim):   time1= 0.015600 ; error= 0.027010
Method#1 (lin reg):  time2= 0.000000 ; error= 0.027010

As per modified code here below:

////// START OF CODE
clear;
clf;

M = [
0.003       170000
0.0028      208980
0.0026      261261
0.0024      336000
0.0022      439530
0.002       570000
0.0018      785568
0.0016      1110375
0.0014      1667952
0.0012      2646000
0.001       4500000
0.0008      8812500
0.0006      20875000
0.0004      70500000 ];

x = M(:,1);
y = M(:,2);

function f = fmodel(a,x)
     f = a(1) + a(2)*log(x);
endfunction

function f = costfun(a,x,y)
    f = norm(y - fmodel(a,x));
endfunction

timer();
// Method#1: optim
a0 = [1;-1];
ly = log(y);
[f,aop] = optim(list(NDcost,costfun,x,ly),a0);
t1 = timer();
// Method#2: linear regression
A = [log(x), ones(x)]\log(y);
y2 = exp(A(2)).*x.^A(1);
t2 = timer();

str = sprintf("Scilab''s optim result:  y = %.4f*exp(%.3f*log(x))\n",exp(aop(1)),aop(2));
y1 = fmodel(aop,x);
plot(x,y1,"blue",x,log(y2),"--c",x,ly,"*r")
h = gca();
h.title.text= str;
h.title.font_size = 4;
xgrid
printf("\nMethod#0 (optim):   time1= %f ; error= %f",t1,norm(y1-ly))
printf("\nMethod#1 (lin reg):  time2= %f ; error= %f",t2,norm(log(y2)-ly))

////// END OF CODE

Regards,
Rafael

-----Original Message-----
From: users [mailto:users-bounces at lists.scilab.org] On Behalf Of Rafael Guerra
Sent: Friday, October 28, 2016 11:24 AM
To: Users mailing list for Scilab <users at lists.scilab.org>
Subject: Re: [Scilab-users] {EXT} Re: horner test

Hi Christophe,

The linear regression method and functions that you propose below do fit the data much better than the  optim solution.

Thanks and regards,
Rafael

-----Original Message-----
From: users [mailto:users-bounces at lists.scilab.org] On Behalf Of Dang Ngoc Chan, Christophe
Sent: Friday, October 28, 2016 10:04 AM
To: Users mailing list for Scilab <users at lists.scilab.org>
Subject: Re: [Scilab-users] {EXT} Re: horner test

Hello,

> De : users [mailto:users-bounces at lists.scilab.org] De la part de Rafael Guerra
> Envoyé : jeudi 27 octobre 2016 20:23
>
> (note: votre message n'a pas été publié)
> […]
> function f = fmodel(a,x)
>     f = exp(a(1) + a(2)*x + a(3)*x.^a(4));
>  endfunction
>
> function f = costfun(a,x,y)
 >  f = norm(y - fmodel(a,x));
> endfunction
>
> a0 = [0;0;0;0];
> [f,aop]= optim(list(NDcost,costfun,x,y),a0);

As I didn't know why the message was not published, I answered directly to the sender (hope he/she got it).
I personally recommended to first look for a power function

A = [log(x), ones(x)]\log(y)

which gives a satisfactory result in itself.

plot(x, exp(A(2))*x^A(1));
plot(x, y, "o");

Then we see that the power is close to -3, we can also try a 3rd degree 1/x polynomial

invx = x.^-1;
X = [invx.^3, invx.^2, invx, ones(x)];
B = X\y;

plot(x, X*B);
plot(x, y, "o");

All this is less straightforward and automatized as your method,
but maybe more stable (and faster ?) as it only uses linear regression.

regards

--
Christophe Dang Ngoc Chan
Mechanical calculation engineer



More information about the users mailing list