[Scilab-users] Iterative regresion

samaelkreutz mariajovera at icloud.com
Wed Mar 12 08:14:00 CET 2014


Hello! 
Im trying to fit a model by linear regression over M bins, and then do a
standard regression. My model has the following form:

log(PGA) = d1*e1 + d2*e2 +...+d(max number of M bins)*e(max number of M
bins) + a*log(R/Ref) + b*(R-Ref)
with dummy variables for each M bin.  where d1, d2, etc =1 for data falling
within M bins 1, 2, etc and 0.0 otherwise.    


What I done:
For dummy variables I made a matrix with weights. I have 14 bins... 


%Term a*log(R/Rref)      
A=log(R./Rref);

%Térm (R-Rref)          

B= R - ones(size(R),1)*Rref;


% finding coeff
%
MI=[W A B];

% The coefficients  x=[coef(1...14) a b]

x=MI\log(PGA);

I found 14 constants, one for each bin... and my coeff "a" and "b"

My question is:  I obtained the foliowing coeff. But some of them doesn't
fit well! My teacher said "Make a simple regression, obtain coefficients and
then iterate again to get the final coefficients"
X =-4.6946
   -4.6215
   -4.3964
   -4.2399
   -3.8835
   -3.6527
   -3.5499
   -3.4174
   -3.3223
   -3.0215
   -2.7988
   -2.4148
   -2.4318
   -2.0816
   -0.0003

But I don't know how do that. Is necessary include all the bins because all
of them are linked, otherwise I could fit for a specific one and find "the
best coefficients". But not... isn't the case.

I need some ideas, I was thinking in Linear regression with gradient
descent, but I found examples with 2 variables, and I have 14 u_u 








--
View this message in context: http://mailinglists.scilab.org/Iterative-regresion-tp4029228.html
Sent from the Scilab users - Mailing Lists Archives mailing list archive at Nabble.com.



More information about the users mailing list