[Scilab-users] interp and memory
Federico Miyara
fmiyara at fceia.unr.edu.ar
Mon Mar 8 09:33:18 CET 2021
Jean-Ives,
It seems that the variables are not so huge, xp has 1000000 components
as well as z, while x and y have 10000 components.
But I guess splin() gets the derivatives through solving a linear
equation system of, in this case, 10000 x 10000, and even if the
system's matrix is quasi diagonal (it has only the diagonal,
sub-diagonal and supra-diagonal components different from 0) and this
surely reduces the computational load, it seems quite a large system and
probably it gets stuck here.
Probably a lighter (and better) way to do what you seem to be looking
for is to try to resample your x-y data by a factor of 100 using
intdec(). However, doing that 1000 times may take quite a long time (I
haven't tested it). If you really need that, it would probably be better
to do the oversampling algorithm from scratch in such a way to design
only once the smoothing filter instead of letting intdec() design the
same filter over and over again.
Regards,
Federico Miyara
On 08/03/2021 04:33, Jean-Yves Baudais wrote:
> Hello,
>
> Is my code wrong or is there a real memory problem with the function
> interp in the following code? (Scilab fulfills the memory and of
> course stops.)
>
> for i=1:1000
> mprintf("%d\n",i);
> n=1e6;
> xp=1:n;
> x=1:100:n;
> y=rand(x);
> d=splin(x,y);
> z=interp(xp,x,y,d);
> end
>
> Thanks,
>
> Jean-Yves
> _______________________________________________
> users mailing list
> users at lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users
>
--
El software de antivirus Avast ha analizado este correo electrónico en busca de virus.
https://www.avast.com/antivirus
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20210308/da41106e/attachment.htm>
More information about the users
mailing list