[Scilab-users] scilab memory usage

paul.carrico at free.fr paul.carrico at free.fr
Sun Jan 5 10:45:29 CET 2014


Hi

have a look to "csvread" functions to read (very fast) txt/csv files even for huge file including million(s) of lines ...
.. but I've not figure out what you want to do exactly

Paul

----- Mail original -----
De: "Stephan Schreckenbach" <step.schr19 at yahoo.de>
À: users at lists.scilab.org
Envoyé: Samedi 4 Janvier 2014 23:31:28
Objet: [Scilab-users] scilab memory usage




Hi, 
I want to script a code that reads data from txt or csv files and does some 
calculations on them. In the future I might evaluate large data files, 
run calculation extensive parameter optimizations, do some loops with a high number of onedimensional calculations. 
Therefore I worry about speed and memory usage. 
I run WinXp with 3GB RAM, Intel Core 2Duo. 



Since scilab processes all data as variant, how much memory is required for each constant or variable, being string, integer or double? 
I still wonder if I should script via Excel VBA or scilab. 
I will store my results in Excel anyway and it might be therefore easier to use VBA. 
As I have read, scilab can use up to 2 GB RAM, VBA only 500MB. 
But, in VBA data types can be declared, saving some memory. 

Scilab can do multithreading, but not on Windows, which is what I use. 
VBA can't do it eighter. 
VBA might have a larger user community and knowledge Base, Scilab might be easier to script. 



I would like to start out with eighter scilab or VBA and then to stick to one of them due to the large amount of code 
and subscripts to write . 



So-what is your suggestions? 
Thanks very much in advance for your help! 



_______________________________________________
users mailing list
users at lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users



More information about the users mailing list