[Scilab-users] large csv text file management issue

David Chèze david.cheze at cea.fr
Tue Feb 12 16:01:57 CET 2013


Hi all !

I'm puzzling with a large csv text file (900 Mo, 2628001 rows, 11 columns)
from which i would like to extract some data. I tried csvRead() with range
option but it fails claiming with error 999 even with stacksize set to max.
I open the file with the LargeTextFileViewer software where I've checked
that the overall format was ok. Using this software i copied first 100 lines
into a test file which is read succesfully by csvRead() using the range
option so the command looks ok.
The pb with csvread seems to occur before trying to convert the data into
scilab memory : just while trying to open the huge text file.
May csvRead (using the range option) work in a similar way as LTFviewer so
that it can manage large text file?
To overcome this limitation i tried to split my huge text file into smaller
pieces but unfortunately I didn't found free software under windows to do
that.
here's a test file with a few lines, the original is far larger:  test.plt
<http://mailinglists.scilab.org/file/n4025917/test.plt>  


Any idea to help ?

Thanks David

W7-32bit-scilab5.4.1branch



--
View this message in context: http://mailinglists.scilab.org/large-csv-text-file-management-issue-tp4025917.html
Sent from the Scilab users - Mailing Lists Archives mailing list archive at Nabble.com.



More information about the users mailing list