reading or loading big files

Samuel Gougeon Samuel.Gougeon at univ-lemans.fr
Mon Jun 7 11:18:28 CEST 2010


Hello,

----- Message d'origine -----
De : Hsu (David)
Date : 05/06/2010 08:20:
> I am just learning SciLab.  I need to read enormous EEG files 
> (electroencephalograms), for example, 20 channels sampled at 32,000 Hz 
> for days on end.  This data may be saved as a matrix with 20 columns 
> but a huge number of rows, so big that I run up against size limits if 
> I try to load the whole thing as a matrix.  This data is usually in 
> binary format.  I tried using stacksize('max') to maximize these size 
> limits but am still running into size limits.
A simple calculation:
20 channels x 32000 measurements/s/channel x 1 byte/measurement
x 3600x24 s/day ~ 55 Gbytes/day
If you want to load the whole data as a int8() or uint8() matrix
(as minimal data format), you need at least ~ 60 Gbytes of RAM.
Do you have it (assuming that Scilab or another software could handle it)?
It looks unrealistic to load and handle these data as a whole into a matrix.
You will likely need to use some detailled binary commands
after mopen() : mseek(), mtell(), mget()... to read some piece of data.

Samuel

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20100607/88655786/attachment.htm>


More information about the users mailing list