reading or loading big files

Hsu (David) hsu at neurology.wisc.edu
Tue Jun 22 04:00:55 CEST 2010


Thank you to Samuel Gougeon, these do appear to be the commands I need, will try them.

Dave

Subject: Re: reading or loading big files<mailto:users at lists.scilab.org?subject=Re:%20Re:%20reading%20or%20loading%20big%20files>
From: Samuel Gougeon ####@####.####<http://lists.scilab.org/cgi-bin/?list=users&cmd=author&authorid=kbhiaakjapnpjgklnkji&month=201006>
Date: 7 Jun 2010 11:18:33 +0200
Message-Id: <4C0CB964.90207 at univ-lemans.fr>
________________________________

Hello,



----- Message d'origine -----

De : Hsu (David)

Date : 05/06/2010 08:20:

> I am just learning SciLab.  I need to read enormous EEG files

> (electroencephalograms), for example, 20 channels sampled at 32,000 Hz

> for days on end.  This data may be saved as a matrix with 20 columns

> but a huge number of rows, so big that I run up against size limits if

> I try to load the whole thing as a matrix.  This data is usually in

> binary format.  I tried using stacksize('max') to maximize these size

> limits but am still running into size limits.

A simple calculation:

20 channels x 32000 measurements/s/channel x 1 byte/measurement

x 3600x24 s/day ~ 55 Gbytes/day

If you want to load the whole data as a int8() or uint8() matrix

(as minimal data format), you need at least ~ 60 Gbytes of RAM.

Do you have it (assuming that Scilab or another software could handle it)?

It looks unrealistic to load and handle these data as a whole into a matrix.

You will likely need to use some detailled binary commands

after mopen() : mseek(), mtell(), mget()... to read some piece of data.



Samuel

________________________________
From: Hsu (David)
Sent: Saturday, June 05, 2010 1:21 AM
To: users at lists.scilab.org
Subject: reading or loading big files

I am just learning SciLab.  I need to read enormous EEG files (electroencephalograms), for example, 20 channels sampled at 32,000 Hz for days on end.  This data may be saved as a matrix with 20 columns but a huge number of rows, so big that I run up against size limits if I try to load the whole thing as a matrix.  This data is usually in binary format.  I tried using stacksize('max') to maximize these size limits but am still running into size limits.

Is there a way to load or read in just a part of this huge matrix, process that, then go back and load or read the next chunk?

Thank you.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20100621/b90c157f/attachment.htm>


More information about the users mailing list