<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
<title></title>
</head>
<body bgcolor="#ffffff" text="#000000">
Hello,<br>
<br>
----- Message d'origine ----- <br>
De : Hsu (David) <br>
Date : 05/06/2010 08:20:
<blockquote
cite="mid:95ECD1500CCA774592DB2A1705797E6B0117CB9E8E84@HECTOR.network.wisc.edu"
type="cite">
<meta http-equiv="Content-Type" content="text/html; ">
<style title="owaParaStyle"><!--P {
MARGIN-TOP: 0px; MARGIN-BOTTOM: 0px
}
--></style>
<div dir="ltr"><font color="#000000" face="Tahoma" size="2">I am just
learning SciLab. I need to read enormous EEG files
(electroencephalograms), for example, 20 channels sampled at 32,000 Hz
for days on end. This data may be saved as a matrix with 20 columns
but a huge number of rows, so big that I run up against size limits if
I try to load the whole thing as a matrix. This data is usually in
binary format. I tried using stacksize('max') to maximize these size
limits but am still running into size limits.</font></div>
</blockquote>
A simple calculation:<br>
20 channels x 32000 measurements/s/channel x 1 byte/measurement<br>
x 3600x24 s/day ~ 55 Gbytes/day<br>
If you want to load the whole data as a int8() or uint8() matrix<br>
(as minimal data format), you need at least ~ 60 Gbytes of RAM.<br>
Do you have it (assuming that Scilab or another software could handle
it)?<br>
It looks unrealistic to load and handle these data as a whole into a
matrix.<br>
You will likely need to use some detailled binary commands<br>
after mopen() : mseek(), mtell(), mget()... to read some piece of data.<br>
<br>
Samuel<br>
<br>
</body>
</html>