[Scilab-users] Saving and Loading big data
Matiasb
matiasb at gmail.com
Wed Oct 22 16:02:01 CEST 2014
Hi,
Just yesterday I solved some issues I was experiencing with the save
function by upgrading to scilab 5.5.1 as suggested by Samuel in this thread
<http://mailinglists.scilab.org/Problems-using-save-td4031410.html> .
Now, being able to save all the environment variables, I'm experiencing the
next problem when there is a lot of data to be saved/loaded.
The first one to have problems is the load function. My script created
several (aprox 200) big matrices and when they are saved they produce a
850MB file. The save function works well in this case, but when I try to
load the variables back again scilab runs out of memory as shown below. If
the scripts creates even bigger matrices, scilab can handle them, but when I
try calling save it does not work (also the problematic code below).
Scilab can handle these big matrices, but is there anything I can do to make
a reliable backup of them?
For example increasing scilab's stacksize even more (which it states its
maximum is 268435455, but my PC has more RAM can be used). Or using
save/load in a more efficient way so that they don't require so much extra
memory in the stack.
Any help would be appreciated.
Thanks you very much!
Code samples
*Load*
I make the backup as follows:
--> stacksize('max') // increase scilab memory
--> save('bigFile.sod')
And when I load the backup again I get the following error:
--> stacksize('max') // increase scilab memory
--> load('bigFile.sod)
* !--error 17
stack size exceeded!
Use stacksize function to increase it.
Memory used for variables: 90307998
Intermediate memory needed: 180528916
Total memory available: 268435455
at line 971 of function %_sodload called by :
load('ScilabLogs/local_39DCM_100HZ_20141022_142845.sod')*
Just to get an idea of the amout of variables and sizes:
--> [names, typs, dims, vols] =
listvarinfile('ScilabLogs/local_39DCM_100HZ_20141022_142845.sod');
--> size(names) --> 237
--> sum(vols) --> 8.920D+08
Some of the matrices are big:
Name Type Size Bytes
--------------------------------------------------------------------------------------------------------
CoreSwitchQueueR_discard constant 9 by 1136546 81831328
CoreSwitchQueueR_queueLe constant 9 by 1136546 81831328
CoreSwitchQueueR_t constant 9 by 1136546 81831328
CoreSwitchQueueR_waittim constant 9 by 1136546 81831328
*save*
When make the backup I get the following error :
--> stacksize('max') // increase scilab memory
--> save('bigFile.sod')
*stack size exceeded!
Use stacksize function to increase it.
Memory used for variables: 235237999
Intermediate memory needed: 44657871
Total memory available: 268435455
at line 71 of function evstr called by :
at line 1003 of function %_save called by :
save(backupDirectory + logFileName + '.sod'); // saving all variables
!--error 4
Undefined variable: %val*
The output from 'whos' shows that there are huge matrices, but I can
manipulate them without problem as long as I dont call save.
--> whos
Name Type Size Bytes
----------------------------------------------------------------------------
.....
RejectedEv_arrived constant 15 by 29935 3592216
RejectedEv_evBuiltLatenc constant 15 by 29935 3592216
RejectedEv_L2Latency constant 15 by 29935 3592216
RejectedEv_packetLatency constant 15 by 29935 3592216
RejectedEv_roundTripLate constant 15 by 29935 3592216
RejectedEv_t constant 15 by 29935 3592216
HLTSV_idlePUs constant 15 by 59872 7184656
HLTSV_queuedEventsToSend constant 15 by 59872 7184656
HLTSV_sent constant 15 by 59872 7184656
HLTSV_t constant 15 by 59872 7184656
coreSwitchQueueCapacity constant 1 by 1 24
CoreSwitchQueueR_discard constant 8 by 5582233 3.573D+08
CoreSwitchQueueR_queueLe constant 8 by 5582233 3.573D+08
CoreSwitchQueueR_t constant 9 by 5582233 4.019D+08
CoreSwitchQueueR_waittim constant 8 by 5582233 3.573D+08
.....
--
View this message in context: http://mailinglists.scilab.org/Saving-and-Loading-big-data-tp4031419.html
Sent from the Scilab users - Mailing Lists Archives mailing list archive at Nabble.com.
More information about the users
mailing list