[Scilab-users] Saving and Loading big data
Antoine Monmayrant
antoine.monmayrant at laas.fr
Thu Oct 23 12:29:33 CEST 2014
On 10/22/2014 04:02 PM, Matiasb wrote:
> Hi,
>
> Just yesterday I solved some issues I was experiencing with the save
> function by upgrading to scilab 5.5.1 as suggested by Samuel in this thread
> <http://mailinglists.scilab.org/Problems-using-save-td4031410.html> .
>
> Now, being able to save all the environment variables, I'm experiencing the
> next problem when there is a lot of data to be saved/loaded.
>
> The first one to have problems is the load function. My script created
> several (aprox 200) big matrices and when they are saved they produce a
> 850MB file. The save function works well in this case, but when I try to
> load the variables back again scilab runs out of memory as shown below. If
> the scripts creates even bigger matrices, scilab can handle them, but when I
> try calling save it does not work (also the problematic code below).
>
> Scilab can handle these big matrices, but is there anything I can do to make
> a reliable backup of them?
> For example increasing scilab's stacksize even more (which it states its
> maximum is 268435455, but my PC has more RAM can be used). Or using
> save/load in a more efficient way so that they don't require so much extra
> memory in the stack.
>
> Any help would be appreciated.
> Thanks you very much!
>
> Code samples
Hi,
I experienced similar issues when trying to save/load big data.
1) save/load do not seem to be really efficient and it looks like they
create copies of variables to save/load.
As a result, if a matrix uses more than 50% of the available stack, load
tend to fail:
%%%
-->n=2250,a=zeros(n,n);s=stacksize();s(2)/s(1)
n =
2250.
ans =
0.5071616 % a is using a bit more than 50% of the stack
-->save('/tmp/a.sod','a')
%val=[a;
!--error 17
stack size exceeded!
Use stacksize function to increase it.
Memory used for variables: 5094842
Intermediate memory needed: 5062522
Total memory available: 10000000
in execstr instruction called by :
at line 35 of function evstr called by :
at line 1003 of function %_save called by :
save('/tmp/a.sod','a')
%%%
if we increase the stack, it works:
-->stacksize('max')
-->save('/tmp/a.sod','a')
What's worse, load seem to try to allocate memory without checking
allocation was successful:
%%% scilab restarted with the default stacksize
-->load('/tmp/a.sod')
!--error 42
A fatal error has been detected by Scilab.
Your instance will probably quit unexpectedly soon.
%%%
2) when trying to load huge sod file, you can try to load saved
variables one by one instead of all of them at once.
Hope it helps,
Antoine
>
> *Load*
> I make the backup as follows:
> --> stacksize('max') // increase scilab memory
> --> save('bigFile.sod')
>
> And when I load the backup again I get the following error:
> --> stacksize('max') // increase scilab memory
> --> load('bigFile.sod)
> * !--error 17
> stack size exceeded!
> Use stacksize function to increase it.
> Memory used for variables: 90307998
> Intermediate memory needed: 180528916
> Total memory available: 268435455
> at line 971 of function %_sodload called by :
> load('ScilabLogs/local_39DCM_100HZ_20141022_142845.sod')*
>
> Just to get an idea of the amout of variables and sizes:
> --> [names, typs, dims, vols] =
> listvarinfile('ScilabLogs/local_39DCM_100HZ_20141022_142845.sod');
> --> size(names) --> 237
> --> sum(vols) --> 8.920D+08
>
> Some of the matrices are big:
> Name Type Size Bytes
> --------------------------------------------------------------------------------------------------------
> CoreSwitchQueueR_discard constant 9 by 1136546 81831328
> CoreSwitchQueueR_queueLe constant 9 by 1136546 81831328
> CoreSwitchQueueR_t constant 9 by 1136546 81831328
> CoreSwitchQueueR_waittim constant 9 by 1136546 81831328
>
>
> *save*
> When make the backup I get the following error :
> --> stacksize('max') // increase scilab memory
> --> save('bigFile.sod')
> *stack size exceeded!
> Use stacksize function to increase it.
> Memory used for variables: 235237999
> Intermediate memory needed: 44657871
> Total memory available: 268435455
> at line 71 of function evstr called by :
> at line 1003 of function %_save called by :
> save(backupDirectory + logFileName + '.sod'); // saving all variables
>
> !--error 4
> Undefined variable: %val*
>
> The output from 'whos' shows that there are huge matrices, but I can
> manipulate them without problem as long as I dont call save.
> --> whos
> Name Type Size Bytes
> ----------------------------------------------------------------------------
> .....
> RejectedEv_arrived constant 15 by 29935 3592216
> RejectedEv_evBuiltLatenc constant 15 by 29935 3592216
> RejectedEv_L2Latency constant 15 by 29935 3592216
> RejectedEv_packetLatency constant 15 by 29935 3592216
> RejectedEv_roundTripLate constant 15 by 29935 3592216
> RejectedEv_t constant 15 by 29935 3592216
> HLTSV_idlePUs constant 15 by 59872 7184656
> HLTSV_queuedEventsToSend constant 15 by 59872 7184656
> HLTSV_sent constant 15 by 59872 7184656
> HLTSV_t constant 15 by 59872 7184656
> coreSwitchQueueCapacity constant 1 by 1 24
> CoreSwitchQueueR_discard constant 8 by 5582233 3.573D+08
> CoreSwitchQueueR_queueLe constant 8 by 5582233 3.573D+08
> CoreSwitchQueueR_t constant 9 by 5582233 4.019D+08
> CoreSwitchQueueR_waittim constant 8 by 5582233 3.573D+08
> .....
>
>
>
> --
> View this message in context: http://mailinglists.scilab.org/Saving-and-Loading-big-data-tp4031419.html
> Sent from the Scilab users - Mailing Lists Archives mailing list archive at Nabble.com.
> _______________________________________________
> users mailing list
> users at lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users
>
More information about the users
mailing list