[scilab-Users] plot efficiency

Peng Du eddy.pdu at gmail.com
Sun Oct 24 22:38:50 CEST 2010


Thanks Mathieu,

Your elaborate explanation is very much appreciated.

Best regards,

Peng

On 24 October 2010 21:26, Mathieu Dubois <mathieu.dubois at limsi.fr> wrote:

>
>
> Le 24/10/2010 22:07, Peng Du a écrit :
>
>  Hi Mathieu,
>>
>> Yes I think you are right. So I think I need to improve the way that
>> data is read because simply changing the stacksize won't solve the
>> problem fundamentally.
>>
>>  Stack' size is a recurrent problem under Scilab (in fact it can also
> append in compiled language although I'm not sure that this is the same
> problem). The point is that your data are stored on the stack so changing
> the way you read data won't change the problem (of course if you use large
> intermediary variable you can free them but if your data are really huge you
> will reach the problem some day).
>
> Increasing stack' size is really the solution. It's common under Scilab (I
> think that version 6 will solve this). Matlab also has this. Most of my
> scripts start with stacksize(2000000). And if I they still not work I simply
> add a 0. Of course at some point you reach hardaware/OS limit but I have
> never faced the problem. There was a message a few month ago from someone
> who read so many data that Scilab wasn't allowed to allocate memory. I think
> that he/she split his dataset.
>
>
>  And talking about matrix, is the effect of using a 16*n matrix the same
>> with using 16 1*n arrays? How can I change my program so it requires
>> smaller stacksize and probably runs faster?
>>
>>  I think that a n*16 matrix (16 columns) consume more or less the same
> amount of memory than 16 n*1 matrices (probably a bit less - a few bits
> less).
>
> But most Scilab function can work with matrices so it may be more
> convenient. In statistics it is common to put data in a m*n matrix where
> each column represent a variable (so there are n variables) and each line an
> observation (disclaimer: I'm not a statistician :)
>
> One function that is very helpful for fast I/O is mscanf. It's a bit hard
> to understand but once you get, it's very useful.
>
> Another function is fscanfMat.
>
> HTH,
> Mathieu
>
>  Thanks.
>>
>> Peng
>>
>> On 24 October 2010 20:58, Mathieu Dubois <mathieu.dubois at limsi.fr
>> <mailto:mathieu.dubois at limsi.fr>> wrote:
>>
>>    Hi Peng
>>
>>    Le 24/10/2010 21:12, Peng Du a écrit :
>>
>>        Hi everyone.
>>
>>        I have a question about how to plot efficiently.
>>
>>        My scilab program reads data from a file containing 16 columns.
>> Each
>>        column represents the data for a polyline. So what I have been
>>        doing so
>>        far is to build 16 separate arrays and plot them in one graph.
>>        However
>>        as the size of the file grows, sometimes up to several GBs, it
>>        started
>>        to take a very long time to finish and finally I got the error
>>        message
>>        of exceeded stack size. Does anyone know how I can work around
>> this?
>>
>>    I don't think that problem comes from plot but rather from your data
>>    reading function.
>>
>>    To increase the stack size, you can use stacksize (see
>>    help("stacksize")).
>>
>>    By the way if you call plot2d with a matrix it will plot a polyline
>>    for each column on the same graph.
>>
>>    HTH,
>>    Mathieu
>>
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20101024/9c070f09/attachment.htm>


More information about the users mailing list