[Scilab-users] setmemory() <= Re: Reintroducing stacksize on Scilab 6 ? was (Re: multiple element by element between large matrix and vector)

Simon Marchetto simon.marchetto at scilab-enterprises.com
Wed Sep 30 11:15:16 CEST 2015


Hello,

You cannot protect users from OS memory allocation errors. There can be 
other softwares running on the machine that consume a big amount of 
memory, like VirtualBox. You can have also low end machines with a low 
RAM size.

The only way is to a implement a dedicated memory manager which 
allocates a big memory buffer at startup, and manages all the memory 
allocations in it....exactly what Scilab 5 does.
There are some uses cases for which this system can be useful however, 
but not to prevent coding errors.

By the way, the goal of tuning the Java heap size is to optimize the 
time production servers spend doing garbage collection. I don't think we 
have this problem in Scilab.

Simon


Le 30/09/2015 10:35, Samuel Gougeon a écrit :
> Hello Clément,
>
> Le 28/09/2015 12:01, Clément David a écrit :
>> .../...
>> So I have a question on your needs for Scilab 6. There is currently no
>> more stacksize as all the system's memory is available. To protect
>> users, I suggested to re-introduce `stacksize` with a changed behavior
>> :
>>
>>   * M=stacksize(N) : will set N * sizeof(double) bytes available on the
>> Scilab datatypes raw memory
>>                      will return M the previous sett'ed value
>>   * stacksize('max') : will disable any memory restriction
>>
>> ## Why re-introducing `stacksize` ?
>>
>> On my Linux system (with 8Go of RAM and some applications started),
>> allocating all my memory (like with `zeros(2**30,2**3)`) slow my
>> computer down and succeed after a lot of time. Reducing the memory
>> available to Scilab using `stacksize` will help user discover algorithm
>> or memory issues more rapidly and without swapping most of the other
>> applications *by default* .
>>
>> My point is not to limit the available memory issue but ease language
>> usage for new-comers by protect them against typo or mis-design
>> algorithms.
>>
>> Awaiting your opinion,
> .
> Since the 6.0.0-a1 release, several bugs somewhat linked to memory 
> saturation were reported.
> After one of them, you are going to allow users to configure the 
> maximal recursivity depth.
> It will be a useful improvement to avoid memory overflow, but it is 
> not sufficient to avoid
> saturations. Indeed, not only the number of recursive calls is 
> involved, but also the
> quantity of additional intermediate memory used by each call, which 
> usually depends on
> the recursive level reached, etc.
> The fact that memory management is changed in Scilab 6 does not kill 
> the need to control
> the maximal amount of memory to ascribe to each Scilab session, 
> because, unfortunately,
> operating systems do not always manage the computer ressources in 
> processors and
> RAM load or interruptions in a relevant or efficient way to avoid 
> "burning" a processor.
> IMO, this is not a matter of Scilab 5=>6 back compatibility. It is a 
> main specific characteristic
> of major releases to not be necessarily back-compatible with the 
> previous ones.
> So, which function with which features ?
>
> * setmemory() would be a better name than stacksize():
>    It is converse to getmemory(). It is more explicit and user-oriented.
>
> * it should merge in a whole local intermediate and global memory 
> domains, corresponding to
>    the former stacksize() and gstacksize().
>    This is another reason to not use "stacksize" as function name.
>
> * It should work in Bytes, not in doubles.
>    The documentation has always been somewhat confused about unit used 
> for amounts of memory,
>    mixing "double" with "word" (a word = 8 bytes. not really common 
> for users).
>    The standard unit is either the bit or the byte. There is no reason 
> to stress on the 8-bytes "double".
>    Scilab deals with int8, int16, int32, int64, double. By the way, 
> "double" is an oldies.
>    Amounts of RAM, disk spaces, etc are given in bytes, not in doubles.
>    So, let's make setmemory() working in bytes.
>
> * The last question i have in mind is about the Java memory heap that 
> can already be configured
>    through preferences, and reserved to Scilab usages.
>    a) it should be decided whether the memory amount set with 
> setmemory() should take the Java
>         heap into account or not. IMO, it should include it. Or an 
> option should specify it.
>
>    b) Is the amount set for the  java heap reserved for the only 
> Scilab session, or is it shared for
>         all running scilab sessions? It may be an obvious question for 
> a developer, but it is not
>         clear for me. It would deserve to be documented.
>
>    c) i think that setmemory() should propose an option to set the 
> java heap amount, as a
>        shortcut to the preferences, in such a way to be a "single 
> desk" for all memory settings
>        for scilab.
>
> To be discussed.
> Anyway, thanks for your proposal, since yes, a memory setting function 
> is still needed.
>
> Best regards
> Samuel Gougeon
>
> _______________________________________________
> users mailing list
> users at lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users




More information about the users mailing list