[Scilab-users] Reintroducing stacksize on Scilab 6 ? was (Re: multiple element by element between large matrix and vector)

Antoine Monmayrant antoine.monmayrant at laas.fr
Mon Sep 28 15:47:44 CEST 2015


Le 09/28/2015 02:37 PM, CHEZE David 227480 a écrit :
> I agree with the proposal to reintroduce stacksize for the reasons explained.
> As Antoine suggests, It would make sense to have a safe "default" value for the  "default" max allocable memory, related to % of installed hardware memory (2, 4, 8.. Go) and appropriate warnings when you go above. Then user could set by himself a desired max allocable memory value and in this case, if you try  to allocate more, Scilab issues an error of max memory reached.
> I think this could satisfy each user: for the first works on new data, you might be happy to have the max possible memory to run first your script and see the first results (default on stacksize), eventually being aware of risk of memory issue (warnings), then you could improve robustness of your script by tuning code and sizing the memory as further steps.
> Thanks,
>
> David Chèze

Should this be turned into a bug report or a SEP?
What would make more sense?
Out of topic, but what about the missing ";" of death ?
I quite like what Julia is doing by omitting the central part of arrays, 
only showing the head and tail:

julia> rand(10000,1)
10000x1 Array{Float64,2}:
  0.492459
  0.865151
  0.269796
  0.409073
  0.642442
  0.468003
  0.450019
  0.766186
  0.419766
  0.885939
  0.423559
  0.416282
  0.644324
  0.0867144
  ⋮
  0.675171
  0.0554125
  0.159523
  0.520639
  0.479544
  0.694801
  0.421987
  0.57718
  0.770263
  0.397439
  0.392381
  0.698025
  0.31154

What do you think?

Antoine

>   
>
> -----Message d'origine-----
> De : users [mailto:users-bounces at lists.scilab.org] De la part de Antoine Monmayrant
> Envoyé : lundi 28 septembre 2015 14:02
> À : users at lists.scilab.org
> Objet : Re: [Scilab-users] Reintroducing stacksize on Scilab 6 ? was (Re: multiple element by element between large matrix and vector)
>
> I second this proposal, I think it's not a good thing for scilab to allow "by default" to bring someone's computer to a crawl.
> It's a good way to turn users away.
> For me, this proposal fits in the same category of improvements than any solution to the forgotten ";" of death (if M is a huge matrix and you type "M" instead of "M;", you are greeted with an endless printing of numbers that you cannot interrupt, even with Ctrl+C).
> Can't we at least get a warning when we tend to consume way too much memory (ie above a certain % of the total available memory)?
>
> Cheers,
>
> Antoine
>
> Le 09/28/2015 12:01 PM, Clément David a écrit :
>> Hello all,
>>
>>> Also, if you don't already know about it, stacksize is a handy Scilab
>>> function if you're working with large data arrays.  "stacksize max"
>>> will
>>> either give you the biggest Scilab stack that can be had or crash
>>> your machine, depending on your version (it appears to work in the
>>> current version).  "stacksize(nnn)" will set your stacksize to nnn,
>>> without crashing your machine (to my knowledge).  "stacksize" will
>>> report the current stacksize.
>> As a reference, David posted a bug on that
>> http://bugzilla.scilab.org/show_bug.cgi?id=14176 and he targets Scilab
>> 6.
>>
>> So I have a question on your needs for Scilab 6. There is currently no
>> more stacksize as all the system's memory is available. To protect
>> users, I suggested to re-introduce `stacksize` with a changed behavior
>> :
>>
>>    * M=stacksize(N) : will set N * sizeof(double) bytes available on
>> the Scilab datatypes raw memory
>>                       will return M the previous sett'ed value
>>    * stacksize('max') : will disable any memory restriction
>>
>> ## Why re-introducing `stacksize` ?
>>
>> On my Linux system (with 8Go of RAM and some applications started),
>> allocating all my memory (like with `zeros(2**30,2**3)`) slow my
>> computer down and succeed after a lot of time. Reducing the memory
>> available to Scilab using `stacksize` will help user discover
>> algorithm or memory issues more rapidly and without swapping most of
>> the other applications *by default* .
>>
>> My point is not to limit the available memory issue but ease language
>> usage for new-comers by protect them against typo or mis-design
>> algorithms.
>>
>> Awaiting your opinion,
>>
>> --
>> Clément <david> David
>> _______________________________________________
>> users mailing list
>> users at lists.scilab.org
>> http://lists.scilab.org/mailman/listinfo/users
>>
> _______________________________________________
> users mailing list
> users at lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users
> _______________________________________________
> users mailing list
> users at lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users
>



More information about the users mailing list