bug-bash
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: SEGFAULT if bash script make "source" for itself


From: bogun.dmitriy
Subject: Re: SEGFAULT if bash script make "source" for itself
Date: Thu, 28 Aug 2014 14:50:56 -0700

2014-08-28 13:59 GMT-07:00 Bob Proulx <bob@proulx.com>:

> bogun.dmitriy@gmail.com wrote:
> > Eric Blake wrote:
> > > bogun.dmitriy@gmail.com wrote:
> > > > IMHO any user action should not lead to SIGSEGV! I am not objecting
> against
> > > > recursive "sourse" itself. But when I got SIGSEGV from "bash", I
> have no
> > > > idea why this is happened. I have made recursive "sourse" by mistake
> and
> > > > spend a lot of time looking up what exactly lead to SIGSEGV.
>
> But you wrote the program that caused the SIGSEGV.  At that point you
> are no longer just a user but are now a programmer.  Technically
> speaking the SIGSEGV problem would be a bug in your bash script
> program.  You as the programmer of that script have the responsibility
> for it.
>
Any action in my script should lead to SIGSEGV in interpreter! If I write
program on some compilable language, for example C, compile it and got
SIGSEGV - this is my problem. But in this case, my program executed by
interpreter, and if interpreter fail, despite reasons, this is problem of
interpreter.

What you will say if gcc(cc) will down with SIGSEGV while compiling your
code? Is it problem of gcc or your code?


>
> > > SIGSEGV is what happens on stack overflow, unless you integrate a stack
> > > overflow detector like GNU libsigsegv with your sources to catch the
> > > segv and replace it with a nice error message.
> >
> > I know when and why program can get SIGSEGV.
>
> Then you already know that it is a recursion problem that has run out
> of stack space.  Any program that allows recursion might be programmed
> in error.  If this is not suitable then using a programming language
> that does not allow recursion might be the choice for you.  For
> example the old FORTRAN did not allow recursion and yet it enjoyed a
> lot of popularity at one time.
>
> Are you make fun of me?

I got an error in interpreter(we talk about "bash" if you forgot). I have
made a proposal, how it can be solved... And you say me to write on
something other. Hm... perhaps this is not so bad idea, if all bash
developer prefer to not fix errors in their code.


> > So why I should got SIGSEGV instead of nice, detailed error message in
> > recursion? We can detect it?
>
> Because in general it is a hard problem to solve.  And it isn't always
> about making it bigger.  It may simply be a very small environment
> without enough stack to complete.  The program may completely fine if
> there were enough stack.  It is hard to tell if the program is in an
> infinite recursion or if it simply didn't have enough stack space to
> complete and would complete if there were more.  All that can be said
> is that it didn't have enough stack.
>

There is already variable that limit function recurstion depth(FUNCNEST).
Why there should not be similar variable for "source" recursion depth?
Why we have a FUNCTEST limit, if this is not problem of interpreter, but
the problem of end prorgammer?

> > The GNU Coding Standards state that GNU software cannot have arbitrary
> > > limits by default.  Any limit we pick, other than unlimited (your
> > > ...
> >
> > IBM produce new CPU - it solve infinite loop in 8 seconds.
> >
> > How bigger amount of memory save from infinite recursion? It lead to
> bigger
> > delay before SIGSEGV and nothing else.
>
> Haha.  Of course it is a hardware problem.  If we only had a machine
> with an infinitely large stack then we would never run out of stack
> space and could never SIGSEGV due to stack overflow.
>
And what? User will wait forever? Until we got CPU that solve infinite
loops?
There is not so many cases that can use infinite recursion.

Of course then the program would simply run forever since it would
> continue to do exactly what it had been programmed to do.  Which is
> one of the things that makes this so hard.  In order for an automated
> detection the program must say that the program should not do what the
> programmer told it to do.  That is where it runs into problems.
> Similar to why auto-correction spell checkers force so many spelling
> errors on humans.
>

Why I can't/should n't set recursion limit? I don't want automatically
solving my errors. I just want to have some protection.


>
> >  $ ulimit -a
> > data seg size           (kbytes, -d) unlimited
> > ...
> >
> > so... in real life we have a limits. Some of them turned off, but they
> > exists and can be ajusted.
>
> Those are explicit process limits in addition to the actual limits.
> It is always easier to make them smaller.  But you can't make the
> actual limits larger.
>
> For example feel free to try to make use of that unlimited data set
> size.  Let's assume you have a X sized memory machine.  Try to use a
> thousand times X that amount of memory regardless of it being set to
> unlimited and see how well things work.
>
again... We talk not about hardware limits!
If my application try to get memory bigger than installed in my PC, and
bigger that available swap space, it will be terminated by kernel.


> > And if I have an option, which I can change to some suitable value for me
> > and this can save me/show me good error message in case of infinite
> > recursion - I will use it. Other can leave it in ifinite position. We can
> > have 2 options - one set recursion level limit, other set action when
> this
> > limit is reached - deny deeper recursion / print warning.
>
> There is the sharp kitchen knife thought-problem.  Have you ever cut
> yourself on a kitchen knife?  Of course most of us have at one time or
> another.  Can you envision a way to make kitchen knives safer?  A way
> to make it impossible for you to cut yourself on one?  Think about how
> you would make a kitchen knife safer.  Give it a serious think.  Then
> ask yourself this question.  Would you use such a knife that you
> designed yourself?  The answer is invariable no.  There has always
> been some inherent danger when using a kitchen knife.  We accept this
> because the alternatives are all worse.
>
> Bob
>
I have a lack of regular words to give you answer.

To be short - you(community) don't want to add limit, because its default
value shoul be infinite!


reply via email to

[Prev in Thread] Current Thread [Next in Thread]