bug-bash
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Bash high memory usage in simple for loop?


From: Chet Ramey
Subject: Re: Bash high memory usage in simple for loop?
Date: Mon, 27 Sep 2010 21:54:53 -0400
User-agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.8) Gecko/20100802 Lightning/1.0b2 Thunderbird/3.1.2

On 9/26/10 11:20 PM, Thomas Guyot-Sionnest wrote:

> For instance I have 68M of small numbers (1..256) separated by newlines
> and that makes bash grow over 1.6G, even when ann it does inside the
> loop is calling true. The only way I can free up the memory is to leave
> the shell.
> 
> You can test easily with this command (you might want to limit your
> memory with ulimit first to avoid trashing your system...):
> 
> $ for i in `seq 1 10000000`; do true; done
> 
> On my test system this requires 1G of memory, and memory climbs a bit
> higher on additional runs but settles at 1.1G (it doesn't seem to leak
> any memory part this point.

Greg's answer was correct.  This is not a leak.  A leak is when a
calling application loses all handles to a piece of allocated memory
before returning it to the allocator with free().

You have to remember the order in which expansions take place to figure
out what's happening.  The entire string is read into memory, then
split into individual words, after which the original string can be
freed.  At some point, bash will require all of that memory to be valid
at once, and malloc will have to request at least all of it, plus more
for overhead, from the kernel.

Under certain circumstances, the bash malloc will release large allocations
back to the system upon free() using sbrk or equivalent; the freed
memory has to be at the top of the process's data segment, though, and
I think there are enough intermediate allocations taking place that this
will not be true when the large string above is freed.

In most cases, the bash malloc will waste memory at large allocation
sizes due to its powers-of-two strategy, though it will use that memory
to satisfy subsequent allocations if it can, rather than request more
memory from the kernel.  I don't know how the libc malloc behaves when
presented with very large allocation requests and subsequent frees.

Chet
-- 
``The lyf so short, the craft so long to lerne.'' - Chaucer
                 ``Ars longa, vita brevis'' - Hippocrates
Chet Ramey, ITS, CWRU    chet@case.edu    http://cnswww.cns.cwru.edu/~chet/



reply via email to

[Prev in Thread] Current Thread [Next in Thread]