[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Error when compiling a large file

From: David Kastrup
Subject: Re: Error when compiling a large file
Date: Thu, 21 Feb 2013 09:32:50 +0100
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/24.3.50 (gnu/linux)

ArnoldTheresius <address@hidden> writes:

> Miguel Jesus wrote
>> I finally got the file to compile. I had to set the LILYPOND_GC_YIELD to
>> 100. Anyone knows why it worked that way and not the default one?
>> Anyway, it took 700 seconds to compile, which is a lot more that it took
>> you. As you said, I only saw 1 CPU being used. Can lilypond use more than
>> 1 CPU to make things faster?
>> ...
>> _______________________________________________
>> bug-lilypond mailing list
>> bug-lilypond@
>> https://lists.gnu.org/mailman/listinfo/bug-lilypond
> Well, I did try to compile it on Win7/64 with several different settings to
> LILYPOND_GC_YIELD (100, 70, 50, 35, 25, 18, 13, 9, 7, 1). All trials faild
> at my setup. The "commited memory usage" for the 32bit Lilypond process
> displayed in the taks manager was approx. 1.3 GB in all processes at the
> time they failed, which is much away from the 4 GB 32bit applications can
> reach under 64bit-Windows (3 GB under 32bit-Windows).
> I wonder, if there is a software limit for the heap in guile.

Depends on the size of the default cell, I guess.  I get
scheme@(guile-user)> (let loop ((x 0)) (if (not (eq? (1+ (1- x)) x)) x (loop 
(1+ x))))
$1 = 536870912

which means that the size of integers fitting into a cell (and not
requiring multi-word arithmetic) is about 30 bits (assuming a sign bit
and symmetric number range).  That makes it likely that there is a
comparable number of (cell-aligned) pointers for all categories

Now I am running on a 32bit system.  You can check the above on your
system and see where you get.  If you don't get further, it is unlikely
that you have more space addressable by Scheme cells.

David Kastrup

reply via email to

[Prev in Thread] Current Thread [Next in Thread]