lout-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Running header


From: Jeff Kingston
Subject: Re: Running header
Date: Mon, 13 Jun 2005 08:35:43 +1100

> When I encoutered the problem for the first time
> I had wrote just one chapter; images are spread
> all over the chapter, so I presume it would be the
> same for a section or another.

I am treating the "need-to-run-n-times" problem
and the "large-ld-file" problem as separate
problems that just happen to both be occurring
in your document.  I don't think I can do
anything about the "need-to-run-n-times" problem,
given that I worked hard at it long ago and that's
the best that I could do.  However, to investigate
the large .ld problem I still think you should
do runs on subsets of your document to try to
find out which particular parts of it are
causing this problem.  If it's the first
chapter you wrote, fine, just comment out
the @Include files for the others, as
described in the User's Guide, and see
whether it is still happening on that chapter,
and if so try to find out which section(s).

> I'm using lout 3.29; do you think I should try with 3.30?

If you want me to do anything, testing with 3.30
would be an advantage because it would reduce the
number of variables in the problem by one.  However
it's not likely that whatever is causing your
problem has been corrected in 3.30.

Jeff

ps It has just occurred to me that there might be
a connection between the two problems.  The
cross reference database for the nth running
page header might include a copy of the cross
reference database entry for the (n-1)th
running page header.  If that is the case,
then as n increases the cross reference database
entries would grow larger and larger, until they
hit the sanity limit that you have reached.  I
would need to look at the difference between
one version of a large .ld file and the next,
to verify that this is happening.  You might
be able to do that yourself, although the
format, while plain text, is rather cryptic.
If this is the problem, then I don't think
I am going to be able to do anything about it.
You could edit file z06.c, changing line

    #define     MAX_STACK       100

to some large number, e.g.

    #define     MAX_STACK       200

This will allow the parser to handle larger
expressions before complaining about being
too deeply nested.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]