help-make
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Library dependency files


From: Hambridge, Philip J (ODP)
Subject: RE: Library dependency files
Date: Tue, 27 Apr 2010 08:15:21 +0100

Mike,

Your understanding is spot on. There are currently some 4500 source .d
files spread across 500 directories. So I would be including 500 files
rather than 4500 if I make the change.

I see a reduction in start-up time of make of around %45.

In my experience I find that make seems very good at sorting out it's
DAG and that reducing the number of files it has to open has more effect
on performance - after that requires I/O on the system, not just
processing stuff in memory.

Regards,

Philip.

-----Original Message-----
From: Mike Shal [mailto:address@hidden 
Sent: 27 April 2010 04:45
To: Hambridge, Philip J (ODP)
Cc: address@hidden
Subject: Re: Library dependency files

On 4/26/10, Hambridge, Philip J (ODP) <address@hidden>
wrote:
> We have a large non-recursive makefile system that works very well on
the
> whole. Sources are grouped into functional directories and, following
> compilation, are linked into static archive libraries (one per
directory)
> which get fully linked into one overall executable at the end.
>
>
>
> Source dependencies are held on .d files (one per source) and these
are
> updated as a side-effect of compiling with gcc - we use Tom Tromey's
method.
> The source dependencies are pulled into the build using a '-include
<list of
> .d files>' for each directory.
>
>
>
> One disadvantage of the system is that the start-up time for make is
seen as
> tool long by some users when only one source actually needs to be
> re-compiled.
>
>
>
> I've started to wonder about reducing the start-up time by placing all
> source dependencies in a per-directory .d file generated at the
archive link
> stage and including this instead of including all the source .d files.
A
> quick test verifies that this significantly reduces the start-up time
due to
> make having to open far less files. I've yet to figure out quite what
the
> implications are for ensuring the per-directory .d file gets updated
> correctly.

Just curious, what kind of performance increase do you actually see
here? If I understand your solution correctly, instead of including N
dependency files with M dependencies each, you want to include 1
dependency file with N*M dependencies (per directory). I would've
thought most of make's time is spent in parsing out that info into the
DAG, rather than just in opening and closing the files. Is the time
difference all in the system time (I think user time should be about
the same)?

> Before I go any further, has anyone else done anything like this? If
so,
> were there any pitfalls?

Sorry, I haven't actually tried it myself :)

-Mike




reply via email to

[Prev in Thread] Current Thread [Next in Thread]