automake
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Speeding up config and make


From: Bruce Korb
Subject: Re: Speeding up config and make
Date: Tue, 22 Apr 2003 08:56:46 -0700

Robert Boehne wrote:

> I thought I'd add another,...example to this thread. ...  The original
> source was built using C-shell scripts (!) and with ~180 MB of source,
> it takes anywhere from 6 hours (Linux dual PIII 700MHz)
> ... it now compiles from scratch on the dual P-3 Linux box in 4 hours,
> and only 10 minutes to run "make" on a freshly made tree (nothing gets
> built).

Since make fires up a shell process for each command, it's clear
that your shell script was sub-optimal.  :-)  I also wouldn't advocate
hand crafting shell scripts to build products.  We moved away from that
25 years ago for good reason...._Deriving_ a shell script from
the project description (Makefile.am's), however, might be useful...

> Bob Proulx wrote:
> > > [...compile everything as one single source file...]
> > > It has the following effects:
> > >
> > > 1.  The resulting makefiles have fewer dependencies.
> > >     They are smaller.  Faster to configure.
> >
> > I don't see how this would change configure speed.

I forgot the parenthetical comment:  (marginally)
The configure is faster simply due to smaller Makefile.in sizes.

> > > 2.  There are fewer compilations.  They are much bigger, but
> > >     still take less time overall.
> >
> > It all depends upon what you are optimizing.  If you have a fast
> > compiler and a slow make then....

I confess I didn't measure a variety of compilers and builders.
GCC's C and GNU-make on Linux.  However, since I'm not playing with
multi-million line projects and since I've carefully kept name spaces
*pretty* separate, the big compiles work for me.  It ensures that
the header files are only parsed once and all the compilation process
creations only happen once.  My guess (without profiling) is that the
header file savings is the biggest chunk of compiler savings.  I know
some day we'll have pre-compiled headers.  We're not there yet.  Also,
I use libtool.  That adds a 150KB shell script fixed overhead to every
compile and every link.

> > > 3.  Make runs *much* faster.

[[Maybe not.  I hadn't separated out "make" from "build".]]

Developers typically work on one or a few boxes.  The autotools
infrastructure distributes configurable make files for client
builds, though the real need (hopefully :) is only for a one-time
build.  So, would the package amalgamators prefer a build script
that (for me) goes 2x faster or the ability to tweak source and
do a partial rebuild?  Actually, the _real_ question is, "Why not both?"

> > How long does 'make' run on your system when everything is up to date?

On my system on my application?

step    | monolithic | separated | delta
========================================
config       1:20         1:23    +marginal
build        2:00         3:35    +80%
noop bld     0:07         0:08    +10%
small bld    0:34         0:32    -marginal

> > For any project that is small enough to compile successfully as a
> > single compilation unit then I postulate that make will not consume an
> > inordinate amount of time.  For those that are large enough that make
> > does consume significant time then I am sure that they are also too
> > large to be compiled as a single compilation unit.

Well, I wasn't really saying that single compiles were the solution
for all projects.  Or, if I was implying it, I wasn't meaning to.
I consider it highly likely, though, that projects ought to be compiled
in larger chunks.  If that can be "at once", then that works as a "larger
chunk", too.

> > > 4.  The executables are potentially faster because inter-procedural
> > >     optimizations are easier to perform.
> >
> > Agreed.  There is potential there.  How much of that is realized is up
> > for debate.

Few would argue that it is more than a marginal effect, tho.
It might be for a few rare intensely used pieces of code.

> > > 5.  Downloads go faster.  The distribution is smaller (marginally).
> >
> > To what do you attribute the smaller size?

Individual build rules, one for each target:

> autogen-opts.o: opts.c
> @am__fastdepCC_TRUE@  if $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) 
> $(AM_CPPFLAGS) $(CPPFLAGS) $(autogen_CFLAGS) $(CFLAGS) -MT autogen-opts.o -MD 
> -MP -MF "$(DEPDIR)/autogen-opts.Tpo" \
> @am__fastdepCC_TRUE@    -c -o autogen-opts.o `test -f 'opts.c' || echo 
> '$(srcdir)/'`opts.c; \
> @am__fastdepCC_TRUE@  then mv "$(DEPDIR)/autogen-opts.Tpo" 
> "$(DEPDIR)/autogen-opts.Po"; \
> @am__fastdepCC_TRUE@  else rm -f "$(DEPDIR)/autogen-opts.Tpo"; exit 1; \
> @am__fastdepCC_TRUE@  fi
> @AMDEP_TRUE@@am__fastdepCC_FALSE@     source='opts.c' object='autogen-opts.o' 
> libtool=no @AMDEPBACKSLASH@
> @AMDEP_TRUE@@am__fastdepCC_FALSE@     depfile='$(DEPDIR)/autogen-opts.Po' 
> tmpdepfile='$(DEPDIR)/autogen-opts.TPo' @AMDEPBACKSLASH@
> @AMDEP_TRUE@@am__fastdepCC_FALSE@     $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@
> @am__fastdepCC_FALSE@ $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) 
> $(AM_CPPFLAGS) $(CPPFLAGS) $(autogen_CFLAGS) $(CFLAGS) -c -o autogen-opts.o 
> `test -f 'opts.c' || echo '$(srcdir)/'`opts.c

Here is a size comparison for
agen5/Makefile.in:    131379  vs.  37717
That was extreme.

> > > 6.  Changing one file triggers a full recompile.  It's still faster.
> >
> > Possibly yes.  It all depends upon the particular mix of factors in
> > any given project.  But if I am compiling C++ instead of C the
> > compilers are much slower and I doubt I could get a faster result this
> > way with any mix.

You only parse headers once, so you might.  C++ is a different beast.

> > > 7.  It is aesthetically displeasing.
> >
> > Downright ugly.

:-)

> > You forgot about namespace issues.  Programs with static/private
> > variables and functions may not work.

I meant to mention it.  :-)  Certainly you wouldn't combine compilations
blindly or if you got compile errors.  I know I suggested puting every
source into a single compile, but one needs to use reason, too.
Here's what I _actually_ did for my exercise:

> nodist_autogen_SOURCES = ag.c
> autogen_SOURCES = defParse.y opts.c cgi-fsm.c

"defParse.y" was forced because if it wasn't directly listed, automake
wouldn't generate the yacc rules.  The other two sources are generated
and conflict.  So, I have four objects to link.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]