automake
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Speeding up config and make


From: Robert Boehne
Subject: Re: Speeding up config and make
Date: Tue, 22 Apr 2003 00:59:04 -0500

Bob:

I thought I'd add another, very different, example to this
thread.  I've been maintaining Auto-OpenCASCADE, a set of C++
libraries for solid modeling (CAD/CAE) applications.  The original
source was built using C-shell scripts (!) and with ~180 MB of source,
it takes anywhere from 6 hours (Linux dual PIII 700MHz) to a
couple of days (an older hppa-risc running HPUX).
In this state, to port to another platform meant generating ~140
csh scripts and then maintaining them.  I decided to use
Automake/Autoconf & Libtool instead to get the portability benefits
mostly.
It's done (after adding features to Libtool to make it work) and
it now compiles from scratch on the dual P-3 Linux box in 4 hours,
and only 10 minutes to run "make" on a freshly made tree (nothing gets
built).
Now I can track dependencies in the 10,000+ source files and
build only what needs to be.  Before, a clean rebuild was required
any time a header changed (there was no way to keep track of which
source files included any one of the thousands of headers).
  I would suggest that converting a project to use a single (and large)
source file with a one-line script to compile it will not "scale" well
so you can add that to the arguments against it.

HTH,

Robert

Bob Proulx wrote:
> 
> Bruce Korb wrote:
> >
> > I recently dawned on me that there is a simple way to speed
> > up downloading, configuring and building packages by tweaking
> > Makefile.am.
> > [...compile everything as one single source file...]
> > It has the following effects:
> >
> > 1.  The resulting makefiles have fewer dependencies.
> >     They are smaller.  Faster to configure.
> 
> I don't see how this would change configure speed.  There are the same
> portability tests and the same number of Makefiles to process.  I
> admit to not trying it but the speed of configure should be virtually
> the same.
> 
> > 2.  There are fewer compilations.  They are much bigger, but
> >     still take less time overall.
> 
> It all depends upon what you are optimizing.  If you have a fast
> compiler and a slow make then this should work out to be faster.  But
> what if you have a fast make *and* a fast compiler?  Is there any
> savings in that case?
> 
> > 3.  Make runs *much* faster.  There are far fewer dependency
> >     computations to make.  This savings is *FAR* greater than the
> >     compile time wasted recompiling unmodified code.  Besides,
> >     most building is on a clean tree anyway.
> 
> Most building for whom?  The end user installing someone else's code?
> Or the developer recompiling and developing an application?
> 
> How long does 'make' run on your system when everything is up to date?
> Change three random files, how long does it take to incrementally
> rebuild the system?  How long to rebuild the entire system from
> scratch?  How long to rebuild the system as a single compilation unit?
> 
> For any project that is small enough to compile successfully as a
> single compilation unit then I postulate that make will not consume an
> inordinate amount of time.  For those that are large enough that make
> does consume significant time then I am sure that they are also too
> large to be compiled as a single compilation unit.
> 
> > 4.  The executables are potentially faster because inter-procedural
> >     optimizations are easier to perform.
> 
> Agreed.  There is potential there.  How much of that is realized is up
> for debate.
> 
> > 5.  Downloads go faster.  The distribution is smaller (marginally).
> 
> To what do you attribute the smaller size?  What files are not
> distributed using your combined process that is distributed when
> building normally?
> 
> > 6.  Changing one file triggers a full recompile.  It's still faster.
> 
> Possibly yes.  It all depends upon the particular mix of factors in
> any given project.  But if I am compiling C++ instead of C the
> compilers are much slower and I doubt I could get a faster result this
> way with any mix.
> 
> > 7.  It is aesthetically displeasing.
> 
> Downright ugly.
> 
> You forgot about namespace issues.  Programs with static/private
> variables and functions may not work.  By mixing them all together you
> will break many correct programs.  You will introduce many namespace
> errors.
> 
> > 8.  It leads to the following question:
> >
> > What if "make" were dropped altogether and replaced with a simple
> > script that simply performed all the build steps all the time?
> > If it takes less time to recompile the world than compute the
> > dependency tree, why bother with make at all?  To heck with dependencies.
> 
> If you have the right mix of files and content then true, there is no
> benefit.  But that is not a general purpose statement.
> 
> As a counter statement a large project to which I am a contributor
> takes around six minutes to recompile entirely from scratch using a
> shell script.  (I actually did that because the original build system
> is abhorrent.)  There is no make overhead in that time.  I created a
> make hierarchy using the autotools.  Using normal recursive make an
> incremental compile changing any random three files out of the 2,500
> odd C files in the project takes only 15 seconds bring the system up
> to date.
> 
> Bob




reply via email to

[Prev in Thread] Current Thread [Next in Thread]