[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Speeding up config and make
Re: Speeding up config and make
Mon, 21 Apr 2003 23:32:38 -0600
Bruce Korb wrote:
> I recently dawned on me that there is a simple way to speed
> up downloading, configuring and building packages by tweaking
> [...compile everything as one single source file...]
> It has the following effects:
> 1. The resulting makefiles have fewer dependencies.
> They are smaller. Faster to configure.
I don't see how this would change configure speed. There are the same
portability tests and the same number of Makefiles to process. I
admit to not trying it but the speed of configure should be virtually
> 2. There are fewer compilations. They are much bigger, but
> still take less time overall.
It all depends upon what you are optimizing. If you have a fast
compiler and a slow make then this should work out to be faster. But
what if you have a fast make *and* a fast compiler? Is there any
savings in that case?
> 3. Make runs *much* faster. There are far fewer dependency
> computations to make. This savings is *FAR* greater than the
> compile time wasted recompiling unmodified code. Besides,
> most building is on a clean tree anyway.
Most building for whom? The end user installing someone else's code?
Or the developer recompiling and developing an application?
How long does 'make' run on your system when everything is up to date?
Change three random files, how long does it take to incrementally
rebuild the system? How long to rebuild the entire system from
scratch? How long to rebuild the system as a single compilation unit?
For any project that is small enough to compile successfully as a
single compilation unit then I postulate that make will not consume an
inordinate amount of time. For those that are large enough that make
does consume significant time then I am sure that they are also too
large to be compiled as a single compilation unit.
> 4. The executables are potentially faster because inter-procedural
> optimizations are easier to perform.
Agreed. There is potential there. How much of that is realized is up
> 5. Downloads go faster. The distribution is smaller (marginally).
To what do you attribute the smaller size? What files are not
distributed using your combined process that is distributed when
> 6. Changing one file triggers a full recompile. It's still faster.
Possibly yes. It all depends upon the particular mix of factors in
any given project. But if I am compiling C++ instead of C the
compilers are much slower and I doubt I could get a faster result this
way with any mix.
> 7. It is aesthetically displeasing.
You forgot about namespace issues. Programs with static/private
variables and functions may not work. By mixing them all together you
will break many correct programs. You will introduce many namespace
> 8. It leads to the following question:
> What if "make" were dropped altogether and replaced with a simple
> script that simply performed all the build steps all the time?
> If it takes less time to recompile the world than compute the
> dependency tree, why bother with make at all? To heck with dependencies.
If you have the right mix of files and content then true, there is no
benefit. But that is not a general purpose statement.
As a counter statement a large project to which I am a contributor
takes around six minutes to recompile entirely from scratch using a
shell script. (I actually did that because the original build system
is abhorrent.) There is no make overhead in that time. I created a
make hierarchy using the autotools. Using normal recursive make an
incremental compile changing any random three files out of the 2,500
odd C files in the project takes only 15 seconds bring the system up