[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Compiling large number of files

From: Bob Friesenhahn
Subject: Re: Compiling large number of files
Date: Wed, 6 Oct 2004 18:26:57 -0500 (CDT)

On Wed, 6 Oct 2004, v p wrote:

Hi Guys,
I have a question. I am using autotools to build my static library.
The way I do it is, I have a as
lib_LIBRARIES = libmylib.a
libmylib_a_SOURCES = xyz.c abc.c etc.

The problem is the number of source files is huge (1659 files to be
precise) and that's why it takes a lot of time to compile all this.
When I run the generated makefile, I see 'gcc' being invoked for each
of the source files. Loading 'gcc' into memory for each file is a time
consuming process, and it is possible to compile more than one file at
a time using gcc. If I could make generated makefile do this, my build
process will be much faster. Is there any way to achieve this using and automake.

I suggest that you do some testing by hand to see what performance benefit is really gained from passing multiple source files to gcc.

Most modern Unix systems are exceedingly fast at starting large programs like GCC and once it has been executed once, the executables and libraries are actually cached in RAM so subsequent executions are faster. Here is an example:

% time gcc --version > /dev/null
gcc --version > /dev/null  0.00s user 0.00s system 0% cpu 0.004 total

If builds are much slower using Automake, the problem is most likely that the scripting associated with automatic dependency generation is slowing down the build. You can configure with --disable-dependency-tracking but then you won't have dependency information (which developers normally need).

Bob Friesenhahn

reply via email to

[Prev in Thread] Current Thread [Next in Thread]