[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Compiling large number of files

From: Ralf Wildenhues
Subject: Re: Compiling large number of files
Date: Thu, 7 Oct 2004 07:33:37 +0200
User-agent: Mutt/1.4.1i

* v p wrote on Thu, Oct 07, 2004 at 01:08:54AM CEST:
> Hi Guys,
> I have a question. I am using autotools to build my static library.
> The way I do it is, I have a as
> lib_LIBRARIES = libmylib.a
> libmylib_a_SOURCES = xyz.c abc.c etc.
> The problem is the number of source files is huge (1659 files to be
> precise) and that's why it takes a lot of time to compile all this.
> When I run the generated makefile, I see 'gcc' being invoked for each
> of the source files. Loading 'gcc' into memory for each file is a time
> consuming process, and it is possible to compile more than one file at
> a time using gcc. If I could make generated makefile do this, my build
> process will be much faster. Is there any way to achieve this using
> and automake.
> Any inputs will be greatly appreciated.

In addition to the answers you already have:

Do you use multiple's (e.g., one per source directory) or
one single large  Because depending on the dependency
structure of your source files, using one or the other might be faster
for a given `make' program (in case that is a limiting factor).  In any
case, try GNU Make if you are currently using a different `make'

To find out how much time is actually spent by the compiler, try to
compare builds generated with CFLAGS=-O0 against your normal
(optimized?) builds.  If the compiler turns out to be the limiting
factor, and your header file structure is amenable to it, maybe try a
very new gcc and use its precompiled header feature.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]