automake
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: dist-xz compression level


From: Reuben Thomas
Subject: Re: dist-xz compression level
Date: Mon, 12 Apr 2010 12:05:52 +0100

On 11 April 2010 23:37, Bob Friesenhahn <address@hidden> wrote:
> Yes, compression is useful.  However, the cost of pushing the algorithm
> close to the limit does incur costs as well.  For many packages, getting 99%
> of the max in 1/2 the time is a worthy tradeoff. This is similar to the
> decision to use -O2 as the default GCC compiler optimization rather than
> -O3.

-O2 vs -O3 is a rather different case, as it's not a straightforward
time-space tradeoff. The last time I checked, -O3 still had
significant bugs, so it was only worth using for time-critical code
where it was worth the testing required to show that they hadn't been
triggered. Assuming that has been fixed, then there's the problem that
-O3 often produces code that takes fewer clock cycles to execute, but
is bigger than -O2's code, so in fact it is still only worth using for
critical regions. Indeed, when Apple switched to Intel, they went one
further, as their system profiling showed that it was better to use
-O2 only for the kernel and system libraries, and to use -Os for
everything else (i.e. optimize for space), because for application
code cache impact was more important than raw speed. I can find little
on research for GNU/Linux systems, except a supporting instance where
someone tried -O2 vs -Os for the kernel and found -O2 to be a bit
better.

-- 
http://rrt.sc3d.org




reply via email to

[Prev in Thread] Current Thread [Next in Thread]