bug-binutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: BUG elf32-i386 R_386_PC32 done wrong


From: doctor electron
Subject: Re: BUG elf32-i386 R_386_PC32 done wrong
Date: Sat, 24 Jun 2006 12:31:13 -0500

Long, long ago, Eric Botcazou, a life form in far off space,
emitted:

>Interesting.  Then your next task is to convince the dumb guys at Sun too 
>because their toolchain behaves exactly like the Linux toolchain...

Thanks for this info, Eric.  

As you might see in Ian's thoughtful reply, I don't think he
gets the point (maybe my failure to communicate well):

ld can get the -4 on its own, rather than read it from "typical"
input files and thereby conform to the rel reloc formula *and*
remove the requirement that .o files contain -4 at all those
locations, which must be a continuing source of shame and
embarrassment to writers of existing Linux compilers (nasm, C,
etc).  Note that if ld coughs up its own -4 (per the formula I
posted), all the existing .o/.so files would still link -- the
-4 is a *constant* in the hardware-based formula and its
presence in those .o/.so files should properly be ignored [so
correct input files with any arbitrary value (e.g., 0) in those
locations would also link].

Think about it, it is not so much as being right or wrong, smart
or dumb (ld covers a huge range of architectures and formats;
clearly its writers are very "smart").  ld's problem is that it
must get the constant -4 from input files, when in fact, it does
not need to be in the input files (the compiler making those
only needs to allocate the "place-holder" 4 bytes addressed in
the rel reloc entry).

Clearly, ld is using the correct formula; the issue is *where*
it gets the constant -4.  If it did not try to read this from
the input file, then all object files would be correctly
relocated -- not just the ones that have -4 in them.  I argue
that this would be beneficial to Linux (Sun would be forced to
follow on the quality upgrade in OS flag-ship linker).

Does it not seem lame to require compiler writers to put a -4
gibberish in their object file outputs for the only reason that
ld can't cough up a -4 all by itself?

If, as Ian pointed out in his second reply (thank you), this is
a very pervasive problem -- requiring all inputs to have -4
constants inserted by third party (compiler) programs, and now
you add Sun, too, this is a big, news-worthy story -- involving
issues of the "image" or "appearance" of competence and
integrity in the Linux OS.

Notice Ian gave no rationale beyond what I might call "This is
the way we do it" and "Some ABI document covers us", as if any
document changes how existing processors work.  Nor did Ian even
challenge my formula (which is really the formula of
microprocessor makers).  Rather he seemed to me to rely on "We
all say this is it so it must be it", instead of addressing the
absurdity of reading the constant -4 from input files in the
first place, and thereby requiring third parties (compiler
writers) to, in essence, corrupt their output files by
mindlessly inserting this -4 constant at rel reloc addresses.

SUMMARY:  Your Sun data only confirms my analysis and augments
the dimension of this big story.  When and if this becomes
public, so to speak, what are proponents of requiring the -4
constants in object files going to say?

From what we see in this thread of posts, will they have nothing
credible that makes any sense -- other than some ABI doc says
it's OK?  The enquiring public will want more substance than
that.

Me, I think we on this end can fix ld.  So if Linux developers
don't want to fix this; that's fine, the loss is reduced Linux
credibility, I think.

Worst, your compiler developers are hung out to dry:  they are
"innocent", so to speak, but are forced by ld developers to take
the hit because they have no rational explanation of the -4's in
their output files other than ld can't do rel relocs all by
itself.  

The implications of an article -- "Scandal in Linux Cyberspace"
-- documenting this "big story" are many.  Why was this not
fixed years ago?  How could such an ill-conceived design (an
object file must have a constant for a linker) ever have become
a "canon of Linux"?  Did these developers not realize that the
linker outputs run only because the linker itself seems unaware
of the correct formula and just needs to make its own -4 and not
read it from object files?  It seems that on historic
development day 2, this constant would have be "moved" into
linker code -- but that did not happen -- why?

Why do Linux developers appear to want to keep correct .o files
(no -4 constants in there) out of the Linux environment?  Is the
intent to keep quality software out of the Linux environment?  
The larger computing world outside Linux/Sun, etc, also has
smart people producing useful object files -- in business, math,
science, you name it.  But ld does not know how to link them!

What message are you sending?  

Is the message that objcopy writers now must also go in and
insert the -4's in .text sections when converting from .obj to
.o?  All because ld can't find its own -4?  [Free gift, cut and
paste any -4 in this post and put it in ld source code!] 

In the absence of rational, credible answers to any of the above
questions, it's also an image problem:  Is this a prototype of
Linux design practices -- require Peter (compiler writers) to do
the work of Paul (linker writers)?

Does Linux want a "first-class in quality" reputation?  If so,
let's see it.

Cordially, Jim





reply via email to

[Prev in Thread] Current Thread [Next in Thread]