texinfo-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Using Perl's cc


From: Gavin Smith
Subject: Re: Using Perl's cc
Date: Sat, 4 Jul 2015 13:10:07 +0100

On 4 July 2015 at 09:03, Eli Zaretskii <address@hidden> wrote:
>> Why does the MSYS installation of Perl have to be used at all?
>
> Because when a Perl script invokes /usr/bin/program, or references
> /dev/tty or /dev/stdout, a native Windows build of Perl will not be
> able to DTRT.  And because shell scripts expect Perl to produce
> Unix-style EOLs in their output.  And for umpteen other subtleties
> that can make running a build a nightmare on Windows.  The MSYS
> toolchain makes all that magically work, which is its single most
> important raison d'ĂȘtre.  Its only problem is that its development
> stagnated, so even its Perl is not 100% up to the texi2any job (it
> spits out 2 non-fatal warnings while processing the Emacs Lisp
> Reference manual).

If the development headers are available for MSYS Perl along with the
compiler used to compile it, then that compiler could be used to build
the extension. From what I hear from you, this is very unlikely
though.

If MSYS Perl is not used when texi2any is run, then these problems
like "/dev/tty" and umpteen other things that I don't know about
wouldn't be taken care of, unless Strawberry Perl etc. can take care
of them. Could they be fixed from within texi2any itself?

Regarding this:
>And because shell scripts expect Perl to produce Unix-style EOLs in their 
>output.

this would only be a problem if shell scripts were processing the
output of texi2any, which would be mostly error messages. MSYS Perl
could still be used in most other cases, just not when texi2any is
run.

>> It's like compiling against a shared library, and then trying to use
>> it against an incompatible version of it.
>
> Yes.  But normally, a build of a package doesn't need to build
> libraries that link against the tools used during the build.  You
> produce libraries that will need to link against the products of the
> build.  And that works, because the compiler and Binutils used during
> the build are MinGW (_not_ MSYS!) compiler/Binutils, and so they use
> the MinGW runtime.

I'm going to elaborate because I'm not sure if you understood my
analogy. I'm sure you understand the functional aspects; it might help
view it from a different point of view.

I wasn't sure if you understood the analogy because in the analogy we
aren't building libraries at all: Perl itself is the library.

Suppose you have a C program that does #include <stdio.h> and calls
"fprintf(stderr, "This is a message\n")". It's compiling against libc,
suppose a shared library, "libc.so.16". But in the next version of the
library, the loadable object for which being "libc.so.17", there is an
incompatible change, and stderr is no longer defined as "2", but now
defined as "99". Anyone wanting to use this C program on a system
where stderr has this new value would have to compile it against a new
"stdio.h". If you use a different compiler, things could go wrong as
well, if it uses an exotic calling convention, so that, e.g., some of
the processor registers used by our program don't get saved when
"fprintf" is called into.

This example has obvious flaws, but hopefully they won't get in the
way. Like the C program in our example, we're including development
headers, like "XSUB.h". In order to be compatible with the Perl
instance, they have to match the Perl executable being used when the
code in the extension module is run.

> Yes, with enough care (you need to carefully segregate your PATH, and
> have scripts to switch between the two).  But there will always be
> problems.  E.g., when I build Texinfo, I must have my "normal" MinGW
> on PATH, to have the compiler found by Make, so then the "alternative"
> MinGW will not be the first one on PATH, and I think you see the
> consequences.

The configure script should go looking for a Perl where a compatible C
compiler can be found and used, and use that compiler only when
building the extension module.

I grant that there are many complications, and I am not going to be
able to solve them all on Windows: it would only be done you (or
someone else) had the time and motivation to get it working.

> Are there ways to produce C or C++ code from Perl, or compile a Perl
> program into native code?  After all, the goal is speeding up
> makeinfo, not using external modules, right?

AFAIK C code based on Perl consists of many uses of the Perl API,
which has the same problem. The final option is to statically link the
extension into Perl and distribute it along with the rest of Texinfo,
but that's not a pleasant solution.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]