guile-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Verifying Toolchain Semantics


From: Taylan Ulrich Bayirli/Kammer
Subject: Re: Verifying Toolchain Semantics
Date: Fri, 03 Oct 2014 14:56:53 +0200
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/24.3 (gnu/linux)

William ML Leslie <address@hidden> writes:

> ​Oh, interesting point. Maybe we should define PDF as an abstract
> semantics that we can convert into a wide range of equivalent document
> layout languages? If the attacker can't tell exactly what xsl-fo or
> dsssl the tool will output, or what software you're using to render
> the result, it will magically make it more difficult to attack!​

Come on, that snark is undue. :-)

It won't make it "magically" more difficult, it will make it actually
more difficult.  Though I think your analogy is wrong.  PDF is already
an abstract semantics, like any formal language spec.  The question is
whether an implementation realizes precisely those semantics, or adds
"oh and insert a backdoor" to some pieces of code in that language, "as
an addition to the language semantics."  This augmentation to the
semantics needn't be apparent in the source code of the implementation,
since the language it's written in might again be compiled by a compiler
that "augments" the semantics of the language it compiles, and that in
turn needn't be apparent in the compiler's source ... until you arrive
at the since long vanished initial source code that planted the virus.

That's just reiterating what the Thompson hack is.  It relies on us
using a single lineage of compilers/software.  (Or a couple lineages
could be compromised separately; that would still be plausible if it's
plausible for one.)  Because at every layer the semantics can only be
compromised enough to propagate the compromise through one concrete
code-base.  In your PDF analogy, the solution is to write a spurious
amount of PDF implementations.  Or for C, to implement a spurious amount
of C compilers.  That is impractical because C is complex.  What might
be practical might be to write one new C compiler (guaranteed clean,
since it's new), and verify GCC with that[*].  What's more useful in the
long term is to define a much simpler language, and base our everything
on that instead of on C.  Then we can verify our toolchain more often,
since we needn't write a whole freaking C compiler every time. :-)

[*] Compile version X of GCC with the Clean C Compiler™, compile version
X of GCC with some GCC binary, then compile some GCC with both GCC-X
binaries, one of which is guaranteed clean, to see if they behave same
while compiling GCC; thus we get both a guaranteed clean GCC, and we get
to know whether the other was compromised in first place, or whether we
wasted a lot of time.

I think Ian is doing himself a disservice by using many fancy abstract
terms and, if I may be blunt, acting in some tinfoil hat-like ways, when
in fact the underlying ideas are relatively sane and simple.  It's all
there if you read a little between the lines, and a little around the
parts written in all-caps, like in:

http://lists.gnu.org/archive/html/guile-devel/2014-09/msg00035.html

Taylan



reply via email to

[Prev in Thread] Current Thread [Next in Thread]