[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: glibc segfault on "special" long double values is _ok_!?

From: James Youngman
Subject: Re: glibc segfault on "special" long double values is _ok_!?
Date: Wed, 6 Jun 2007 22:44:17 +0100

On 6/6/07, Andreas Schwab <address@hidden> wrote:
Jim Meyering <address@hidden> writes:

> I'm interested, because I don't want my applications to segfault on such
> inputs.  Sure it may look a little far-fetched, but I think it's not.
> Imagine such a bit pattern being injected into a network data stream
> that is then printed as a long double.  Just printing an arbitrary
> "long double" should not make a server vulnerable to a DoS attack.

In which way is this different from passing NULL to strlen?

If I read a stream of data (from a file, a network socket, etc.) which
is supposed to contain valid floating-point data, it is reasonable for
a program to print it.  Especially so if the printing I am doing is a
for diagnostic logging.  For example, in the case where a program
seems to behave oddly with a certain input, it is reasonable for the
invoking user to turn up the verbosity of diagnostic messages.   So
there will be some statements in applications which preferentially
invoke printf-family functions on dubious data.

What the program is doing here is under the control of the program
author.  What the data stream contains is under the control of the
user, modulo lossage caused by user error and faulty hardware.  Faulty
hardware is perfectly capable of injecting random bit corruption into
(for example) a TCP data stream, undetected.  (That is, undetected by
both the TCP checksum and the CRCs on the Ethernet frames).

Summary - the application author can reasonably write their program
under the assumption that a data set is of a certain type, but the
world is a sufficiently complex place that sometimes the data can be
corrupted or of an unexpected nature.

Contrariwise, the same consideration doesn't really apply to passing
NULL to strlen.  For that to happen, the application essentially needs
to contain a logic error (for example a code path in which a pointer
is used before being assigned to the address of an allocated buffer).
That's a different type of failure; it's a problem within the program
itself, not with the environment in which the program is operating.

Robust programs need to be able to detect and report problems with
their environment and their input.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]