l4-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: awareness + flexibility + security


From: Bas Wijnen
Subject: Re: awareness + flexibility + security
Date: Mon, 14 Nov 2005 09:32:16 +0100
User-agent: Mutt/1.5.11

On Mon, Nov 14, 2005 at 10:28:16AM +0900, Andre Caldas wrote:
> Marcus Brinkmann wrote:
> >Here is now my idea how this loophole could effectively be closed: If
> >a web service provider wants to use GPL software with modifications,
> >they have to release the source code.  Once the source code is
> >released, the FSF can offer as a service to sign binaries of the
> >source code releases.  Using these signatures, and the remote
> >attestion features of TC, the _users_ of these remote web services
> >could _verify_ that the remote server only runs free software in
> >implementing this service.

This makes sense.  I would certainly consider this a good use of it.

> >Similar services could be provided for embedded devices.  We know as a
> >fact that today this is a market segment where the GPL is frequently
> >violated.

This may be even more significant.  On the other hand, I'm not so sure if they
will use signed software.  If they want to violate the GPL, they still can do
so...

> >I would probably not go so far as requiring such a signing process in
> >the license.  But it would put an enormous public pressure on web
> >services.  If they can prove for a small sum that they run only free
> >software, why would they not do so?

Because there will be hardly any public pressure.  We all say that people
should use free software.  Even though there are free pdf readers available,
the only publicly well known one is acrobat reader (which is I believe
renamed, but anyway).  Free software isn't a big issue for most people.

> Because they value their privacy!? Something like the way you say you 
> value your freedom. "If it is for free, and technologically better, 
> why don't you take it?"

This is not about things for customers, it's about things for service
providers.  I know in some cases the line between them is blurry.

Say I provide a service to people who don't know you, for example web
hosting.  Assume that you want to use my service for your company, and that
you want to put some internal secret company data in a protected section of
your site.  You will want to make sure that unauthorized people cannot access
that data, and that includes me.  So if I don't prove to you that I am unable
to access it, you will not do business with me, or at least not put that data
on it.  Then we both lose.  In this case, the fact that my customers know I
only use trusted free software is more important than the fact that I always
have the newest technologically better system.  Bug fixes must of course be
installed, but I'm sure that can be done as long as they are signed.

However, if we are friends and I happen to have a computer permanently on the
internet, while you don't, I can also host your website.  In that case, I may
just install every new feature that is released, or that I wrote myself.  No
need to do any verification in that case.

> >Also, I think it makes a lot of sense to look at TC and DRM
> >separately.  TC does effectively support as simple operations as
> >harddrive encryption on laptops.  Yes, these can be added in software
> >as well.  But I think this is undoubtly a positive use, not a negative
> >one.  Don't you agree?

I do.

> >Maybe I am making a basic mistake here, but it isn't _obvious_ to me
> >that the only uses are bad.

No, after your and other's responses I'm convinced that there are good things
about it, and that we can't stop DRM anyway, except by making our system
completely insecure.  That is not something I would want to do.

> I don't think this should be a matter of: 'Is there any good uses?'
> Probably people can come up with good uses for it easily. The problem 
> here is:
>       Is it possible to separate the good and bad uses?

I asked this question ("Are there any good uses?"), because I couldn't think
of any.  I changed my mind, mostly because we can't enforce that people don't
implement the bad stuff themselves.  So the bad uses _will_ be possible, no
matter what we do.  Therefore, since there are also good uses, I think it is a
good idea to support the technology.

> Is it a trade? Do I have to agree with the 'bad uses' in order to have 
> the 'good uses'? Here is an example:

Well, "agree" may be a strong word.  But if we want a secure system (and I at
least very much do), then those bad uses will be either possible, or easily
implementable.  So we cannot stop them.  Do we have to deprive ourselves from
the good uses in order to make sure at least some work must be done?  I don't
think so.  They will do the work, and then the bad uses are possible and the
good ones aren't.  That's the worst of both sides.

> Thinking about the above, I just realized that there are much simpler 
> examples. The 'freedom to kill someone' and the 'right to live'. Is 
> this freedom really important?

I very much agree with your reasoning here.  However the problem is that even
if we forbid people to "show their voting card", we cannot enforce it.

Well, perhaps we could if we put it in GPL v3, but it would be very tricky to
do so, because we would claim that the freedom to change and redistribute the
code is limited depending on what you code.  I think that would be a bad
direction for the GPL.

Thanks,
Bas

-- 
I encourage people to send encrypted e-mail (see http://www.gnupg.org).
If you have problems reading my e-mail, use a better reader.
Please send the central message of e-mails as plain text
   in the message body, not as HTML and definitely not as MS Word.
Please do not use the MS Word format for attachments either.
For more information, see http://129.125.47.90/e-mail.html

Attachment: signature.asc
Description: Digital signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]