bug-gne
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-gne]External Servers and Illegal/Extreme Content


From: Imran Ghory
Subject: Re: [Bug-gne]External Servers and Illegal/Extreme Content
Date: Sun, 18 Feb 2001 12:07:20 -0000

On 18 Feb 2001, at 3:40, Tom Chance wrote:

> > Define how to moderate without causing the
> > Encyclopedia to be 
> > under central control ?
> 
> Look we can't just let every article submitted enter
> the resource, we've decided that already, so we don't
> get binary bombs and adverts. The most open moderation
> system proposed was mine and Rob's which is where any
> article only needs a few "yes" votes to get in;
> there's no majority needed or anything similar... no
> "reject" votes. 

That's open to total abuse though, it be trivial to bypass such a 
checking system. And it will still rely on the owner of the server not 
wiping anything they don't like.

> > >Just so long as nobody
> > > has control over that index (i.e. it just hosts a
> > > reference to every article that is stored on some
> > > GNE-related server) then it can remain completely
> > > "free", and is easy to maintain.
> > 
> > And how would you propose to do that ?
> 
> Stick it on a computer and just leave the system
> automated; don't let somebody sneak through it
> deleting references.

Unless the system is physically secure (i.e in a lead box under the 
Atlantic), that's going to be impossible to do. Game networking 
programmers have been trying for years to find a method of 
ensuring data integerity on insecure computers.

> > If the author wanted to submit such an article they
> > would need to 
> > find a server which would allow it, say they found
> > such a server in 
> > Portugal that server would host it.
> 
> That would be an alternative to my solution of making
> the moderators designate certain articles as
> "controversial/illegal". You're just putting the honus
> on the user rather than the moderator. 

If we move it to the user, we(the main GNEP project) would no 
longer be legally responsible for accepting the data.(IANAL)

> > Now when someone uses a front end to preform a
> > search the front 
> > end can query the main GNEP server (or mirrors of
> > it) and if it 
> > wants to it can also query the server in Portugal,
> > it could collate 
> > the results and pass them back to the user.
> 
> Don't you think that might be a bit slow though? 

Meta-search enignes have been doing it succesfully for years.

>So
> when the user searches the frontend/ classfication he
> is in, he could select to just search his mirror, or
> all the mirrors (which would let him see more varied
> articles). So the perl script (or php or whatever)
> would parse the index on his mirror, return any
> results into a temporary file, then parse every other
> index (there could be quite a few) to find more
> results, 

How would parsing several indexes take any longer then parsing a 
longer single index ?

>ensuring it doesn't double return the same
> article

The servers should keep the index of it's mirrored data and it's 
index of original data seperate, that way we could avoid duplication.

>Why not just have the index on
> every mirror to be identical, so when you search, you
> just search your local index, 

Why not have the front end mirror all of the indexes locally so that 
it can just search it's local index ?

>not centralised despite there being a
> "central" server, and allows us to server materials
> across the world.

But also creating a single point of failure.

Imran



reply via email to

[Prev in Thread] Current Thread [Next in Thread]