savannah-hackers-public
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Savannah-hackers-public] robots.txt disallows all spiders for maili


From: Yavor Doganov
Subject: Re: [Savannah-hackers-public] robots.txt disallows all spiders for mailing lists
Date: Sun, 05 Jul 2009 22:47:23 +0300
User-agent: Wanderlust/2.15.5 (Almost Unreal) SEMI/1.14.6 (Maruoka) FLIM/1.14.9 (Gojō) APEL/10.7 Emacs/22.3 (i486-pc-linux-gnu) MULE/5.0 (SAKAKI)

Nicodemo Alvaro wrote:
> https://savannah.gnu.org/maintenance/MailSystem

Thanks, I was sure the reasons were purely technical, although I
didn't remember the details.

Noah Slater wrote:
> I'm not sure if this is a joke, but I struggle to take it seriously.

Not a joke.
I avoid using Google, Yahoo!, etc. search engines and TTBOMK the GNU
project does not have a mission to make their life easier.

If I have to read an old disussion from the time I was not subscribed
to that particular list, I download it in mbox format, grep through it
to discover the right file and then use my MUA's facilities to search
and read the interesting messages.  This has the extra advantage to
follow the discussion in the same natural order it happened, and (if
necessary) reply without breaking threads.

> Why don't we do the same for the main GNU site, 

The main GNU site is different -- we want to be as convenient as
possible for everyone to discover GNU and learn about the free
software movement.  Which is why some webmasters go through the pains
of supporting proprietary browsers.  But nobody is obliged to do so,
fortunately.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]