savannah-hackers-public
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Savannah-hackers-public] robots.txt disallows all spiders for maili


From: Noah Slater
Subject: Re: [Savannah-hackers-public] robots.txt disallows all spiders for mailing lists
Date: Sun, 5 Jul 2009 23:23:28 +0100
User-agent: Mutt/1.5.18 (2008-05-17)

On Mon, Jul 06, 2009 at 12:14:47AM +0200, Sylvain Beucler wrote:
> Yes, we have a good enough experience fighting a monopoly (Microsoft),
> so encouraging One Unique Search Engine sounds like an heresy.
> Especially that one.

Aye, I would expand it to at least a collection of well known ones. Heh.

> IMHO this is a suboptimal solution as our free software projects are
> losing a decent adversiting and helpdesk vector that way.  Since we
> now have newer hardware for lists.gnu.org, it would make sense to open
> it up again to search engines, possibly with a crawl delay parameter
> is there's a problem.

For reference, I added this in my email to address@hidden:

This makes sense, but I would like to ask that you add an exception a few of the
big search engines like Google, Yahoo, and MSN. All three search engines support
the crawl-delay parameter which is mentioned here:

  http://en.wikipedia.org/wiki/Robots_exclusion_standard

Google lets you specify this directly:

  http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=48620

Best,

-- 
Noah Slater, http://tumbolia.org/nslater




reply via email to

[Prev in Thread] Current Thread [Next in Thread]