bug-lilypond
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: #852 - stop google from indexing old docs


From: Graham Percival
Subject: Re: #852 - stop google from indexing old docs
Date: Wed, 16 Dec 2009 08:59:47 +0000
User-agent: Mutt/1.5.18 (2008-05-17)

On Wed, Dec 16, 2009 at 12:48:03AM -0800, Mark Polesky wrote:
> This should be trivial to fix.  I would do it but I can't
> figure out from the sources how `robots.txt' is generated.

Huh, apparently it's not even in the web branch.  Sigh.  I can see
it here:
http://lilypond.org/robots.txt
and in retrospect it's obvious how to disallow the other versions.

> 1) The only valid locations for blank lines are *above* a
>    "User-agent" line and below the last "Disallow" line in a
>    single "User-agent" record.  Remove all other blank
>    lines.

I didn't know that; interesting!

> 2) Individually disallow *all* directories that are
>    immediately below the /doc/ directory EXCEPT the one for
>    the current stable release.  Ideally this would be
>    automated by a script.

To be fixed in the new website; for now we'll just band-air the
old robots.txt

> User-agent: *
> Disallow: /doc/v1.6/
> Disallow: /doc/v1.8/
> Disallow: /doc/v1.9/

Huh, I didn't reliaze we kept the old unstable directories around;
they're not listed on
  http://lilypond.org/documentation
anybody mind if I delete the unstable doc dirs?

> There is an alternative, which may be easier to maintain
> (and thus safer).  Maybe there are reasons that this would
> be a bad idea (I don't know), but we could move the current
> stable docs into a new subdirectory of /doc/ (like
> /doc/current/) and move everything else to another
> subdirectory (like /doc/other).  Then the robots.txt file
> would only need to be:

or /doc/archive ?  I'll think about it.

Once 2.12.3 is out, and if there's no other emergencies, I'll
start working on lilypond.org, the ajax searching, etc etc.  Stick
around; we'll talk much more about this in a week or so.  :)

Cheers,
- Graham




reply via email to

[Prev in Thread] Current Thread [Next in Thread]