[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Website translations with Haunt

From: sirgazil
Subject: Re: Website translations with Haunt
Date: Sat, 16 Dec 2017 10:26:12 -0500
User-agent: Mozilla/5.0 (X11; Linux i686; rv:52.0) Gecko/20100101 Thunderbird/52.4.0

Hi :)

On 09/12/17 13:06, pelzflorian (Florian Pelz) wrote:

First of all, I want to say thank you to the Guile, Haunt, ffi-helper
and related projects’ developers.

I built my personal website [1] using David Thompson’s Haunt [2] and
recently talked my university’s Islamic Students’ Association
(Islamische Hochschulvereinigung) into using Haunt for their
not-yet-finished website as well, because I think the concept of Haunt
and SHTML is superior to alternatives.  However in order to make the
website multilingual (the user can choose to view it in German or
English) so far I used an association list with assoc-ref which is not
very comfortable since all strings have to be added in two places,
i.e. in the SHTML code and in the association list where the code
looks for translations.

I want to ask for your thoughts on my new solution since translations
are probably important to many Haunt users.  In particular, I believe
there was some discussion on Website translation on the Guile or Guix
lists as well.

I'm very interested on this subject because I help with Guile and Guix websites, and I usually work with multilingual websites. I have no idea of what would be the right way to do i18n of websites written in Scheme, though. So I will just join this conversation as a potential user of your solutions :)

I did not want to use the ordinary gettext functions in order to not
call setlocale very often to switch languages.  It seems the Gettext
system is not designed for rapidly changing locales, but maybe I am
wrong about this and very many setlocale calls would not be that bad.

For what is worth, I use ordinary gettext and `setlocale` in my website, which is not Haunt-based, but it is Guile Scheme and statically generated too. So far, it works ok.

For internationalization, I know the convention is to use _, but I don't like that, so I use the alias l10n instead.

For internationalizing complex blocks that should not be translated in fragments, like:

`(p "Hi! I play "
    (a (@ (href ,sport-url)) ,(l10n "futsal"))
    " in "
    (a (@ (href ,place-url)) ,(l10n "Tokyo")))

I had to write a procedure I call `interleave` that I use like this:

  ,@(interleave (l10n "Hi! I play ~SPORT~ in ~PLACE~.")
                `(a (@ (href ,sport-url)) ,(l10n "futsal"))
                `(a (@ (href ,place-url)) ,(l10n "Tokyo"))))

So, in the translation catalogs, translators will see the strings:

"Hi! I play ~SPORT~ in ~PLACE~."

Currently, I use xgettext manually and Poedit for working with translation catalogs, but I'd like to manage translations in the future like this (replace `site` with `haunt`):

# Create new translation catalogs for Finnish and Japanese.
$ site catalog-new fi ja

# Update translation catalogs with new translation strings.
$ site catalog-update

# Compile translation catalogs (generate .mo files)
$ site catalog-compile

To be fully localized, I also have to pass IETF Language Tags around in the website code, so that I get the right content when rendering the templates in a given language.

My 2¢

reply via email to

[Prev in Thread] Current Thread [Next in Thread]