|
From: | Federico Bruni |
Subject: | Re: source file ... .scm newer than compiled ... .go file |
Date: | Sun, 16 Oct 2022 20:41:54 +0200 |
Il giorno dom 16 ott 2022 alle 12:18:25 +0200, Jean Abou Samra <jean@abou-samra.fr> ha scritto:
Le 14/10/2022 à 19:55, Jonas Hahnfeld via Discussions on LilyPond development a écrit :For the records, another application using Guile (GNU Cash) had the same problem with flatpak three years ago. Their workaround was disabling recompilation. Bad idea or good idea? https://github.com/flathub/org.gnucash.GnuCash/blob/master/patches/0001-Never-recompile.patchNot really great. On the other hand, you only need a very targeted installation and don't expect a fully functional Guile...Try really hard to avoid this. Some LilyPond libraries are written in .scm files, like a lot of the stuff in openLilyLib. It is important to use compilation (by running with GUILE_AUTO_COMPILE=1) in order to get helpful error messages when something goes wrong there. If this patch is used, as soon as one run with GUILE_AUTO_COMPILE=1 has been done, changes in the .scm files will have no effect, which is going to be very confusing for the user ...
I'm afraid I have no alternative. The LilyPond 2.23.x version in flatpak doesn't currently work. I don't know when it started breaking, as I usually tested the stable only.
I opened a PR which makes 2.23.x work again: https://github.com/flathub/org.frescobaldi.Frescobaldi/pull/14openLilyLib is used by a small number of users, I think, probably not even using flatpak. In case of problems they can still download and use the official LilyPond binaries.
Open issue which did not receive any feedback from flatpak developers:https://github.com/flatpak/flatpak/issues/3064Yes, this would need proper addressing for use cases such as Guile bytecode compilation.CPython has a system roughly similar to Guile's; how does packaging Python libraries work in Flatpak?
I don't know. Do you know some Python libraries which may need a similar feature?
[Prev in Thread] | Current Thread | [Next in Thread] |