lilypond-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Midi advice


From: Carl Sorensen
Subject: Re: Midi advice
Date: Tue, 29 Mar 2016 18:59:08 +0000
User-agent: Microsoft-MacOutlook/14.6.1.160122

On 3/29/16 9:45 AM, "Redwood" <address@hidden> wrote:

>David and Carl,
>
>Being the two that have responded and encouraged me, can I take your
>non-response to this query as meaning neither of you can help me here?
>Any Œmeta-advice¹ about how to approach this?
>
>If I can¹t get help with these basic questions, then my take is that this
>is probably beyond me for now, and my best hope is to slowly take in
>Guile, understanding what expert lilypond users know, and hope that
>eventually I¹ll be able to answer these questions myself. If so, the midi
>project itself will be on hold for a while...

Daniel,

David is currently on vacation (climbing in Italy), so a day of delay or
so is certainly not beyond expectations.

I have never looked at a single routing in midi, so I can't offer you any
knowledge based on experience.

I can tell you what I understand based on reading descriptions of it, so
here goes.  (It's also explained in Chapter 10 of the contributor's guide)
http://www.lilypond.org/doc/v2.19/Documentation/contributor/overview-of-lil
ypond-architecture

LilyPond music is parsed by the parser into Scheme music.

Scheme music is transformed by iterators to assign the music to contexts.
Once the iterators are done, the music is passed to engravers (to create
printed output) and to performers (to create midi output) in the form of
stream events.  Collectively, these are called translators.

Section 10.11 of the CG talks about engravers.  Since engravers are
translators (and most of the macros described in section 10.11 use the
name TRANSLATOR), I believe the same things apply to performers, although
I have never programmed one.

I have never plowed through articulate.ly.  But I know that articulate.ly
was created entirely outside of lilypond development, and initially was
not part of lilypond.  It was eventually added.  In general, there is more
flexibility in the translator than there is in GUILE, although one can
write a translator in GUILE.  I believe that you would be well served to
start by trying to improve the performers, rather than by trying to
improve articulate.ly.  The performers are more central to the lilypond
workflow.

>
>-d
>> Hi David and Carl and others,
>> 
>> It appears that the internal midi code is taking some set of objects,
>>and mapping them to midi events. No great insight here.
>> 
>> I have a source code question, and a related design question:
>> 
>> Source Code Question:
>> 
>> Where are the musical objects that are being formatted to midi: are
>>they Guile objects or C++ objects?

I have not delved into the code, but I believe they are both.  They are
C++ objects that have GUILE internals (at least that's what happens in the
engravers, and I have no reason to believe performers are any different).

>> What¹s the best way to see and understand these objects. From what I
>>can see, most objects in lilypond are format related (clefs, stemsŠ),
>>not purely musical objects (this pitch for this duration at this time).
>>I tried the graphviz, and all those objects appear to be format related
>>(though I see one can tune this output).

The graphviz code is specifically aimed at displaying grob relationships.
Midi objects are not grobs, so graphviz won't be of much help.  I believe
that what you will need to capture in your MIDI work is EventChord s
(simultaneous notes) and/or NoteEvents.  I seem to recall that at some
time in the past the code was changed so that all notes are wrapped in
EventChord, but I may be mistaken in this understanding; I haven't been
heavily involved in detailed coding for about 3-4 years.



>> 
>> Design Question:
>> 
>> It appears that the problem with the non-articulate midi output is that
>>it takes these events, and maps them one-to-one. What articulate does is
>>to generate a different set of objects from the originals. These new
>>objects are like a Œperformance¹ of the original objects. I suspect that
>>was Jan¹s original intention (hence the names performer in the midi
>>source files), but never actually did the step of making a performance
>>from the original events.

I'm not sure what you mean by a 'performance'.  I think that there
currently is a rudimentary performance of the original events.  Certainly
when I look at the performers listed in the Internals Reference, I see
performers for notes, drum notes, dynamics, beams, control track, key,
lyric,  midi control function, piano pedal, slur, staff, tempo, tie, and
time signature.  I don't see anything for articulations, although I could
imagine that the articulations could (and maybe should) be part of the
note performer.

Without having studied carefully, I would not assume that there is no
performance.  I would just assume that the performance doesn't make good
use of all the capabilities MIDI has to offer, and that we should make the
performance be much more rich.

>> 
>> My assertion: articulate.ly is awkward and less than successful because
>>it doesn¹t have access to all the information that the C++ code has. But
>>perhaps I¹m wrong: perhaps 100% of what the C++ code knows is available
>>to Guile code?

I don't know how to answer this authoritatively, but here is my
speculation.  It is possible to get everything the C++ code has access to
if one works hard enough, but it's much easier to work in the
engraver/performer level than trying to figure out how to do everything
with callbacks, which is what one often does when modifying LilyPond's
behavior in GUILE.

HTH,

Carl




reply via email to

[Prev in Thread] Current Thread [Next in Thread]