fluid-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [fluid-dev] Thread safety long-term thoughts


From: josh
Subject: Re: [fluid-dev] Thread safety long-term thoughts
Date: Thu, 26 Nov 2009 15:58:40 -0800
User-agent: Internet Messaging Program (IMP) H3 (4.1.6)

Quoting David Henningsson <address@hidden>:
address@hidden wrote:
I don't think there is an issue with this. Since the new value isn't
actually passed through the queue, the queued event is essentially just an update request. The latest value will always be the value assigned to the variable and the update event will ensure that the synthesis thread uses the latest value. Events are processed at whatever interval the fluid_synth_one_block() function is called in relation to the MIDI events. If more than one pitch bend event occurs within a given interval, the latest value will get used.

I can think of issues, but perhaps it is only in theory they can happen.

Imagine that we have a note sounding and a channel volume of 1, so the
note is barely audible. Then we have a volume change to 2, then a
note-off (and instant release time), then a volume change to 127. Given
 a certain timing, it could happen that the volume change to 2 is never
read and 127 is read instead, so the note will sound at volume 127
instead of 2 for a very short period of time.



You are right. My immediate thought was that note-on events get processed in sequential order, with other events for a given MIDI thread. But the actual voice processing is done after all MIDI events have been processed, so that would mean that the note-off would occur with a volume of 127 assigned (if all events occur within the same fluid_synth_one_block cycle). I'm not sure if this is really an issue though. What if the note has a really long release for example, you would expect that channel volume changes would indeed affect that voice.

The assumption that I have made is that the rate at which fluid_synth_one_block() is called in relation to injected MIDI events is the granularity of event processing. This rate is a function of the audio driver buffer size. The last controller event of the same source will end up being the used value for that rendered buffer, for a given MIDI source (thread).

Ordering of events from different threads is not guaranteed though, since each thread has its own queue and in fact all the events of one thread will override that of another. This still falls under the event granularity rule though, but seems less than ideal, since an earlier control change could actually end up being used and certain MIDI threads will be given priority in a rather non deterministic fashion. Fixing this would require something like event time stamps or unique auto-increment IDs and re-ordering in the synth thread.

Making things thread safe has really been a slippery slope!


Yeah, now that I think about it, it would. In the case of fast render, program changes need to occur synchronously as part of the synthesis process, but in the case of realtime playback it needs to happen outside the synthesis thread. Seems to fall under the single versus multi thread enable/disable.

We have so many use cases it is easy to fix one and break another. I
started to write something at
http://fluidsynth.resonance.org/trac/wiki/UseCases earlier today but
I'm not sure if it will be helpful; at least it is not so complete yet.



It has definitely been difficult fixing the threading issues, using an API which wasn't really designed for it from the beginning. It may be that we wont be able to completely fix all corner cases, unless the API is adjusted accordingly. As far as I know all crash cases should now be fixed in regards to thread safety, but there may be some issues related to event ordering which I still do not see.



I don't understand the difference between the two or what distinction should be made. Can you clarify this a little and what this might look like as far as API? It seems to me like there are really only 2 distinctions that we care about, single threaded (audio synthesis and MIDI events occur synchronously and from the same thread) and multi-threaded where MIDI events may occur in the audio thread or in other threads.

In the wiki page I quoted above, there are three properties and some
short explanations, does it clarify things? I'm not sure how (and which
of them) we should configure via the API though.



That information does seem helpful. Thanks for creating that. At the moment I can't think of any functional change that would be affected by whether there is more than one MIDI thread or not, so I've left it out of this discussion.

The "Separate audio thread" case is the multi-thread enable/disable. If all is single threaded, then:
- Mutex locking is not required.
- Queuing events to synthesis context is not required. Events can be executed synchronously.

The "Needs audio real-time response" affects whether real-time unsafe operations (garbage collection, mutex locking, etc) can be performed synchronously or not. If not real-time sensitive then:
- Return queue events not required and can be executed synchronously.

If it is real-time sensitive:
- Use of return queue needed, which can be processed by a thread automatically created or by manually calling a return event queue processing function. - All MIDI events need to be real-time safe. Use of return queue for MIDI events is probably a bad idea, due to difficulty in controlling re-ordering issues and completely fails in the case of fast rendering.


I was thinking we could provide additional settings parameters for controlling this behavior, rather than API functions (especially since this functionality needs to be set prior to the fluid_synth_t being created).

We could add a boolean like "synth.multi-threaded" or perhaps a "synth.threading" string enum in anticipation of additional threading modes (beyond single/multi). The audio.realtime-prio could be used to determine if realtime priority is desired. Another setting could be added for auto creating a thread for processing the return queue, say a boolean "synth.create-return-thread" which defaults to on.

At any rate, I don't currently see how we can escape the requirement of having all MIDI generated events be real-time safe. I think currently only program changes and SYSEX tuning events don't comply, which might be solve-able with the current API.


// David


Josh





reply via email to

[Prev in Thread] Current Thread [Next in Thread]