[gmpi] Re: Reqs 3.9. Time - opening arguments.1

  • From: "Michael Stauffer" <michael@xxxxxxxxxxxxxxxxxx>
  • To: <gmpi@xxxxxxxxxxxxx>
  • Date: Thu, 12 Feb 2004 14:54:30 -0500

>On Wed, Feb 11, 2004 at 12:16:44PM -0500, Ron Kuper wrote:
>> >>>
>> Personally, I would prefer to see tempo specified as a list of
>> timestamped "tempo events" for that buffer (the way MIDI events are in
>> dealt with in VST). The plugin can then perform sample accurate event
>> handling for that buffer.
>> <<<
>>
>> I don't think tempos can be events.  Here's why I believe this, please
>> explain why I'm off base about this.
>
>It's a real issue in concept, but I think it can be constrained..
>
>> Let's say for arguments sake our audio frame size is equal to 1 beat =
>> 960 ticks (assuming MIDI parlance for a second) at whatever
>the current
>> tempo is.  Every set of events that are passed in live within the next
>> beat, and can be safely rendered in the next beat-sized frame.
>>
>> Now I allow a tempo event to come in.  This event slows the
>tempo down,
>> so that one audio frame is now equal in duration to 1/2 a beat or 480
>> ticks.
>>
>> If the host is feeding these events to the plugin, the host
>now needs to
>> be careful about what events get sent after the tempo change,
>to ensure
>> that we don't send more events than fit in a single frame.
>
>Right, the host needs to be aware of tempo changes, as does
>anything which
>uses musical timing internally.  If you think of the host's
>sequencer/automation engine as plugins (whether they are or not..) and
>tempo-masters as plugins, then it's a matter of dependencies.
>
>The sequencer plugins depend on the tempo plugin.  In a graph,
>that means
>the tempo plugin must be processed BEFORE anything that cares
>about tempo.
>
>> If a plugin is sending these events to other plugins, how do the other
>> events in the other plugins "know" that they are now out of range for
>> the audio frame?
>
>So if your plugin cares about tempo, you will be notified of a
>tempo change
>BEFORE you send any events.  When you see a tempo change, you
>know to not
>send events that would be out of range.
>
>If your plugin does not care about tempo, but sends events,
>then it must be
>presumed to want sample-based timing and not musical-based timing.
>
>Does that answer that?

Yes, I think it does. Don't hosts have to handle all this already when
they sync to an external master? If a host is sync'ed to midi clocks,
it's monitoring the tempo while it's preparing buffers and events for
plugs, and I imagine in the one-tempo-per-audio-frame model, it would
have to use whatever its current notion of tempo is to prepare the buffer
and associated events before it sends the buffer out.

The multiple-tempo-event-list-per-audio-frame model would only make sense
if the tempo controller were working from a static tempo map so it could
know ahead of time what the changes were going to be within an audio
frame. That is, unless the 1-frame latency is used that Koen mentioned.
Does that sound right?

This makes me wonder about the audio-frame method for sending events to
plugs. Why are events not sent via a separate method with smaller time
resolution? I figure it's much easier to just package all audio data and
events into one bundle and have a single callback/thread to handle the
data.

Are there other reasons? Cuz otherwise, plugs could receive all midi and
other control events at finer resolution in a separate callback thread,
and the plug could decide how carefully they wanted to process them: if
it wanted to try and honor the events within the current frame that's
being processed, or if it wanted to ignore them until the next frame (at
which point timestamps would allow taking history into account). Would
this be worth the extra complication?

The plug-in architecture could provide a default event queue object that
a simple plug could use at the begin of a new frame cycle to get a list
of all events that happened while it was busy processing the previous
frame. This would be much like getting a list directly associated with
the incoming audio buffer. A more complicated plug could bypass this and
make its own callback thread for handling the events as they come in.

Does anyone know what kinds of frame sizes users tend to be able operate
with? e.g., how common is 512 samples at 44.1kHz? This yields ~10 msec
frame latency, which isn't too horrible for tempo processing, especially
if it's a small percentage of users who might not be as concerned about
top performance anyway. I don't know. I'm wondering how much of an issue
frame latencies are going to be as systems can handle smaller and smaller
frames at higher resolutions. Or have I got things mixed up somehow?

Cheers,
Michael


----------------------------------------------------------------------
Generalized Music Plugin Interface (GMPI) public discussion list
Participation in this list is contingent upon your abiding by the
following rules:  Please stay on topic.  You are responsible for your own
words.  Please respect your fellow subscribers.  Please do not
redistribute anyone else's words without their permission.

Archive: //www.freelists.org/archives/gmpi
Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe

Other related posts: