[gmpi] Re: Reqs 3.9. Time - opening arguments.1

  • From: "Jeff McClintock" <jeffmcc@xxxxxxxxxx>
  • To: <gmpi@xxxxxxxxxxxxx>
  • Date: Fri, 13 Feb 2004 11:11:27 +1300

Hi Michael,

easy one first...
> Does anyone know what kinds of frame sizes users tend to be able operate
> with? e.g., how common is 512 samples at 44.1kHz? This yields ~10 msec
> frame latency, which isn't too horrible for tempo processing

A nice soundcard (like an M-Audio Delta) can happily run at less than 10msec
frame latency, some people report latency as low as 2ms.  From a musicians
perpective, that's pretty good.

> This makes me wonder about the audio-frame method for sending events to
> plugs. Why are events not sent via a separate method with smaller time
> resolution?
> ... Cuz otherwise, plugs could receive all midi and
> other control events at finer resolution in a separate callback thread,

If I understand you, you're suggesting MIDI gets sent to a plugin "as soon
as possible" to avoid the latency of pre-queing events.

This has been suggested several times.  I naively used this approach once.
Sounded terrible.  I'll try to explain...

Say you are running 10 plugins, 10ms frame size.

Each plugin in turn processes 10ms worth of events and audio.  Each plugin
has only 1ms to do so (on average).  Each plugin is 'asleep' for 9 ms out of
10.
  If you attempt to send MIDI in 'real time', the events get delivered
'clumped' into 1ms lumps, 10ms apart.  It's like quantizing all events to
the frame size (which is never a 'musical' interval).  Any MIDI controller,
like pitch bend become audibly 'stepped'.  All note timing becomes badly
'jittered'.

The correct solution is to delay all real-time events by the frame latency
before injecting them into the graph.  Your soundcard imposes the same
latency on incoming audio, so you're not 'losing out', in fact this ensures
that recorded MIDI and audio stay perfectly in sync.

Appologies if I misunderstood your proposal.

Best Regards,
Jeff

----- Original Message ----- 
From: "Michael Stauffer" <michael@xxxxxxxxxxxxxxxxxx>
To: <gmpi@xxxxxxxxxxxxx>
Sent: Friday, February 13, 2004 8:54 AM
Subject: [gmpi] Re: Reqs 3.9. Time - opening arguments.1


> >If your plugin does not care about tempo, but sends events,
> >then it must be
> >presumed to want sample-based timing and not musical-based timing.
> >
> >Does that answer that?
>
> Yes, I think it does. Don't hosts have to handle all this already when
> they sync to an external master? If a host is sync'ed to midi clocks,
> it's monitoring the tempo while it's preparing buffers and events for
> plugs, and I imagine in the one-tempo-per-audio-frame model, it would
> have to use whatever its current notion of tempo is to prepare the buffer
> and associated events before it sends the buffer out.
>
> The multiple-tempo-event-list-per-audio-frame model would only make sense
> if the tempo controller were working from a static tempo map so it could
> know ahead of time what the changes were going to be within an audio
> frame. That is, unless the 1-frame latency is used that Koen mentioned.
> Does that sound right?
>
> This makes me wonder about the audio-frame method for sending events to
> plugs. Why are events not sent via a separate method with smaller time
> resolution? I figure it's much easier to just package all audio data and
> events into one bundle and have a single callback/thread to handle the
> data.
>
> Are there other reasons? Cuz otherwise, plugs could receive all midi and
> other control events at finer resolution in a separate callback thread,
> and the plug could decide how carefully they wanted to process them: if
> it wanted to try and honor the events within the current frame that's
> being processed, or if it wanted to ignore them until the next frame (at
> which point timestamps would allow taking history into account). Would
> this be worth the extra complication?
>
> The plug-in architecture could provide a default event queue object that
> a simple plug could use at the begin of a new frame cycle to get a list
> of all events that happened while it was busy processing the previous
> frame. This would be much like getting a list directly associated with
> the incoming audio buffer. A more complicated plug could bypass this and
> make its own callback thread for handling the events as they come in.
>
> Does anyone know what kinds of frame sizes users tend to be able operate
> with? e.g., how common is 512 samples at 44.1kHz? This yields ~10 msec
> frame latency, which isn't too horrible for tempo processing, especially
> if it's a small percentage of users who might not be as concerned about
> top performance anyway. I don't know. I'm wondering how much of an issue
> frame latencies are going to be as systems can handle smaller and smaller
> frames at higher resolutions. Or have I got things mixed up somehow?
>
> Cheers,
> Michael



----------------------------------------------------------------------
Generalized Music Plugin Interface (GMPI) public discussion list
Participation in this list is contingent upon your abiding by the
following rules:  Please stay on topic.  You are responsible for your own
words.  Please respect your fellow subscribers.  Please do not
redistribute anyone else's words without their permission.

Archive: //www.freelists.org/archives/gmpi
Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe

Other related posts: