[gmpi] Re: 3.9 Time Formats

  • From: "Michael Stauffer" <michael@xxxxxxxxxxxxxxxxxx>
  • To: <gmpi@xxxxxxxxxxxxx>
  • Date: Tue, 17 Feb 2004 15:01:55 -0500

Ron wrote:

>If a host were designed such that tempo changes were first class events,
>rendered on the sequencer timeline like any other event, then it's
>trivial to allow a plugin to change the map on the fly.
>For our apps, tempo changes are pretty deep.  Anything that changes how
>musical time converts between sample time is a non trivial change.
>For example, we have a highly optimized disk I/O scheduling algorithm.
>It determines *precisely* how much memory is needed for reading all
>regions from all files in the project, without any waste.  This is key
>to maximizing our disk bandwidth and track count.  This algorithm needs
>to do calculation on sample start times and extents for the audio data
>in the project.  When you change the tempo, the memory requirements for
>the I/O schedule may change.
>Sure, we can deal with it if we need to... it's just code<g>.  Sure,
>it's host specific.  But it's just one issue in our app.  There are
>others, and I suspect other hosts have similar types of issues to deal

Ron, I hope I'm not being dense, but I don't understand why it would have
to be *so* complicated. Just to be clear, I'm talking about adjusting the
host tempo map in and *offline* manner, ie NOT while the transport is

Currently in Sonar 3, I can load an audio clip, then insert tempo changes
at any point in the project via the GUI (and I think via CAL access to
menu commands?). If the clip is enabled for Groove-clip Looping, its
position within the M:B:T Time Ruler stays constant and the clip is
appropriately timestretched. If the clip is NOT enabled for groove-clip
looping, then the clip's duration within the Time Ruler changes
appropriately. If I delete a tempo event from the GUI, everything adjusts

I'm not understanding why it would be signifigantly different for Sonar
to receive tempo insert/delete events from an offline plug instead of
from the GUI or via CAL? When the user does tempo insertion/deletion via
the GUI, there's a mechanism for handling it, which I *imagine* could be
adapted to handle the same information coming from a plug event rather
than the GUI. In my own programming, I have functions that change the
program state, and these are separate from the GUI event handling
functions. Sorry if I'm missing something obvious.

>If a tempo change is a first class event, why not an audio region?  Why
>not let plugins change that audio that's played on the timeline
>dynamically, by injecting new files to play into the host's sequence?
>Because (I think we all recognize) that this amount of realtime
>interaction crosses the line between what we understand to be "easy" on
>the host side, and what's "hard."  If you want a plugin to generate
>audio, write a sampler.

Don't offline plugs implicitly inject new files into the sequence when
they process the user's selection and write to disk, and the host
replaces the original audio file (or clip) with the processed audio file?

Now when you say "realtime interaction", if you're thinking that this is
all happening while the transport is running, I agree with you. Again,
I'm discussing *offline* processing, ie with transport stopped.

>By the same token, if you want a plugin to mess with tempo, maybe the
>solution is to put it downstream from whatever data it hopes to munge?

Except that in the example I've given, that would do no good. I'm
operating on the assumption that the host has sequencing capabilities of
its own that I'd like to influence.


Generalized Music Plugin Interface (GMPI) public discussion list
Participation in this list is contingent upon your abiding by the
following rules:  Please stay on topic.  You are responsible for your own
words.  Please respect your fellow subscribers.  Please do not
redistribute anyone else's words without their permission.

Archive: //www.freelists.org/archives/gmpi
Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe

Other related posts: