On Thu, Feb 19, 2004 at 08:06:13AM -0500, Ron Kuper wrote: > >>> > 2. Tempo sync > A plugin wants to do a tempo-synced delay. The plugin needs to find > out > when the time will be exactly 3 beats from "now" at the current tempo, > and > when that tempo changes. > <<< > I agree that this is a valid use case, but doesn't this put us straight > into a requirement that plugins be allowed to look-ahead to future tempo > changes? It is possible to do tempo sync without tempo lookahead, just > as long as every tempo change that occurs within an audio frame is given > the plugin in time. (Maybe just leave out the "3 beats from now" in the > use case.) Right. The wording needs to be fixed, then :) What I am trying to convey is that it needs to figure out when some future "now" is 3 beats from some past "now". Perhaps: A plugin wants to do a tempo-synced delay. The plugin needs to find out exactly how many sample frames constitute a given number of beats at the current tempo, and when the tempo changes. > >>> > 3. Quantizing > <<< > > This poses an interesting problem which I'm not sure we've gotten to > yet. A quantizer might need to snap events to a point in the time > located in the past. IOW, an event at 1:1:005 might need to be > quantized to 1:1:000. What happens in this case, is the audio > processing frame spans musical time 1:1:005-1:1:010, that is, the audio > processing frame for the desired quantized time has already passed? > > I guess I'm saying this suggests that plugins need to be able to report > their processing delay in musical units as well as sample units? So latency changes with the tempo? eeek. I think that if you need to quantize into the past, then you need to have some amount of latency. Maybe there is a parameter which the user can change to allow more backwards time? Doesn't require lookahead, but could benefit from it. Given the nature of quantizing, it would need to look-ahead at live input, so I think latency is the best answer.. > >>> > 5. Automatic accompaniment > <<< > > Again, can this be done without tempo map look-ahead? Assuming you can get a smart-enough algorithm and fast enough system, I don't see why not. ? > >>> > 6. Realtime tempo follower > 7. Offline tempo analysis > <<< > > I would think that 7 follows from 6, using the rule that offline is a > special case of realtime? 7 wants to bundle up a map and hand it back to the host. 6 just wants to send events Just In Time. That's the fundamental diff. > >>> > 8. Sequencer tracks > 9. Audio tracks > Is it notified of transport position in samples? > <<< > This would be the "score time" (as opposed to the "stream time"). Every > processing frame would know the score time. But each of those works better with a different unit base. An audio track (without time stretch or beat marking) would not care about the musical unit. It would have a hard time converting "Bar 40, Beat 3" into a proper sample offset. A music-oriented sequence track would have a hard time converting "sample 123456" into a musical position. > What if meter changes dynamically, but the sequenced data no > longer > makes sense? > <<< > When does a sequencer plugin cease to become a plugin, and start > becoming a host? <g> A question I have been asking myself more and more. Having a dynamic tempo is not too bad. Having a dynamic time signature is a LOT more complicated. Maybe it is not worth it? Or maybe those FEW plugins that care about meter will figure it out on their own? ---------------------------------------------------------------------- Generalized Music Plugin Interface (GMPI) public discussion list Participation in this list is contingent upon your abiding by the following rules: Please stay on topic. You are responsible for your own words. Please respect your fellow subscribers. Please do not redistribute anyone else's words without their permission. Archive: //www.freelists.org/archives/gmpi Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe