>I think the point is that anything that is stored in the sequencer can be >delivered timestamped for the future before it's timeslice. Anything that >is realtime (user inputs, UI, MIDI, outputs from an upstream processor, etc) >would have to be latent. the only category there that interests me is "outputs from an upstream processor". as several of us have noted, we can't do anything about asynch input from user devices. but the upstream processors might be mini-sequencers in themselves. they expect to deliver events with N timeslices of latency, where N has been assumed by several of us to be 1. as soon as you require that some nodes can ask for longer event delivery latency, each one of these nodes has to adjust what its doing. doesn't seem very nice to me. for a simple example, imagine something like a mini-pattern sequencer running as a GMPI plugin, sending music data of some kind to other plugins. --p ---------------------------------------------------------------------- Generalized Music Plugin Interface (GMPI) public discussion list Participation in this list is contingent upon your abiding by the following rules: Please stay on topic. You are responsible for your own words. Please respect your fellow subscribers. Please do not redistribute anyone else's words without their permission. Archive: //www.freelists.org/archives/gmpi Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe