Thanks for your response. Please, focus on the musical requirements, the use cases and musical examples, not on my own ideas for how to fulfull those requirements. The musical requirements are to provide GMPI plugins that data they require (a) to synchronize internal scores, score generators, and event processors with the host's performance, e.g. to generate an accompaniment to a stored sequence with "feel", and (b) to transmit performance gestures to plugins such that they can modify synthesis in accordance with the intention of the gesture. It helps if the gestural data (turning a knob, e.g.) is tied to the note or notes (voice or voices) that knob is meant to affect (e.g., vibrato rate for strings playing a specific Ebm9). It would be unwise to assume that knob copntrols a "timbre" or "instrument" as a whole; it might well control one specific note (Yamaha used to make a fancy electronic organ in which each key had a separate key pressure sensor and could apply vibrato to each note independently). My experience as performer, composer, and above all listener has taught me music software consistently falls short of these requirements -- FAR short. My experience as a developer has taught me that plugin specifications make oversimple assumptions, and these can be (somewhat) defended against by providing access to ALL data on the other side of the interface. Original Message: ----------------- From: Tim Hockin thockin@xxxxxxxxxx Date: Tue, 23 Dec 2003 22:56:43 -0800 To: gmpi@xxxxxxxxxxxxx Subject: [gmpi] Re: Reqs 3.8 Events - gesture start/end On Tue, Dec 23, 2003 at 09:45:28PM -0500, Michael Gogins wrote: > GMPI shall provide to plugins all information present in scores and > sequences. This presupposes some agreement as to score and sequence > representation (extended MIDI, NIFF, Music XML, just a list of possible data > elements, etc.). > GMPI shall provide to plugins all data used by hosts to control plugins. I am trying to follow your thinking, but I just keep getting confoozled. I *think* I get what you are getting at with this - can you give me some more concrete examples? > GMPI shall link gestural data to the note or note to which the gesture > controls by note ID. How does gesture data tie into notes? I though gesture start/end was mostly to give the host some clues about associated events. If I click and drag a knob in the GUI, how does that tie to a note? > Plugins shall be able to instruct the host to filter incoming events. Some > plugins may request only note on-note off type data. Others may request page > turns, full dynamic markings, and changes of meter. This is the notion I have been describing as "subscribing to" or "asking for". I think that THIS is the fundamental point of the whole XAP design, and I think it makes a lot of sense for GMPI. ---------------------------------------------------------------------- Generalized Music Plugin Interface (GMPI) public discussion list Participation in this list is contingent upon your abiding by the following rules: Please stay on topic. You are responsible for your own words. Please respect your fellow subscribers. Please do not redistribute anyone else's words without their permission. Archive: //www.freelists.org/archives/gmpi Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe -------------------------------------------------------------------- mail2web - Check your email from the web at http://mail2web.com/ . ---------------------------------------------------------------------- Generalized Music Plugin Interface (GMPI) public discussion list Participation in this list is contingent upon your abiding by the following rules: Please stay on topic. You are responsible for your own words. Please respect your fellow subscribers. Please do not redistribute anyone else's words without their permission. Archive: //www.freelists.org/archives/gmpi Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe