[gmpi] Re: 3.15 MIDI

  • From: "Martijn Sipkema" <m.j.w.sipkema@xxxxxxxxxxxxxxxxxx>
  • To: <gmpi@xxxxxxxxxxxxx>
  • Date: Wed, 16 Jun 2004 13:00:29 +0100

> >>>And I still haven't seen a good reason for only having a single control
> >>>protocol...
> >>>
> >>
> >>Let me try to give you a good reason. Lets say I am a synthesizer
> >>plugin, one sine wave oscillator, which takes a note on of a  specific
> >>pitch. So lets say the host can pass GMPI events and MIDI mixed
together.
> >>For GMPI events, the note on has a frequency attached which tells me
> >>what frequency to generate. For the MIDI, I get a note number, which I
> >>need to translate into a frequency.
> >>So how do I translate note numbers into frequency? Well, I use a look
> >>up table. Uh-oh, that's the same way that the host translates note
> >>numbers into GMPI events with frequency. Presto, either there are two
> >>table which the user has to set independently, or the plugin and host
> >>have to share a translation table. And this is a really simple
situation.
> >
> >
> > Actually I don't that is a valid reason. The table actually might just
as
> > well belong to a patch and thus one could argue that it should be on
> > the plugin side. Also, though it should probably be possible it is
unlikely
> > that one would use both MIDI and GMPI parameters to control a
> > synthesizers (note on/off) at the same time and one would probably
> > only use a protocol other than MIDI if MIDI would not suffice. And
> > what is the problem with the conversion being done in two places?
> > The tables don't have to be shared. In the case of GMPI parameters
> > only, will the host ask the plugin for the note to pitch converesion
> > table that belongs to a patch and update it on patch changes?
>
> I don't see how it is better for the user to need to set up a table for
> each synthesizer to handle MIDI->frequency translation, as opposed to
> doing it once in the host.

The table will in general be part of a patch, e.g. stretched tuning for a
solo
piano patch. There is not necessarily a requirement for the user to set up
these tables, in fact most users will not want to do so.

If the GMPI parameters do indeed pass absolute frequency, then it is
_not_ compatible to MIDI. I believe such a system has its uses and
could be supported next to MIDI.

> So let me try another tack. The GMPI event protocol will be a superset
> of MIDI. Every MIDI message will be able to be passed using a GMPI
> event. In the case of SysEx, it will be wrapped, not translated. So
> NOTHING IS LOST!

Then why not support MIDI as a byte stream instead of encapsulating it
in some other encoding? Also I don't believe the encapsulating will be
lossles _unless_ it is transparent, i.e. byte exact.

> But because GMPI events are a superset of MIDI events, there are other
> possible messages besides the types represented using MIDI, and even the
> MIDI ones have added benefits like timestamps.

I'm also in favor of timestamps for the MIDI byte stream. I just want
to keep MIDI seperated from other (control) protocols.

> All of this is managed by the host. MIDI comes into the host. The host
> applies whatever mapping the user requests, and sends GMPI events to the
> plugin. This way, it is also the host's responsibility when OSC messages
> come in - they are turned into GMPI events.
> The reverse happens for output. A plugin outputs GMPI events, which are
> mapped to MIDI by the host.

And I'm saying that I'd like the possibility to not have MIDI mapped before
input and after output. That way it is certain to contain all the
information
that the MIDI stream contained.

> Where is the problem here? Why is there all of this yelling? Everyone
> who needs only MIDI is completely satisfied.

There is no yelling, at least not from me. And I'm telling you that I am
_not_ satisfied.

> Everyone who needs more
> than MIDI is satisfied to the extent to which GMPI events meet their
> needs. Before joining the corporate music community, I worked as a
> composer and programmer in the experimental music community, and I
> frequently needed more than MIDI to do the things that I wanted. So no
> one is saying throw out MIDI. We are simply saying that there are
> legitimate musical needs for which MIDI is ill-suited, so lets use
> protocol that is the intersection of MIDI and those needs.

I still haven't seen a single argument as to why there can be only a single
protocol. I think no single protocol will best suit all needs, so I think it
would be good to support more than one protocol.

--ms



----------------------------------------------------------------------
Generalized Music Plugin Interface (GMPI) public discussion list
Participation in this list is contingent upon your abiding by the
following rules:  Please stay on topic.  You are responsible for your own
words.  Please respect your fellow subscribers.  Please do not
redistribute anyone else's words without their permission.

Archive: //www.freelists.org/archives/gmpi
Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe

Other related posts: