[gmpi] Re: Generalized Music Plugin Interface list is now onl ine

  • From: David Olofson <david@xxxxxxxxxxx>
  • To: gmpi@xxxxxxxxxxxxx
  • Date: Tue, 11 Feb 2003 15:51:55 +0100

On Tuesday 11 February 2003 15.15, RonKuper@xxxxxxxxxxxx wrote:
> > I agree. No midi communication between host and plugin unless
> > there's really no other way.
> I don't think it's prudent to toss aside MIDI as a communication
> mechanism.

It has served well, still works well for what it's meant for, and has=20
a good feature set. However, it's still a protocol meant for low=20
bandwidth wires; not APIs that use function calls of timestamped=20
events for communication.

> First, defining an alternative is a huge exercise onto itself.=20
> It's tantamount to saying that GMPI =3D MIDI 2 + other stuff.  That's
> way too broad a scope, IMO.

Why? Is it an unrealistic effort to read the MIDI (2) spec and map the=20
features more or less directly to a less cumbersome API, designed for=20
a plugin API rather than wire?

I think you're underestimating the cost of parsing MIDI inside every=20
plugin, in terms of extra work for plugin authors and increased risk=20
of bugs.

Looking at VST, I'm also worried - no *convinced* - that most plugins=20
will support only the very basic MIDI real time messages, thus=20
causing severe headache for any users that want to go beyond simple=20
12 tone scales and the note on/off paradigm. IMNSHO, VST 2.0 has a=20
serious design mistake in this area, and doing it all over again=20
seems rather silly.

> Furthermore, think about interop with hardware gear.  A host app or
> even a plugin may want to listen to data from external gear or send
> data to it.

Hosts can do whatever they like, of course, and as to plugins, I refer=20
to that kind of plugins as "driver plugins" - ie special plugins that=20
are used to interface with the world outside the host. They can be=20
used to wrap all audio and control I/O in a host, to avoid special=20
cases when making connections and running the processing graph. (The=20
only really significant exception is that an audio driver plugin=20
would block on audio blocks, and thus control the scheduling of the=20
audio thread.)

> The user of a sequencer may want to seamlessly swap
> hardware synths with software synths.

No problem, as long as it is at all possible to translate the concepts=20
discussed using the two protocols.

> Finally, MIDI is thing that can help the hardware world migrate to
> software. I'm sure commercial hardware synth companies have an
> investment in embedded DSP and synthesis code.  This code is no
> doubt MIDI based.  By having a software based solution support
> MIDI, the hardware guys can move over a bit easier.

I don't think it makes much of a difference, really. Many older synths=20
even have a separate MCU just for the MIDI parsing, and considering=20
the way MIDI has to be parsed, I can hardly see how modern h/w synths=20
(or s/w synths for that matter) could use MIDI without a similar=20
separate layer.

//David Olofson - Programmer, Composer, Open Source Advocate

=2E- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`---------------------------> http://olofson.net/audiality -'
   --- http://olofson.net --- http://www.reologica.se ---

Generalized Music Plugin Interface (GMPI) public discussion list
Participation in this list is contingent upon your abiding by the
following rules:  Please stay on topic.  You are responsible for your own
words.  Please respect your fellow subscribers.  Please do not
redistribute anyone else's words without their permission.

Archive: //www.freelists.org/archives/gmpi
Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe

Other related posts: