[gmpi] Re: Topic 6: Time representation

  • From: David Olofson <david@xxxxxxxxxxx>
  • To: gmpi@xxxxxxxxxxxxx
  • Date: Sun, 11 May 2003 20:36:48 +0200

On Saturday 10 May 2003 18.03, Vincent Burel wrote:
[...]
> > > > First, you can run the MIDI I/O thread at a higher priority
> > > > than the audio thread, and it'll give you pretty accurate
> > > > timing.
> > >
> > > yes but it's not usual. and nearly not applicable under Windows
> > > for example.
> >
> > Why not? I usually let background services in Windows 2000 have
> > priority over foreground stuff, since it gives ASIO and other
> > drivers a bit more life. So I'm pretty sure that thread
> > priorities are applicable in Windows.
>
> As far as know ! there is nothing precise below 1ms under windows.
> Even a MultimediaTimer cannot warrant the one ms strictly between 2
> calls. Because some system task (especially driver) can lock the
> CPU during a indefinite time. Even the ASIO Driver callback is an
> event which can be delayed by the system for whatever (display an
> info box for example) and cut your sound,

Right; Windows is not a real time OS. With some luck, it'll deliver 
with quite reasonable accuracy most of the time, but not always. Deal 
or look for another OS.


> and just because a PC
> cannot provide a precise 1ms timer by feature.

(I assume that by "PC" you mean "Windows PC". Although the ISA legacy 
stuff is indeed utter crap, it's not *that* bad. This is an OS 
problem.)


> In the years 1998 upto 2000 i remember some audio manufacturer were
> using MultimediaTimer or whatever timer connected on the PIT
> (programmable interrupt timer of the PC) to compute time and manage
> M.I.D.I. Sequencer and the MIDI was synchronize on this PIT and the
> AUDIO stream was depending of course of the audio board clock . But
> the PIT when programmed for 1000 calls per second induce a 9ms
> derivation per minutes (it's due to the base frequency of the pit)
> . So many clients were complaining about un-synchronization between
> audio and MIDI after 3 or 5 minutes... Why !? Just because there
> was 2 clocks. Now they just use timer to push MIDI Event , but they
> get the time from the audio stream or from an external clock.

No. This happened because there was no sync/lock between the clocks. 
Any two free running oscillators will drift apart at some rate, 
unless you sync them. If you can't sync them, you have to adjust the 
virtual clocks you generate from them.

I believe what you've seen/heard was an example of what happens when 
audio card drivers don't implement all features properly. AFAIK, the 
most common problem with the pre-DirectSound drivers was that many 
drivers didn't implement the audio output position API properly. Some 
sequencers used the timing of the callbacks/messages instead, but 
that just gives you a scheduling latency dependent random factor to 
deal with instead.


> > As MIDI data needs to be worked with more often than audio (MIDI
> > data needs
> > to be sent at the time events should occur, while audio is
> > serviced in blocks), it makes perfect sense to have MIDI being
> > processed in a higher priority thread than the audio one.
>
> perfect sense !? personnally i never seen that, and i cannot
> imagine an audio station where the audio stream might be cut by a
> MIDI code reception !

It makes perfect sense if it's the only way you can timestamp incoming 
MIDI events, or deliver MIDI events on time. If you have a serious 
MIDI API, you shouldn't need to do it.

As to cutting up an audio stream as a result of an incoming MIDI 
event, well it's completely pointless, because plugins do not execute 
all in parallel, one sample at a time, at the audio rate. The chance 
of you intercepting the right plugin at the right position in the 
buffer is microscopic, and it's not even theoretically possible if 
the plugin does multiple passes internally.

The whole point is to timestamp incoming MIDI events as accurately as 
possible, so you can turn random jitter of +/- half the audio buffer 
duration into a constant latency. If you need 2 fragments of 15 ms 
for reliable audio, you want 30 ms MIDI->audio latency (playable); 
not 22.5 +/- 7.5 ms (useless).


[...]
> synchronisation usually means that there is one master clock and
> slave one . we cannot here talk about 2 different clocks.

Well, life is good if you have serious hardware. :-)

However, this doesn't connect MIDI timing to audio time in any way, 
unless you have an audio/MIDI interface that timestamps MIDI events 
with audio sample counts.


//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -'
   --- http://olofson.net --- http://www.reologica.se ---


----------------------------------------------------------------------
Generalized Music Plugin Interface (GMPI) public discussion list
Participation in this list is contingent upon your abiding by the
following rules:  Please stay on topic.  You are responsible for your own
words.  Please respect your fellow subscribers.  Please do not
redistribute anyone else's words without their permission.

Archive: //www.freelists.org/archives/gmpi
Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe

Other related posts: