>The spec say that "when an MSB is received, a receiver should >set its concept of the LSB to zero". No timeout is stated, or >implied, nor should be assumed, I think. right, and this is really a poor design (for s/w at least). i've discussed this with about a dozen people over the last 3-4 years, and nobody can figure out what the original spec team were imagining. IIRC, there's at least 0.6msec between the MSB and LSB messages, or 20-60 samples at typical SR's. Thats enough to cause really glaring artifacts when combined with sample-accurate event delivery. If you reset the LSB to zero immediately the MSB arrives, you get wierd zipper noise. if you wait to see if the LSB arrives, you can avoid the zipper noise, but things get tricky from a s/w implementation point of view. Timeouts, interpolated values, etc. etc. --p ---------------------------------------------------------------------- Generalized Music Plugin Interface (GMPI) public discussion list Participation in this list is contingent upon your abiding by the following rules: Please stay on topic. You are responsible for your own words. Please respect your fellow subscribers. Please do not redistribute anyone else's words without their permission. Archive: //www.freelists.org/archives/gmpi Email gmpi-request@xxxxxxxxxxxxx w/ subject "unsubscribe" to unsubscribe