Do you have any way of getting a real position from the layer below you? That would be much better than relying on KeQueryInterruptTime(). Or are you throwing the data on the floor like MSVad does? Don’t use my code; it’s just a proof-of-concept to demonstrate that you can get much better than 0.29 ms jitter using sample accuracy instead of millisecond accuracy. In particular, I really don’t like this line: ULONG ulByteDisplacement = (ULONG)((m_ulDmaMovementRate * ullElapsed + ullHnsInHalfASec) / ullHnsPerSec); This line introduces slop into the byte position which accumulates over time. I think that’s why my skew actually regressed (but I’m not sure.) I’m working on a better version which attempts to maintain sample granularity but eliminates the slop.