Make sure you implement IAudioProcessingObject::GetLatency(...), and report the delay between your input and output. If an app cares about its sound playing all the way through to the very last byte, the design is for the IAudioClient to pad silence onto the end of the stream, and keep pumping silence until IAudioClock indicates that the very last byte has made it all the way through the speaker. From: wdmaudiodev-bounce@xxxxxxxxxxxxx [mailto:wdmaudiodev-bounce@xxxxxxxxxxxxx] On Behalf Of Abhinav Singh Sent: Monday, December 15, 2014 10:12 AM To: wdmaudiodev@xxxxxxxxxxxxx Subject: [wdmaudiodev] buffering samples in an APO I am trying to integrate a DSP algorithm with my GFX APO which requires a fixed number of samples greater than the number of samples i get every time the Audio Engine calls the APOProcess() function of my APO. So i buffer the samples until i have the required number of samples and feed the audio engine zeros until i have valid output to feed to the engine. This works but there is a loss of few samples at the very end(some samples remain unprocessed in the input buffer while some are left over in the output buffer ). Is there any workaround to this problem?