[wdmaudiodev] Re: Retrieving WAVEFORMATEX from IAudioSessionControl

  • From: Dadi <dadiku@xxxxxxxxx>
  • To: wdmaudiodev@xxxxxxxxxxxxx
  • Date: Wed, 17 Dec 2014 22:57:19 +0200

Hi Matthiew,

The upmix effect is integrated along with other effects we have for
surround and we cannot separate it to two different effects SFX and MFX.
We will check for the minimum and/or maximum number of channels and also
analyze the audio stream for our calculations.

In any case, my original question was not only for upmix but also for other
different effects (the upmix effect is only one example) in which we need
to get the WAVEFORMATEX of the active streams.
I thought maybe there is a runtime attribute that we can read from
IAudioSessionControl and get the audio format?
Or maybe we can QueryInterface to get some other class to assist?

So we need to find the best way to retrieve this information. We prefer to
avoid writing orphan SFX APOs only for that matter.

Thank you,
Dadi



2014-12-17 22:40 GMT+02:00 Matthew van Eerde <
Matthew.van.Eerde@xxxxxxxxxxxxx>:

>  > our effects must all reside in the same APO instance
>
>
>
> Can you elaborate on why?
>
>
>
> The approach we had in mind when designing the system is to do most
> processing in the MFX; processing where the channel count may change in the
> SFX; and hardware compensation in the EFX.
>
>
>
> For example, you could create an SFX that did both 2.0 => 5.1 upmix and
> 5.1 => 5.1 passthrough.
>
>
>
> It should be fine to have out-of-band communication between your SFX and
> your MFX. What would do you do in your MFX if there were two active
> streams, one with a stereo WAVEFORMATEX, and one with a 5.1 WAVEFORMATEX?
>
>
>
> *From:* wdmaudiodev-bounce@xxxxxxxxxxxxx [mailto:
> wdmaudiodev-bounce@xxxxxxxxxxxxx] *On Behalf Of *Dadi
> *Sent:* Wednesday, December 17, 2014 12:30 PM
>
> *To:* wdmaudiodev@xxxxxxxxxxxxx
> *Subject:* [wdmaudiodev] Re: Retrieving WAVEFORMATEX from
> IAudioSessionControl
>
>
>
> Hi Matthew,
>
> Yes we know that but we are processing only global (MFX) and our effects
> must all reside in the same APO instance so in surround we get only 5.1 =>
> 5.1.
> In addition we are also working on new effects (not upmix) and we must
> have the WAVESFORMTEX of the active streams.
>
> The possibilities we currently have are:
> - We can write SFX APOs that always do full bypass only for that matter -
> gather the WAVESFORMATEX information and copy it to a shared memory or some
> other IPC mechanism so the MFX APO can read read it.
>
> - We can also get the assistance of the codec vendor (with private API) to
> give us this information to overcome this limitation but we prefer to have
> a better way for this.
>
>
>
> Is there any way to resolve it gracefully?
>
> Thank you,
>
> Dadi
>
>
>
> 2014-12-17 20:08 GMT+02:00 Matthew van Eerde <
> Matthew.van.Eerde@xxxxxxxxxxxxx>:
>
> It’s too late to do any kind of stream-intelligent upmixing in a render
> GFX, MFX or EFX. Everything’s already been channel-converted and mixed
> together by the time it gets to you. If you want to do intelligent channel
> matrixing per stream, you need an SFX.
>
>
>
> Render SFXes (and only render SFXes) can support multiple input formats,
> even with differing channel counts. So on a 5.1 output you can do:
>
> 1.       2.0 => 5.1 (upmix)
>
> 2.       5.1 => 5.1
>
>
>
> And on a 2.0 output, you can do:
>
> 1.       2.0 => 2.0
>
> 2.       5.1 => 2.0 (downmix)
>
>
>
> *From:* wdmaudiodev-bounce@xxxxxxxxxxxxx [mailto:
> wdmaudiodev-bounce@xxxxxxxxxxxxx] *On Behalf Of *Dadi
> *Sent:* Tuesday, December 16, 2014 6:02 AM
> *To:* wdmaudiodev@xxxxxxxxxxxxx
> *Subject:* [wdmaudiodev] Re: Retrieving WAVEFORMATEX from
> IAudioSessionControl
>
>
>
> Hi Matthew,
>
> Our global APO needs that information for its processing logic.
>
> E.g. for doing surround upmix we need to know the channel count of all
> open sessions. That way we can find if an app opened a stereo channel to
> play audio (and so the APO should do upmix) or if an app is playing full
> surround and intentionally only two channels are active (e.g. can happen
> while a movie is played) and in this case we should NOT do any upmix in the
> APO.
>
> Because there can be many audio sessions simultaneous then we will take
> the minimum or maximum values for reference.
>
> The above is only one example and there are other cases in which our APO
> needs to know the original format.
>
> Windows API allows us to enumerate all the active audio sessions (we can
> filter it per endpoint) and get IAudioSessionControl for each session but
> we need to use it to get the active WAVESFORMATEX.
>
>
>
> Note - we do not have any SFX APOs, only global.
>
> Is there any way to resolve this?
>
> Thank you,
>
> Dadi
>
>
>
>
>
>
>
> 2014-12-15 18:58 GMT+02:00 Matthew van Eerde <
> Matthew.van.Eerde@xxxxxxxxxxxxx>:
>
> Conceptually, an audio session (IAudioSessionControl) is an (app,
> endpoint) pair.
>
>
>
> An app can have zero or more active streams (IAudioClient) on a given
> endpoint at any given time.
>
>
>
> The stream has an associated WAVEFORMATEX (what was passed to
> IAudioClient::Initialize) but there’s no way to retrieve it from the
> IAudioClient.
>
>
>
> There is also no way to get the list of active IAudioClients from an
> IAudioSessionControl.
>
>
>
> What is the larger problem you’re trying to solve?
>
>
>
> *From:* wdmaudiodev-bounce@xxxxxxxxxxxxx [mailto:
> wdmaudiodev-bounce@xxxxxxxxxxxxx] *On Behalf Of *Dadi
> *Sent:* Sunday, December 14, 2014 11:21 AM
> *To:* wdmaudiodev@xxxxxxxxxxxxx
> *Subject:* [wdmaudiodev] Retrieving WAVEFORMATEX from IAudioSessionControl
>
>
>
> Hi,
>
> Is it possible to retrieve the audio session format (WAVEFORMATEX) from a
> given pointer to an active IAudioSessionControl?
>
> Thank you,
>
> Dadi
>

Other related posts: