[wdmaudiodev] Re: Change default sound format with an inf File in Win7

  • From: Ueli Giger <giger@xxxxxxxxxxx>
  • To: wdmaudiodev@xxxxxxxxxxxxx
  • Date: Tue, 23 Feb 2010 21:06:24 +0100

Frank Yerrace schrieb:
If the device is offering both mono and stereo 16-bit 44100Hz, Windows 7 should 
default to using the stereo format. Are you sure you see this behavior on a 
clean install of Windows 7?
Yes I'm sure that it is a clean install of Windows 7 Professional with the newest updates. And when I plug in the PCM2904 the default audio format is Mono, 16Bit, 44100Hz. In the Attachement you can see the screen of the control panel after i pluged in the USB Device and windows had installed the standard driver with the inf file wdma_audio.inf.

The Vendor ID of the device is: 0x08BB and the Product ID: 0x2904
Regarding your observation of gain, first let me ask: does this USB device implement a 
Volume feature unit? I ask this because Windows 7 does have a known issue applying extra 
gain to audio capture devices that do not implement their own volume control "in the 
This USB device does not implement a volume feature unit. But how can I remove this extra gain?
Frank Yerrace

This posting is provided "AS IS" with no warranties, and confers no rights.

-----Original Message-----
From: wdmaudiodev-bounce@xxxxxxxxxxxxx 
[mailto:wdmaudiodev-bounce@xxxxxxxxxxxxx] On Behalf Of Tim Roberts
Sent: Tuesday, February 23, 2010 10:20 AM
To: wdmaudiodev@xxxxxxxxxxxxx
Subject: [wdmaudiodev] Re: Change default sound format with an inf File in Win7

Ueli Giger wrote:
That's a good idea cause I only want to make the format Stereo 16Bit
44100 Hz, but how can I do this?
My problem is that I've no experience in programming a lower filter
driver. Can you give me any good
links or examples?

Is this a device that your company is manufacturing?  If so, then just
change the configuration descriptors to advertise that one format only. Easy solution.

Filter drivers are pretty easy with KMDF.  There are good samples in the
DDK, in src\general\toaster\filter for one.  One way to do this is to
have your filter driver intercept the URB_GET_DESCRIPTOR_FROM_DEVICE
request, and rewrite it to have only the format you want.  That's not
rocket science, but there are a lot of details.  You have to allocate
memory to hold the revised descriptor, parse it well enough to find the
format descriptor, and copy it back into the URB.
No it isn't a device that our company is manufacturing. It is a device from TI.
Problem 1:
When you plug in the PCM2904, Win7 finds the USB Audio Codec and
installs it with the default format
Mono, 16 Bit, 44100 Hz. But we need the format Stereo, 16 Bit, 44100
Hz. To solve this problem
my idea was to copy the standard .inf file and add the lines for the
default format and then install
this driver over the standard usb audio driver.

I don't understand that.  Windows should choose the most capable format
the device offers, unless there is already a format override.  But
regardless of that, if the application needs a specific format, why
can't you have your LabView application override the format?  THAT'S the
right solution, not this ugly driver mucking.  Kernel code should always
be a last resort.
I think too that LabView should overwrite the format, but it isn't so. I'm in contact with the LabView
support too to clear this issue.

Problem 2:
Under Win7 we have an additional gain of +40dB against the
measurements with win XP. When I change
the format over the control panel when the LabView application is
running, the gain disappears and the
measurement is the same as in XP.
What could be the reasons for this gain? And how can I bypass my
signal without any gains?

How is that possible?  +40dB is a ratio of 10,000 to 1.  A 16-bit format
only has a range of +/- 32,768.  There must be something else going on
I think because of the reason that Frank Yerrace pointed out.

PNG image

Other related posts: