Yes, the Device code looks at the 16 bit value and tries to serve up 256
bytes. That doesn't seem to make the driver very happy.
I don't know why the driver requests 0x0100 bytes of frequency definition.
For flesh-and-blood beings it would however make sense to first ask for one
bytes and use that to determine the number of triplets. Although the number
of triplets, too, is a 16-bit number.
Now, I'm not saying there is a bug in the driver. I'd just very much like
to know what I should give it when it asks for 0x0100 bytes. And I'd like
to know if it asks for that because of some other bug in my descriptors.
On Tue, Jul 4, 2017 at 9:09 PM, Tim Roberts <timr@xxxxxxxxx> wrote:
On Jul 4, 2017, at 2:24 AM, Børge Strand-Bergesen <borge.strand@xxxxxxxxx>
I can now see all the class specific requests to my firmware in a debug
when it requests the sample rate setup. My firmware is originally set up to
What it looks like now is that Win10 C.U. requests wLength=0x0100 bytes
provide min(wLength, sizeof(Speedx)) bytes, which is shorter than 0x0100.
Interpreting only the low byte and sending 0x00 bytes makes Win10 C.U. halt
Who is "interpreting only the low byte"? wLength is, by definition, a
16-bit value. All of the USB descriptor and request fields use Hungarian
notation in their names to embed the type, so even the name "wLength" tells
you that it is a 16-bit value.
Tim Roberts, timr@xxxxxxxxx
Providenza & Boekelheide, Inc.