On Jul 4, 2017, at 12:24 PM, Børge Strand-Bergesen <borge.strand@xxxxxxxxx>
Yes, the Device code looks at the 16 bit value and tries to serve up 256
bytes. That doesn't seem to make the driver very happy.
I don't know why the driver requests 0x0100 bytes of frequency definition.
For flesh-and-blood beings it would however make sense to first ask for one
bytes and use that to determine the number of triplets. Although the number
of triplets, too, is a 16-bit number.
Now, I'm not saying there is a bug in the driver. I'd just very much like to
know what I should give it when it asks for 0x0100 bytes. And I'd like to
know if it asks for that because of some other bug in my descriptors.