Our project is USB Audio Class 1.0 Full Speed with playback and recording interfaces which can be used at the same time, and should work with the OS built-in drivers.
Our CODEC only supports a single sample rate for both input and output (which is the most common way these are used).
There are only 3 alt settings for each interface: 0 bandwidth, 16-bit, and 24-bit. It's asynchronous in and out with explicit feedback.
In Mac OS X, changing the sample rate of input also changes output, so there is no problem, but in Windows it can be set to record and play at different rates. Our hardware doesn't support this, and our firmware fails in this case.
The manufacturer's suggested solution is to ignore input alt setting changes while the output is playing and vice versa. (The ignored endpoint sends NAKs, if I understand correctly. Bus analyzer shows "periodic timeout" and recording software just stalls, for instance.)
This doesn't seem like the best method, though, since it will also ignore a good change by the user, too (correcting the sample rate of one interface to match the other during playback) and some OSes send all zeros on the endpoints continuously even when you're not actively playing or recording, which prevents anything from being changed.
(Also, the alt setting change and sample rate request are technically independent, right? Though it seems every OS sends both at the same time, even if the alt setting has not been changed.)
Any suggestions on how to handle this better?
(I think UAC 2.0's clock descriptors would prevent the computer from requesting this? But Windows doesn't support UAC 2.0 yet.)
Also we aren't technically required to honor a Set sampling rate request (5.2.3.2.3.1 Sampling Frequency Control), but Windows and OS X both assume that we have, and never send a Get sample rate request to confirm that it was changed, so they send the wrong rate of data.