Hi folks,
Apologies if this is obvious, but how does the OSS emulation layer set hardware parameters at the codec level?
We are working in an embedded environment, and using a custom codec driver. When we access it via the ALSA API, it works fine. We have a bf5xx_wm8990_hw_params() function with configures the critical hardware parameters of the codec, and I have diagnostics code in the driver which does kernel prints when the driver changes key codec parameters (such as bit rate, timing, etc).
When my OSS application opens it's device and changes the audio rate, I don't see that function getting called. In core/oss/pcm_oss.c, I see the snd_pcm_oss_set_rate() function getting called, but I don't understand how it changes parameters in the codec. My bf5xx_wm8990_hw_params() never gets called.
Can somebody please point me in the right direction? I have two applications who need to access the audio driver, and at different bit rates. I have to makes sure they are getting set correctly.
Mike