Question about soc_pcm_apply_msb()
Kuninori Morimoto
kuninori.morimoto.gx at renesas.com
Fri Dec 17 00:18:58 CET 2021
Hi Lars-Peter
Thank you for your feedback
>> I wonder do we need both (X) (Y) ?
>> I think we can merge (A) and (B) (= find Codec/CPU max sig_bits),
>> and call soc_pcm_set_msb() once, but am I misunderstand ?
> We need both. Or alternatively you could write
> soc_pcm_set_msb(substream, min(bits, cpu_bits)).
>
> What this does is it computes the maximum msb bits from both the CPU
> side and the CODEC side and then sets the msb bits reported to userspace
> to the minimum of the two.
>
> The largest number of MSBs we'll see on the CODEC side is the max() and
> the largest number of MSBs we'll see on the CPU side is the max(). And
> the number of MSBs that the application will be able to see is the
> smaller of the two.
Oh, yes. thank you for explaining details.
I think snd_pcm_hw_rule_msbits() was the point.
Best regards
---
Kuninori Morimoto
More information about the Alsa-devel
mailing list