Question about soc_pcm_apply_msb()

Lars-Peter Clausen lars at metafoo.de
Thu Dec 16 10:48:56 CET 2021


On 12/16/21 7:53 AM, Kuninori Morimoto wrote:
> Hi ALSA ML
>
> At soc_pcm_apply_msb(),
> (A) part finds max sig_bits for Codec, and
> (B) part finds max sig_bits for CPU
> and, set it to substream via soc_pcm_set_msb() at (X), (Y).
>
> 	static void soc_pcm_apply_msb()
> 	{
> 		...
>   ^		for_each_rtd_codec_dais(rtd, i, codec_dai) {
> (A)			...
>   v			bits = max(pcm_codec->sig_bits, bits);
> 		}
>
>   ^		for_each_rtd_cpu_dais(rtd, i, cpu_dai) {
> (B)			...
>   v			cpu_bits = max(pcm_cpu->sig_bits, cpu_bits);
> 		}
>
> (X)		soc_pcm_set_msb(substream, bits);
> (Y)		soc_pcm_set_msb(substream, cpu_bits);
> 	}
>
> I wonder do we need both (X) (Y) ?
> I think we can merge (A) and (B) (= find Codec/CPU max sig_bits),
> and call soc_pcm_set_msb() once, but am I misunderstand ?
We need both. Or alternatively you could write 
soc_pcm_set_msb(substream, min(bits, cpu_bits)).

What this does is it computes the maximum msb bits from both the CPU 
side and the CODEC side and then sets the msb bits reported to userspace 
to the minimum of the two.

The largest number of MSBs we'll see on the CODEC side is the max() and 
the largest number of MSBs we'll see on the CPU side is the max(). And 
the number of MSBs that the application will be able to see is the 
smaller of the two.




More information about the Alsa-devel mailing list