On 24.03.2021 17:28, Pierre-Louis Bossart wrote:
I am using hw_params_fixup, but it's not enough. The first thing I do is to not add the BE HW constraint rules in runtime->hw_constraints, because this will potentially affect the FE HW params. If the FE does sampling rate conversion, for example, applying the sampling rate constrain rules from a BE codec on FE is useless. The second thing I do is to apply these BE HW constraint rules to the BE HW params. It's true that the BE HW params can be fine tuned via hw_params_fixup (topology, device-tree or even static parameters) and set in such a way that avoid the BE HW constraints, so we could ignore the constraint rules added by their drivers. It's not every elegant and running the BE HW constraint rules for the fixup BE HW params can be a sanity check. Also, I am thinking that if the FE does no conversion (be_hw_params_fixup missing) and more than one BEs are available, applying the HW constraint rules on the same set of BE HW params could rule out the incompatible BE DAI links and start the compatible ones only. I am not sure this is a real usecase.
Even after a second cup of coffee I was not able to follow why the hw_params_fixup was not enough? That paragraph is rather dense.
Right, sorry about that. :)
And to be frank I don't fully understand the problem statement above: "separate the FE HW constraints from the BE HW constraints". We have existing solutions with a DSP-based SRC adjusting FE rates to what is required by the BE dailink.
Maybe it would help to show examples of what you can do today and what you cannot, so that we are on the same wavelength on what the limitations are and how to remove them?
For example, ad1934 driver sets a constraint to always have 32 sample bits [1] and this rule will be added in struct snd_pcm_runtime -> hw_constraints [2]. As you can see, this is added early and always. So this rule will affect the HW parameters used from the user-space [3] and, in our example, only audio streams that have formats of 4B containers will be used (S24_LE, S32_LE, etc). For playback, if the audio stream won't have this format, the stream will need to be converted in software, using plug ALSA plugin for example. This is OK for normal PCM:
HW params user <----------> CPU <---> AD1934 space ^ DAI | | | -------------------------| sample bits constraint rule (32b)
For DPCM, we have the same behavior (unfortunately). ad1934, as a BE codec, will add it's rule in the same place, which means that it will again affect the HW parameters set by user-space. So user-space will have to convert all audio streams to have a 32b sample size. If the FE is capable of format conversing, this software conversion is useless.
FE HW params BE HW params user <----------> FE <--------------> BE CPU <----> BE AD1934 space ^ DAI DAI | | | --------------------------------------------------| sample bits constraint rule (32b)
The format conversion will be done in software instead of hardware (FE).
I hope I was able to be more clear now. Thanks for taking time to understand this. I owe you a coffee. :)
Best regards, Codrin
[1] https://elixir.bootlin.com/linux/v5.12-rc4/source/sound/soc/codecs/ad193x.c#... [2] https://elixir.bootlin.com/linux/v5.12-rc4/source/sound/core/pcm_lib.c#L1141 [3] https://elixir.bootlin.com/linux/v5.12-rc4/source/sound/core/pcm_native.c#L6...