Hi Pierre-Louis
Thank you for your review
The max() used between all cpu or codec dais for the same dailink is largely defensive programming, in practice it's not clear to me if we have different delays reported by differents codec or cpu_dais. I would expect all cpu dais in the same dailink to report the same delay, and likewise all codecs dais in the same dailink should provide the same value.
Now doing a max between cpu and codec dais does not seem right to me. You may have a delay in a DSP and a delay in a codec, and worst case the delay is the total of the two. It wouldn't matter too much with a 'simple' codec with limited buffering, but the moment the codec itself has a DSP and internal buffering this change in accounting would introduce a real offset.
(snip)
- for_each_rtd_cpu_dais(rtd, i, cpu_dai) {
cpu_delay = max(cpu_delay,
snd_soc_dai_delay(cpu_dai, substream));
- }
- delay += cpu_delay;
- for_each_rtd_codec_dais(rtd, i, codec_dai) {
codec_delay = max(codec_delay,
snd_soc_dai_delay(codec_dai, substream));
- }
- delay += codec_delay;
- for_each_rtd_dais(rtd, i, dai)
add_delay = max(add_delay,
snd_soc_dai_delay(dai, substream));
- runtime->delay = delay;
- /* base delay if assigned in pointer callback */
- runtime->delay += add_delay; return offset;
Hmm.. ? Indeed... Why I merged these ?? Thank you for pointing it. This patch is indeed wrong. Will remove it in v2.
Thank you for your help !!
Best regards --- Kuninori Morimoto