Hi ALSA SoC Cc Jyri
For historical reason, simple-card supports legacy DT daifmt settings. Then, I noticed one strange code.
-- simple-card-utils.c ---
asoc_simple_parse_daifmt(xxx) { (A) daifmt = snd_soc_of_parse_daifmt(...); daifmt &= ~SND_SOC_DAIFMT_MASTER_MASK;
if (!bitclkmaster && !framemaster) { /* * No dai-link level and master setting was not found from * sound node level, revert back to legacy DT parsing and * take the settings from codec node. */ (B) daifmt = snd_soc_of_parse_daifmt(codec, NULL, NULL, NULL) | (daifmt & ~SND_SOC_DAIFMT_CLOCK_MASK); } ... ... }
It rollbacks to legacy DT parsing at (B) if (A) didn't have master settings. Here, (B) re-try to get daifmt, and use "or" with (daifmt & ~CLOCK mask). Why CLOCK mask ? and shouldn't it use mask when "or" ? Otherwise FORMAT and INV part will be duplicated, I think. for example daifmt = (snd_soc_of_parse_daifmt() & SND_SOC_DAIFMT_CLOCK_MASK) | (daifmt & ~SND_SOC_DAIFMT_CLOCK_MASK)
I think using snd_soc_of_parse_daifmt() only is very enough at (B), but am I misunderstanding ??
The original code was from
b3ca11ff59bc5842b01f13421a17e6d9a8936784 ("ASoC: simple-card: Move dai-link level properties away from dai subnodes")
Thank you for your help !!
Best regards --- Kuninori Morimoto