On Thu, Aug 29, 2013 at 09:58:19AM +0200, Lars-Peter Clausen wrote:
It would be nice if this could be handled by the core. But how? One option might be tracking the the DAI format in the core and power up the DAI widget in DAPM if the stream is active regardless of if there is an active path if the DAI is in master mode. That may power up more than what is needed though. E.g. for this CODEC the clock generator and the rest of the DAI can be powered up independently and we only need to power up the clock generator.
Yes, that was what I was thinking of. We can provide a mechanism to allow the driver to specify a specific widget for the clocks if one exists and fall back on the DAI widgets otherwise.
+static int adau17x1_set_dai_clkdiv(struct snd_soc_dai *dai, int div_id, int div) +{
What's this doing? It'd be better to have a specific ID and check that too.
Setting the relationship between the external clock rate and the internal base sample rate. There might be a better way to do this though.
If it's specifying two clock rates then specifying them both as actual rates might make sense. I take it it's not possible for the driver to automatically figure out what the most sensible rate is?
+int adau17x1_set_micbias_voltage(struct snd_soc_codec *codec,
- enum adau17x1_micbias_voltage micbias)
When would a machine driver use this (as opposed to just letting it be set by platform data)?
It is not meant to be used by machine drivers, but by the adau1761 and adau1781 CODEC drivers, which set the micbias based on platform data.
Ah, in that case how about having generic platform data for the core and just passing that through to a platform data parsing function?