27 Apr
2010
27 Apr
'10
12:12 p.m.
On Fri, Apr 23, 2010 at 11:06 PM, Mark Brown broonie@opensource.wolfsonmicro.com wrote:
On Wed, Apr 21, 2010 at 05:36:47PM +0800, Barry Song wrote:
Signed-off-by: Barry Song 21cnbao@gmail.com
+static int ad193x_set_dai_pll(struct snd_soc_dai *codec_dai,
- int pll_id, int source, unsigned int freq_in, unsigned int freq_out)
+{
- struct snd_soc_codec *codec = codec_dai->codec;
- int reg;
- reg = snd_soc_read(codec, AD193X_PLL_CLK_CTRL0);
- switch (freq_in) {
- case 12288000:
- reg = (reg & AD193X_PLL_INPUT_MASK) | AD193X_PLL_INPUT_256;
- break;
This all looks like you should just implement set_sysclk() not set_pll()
- from the user point of view the fact that a PLL ends up getting tuned
is immaterial here, they can't control the output frequency at all. From their point of view they just specify the clock going into the CODEC and the driver works out everything else for them.
Sorry for I have not noticed this entry. Then i'll use set_sysclk() to set the MCLK instead of set_pll().