[alsa-devel] ASoC: hook for codec control updates and clock controls
Hi,
Codecs like the tlv320aic3x family have soft volume controls to smoothly drive up and down mixer elements in order to prevent clicks and pops. In fact, most of the controls on this chip are implemented as such. For the mixers to work, however, the I2S input clock is needed which is currently only true when the DAC/ACD elements are active but not when the chip is just operating as an analog mixer.
To allow this functionality, the I2S clock must be present for a short period after each volume control update to the codec and disabled again after some milliseconds. It should, however, not be enabled if it's already running and of course, it must not be disabled if the ADC/DAC is still running.
Adding a hook to struct snd_soc_codec which is called from snd_soc_update_bits() isn't a big deal, but the problem is the abstraction layers in this case - for good reasons, the cpu_dai is not accessible from the codec dai. Or did I miss a link?
Is there a sane way to address this problem?
Thanks and best regards, Daniel
On Sun, Dec 07, 2008 at 06:38:25PM +0100, Daniel Mack wrote:
mixers to work, however, the I2S input clock is needed which is currently only true when the DAC/ACD elements are active but not when the chip is just operating as an analog mixer.
What exactly do you mean when you say "I2S clock input" here? An I2S link has two clocks, a bit clock and a frame clock and most codecs also use a third, master, clock which is not associated directly with an I2S link. I suspect that the requirement is for the master clock...
The clocking is the responsibility of the board driver - often the clocks are disabled while the link is inctive but this up to the board driver (and ultimately the hardware design).
Adding a hook to struct snd_soc_codec which is called from snd_soc_update_bits() isn't a big deal, but the problem is the abstraction layers in this case - for good reasons, the cpu_dai is not accessible from the codec dai. Or did I miss a link?
You'll be able to get there since the card is accessible but it's unacceptable for the codec driver to go fiddling with other parts of the link like you're suggesting here - it doesn't have enough information about the system to know what to do. The clocking for the bus could be shared with other devices, the clocks could be provided by other devices or the codec itself could be the clock master.
Is there a sane way to address this problem?
It does depend on what the actual requirement is here. Which clock or clocks are the issue?
Hi Mark,
On Sun, Dec 07, 2008 at 07:02:40PM +0000, Mark Brown wrote:
On Sun, Dec 07, 2008 at 06:38:25PM +0100, Daniel Mack wrote:
mixers to work, however, the I2S input clock is needed which is currently only true when the DAC/ACD elements are active but not when the chip is just operating as an analog mixer.
What exactly do you mean when you say "I2S clock input" here? An I2S link has two clocks, a bit clock and a frame clock and most codecs also use a third, master, clock which is not associated directly with an I2S link. I suspect that the requirement is for the master clock...
Yes, correct. It's about the master clock. The internal mechanisms seem to count in multiples of Fs, but this one is also derived from the master clock (called MCLK here).
The clocking is the responsibility of the board driver - often the clocks are disabled while the link is inctive but this up to the board driver (and ultimately the hardware design).
That makes sense. However, there is this special case I'm facing here - the codec's volume settings are updated and the board driver doesn't notice that, so it can't react and switch on the clock for some time.
Adding a hook to struct snd_soc_codec which is called from snd_soc_update_bits() isn't a big deal, but the problem is the abstraction layers in this case - for good reasons, the cpu_dai is not accessible from the codec dai. Or did I miss a link?
You'll be able to get there since the card is accessible but it's unacceptable for the codec driver to go fiddling with other parts of the link like you're suggesting here - it doesn't have enough information about the system to know what to do. The clocking for the bus could be shared with other devices, the clocks could be provided by other devices or the codec itself could be the clock master.
Sorry for not being precise enough here. In our case, the codec is slave and all clocks are provided by the I2S master, which is a PXA processor.
The idea is to provide a way for codecs to signal the need for the master clock (or whatever clock) to be applied externally. And it's actually not important what it takes to actually do that - a generic way the board support code can use would be good.
Is such an aproach feasible?
Best regards, Daniel
On Mon, Dec 08, 2008 at 01:04:50AM +0100, Daniel Mack wrote:
On Sun, Dec 07, 2008 at 07:02:40PM +0000, Mark Brown wrote:
The clocking is the responsibility of the board driver - often the clocks are disabled while the link is inctive but this up to the board driver (and ultimately the hardware design).
That makes sense. However, there is this special case I'm facing here - the codec's volume settings are updated and the board driver doesn't notice that, so it can't react and switch on the clock for some time.
The normal thing would be to never turn off the master clock. This is a very common setup - many systems will have a dedicated crystal for the codec which can never be disabled. Do you actually need to turn off the clock (and are you sure that the codec is happy about that)?
The idea is to provide a way for codecs to signal the need for the master clock (or whatever clock) to be applied externally. And it's actually not important what it takes to actually do that - a generic way the board support code can use would be good.
This would need to at least specify the clock that needs to be turned on and how many cycles it needs to be turned on for. I'm having a hard time getting enthusiastic about that, it seems like too much software especially since we can't use the clock API for any of this due to the inconsistent implementations.
In what circumstances does the codec need the clock? What are the negative consequences of it not being enabled? If it only needs the clock when bypass paths are active then we should just fix the bias level control to set bias on when only analogue paths are active (this ought to be done anyway) and the machine driver can use the bias level callback to enable the clocks. If it needs it at other times are you sure that the device is supposed to tolerate being used with the MCLK removed?
On Mon, Dec 08, 2008 at 09:48:43AM +0000, Mark Brown wrote:
That makes sense. However, there is this special case I'm facing here - the codec's volume settings are updated and the board driver doesn't notice that, so it can't react and switch on the clock for some time.
The normal thing would be to never turn off the master clock. This is a very common setup - many systems will have a dedicated crystal for the codec which can never be disabled. Do you actually need to turn off the clock (and are you sure that the codec is happy about that)?
The codec is happy about that, yes. And switching off the clock was planned to save some power, but the more I think about it, the more I tend to leave the clock running all the time.
In general, this chip has two purposes: one is an analog mixer and the other is a codec, and the clock is only needed when the DAC/ADC is in use or (for a short period) when new mixer values are written.
The idea is to provide a way for codecs to signal the need for the master clock (or whatever clock) to be applied externally. And it's actually not important what it takes to actually do that - a generic way the board support code can use would be good.
This would need to at least specify the clock that needs to be turned on and how many cycles it needs to be turned on for. I'm having a hard time getting enthusiastic about that, it seems like too much software especially since we can't use the clock API for any of this due to the inconsistent implementations.
That's why I was thinking about a general purpose callback mechanism like the one Jarkko suggested.
In what circumstances does the codec need the clock? What are the negative consequences of it not being enabled?
If the clock is not enabled, changes in volume registers are not applied to the actual mixers in hardware, e.g., the gain setting does simply not happen.
We'll think about how much disadvantage it is to keep the clock running all the time.
Thanks, Daniel
On Mon, Dec 08, 2008 at 01:18:55PM +0100, Daniel Mack wrote:
On Mon, Dec 08, 2008 at 09:48:43AM +0000, Mark Brown wrote:
This would need to at least specify the clock that needs to be turned on and how many cycles it needs to be turned on for. I'm having a hard time getting enthusiastic about that, it seems like too much software especially since we can't use the clock API for any of this due to the inconsistent implementations.
That's why I was thinking about a general purpose callback mechanism like the one Jarkko suggested.
Hrm? I'm talking about the form such a mechanism would take. Like I say, I think it would be likely to be more trouble than it's worth.
In what circumstances does the codec need the clock? What are the negative consequences of it not being enabled?
If the clock is not enabled, changes in volume registers are not applied to the actual mixers in hardware, e.g., the gain setting does simply not happen.
We'll think about how much disadvantage it is to keep the clock running all the time.
Note that I also suggested that leaving the clocks on while the analogue paths are enabled by ensuring that the DAPM bias configuration covers those and then using the machine bias callback; this would turn the clocks off when there's no audio which is likely to be all you need since the amplifiers will probably burn enough power to make the cost of the clock irrelevant.
We should be bringing the bias up to ON with analouge only paths anyway so this is a low cost approach.
On Mon, Dec 8, 2008 at 2:04 AM, Daniel Mack daniel@caiaq.org wrote:
Sorry for not being precise enough here. In our case, the codec is slave and all clocks are provided by the I2S master, which is a PXA processor.
The idea is to provide a way for codecs to signal the need for the master clock (or whatever clock) to be applied externally. And it's actually not important what it takes to actually do that - a generic way the board support code can use would be good.
Is such an aproach feasible?
My two cents. However, aic3x specific only.
Would it work if there is some set_mclk(int enable) callback in struct aic3x_setup_data, then codec->write in aic3x_init points to higher level aic3x_write_xxx which takes care of calling set_mclk(1) when needed and call aic3x_write? Also delayed set_mclk(0) can be put into aic3x driver.
Machine driver can then decide will it provide the callback.
Jarkko
On Mon, Dec 08, 2008 at 11:49:46AM +0200, Jarkko Nikula wrote:
On Mon, Dec 8, 2008 at 2:04 AM, Daniel Mack daniel@caiaq.org wrote:
My two cents. However, aic3x specific only.
[You might want to check your MUA - most of your recent mails have done the above and marked the first line of your reply as part of the quote.]
Would it work if there is some set_mclk(int enable) callback in struct aic3x_setup_data, then codec->write in aic3x_init points to higher level aic3x_write_xxx which takes care of calling set_mclk(1) when needed and call aic3x_write? Also delayed set_mclk(0) can be put into aic3x driver.
TBH I don't really see any benefit from doing this in the codec driver to doing it in the core.
On Mon, Dec 8, 2008 at 1:11 PM, Mark Brown broonie@sirena.org.uk wrote:
[You might want to check your MUA - most of your recent mails have done the above and marked the first line of your reply as part of the quote.]
Grrr. It's gmail & iceweacel.
Would it work if there is some set_mclk(int enable) callback in struct aic3x_setup_data, then codec->write in aic3x_init points to higher level aic3x_write_xxx which takes care of calling set_mclk(1) when needed and
call
aic3x_write? Also delayed set_mclk(0) can be put into aic3x driver.
TBH I don't really see any benefit from doing this in the codec driver to doing it in the core.
Agree. Also other codecs might use mclk for their internal state machine. IRCC, TSC2301 had some weird internal states and timeouts when mclk was required to be on and when it can be shutdown.
Jarkko
On Mon, Dec 08, 2008 at 01:40:28PM +0200, Jarkko Nikula wrote:
On Mon, Dec 8, 2008 at 1:11 PM, Mark Brown broonie@sirena.org.uk wrote:
TBH I don't really see any benefit from doing this in the codec driver to doing it in the core.
Agree. Also other codecs might use mclk for their internal state machine. IRCC, TSC2301 had some weird internal states and timeouts when mclk was required to be on and when it can be shutdown.
It's fairly common, but most devices either only really need the clock when they've got active paths or aren't supposed to have the clock removed at all so I'm not sure we need a specific mechanism for this.
participants (3)
-
Daniel Mack
-
Jarkko Nikula
-
Mark Brown