On Thu, May 02, 2013 at 09:54:21AM +0200, Fabio Baltieri wrote:
On Tue, Apr 30, 2013 at 07:30:35PM +0100, Mark Brown wrote:
On Tue, Apr 30, 2013 at 04:09:53PM +0200, Fabio Baltieri wrote:
Move ab8500 clock control definitions to the ab8500 codec driver, leaving only card specific setting in mop500_ab8500_ctrls.
So, if this is some generic thing and not some weird stuff for the card this really reopens the question about why this is done with user visible controls...
static struct snd_kcontrol_new ab8500_ctrls[] = {
- /* Digital interface - Clocks */
- SOC_SINGLE("Digital Interface Master Generator Switch",
AB8500_DIGIFCONF1, AB8500_DIGIFCONF1_ENMASTGEN,
1, 0),
- SOC_SINGLE("Digital Interface 0 Bit-clock Switch",
AB8500_DIGIFCONF1, AB8500_DIGIFCONF1_ENFSBITCLK0,
1, 0),
- SOC_SINGLE("Digital Interface 1 Bit-clock Switch",
AB8500_DIGIFCONF1, AB8500_DIGIFCONF1_ENFSBITCLK1,
1, 0),
...this is all stuff that is normally figured out automatically by the drivers, we know when the audio interface is in use and hence when it needs to be clocked.
It makes sense, I'll poke the documentation I have and try to figure out how to control those bits from a more appropriate place.
Well, it looks like this is *already* handled automatically by the ab8500 codec driver in ab8500_codec_set_dai_clock_gate(), and these controls just allow some messy degree of overriding after the audio stream is started. At this point the only thing that comes to my mind is that this is some debug leftover and I'm replying with a v2 to just drop these three widgets altogether.
In the meantime, I'm also observing some funny behavior of other alsamixer controls, but I'll fix those on a separate series. This is enough to bring the driver back on a working state.
Thanks, Fabio