[alsa-devel] Disable DAPM for click avoidance ?
I'm having an issue with a codec where the normal DAPM procedure of disabling most of the codec when there is no stream ongoing results in large clicks when a playback stream starts and stops, because the output jumps from 0V when the output is disabled to over a volt when the output is enabled.
There appears to be a 'slow stop/start' feature in the codec, however, it takes about one second to ramp up and down the output voltage, and that is too long a delay when starting a stream.
In our usecase the power consumtion of the codec is negligable compared to the rest of the electronics, so leaving the codec running at all times is perfectly feasible.
Is there any way from a machine driver to disable the DAPM switching, i.e. leave parts of the codec running at all times? In this specific case, there are two internal PM controls which enable the output. One of them is controlled by an output mixer element (SND_SOC_DAPM_MIXER_E), and the other one from set_bias_level().
I've had a look in the core code, but it seems that the whole business of changing the bias level when a stream is started is always performed and that there is no way to say "Set the bias level to ON and leave it there".
An alternative would be to augment the codec driver with an 'always on' feature, which effectively always enables PM for certain parts of the chip. But it seems like the wrong place to do this type of thing.
Another use for this feature is that we have had a lot of problems over the years with buggy I2S interfaces misbehaving when streams are started and stopped, and there would be an advantage to letting the AIF run the whole time after startup.
/Ricard
On Wed, Nov 22, 2017 at 01:33:37PM +0100, Ricard Wanderlof wrote:
Always CC maintainers on postings if you want them to be reliably seen...
I'm having an issue with a codec where the normal DAPM procedure of disabling most of the codec when there is no stream ongoing results in large clicks when a playback stream starts and stops, because the output jumps from 0V when the output is disabled to over a volt when the output is enabled.
There appears to be a 'slow stop/start' feature in the codec, however, it takes about one second to ramp up and down the output voltage, and that is too long a delay when starting a stream.
The CODEC driver is broken, it shouldn't be powering off on idle if it takes a second to ramp, it should be leaving VMID up while the system is running - this is what the _STANDBY bias level is all about. Look at what older CODEC drivers like the wm8731 do.
Another use for this feature is that we have had a lot of problems over the years with buggy I2S interfaces misbehaving when streams are started and stopped, and there would be an advantage to letting the AIF run the whole time after startup.
This is why we've got the digital_mute() callback - the idea is that the output is just going to be discarded until the interface has had a chance to settle down.
On Wed, 22 Nov 2017, Mark Brown wrote:
On Wed, Nov 22, 2017 at 01:33:37PM +0100, Ricard Wanderlof wrote:
Always CC maintainers on postings if you want them to be reliably seen...
Ok. I thought it was more of a general question than a maintainer question but I'll keep it in mind in future.
I'm having an issue with a codec where the normal DAPM procedure of disabling most of the codec when there is no stream ongoing results in large clicks when a playback stream starts and stops, because the output jumps from 0V when the output is disabled to over a volt when the output is enabled.
There appears to be a 'slow stop/start' feature in the codec, however, it takes about one second to ramp up and down the output voltage, and that is too long a delay when starting a stream.
The CODEC driver is broken, it shouldn't be powering off on idle if it takes a second to ramp, it should be leaving VMID up while the system is running - this is what the _STANDBY bias level is all about. Look at what older CODEC drivers like the wm8731 do.
It actually does leave VMID on as soon as the codec is not at the _OFF level, the problem is really that the output buffer is not enabled until a playback stream is set up, and its not until the output buffer is enabled that the output attains its operating level.
Of course, one solution would be to change the driver to enable the output already at the _STANDBY level. It does seem to me that that would violate part of the whole point of DAPM, on the other hand, I would think that at some point power management must take a second seat to actually getting a good user experience.
Another use for this feature is that we have had a lot of problems over the years with buggy I2S interfaces misbehaving when streams are started and stopped, and there would be an advantage to letting the AIF run the whole time after startup.
This is why we've got the digital_mute() callback - the idea is that the output is just going to be discarded until the interface has had a chance to settle down.
The problems we've had have been on the CPU side, with I2S interfaces that have problems synchronizing to an ongoing I2S stream when the CPU AIF is in slave mode, to the extent that in some cases its impossible to determine if the first sample received by or sent from the I2S interface is the left or right channel. If the DAIs had been set up once and been allowed to run continuously even when there is no actual audio to transfer, it would have helped this situation. Agreed, I'd rather have a DAI that worked properly, but sometimes one doesn't have that luxery. :-(
/Ricard
On Thu, Nov 23, 2017 at 10:41:33AM +0100, Ricard Wanderlof wrote:
On Wed, 22 Nov 2017, Mark Brown wrote:
On Wed, Nov 22, 2017 at 01:33:37PM +0100, Ricard Wanderlof wrote:
Always CC maintainers on postings if you want them to be reliably seen...
Ok. I thought it was more of a general question than a maintainer question but I'll keep it in mind in future.
If you don't copy any actual people the chances of getting a response are vastly reduced.
The CODEC driver is broken, it shouldn't be powering off on idle if it takes a second to ramp, it should be leaving VMID up while the system is running - this is what the _STANDBY bias level is all about. Look at what older CODEC drivers like the wm8731 do.
It actually does leave VMID on as soon as the codec is not at the _OFF level, the problem is really that the output buffer is not enabled until a playback stream is set up, and its not until the output buffer is enabled that the output attains its operating level.
Of course, one solution would be to change the driver to enable the output already at the _STANDBY level. It does seem to me that that would violate part of the whole point of DAPM, on the other hand, I would think that at some point power management must take a second seat to actually getting a good user experience.
Yes, you'll need to do something like leave the outputs enabled. Most devices have some facility to keep VMID switched to the outputs, though I did see a few where someone thought it was a good idea to power on from cold always. A device that old isn't going to be that competitive power wise anyway even if it were well implemented which this one seems to not have been.
This is why we've got the digital_mute() callback - the idea is that the output is just going to be discarded until the interface has had a chance to settle down.
The problems we've had have been on the CPU side, with I2S interfaces that have problems synchronizing to an ongoing I2S stream when the CPU AIF is
That's what the digital mute handling is for - we're just throwing away any garbage the CPU puts out before it's stable. We're not expecting it to work around any CODEC side issues.
in slave mode, to the extent that in some cases its impossible to determine if the first sample received by or sent from the I2S interface is the left or right channel. If the DAIs had been set up once and been allowed to run continuously even when there is no actual audio to transfer, it would have helped this situation. Agreed, I'd rather have a DAI that worked properly, but sometimes one doesn't have that luxery. :-(
If you're getting L/R swap issues on some startups leaving things enabled all the time will mean that your random swap issue gets moved to boot.
On Thu, 23 Nov 2017, Mark Brown wrote:
If you don't copy any actual people the chances of getting a response are vastly reduced.
My point has been that I don't want to pester any specific people ... but point taken that otherwise it just ends up as an FYI.
Of course, one solution would be to change the driver to enable the output already at the _STANDBY level. It does seem to me that that would violate part of the whole point of DAPM, on the other hand, I would think that at some point power management must take a second seat to actually getting a good user experience.
Yes, you'll need to do something like leave the outputs enabled. Most devices have some facility to keep VMID switched to the outputs, though I did see a few where someone thought it was a good idea to power on from cold always. A device that old isn't going to be that competitive power wise anyway even if it were well implemented which this one seems to not have been.
In my case the codec is only a couple of years old, so I'd say it qualifies as 'modern'. There's a table in the data sheet describing the power consumption for various combinations of the PM control bits but unfortunately the case of having the outbut buffer enabled but not the DAC is not listed. I could always ask the manufacturer directly of course. Or simply measure it myself on an evaluation board to get an idea.
This is why we've got the digital_mute() callback - the idea is that the output is just going to be discarded until the interface has had a chance to settle down.
The problems we've had have been on the CPU side, with I2S interfaces that have problems synchronizing to an ongoing I2S stream when the CPU AIF is
That's what the digital mute handling is for - we're just throwing away any garbage the CPU puts out before it's stable. We're not expecting it to work around any CODEC side issues.
During which time period is the digital mute enabled? From the first stream startup until everything is up and running?
in slave mode, to the extent that in some cases its impossible to determine if the first sample received by or sent from the I2S interface is the left or right channel. If the DAIs had been set up once and been allowed to run continuously even when there is no actual audio to transfer, it would have helped this situation. Agreed, I'd rather have a DAI that worked properly, but sometimes one doesn't have that luxery. :-(
If you're getting L/R swap issues on some startups leaving things enabled all the time will mean that your random swap issue gets moved to boot.
Basically the situation I had was that it worked fine on first startup, it was on subsequent startups/shutdowns that the problems occurred, as shutting down the audio streams seemed to leave the CPU DAI in some intermediate state, from which it could only recover if completely reset, which did not play well with the concept of for instance having a capture stream running continuously while a playback stream was started and stopped.
/Ricard
On Thu, Nov 23, 2017 at 03:32:26PM +0100, Ricard Wanderlof wrote:
On Thu, 23 Nov 2017, Mark Brown wrote:
Yes, you'll need to do something like leave the outputs enabled. Most devices have some facility to keep VMID switched to the outputs, though I did see a few where someone thought it was a good idea to power on from cold always. A device that old isn't going to be that competitive power wise anyway even if it were well implemented which this one seems to not have been.
In my case the codec is only a couple of years old, so I'd say it qualifies as 'modern'. There's a table in the data sheet describing the
Wow, people are still designing new VMID based CODECs - what is this one?
That's what the digital mute handling is for - we're just throwing away any garbage the CPU puts out before it's stable. We're not expecting it to work around any CODEC side issues.
During which time period is the digital mute enabled? From the first stream startup until everything is up and running?
It should be enabled at all times that we're not actively playing anything. Though now that I think about it we need to go through and mute everything during probe otherwise it won't be muted on first power on.
If you're getting L/R swap issues on some startups leaving things enabled all the time will mean that your random swap issue gets moved to boot.
Basically the situation I had was that it worked fine on first startup, it was on subsequent startups/shutdowns that the problems occurred, as shutting down the audio streams seemed to leave the CPU DAI in some intermediate state, from which it could only recover if completely reset, which did not play well with the concept of for instance having a capture stream running continuously while a playback stream was started and stopped.
That's a new (and buggy) one...
On Thu, 23 Nov 2017, Mark Brown wrote:
In my case the codec is only a couple of years old, so I'd say it qualifies as 'modern'. There's a table in the data sheet describing the
Wow, people are still designing new VMID based CODECs - what is this one?
In this case it's the AK4637. From the data sheet it looks to be from 2015.
I know there are codecs that do not need their inputs/outputs (internally) biased to a VMID, but have a 0V bias level using an internal charge-pump design to generate the required internal negative voltages, if that is what you're thinking of. One disadvantage from a cost perspective is the external capacitors needed for the charge pump to operate, and the design in which this one is used is extermely cost optimized.
During which time period is the digital mute enabled? From the first stream startup until everything is up and running?
It should be enabled at all times that we're not actively playing anything.
Ok. So it really would be the last callback made (with mute = off) to the codec driver once everything else has been set up then?
/Ricard
On Fri, Nov 24, 2017 at 10:43:48AM +0100, Ricard Wanderlof wrote:
I know there are codecs that do not need their inputs/outputs (internally) biased to a VMID, but have a 0V bias level using an internal charge-pump design to generate the required internal negative voltages, if that is what you're thinking of. One disadvantage from a cost perspective is the external capacitors needed for the charge pump to operate, and the design in which this one is used is extermely cost optimized.
Right, but it's not like none of that wasn't already covered by the many existing CODECs in that space that already paid back their development costs years ago, it's surprising that anyone would try to make something new there.
During which time period is the digital mute enabled? From the first stream startup until everything is up and running?
It should be enabled at all times that we're not actively playing anything.
Ok. So it really would be the last callback made (with mute = off) to the codec driver once everything else has been set up then?
That's the idea.
Hi Mark,
Picking up an old thread here, I'm working on a VMID based codec with associated driver from AKM, where the VMID and audio outputs are only switched on when the devices is actually outputting audio data (in the playback case), causing loud clicks when a stream is started and stopped.
On Wed, 22 Nov 2017, Mark Brown wrote:
On Wed, Nov 22, 2017 at 01:33:37PM +0100, Ricard Wanderlof wrote:
I'm having an issue with a codec where the normal DAPM procedure of disabling most of the codec when there is no stream ongoing results in large clicks when a playback stream starts and stops, because the output jumps from 0V when the output is disabled to over a volt when the output is enabled.
There appears to be a 'slow stop/start' feature in the codec, however, it takes about one second to ramp up and down the output voltage, and that is too long a delay when starting a stream.
The CODEC driver is broken, it shouldn't be powering off on idle if it takes a second to ramp, it should be leaving VMID up while the system is running - this is what the _STANDBY bias level is all about. Look at what older CODEC drivers like the wm8731 do.
I was tracing through what actually is called in the codec driver when a stream is started, and from what I can see ALSA sequences through the STANDBY, PREPARE and ON states when a stream is started, with no other calls to set_bias_level prior to that. So it would seem that the STANDBY level is reached during the process of starting the stream, in other words, VMID and the outputs would have to be enabled already in the OFF state.
In our system, the power consumed by the codec is negligible compared to the rest of the system at all times, so it's not a problem to leave basically the whole codec up, but I am trying understand what is the Right Thing to do here.
/Ricard
On Mon, Mar 19, 2018 at 04:53:23PM +0100, Ricard Wanderlof wrote:
On Wed, 22 Nov 2017, Mark Brown wrote:
The CODEC driver is broken, it shouldn't be powering off on idle if it takes a second to ramp, it should be leaving VMID up while the system is running - this is what the _STANDBY bias level is all about. Look at what older CODEC drivers like the wm8731 do.
I was tracing through what actually is called in the codec driver when a stream is started, and from what I can see ALSA sequences through the STANDBY, PREPARE and ON states when a stream is started, with no other calls to set_bias_level prior to that. So it would seem that the STANDBY level is reached during the process of starting the stream, in other words, VMID and the outputs would have to be enabled already in the OFF state.
To repeat if you're setting idle_bias_off for a CODEC like this you're doing it wrong. The device shouldn't be going to a bias level lower than _STANDBY while the system is out of suspend.
In our system, the power consumed by the codec is negligible compared to the rest of the system at all times, so it's not a problem to leave basically the whole codec up, but I am trying understand what is the Right Thing to do here.
Are you sure that the device doesn't have any facility for switching the outputs to a lower power maintenance state while things are idle? This sounds like an extremely basic device, even by the standards of a decade ago. The reason for powering down the outputs isn't just a power one, often especially with these older devices there's an audible noise floor even when playing silence which is distracting for users in quiet environments. Powering down eliminates this noise floor.
The easiest thing to do if you just want to keep playing audio all the time is to handle this in the sound server and have it keep playing. If the hardware really is as limited as it seems there's going to be no good way to support it. Without knowing how flexible it is or what features it has it's difficult to be specific about the best way to handle it but whatever gets done having a stream coming from the CPU all the time will work. The more control there is the more likely it is that the driver is going to have to power it down on idle anyway to allow configuration changes to be made without generating pops and clicks.
On Wed, 21 Mar 2018, Mark Brown wrote:
The CODEC driver is broken, it shouldn't be powering off on idle if it takes a second to ramp, it should be leaving VMID up while the system is running - this is what the _STANDBY bias level is all about. Look at what older CODEC drivers like the wm8731 do.
I was tracing through what actually is called in the codec driver when a stream is started, and from what I can see ALSA sequences through the STANDBY, PREPARE and ON states when a stream is started, with no other calls to set_bias_level prior to that. So it would seem that the STANDBY level is reached during the process of starting the stream, in other words, VMID and the outputs would have to be enabled already in the OFF state.
To repeat if you're setting idle_bias_off for a CODEC like this you're doing it wrong. The device shouldn't be going to a bias level lower than _STANDBY while the system is out of suspend.
Thanks for the pointer, we've gotten the driver from the codec manufacturer so I haven't been very familiar with it, and I wasn't even aware this configuration option existed. The driver does indeed set idle_bias_off, and clearing this bit was the first step to getting it to work. I'll spare you the gory details.
My biggest issue here has really been understanding how ALSA expects the device to behave, and which options there are. Once I've got that clear getting the codec to behave correspondingly is mostly a question of reading the data sheet and writing code.
Are you sure that the device doesn't have any facility for switching the outputs to a lower power maintenance state while things are idle? This sounds like an extremely basic device, even by the standards of a decade ago. The reason for powering down the outputs isn't just a power one, often especially with these older devices there's an audible noise floor even when playing silence which is distracting for users in quiet environments. Powering down eliminates this noise floor.
Yes, there is a bit for setting the output amplifier in power save mode, which is handled as a PM bit in the driver. The specs aren't that bad, the worst case SNR is 80 dB, with 97 dB typical. The good thing about using this bit is that when the output amplifier is in power save mode, there is actually a high impedance connection to VMID, so that as long as the device is powered on properly, the outputs take a while to reach VMID depending on the value of the DC blocking cap on the output, and the slow ramp also avoids clicks when powering on the device (and we don't start an output stream until quite a while after powering on the system).
/Ricard
On Fri, Mar 23, 2018 at 05:23:23PM +0100, Ricard Wanderlof wrote:
Yes, there is a bit for setting the output amplifier in power save mode, which is handled as a PM bit in the driver. The specs aren't that bad, the worst case SNR is 80 dB, with 97 dB typical. The good thing about using this bit is that when the output amplifier is in power save mode, there is actually a high impedance connection to VMID, so that as long as the device is powered on properly, the outputs take a while to reach VMID depending on the value of the DC blocking cap on the output, and the slow ramp also avoids clicks when powering on the device (and we don't start an output stream until quite a while after powering on the system).
That feature sounds normal for such old devices, you should be using that rather than ramping VMID from ground every time you want to start and stop playback. There's a number of very old drivers doing something similar, grepping around will probably show something.
participants (2)
-
Mark Brown
-
Ricard Wanderlof