Re: [alsa-devel] [PATCH]ASoc:ZOOM2:Add support for DSP rendering.
On Thu, Nov 05, 2009 at 04:21:31PM -0600, Jane Wang wrote:
New mixer controls added to send dapm stream events to power up/down TWL4030 DAPM widgets along playback and capture paths. In case of DSP rendering, use the corresponding mixer control to power up widgets before stream starts and power down the widges after stream stops.
What does the hardware look like and what is the problem you're trying to solve here? Sending stream events into the core is a definite layering violation and is likely to fail with some use case transitions but without knowing what the patch is trying to achieve it's hard to offer much advice.
+static const char *path_control[] = {"Off", "On"};
+static const struct soc_enum zoom2_enum[] = {
- SOC_ENUM_SINGLE_EXT(ARRAY_SIZE(path_control), path_control),
+};
Why is this an enumerated control and not a switch?
-----Original Message----- From: Mark Brown [mailto:broonie@opensource.wolfsonmicro.com] Sent: Friday, November 06, 2009 8:03 AM To: Wang, Jane Cc: alsa-devel@alsa-project.org; peter.ujfalusi@nokia.com Subject: Re: [alsa-devel][PATCH]ASoc:ZOOM2:Add support for DSP rendering.
On Thu, Nov 05, 2009 at 04:21:31PM -0600, Jane Wang wrote:
New mixer controls added to send dapm stream events to power up/down
TWL4030
DAPM widgets along playback and capture paths. In case of DSP rendering, use the corresponding mixer control to power up widgets before stream
starts
and power down the widges after stream stops.
What does the hardware look like and what is the problem you're trying to solve here? Sending stream events into the core is a definite layering violation and is likely to fail with some use case transitions but without knowing what the patch is trying to achieve it's hard to offer much advice.
The scenario is data is going between DSP<->McBSP<->TWL4030. DSP configures and controls McBSP ports. Since DSP does not have control over TWl4030 codec, the role of ALSA here is providing mixer controls for user space application to configure codec to prepare for DSP rendering.
TWL4030 driver does not provide explicit controls over DAC/ADC, mic bias, APLL, headset pop-noise attenuation etc, and we need these to make playback/capture happen. I can think of two ways to handle this, one is add controls in codec to explicitly control these widgets, or using DAPM stream events as a single control to power up/down the path. And I feel the second solution is cleaner...
+static const char *path_control[] = {"Off", "On"};
+static const struct soc_enum zoom2_enum[] = {
- SOC_ENUM_SINGLE_EXT(ARRAY_SIZE(path_control), path_control),
+};
Why is this an enumerated control and not a switch?
I can change this and resubmit the patch.
On Mon, Nov 09, 2009 at 11:22:47AM -0600, Wang, Jane wrote:
Please fix your MUA to word wrap your mails; it makes them very much more legible and easy to reply to.
The scenario is data is going between DSP<->McBSP<->TWL4030. DSP configures and controls McBSP ports. Since DSP does not have control over TWl4030 codec, the role of ALSA here is providing mixer controls for user space application to configure codec to prepare for DSP rendering.
So in this setup the DSP is essentially a separate device on the audio bus; it's very much similar to things like the bluetooth chipset you can see in the OpenMoko devices and ought to be presented in a similar way as a separate DAI.
TWL4030 driver does not provide explicit controls over DAC/ADC, mic bias, APLL, headset pop-noise attenuation etc, and we need these to make playback/capture happen. I can think of two ways to handle this, one is add controls in codec to explicitly control these widgets, or using DAPM stream events as a single control to power up/down the path. And I feel the second solution is cleaner...
Representing the DSP DAI to ASoC is what's needed here, check out the OpenMoko code for an example of solving a similar problem.
On Tuesday 10 November 2009 15:03:34 ext Mark Brown wrote:
On Mon, Nov 09, 2009 at 11:22:47AM -0600, Wang, Jane wrote:
Representing the DSP DAI to ASoC is what's needed here, check out the OpenMoko code for an example of solving a similar problem.
I agree, the DSP should have it's own platform driver (cpu_dai). I'm not really following the DSP development, so I'm not sure how it can be done, but if it is done like this, than you could be able to hook up any codec to the DSP, which would be really nice.
participants (3)
-
Mark Brown
-
Peter Ujfalusi
-
Wang, Jane