[alsa-devel] ALSA and MediaController
During the ALSA/Embedded audio workshop two weeks ago we talked about using the MediaController API (merged in 2.6.39) to provide userspace with a graph-like representation of the transforms implemented in firmware or hardware. Two main applications were mentioned: - volume control, at the moment in PulseAudio we get a flat list of volume controls and we use a convoluted mixer logic to figure out if 'pcm playback volume' is done before or after 'master playback volume'. - tuning applications. With literally dozens of controls exposed in embedded solutions, a flat list is going to be very hard to use. It would make sense during the device tuning steps to figure out which controls is located where to better understand its impacts. This would help create UCM configuration files as well.
At the moment the Media Controller does not provide means to set any values/parameters in a media_entity. We would need to cross-reference each media_entity with ALSA controls and use the existing control get/set interfaces. This raises 2 main questions: - Is the list of ALSA controls is completely defined after the card initialization step? Or do we have cases where ALSA controls are created dynamically after the init? I tend to believe the former is true but want to verify this with others on the mailing list. It's my understanding that only connections may be changed in such a graph. - Laurent suggested a new ioctl to export media_entity information to userspace, as a binary buffer containing a list of TLV (Type-Length-Value) field. However it looks like there's already an ELEM_BYTES control type. It might make more sense to expose this information using native ALSA controls, but again feedback from smart people on this list would be welcome.
Thanks, -Pierre
On Tue, May 17, 2011 at 12:48:59PM -0500, pl bossart wrote:
- Is the list of ALSA controls is completely defined after the card
initialization step? Or do we have cases where ALSA controls are created dynamically after the init? I tend to believe the former is true but want to verify this with others on the mailing list. It's my understanding that only connections may be changed in such a graph.
Currently it is static for at least ASoC but I'd expect this will change as the DSP stuff gets more advanced.
At Tue, 17 May 2011 11:02:28 -0700, Mark Brown wrote:
On Tue, May 17, 2011 at 12:48:59PM -0500, pl bossart wrote:
- Is the list of ALSA controls is completely defined after the card
initialization step? Or do we have cases where ALSA controls are created dynamically after the init? I tend to believe the former is true but want to verify this with others on the mailing list. It's my understanding that only connections may be changed in such a graph.
Currently it is static for at least ASoC but I'd expect this will change as the DSP stuff gets more advanced.
There are actually cases where controls are dynamically created/removed. We had it in some drivers, but I don't remember whether we still keep it or changed in a different way. Anyway, the alsa-lib API also can notify the event per addition/removal of controls, too.
Takashi
At Tue, 17 May 2011 12:48:59 -0500, pl bossart wrote:
During the ALSA/Embedded audio workshop two weeks ago we talked about using the MediaController API (merged in 2.6.39) to provide userspace with a graph-like representation of the transforms implemented in firmware or hardware. Two main applications were mentioned:
- volume control, at the moment in PulseAudio we get a flat list of
volume controls and we use a convoluted mixer logic to figure out if 'pcm playback volume' is done before or after 'master playback volume'.
- tuning applications. With literally dozens of controls exposed in
embedded solutions, a flat list is going to be very hard to use. It would make sense during the device tuning steps to figure out which controls is located where to better understand its impacts. This would help create UCM configuration files as well.
The current flat array implementation is a result after hard time with attempts of implementing the graph in mixer API at the time of ALSA 0.5.x. This ended up with a total mess of the code.
Maybe now we'd implement it differently, but I believe that the flat array itself is the best as the primary data form. We can provide some extended attributes to each control if needed.
At the moment the Media Controller does not provide means to set any values/parameters in a media_entity. We would need to cross-reference each media_entity with ALSA controls and use the existing control get/set interfaces. This raises 2 main questions:
- Is the list of ALSA controls is completely defined after the card
initialization step? Or do we have cases where ALSA controls are created dynamically after the init? I tend to believe the former is true but want to verify this with others on the mailing list. It's my understanding that only connections may be changed in such a graph.
- Laurent suggested a new ioctl to export media_entity information to
userspace, as a binary buffer containing a list of TLV (Type-Length-Value) field. However it looks like there's already an ELEM_BYTES control type. It might make more sense to expose this information using native ALSA controls, but again feedback from smart people on this list would be welcome.
You may use a TLV information provided for each control element, too. So far, this provides only the dB information. But it doesn't have to be restricted to that.
Takashi
On Tue, May 17, 2011 at 08:43:43PM +0200, Takashi Iwai wrote:
The current flat array implementation is a result after hard time with attempts of implementing the graph in mixer API at the time of ALSA 0.5.x. This ended up with a total mess of the code.
Maybe now we'd implement it differently, but I believe that the flat array itself is the best as the primary data form. We can provide some extended attributes to each control if needed.
Yeah, the discussion was to initially keep the array of controls and associate that with a separate graph so that we have a way for userspace to figure out how all the controls are plumbed together and what they actually mean. One of the main drivers for interest in the media controller API was that it's already doing the graph drawing to userspace bit.
- Laurent suggested a new ioctl to export media_entity information to
userspace, as a binary buffer containing a list of TLV (Type-Length-Value) field. However it looks like there's already an ELEM_BYTES control type. It might make more sense to expose this information using native ALSA controls, but again feedback from smart people on this list would be welcome.
You may use a TLV information provided for each control element, too. So far, this provides only the dB information. But it doesn't have to be restricted to that.
We also need to be able to add non-control nodes to what's visible to the user, showing fixed function connections.
2011/5/18 pl bossart bossart.nospam@gmail.com:
During the ALSA/Embedded audio workshop two weeks ago we talked about using the MediaController API (merged in 2.6.39) to provide userspace with a graph-like representation of the transforms implemented in firmware or hardware.
How Media controller or PA handle hda jack retasking of those 3 jacks motherboard (channel mode or smart 5.1 switch) ?
For those mini ITX motherboard without front audio panel, there will be no capture source if the HDA codec does not support stereo mix after the driver retask those input jacks for 5.1 outputs
Does this mean that those 3 jacks , 5 jacks and 6 jacks desktop need to add the rear panel jack to use snd_jack_report ?
Raymond, it's fairly basic process for Linux lists to do a reply to all. Please pay attention to this.
On Sat, May 21, 2011 at 10:43:55AM +0800, Raymond Yau wrote:
2011/5/18 pl bossart bossart.nospam@gmail.com:
During the ALSA/Embedded audio workshop two weeks ago we talked about using the MediaController API (merged in 2.6.39) to provide userspace with a graph-like representation of the transforms implemented in firmware or hardware.
How Media controller or PA handle hda jack retasking of those 3 jacks motherboard (channel mode or smart 5.1 switch) ?
For those mini ITX motherboard without front audio panel, there will be no capture source if the HDA codec does not support stereo mix after the driver retask those input jacks for 5.1 outputs
This just sounds like a fairly basic reconfiguration of the CODEC, it's no different to any other rerouting.
Does this mean that those 3 jacks , 5 jacks and 6 jacks desktop need to add the rear panel jack to use snd_jack_report ?
It seems like a good idea to report any jacks we can do detection on....
2011/5/21 Mark Brown broonie@opensource.wolfsonmicro.com:
How Media controller or PA handle hda jack retasking of those 3 jacks motherboard (channel mode or smart 5.1 switch) ?
For those mini ITX motherboard without front audio panel, there will be no capture source if the HDA codec does not support stereo mix after the driver retask those input jacks for 5.1 outputs
This just sounds like a fairly basic reconfiguration of the CODEC, it's no different to any other rerouting.
Does this mean that those 3 jacks , 5 jacks and 6 jacks desktop need to add the rear panel jack to use snd_jack_report ?
It seems like a good idea to report any jacks we can do detection on....
But snd_jack API seem does not has any definition for "Line In" - Blue jacks ?
The jack retasking is performed by snd_hda_sequence_write_cache(codec, chmode[mode].sequence) in snd_hda_ch_mode_put() for some HDA codecs
On Mon, May 23, 2011 at 01:39:08PM +0800, Raymond Yau wrote:
2011/5/21 Mark Brown broonie@opensource.wolfsonmicro.com:
I see you've ignored my remarks about CCing people and managed to cut that text without also deleting other unneeded context; please do CC people, it is actually important if you want people to engage in discussion.
But snd_jack API seem does not has any definition for "Line In" - Blue jacks ?
I don't see any substantial issue with adding such a definition if someone has use for it?
The jack retasking is performed by snd_hda_sequence_write_cache(codec, chmode[mode].sequence) in snd_hda_ch_mode_put() for some HDA codecs
I'm not sure what the point you're trying to make here is - the internals of individual drivers shouldn't be too relevant to external interfaces?
participants (4)
-
Mark Brown
-
pl bossart
-
Raymond Yau
-
Takashi Iwai