Re: [alsa-devel] Jack event API - decision needed
Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
David Henningsson david.henningsson@canonical.com wrote:
Hi,
I'm still new in the community in that sense that I'm not sure how decisions are made. But I could use the outcome of such a decision.
Background: I'm trying to pull together the missing pieces of jack detection on both kernel / plumbing / application layers, so that when you plug something in (headset, microphone etc), userspace is notified and can take appropriate actions (e g routing decisions).
As part of that I wrote a udev patch a few days ago, which nobody commented on in alsa-devel [1], but was somewhat disliked by at least Kay Sievers who maintains udev [2], who preferred we would rewrite our input layer to do something else within ALSA.
So before I proceed further I'd like to know if
- We're continuing the path with /dev/input devices
2a) We'll rewrite these devices to be read-only ALSA mixer controls
2b) We'll rewrite these devices to be something within ALSA, but not exactly mixer controls.
For options 2a) and 2b) I guess the existing /dev/input thing should be deprecated and/or removed. So part of decision should maybe be based on information about how widespread the usage of these devices are currently...?
-- David Henningsson, Canonical Ltd. http://launchpad.net/~diwic
[1] http://mailman.alsa-project.org/pipermail/alsa-devel/2011-June/040916.html
On 2011-06-20 15:56, Mark Brown wrote:
Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address,your proposed solution and the problems Kay had.
This particular issue and the udev patch was about letting the current logged in user access the input devices, which are currently only accessible by root.
I'm sure there will be quite a few more issues before something that "just works" for the vast majority of people!
I'll leave it to Kay to explain what problems he had with accepting the patch.
On Mon, Jun 20, 2011 at 15:56, Mark Brown broonie@opensource.wolfsonmicro.com wrote:
Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
Domain-specific events should not be 'faked' as 'input' devices. No non-input subsystem should do that if it can be avoided. That 'input' supports nice and easy events to userspace should not be a reason to misuse it for pretty much unrelated things.
There are patches to have the ALSA control device emit these ALSA related events natively. That's would be just better, simpler and more correct than any additionally created input device.
If Takashi can make that possible in a reasonable time frame, we should not even start handling the (currently not handled) input devices in upstream projects like udev and PulseAudio, and focus right away on the native ALSA control events.
If we can't have the native ALSA events anytime soon for some reason, we might need to merge the input device support, but I would like to avoid that.
Kay
At Mon, 20 Jun 2011 16:19:51 +0200, Kay Sievers wrote:
On Mon, Jun 20, 2011 at 15:56, Mark Brown broonie@opensource.wolfsonmicro.com wrote:
Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
Domain-specific events should not be 'faked' as 'input' devices. No non-input subsystem should do that if it can be avoided. That 'input' supports nice and easy events to userspace should not be a reason to misuse it for pretty much unrelated things.
There are patches to have the ALSA control device emit these ALSA related events natively. That's would be just better, simpler and more correct than any additionally created input device.
If Takashi can make that possible in a reasonable time frame, we should not even start handling the (currently not handled) input devices in upstream projects like udev and PulseAudio, and focus right away on the native ALSA control events.
If we can't have the native ALSA events anytime soon for some reason, we might need to merge the input device support, but I would like to avoid that.
Well, the implementation would be relatively easy. There was already a patch, so not too hard to revisit.
But, there are still some open questions. For example, what information is mandatory and what information is preferred about the pin. HD-audio provides the location, type, color, etc. We can encode these into a single name string, but is it a preferred way?
Alternatively, the control element can provide the HD-audio pin config via an extra TLV data, so that apps can refer to it if needed in addition to the name string.
Also, we may consider some way to expose the corelation of the jack control element and other mixer elements. Though, this sounds optional to me for the time being.
These questions are basically requirements from the apps; so I'd like to know the exact demands before going to implementation.
thanks,
Takashi
On Mon, Jun 20, 2011 at 05:35:00PM +0200, Takashi Iwai wrote:
Also, we may consider some way to expose the corelation of the jack control element and other mixer elements. Though, this sounds optional to me for the time being.
_From that point of view it'd be much better to do this as part of exposing the full routing map and positioning of the controls within the map to applications - there has been some discussion about using the media controller API to do that within the context of ASoC though not much work on actually implementing it yet. This would solve a lot of problems with figuring out how the card fits together that we have currently.
At Mon, 20 Jun 2011 17:52:37 +0100, Mark Brown wrote:
On Mon, Jun 20, 2011 at 05:35:00PM +0200, Takashi Iwai wrote:
Also, we may consider some way to expose the corelation of the jack control element and other mixer elements. Though, this sounds optional to me for the time being.
From that point of view it'd be much better to do this as part of exposing the full routing map and positioning of the controls within the map to applications - there has been some discussion about using the media controller API to do that within the context of ASoC though not much work on actually implementing it yet. This would solve a lot of problems with figuring out how the card fits together that we have currently.
Yeah, that'd be another option. Though, I don't know what actually PA would need for the jack-detection for the current implementation...
Takashi
On 2011-06-20 17:35, Takashi Iwai wrote:
At Mon, 20 Jun 2011 16:19:51 +0200, Kay Sievers wrote:
On Mon, Jun 20, 2011 at 15:56, Mark Brown broonie@opensource.wolfsonmicro.com wrote:
Sorry about the top posting, but as I wasn't involved in any of the discussions and am on a mobile device right now and your mail isn't directly legible it would be enormously helpful if you could summarize the issues you're trying to address, your proposed solution and the problems Kay had.
Domain-specific events should not be 'faked' as 'input' devices. No non-input subsystem should do that if it can be avoided. That 'input' supports nice and easy events to userspace should not be a reason to misuse it for pretty much unrelated things.
There are patches to have the ALSA control device emit these ALSA related events natively. That's would be just better, simpler and more correct than any additionally created input device.
If Takashi can make that possible in a reasonable time frame, we should not even start handling the (currently not handled) input devices in upstream projects like udev and PulseAudio, and focus right away on the native ALSA control events.
If we can't have the native ALSA events anytime soon for some reason, we might need to merge the input device support, but I would like to avoid that.
Well, the implementation would be relatively easy. There was already a patch, so not too hard to revisit.
But, there are still some open questions. For example, what information is mandatory and what information is preferred about the pin. HD-audio provides the location, type, color, etc. We can encode these into a single name string, but is it a preferred way?
I was thinking the same thing. We don't want another parser as we now have for the volume control names in Pulseaudio. [1] Better encode the information in binary.
Alternatively, the control element can provide the HD-audio pin config via an extra TLV data, so that apps can refer to it if needed in addition to the name string.
Also, we may consider some way to expose the corelation of the jack control element and other mixer elements. Though, this sounds optional to me for the time being.
I think this correlation is what I'm missing the most, I think. We need e g, a way to figure out that if you plugged something into HDMI port nr 2, we should output through PCM device 2 on that card. Exposing what mixers affect this port is highly desirable as well to get more accurate than the current name-based algorithm Pulseaudio currently uses.
For the jack itself, the most important info would be 1) Type (Headphone / Line-out / etc) 2) Channel allocation (Front / Side / LFE / etc) 3) Location (Rear / Docking station / and can also be Front, just to add to the confusion)
These questions are basically requirements from the apps; so I'd like to know the exact demands before going to implementation.
Hopefully this mail gives a little more insight from the Pulseaudio viewport at least.
Should we design something new, I think we should start with a pin/port concept rather than a jack concept. "Internal speaker" would then be a port that does not have a jack, whereas "Headphone" would be a port that has a jack.
(Btw, I don't know much about DAPM and how well that scales to cope with these requirements, it looks very ASoC to me, but perhaps it's just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send events to userspace?)
On Mon, Jun 20, 2011 at 08:24:14PM +0200, David Henningsson wrote:
Exposing what mixers affect this port is highly desirable as well to get more accurate than the current name-based algorithm Pulseaudio currently uses.
Of course given the potential for internal routing within the card you can only really say that about the very edge controls that are fully committed to a given path - hence my comment about exposing the all the routing being much better.
Should we design something new, I think we should start with a pin/port concept rather than a jack concept. "Internal speaker" would then be a port that does not have a jack, whereas "Headphone" would be a port that has a jack.
I think that's too edge node focused, if we're going to define new interfaces we may as well cover everything rather than doing half the job.
(Btw, I don't know much about DAPM and how well that scales to cope with these requirements, it looks very ASoC to me, but perhaps it's just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send events to userspace?)
No, there is no userspace visiblity of the routing map. It's pretty much exactly equivalent to the data HDA CODECs expose but shipped in source rather than parsed out of the device at runtime.
On 2011-06-21 02:29, Mark Brown wrote:
On Mon, Jun 20, 2011 at 08:24:14PM +0200, David Henningsson wrote:
Exposing what mixers affect this port is highly desirable as well to get more accurate than the current name-based algorithm Pulseaudio currently uses.
Of course given the potential for internal routing within the card you can only really say that about the very edge controls that are fully committed to a given path - hence my comment about exposing the all the routing being much better.
I'm ok with exposing all the routing. Perhaps we could then add some convenience functions in alsa-lib (and/or UCM?) that makes it easier for applications to figure out what they need to know without having to parse the entire graph.
Should we design something new, I think we should start with a pin/port concept rather than a jack concept. "Internal speaker" would then be a port that does not have a jack, whereas "Headphone" would be a port that has a jack.
I think that's too edge node focused, if we're going to define new interfaces we may as well cover everything rather than doing half the job.
Rather, the ports are what we're missing. We already have PCM devices (which correspond to the DAC nodes) and mixer controls, so the ports are the only objects/nodes we're missing.
Maybe there are some other type of objects we're missing as well, but I don't think they're as common or important.
We're also missing the links between the nodes (it's partially exposed through the mixer control naming, but that's error prone).
(Btw, I don't know much about DAPM and how well that scales to cope with these requirements, it looks very ASoC to me, but perhaps it's just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send events to userspace?)
No, there is no userspace visiblity of the routing map. It's pretty much exactly equivalent to the data HDA CODECs expose but shipped in source rather than parsed out of the device at runtime.
Sorry, I mixed up DAPM and the Media Controller API, but they seem partially overlapping to me (both are supposed to expose this codec graph).
So, if we were to use the new Media Controller API (which I think you have suggested?), could you explain a little where the boundaries/intersections between this API and the current objects (ALSA PCM objects, ALSA mixer control objects) would be, and how the Media controller API would interact with the existing ALSA APIs?
On Tue, Jun 21, 2011 at 08:57:39AM +0200, David Henningsson wrote:
On 2011-06-21 02:29, Mark Brown wrote:
Of course given the potential for internal routing within the card you can only really say that about the very edge controls that are fully committed to a given path - hence my comment about exposing the all the routing being much better.
I'm ok with exposing all the routing. Perhaps we could then add some convenience functions in alsa-lib (and/or UCM?) that makes it easier for applications to figure out what they need to know without having to parse the entire graph.
I'd guess so unless the raw interface is just easy enough to use but first we need to get the data in place, then we can worry about convenience functions.
UCM wouldn't be an appropriate place to put this, it's all about applying configurations something else has generated and has no reason to care about routing itself.
I think that's too edge node focused, if we're going to define new interfaces we may as well cover everything rather than doing half the job.
Rather, the ports are what we're missing. We already have PCM devices (which correspond to the DAC nodes) and mixer controls, so the ports are the only objects/nodes we're missing.
PCM devices correspond to digital links to the CPU, not to DACs and ADCs. You can have similar digital routing between these two to what you can get in the analogue doman.
We're also missing the links between the nodes (it's partially exposed through the mixer control naming, but that's error prone).
It's not just error prone, it just plain doesn't exist in any real sense at the minute. The name based approach can work only for the most basic sound cards with simple routes, and even then it can't tell applications how the controls are ordered in the audio paths.
(Btw, I don't know much about DAPM and how well that scales to cope with these requirements, it looks very ASoC to me, but perhaps it's just the SND_SOC_DAPM_* naming that fools me. But can DAPM e g send events to userspace?)
No, there is no userspace visiblity of the routing map. It's pretty much exactly equivalent to the data HDA CODECs expose but shipped in source rather than parsed out of the device at runtime.
Sorry, I mixed up DAPM and the Media Controller API, but they seem partially overlapping to me (both are supposed to expose this codec graph).
No, DAPM is not supposed to expose anything to the application layer. It's purely an implementation detail of the drivers and it is only concerned with power so has no information about anything that doesn't affect power.
So, if we were to use the new Media Controller API (which I think you have suggested?), could you explain a little where the boundaries/intersections between this API and the current objects (ALSA PCM objects, ALSA mixer control objects) would be, and how the Media controller API would interact with the existing ALSA APIs?
My expectation would be that whatever interface we use for the graph would in the first instance just point at the existing user visible objects where they exist from the nodes and edges of the graph. I'd not expect any interaction as such otherwise we get an API break.
On Mon, Jun 20, 2011 at 04:19:51PM +0200, Kay Sievers wrote:
Domain-specific events should not be 'faked' as 'input' devices. No non-input subsystem should do that if it can be avoided. That 'input' supports nice and easy events to userspace should not be a reason to misuse it for pretty much unrelated things.
OK, but since jacks aren't at all audio specific (the most obvious additional thing that goes over them is video for cases like composite out on mic and the modern digital standards like HDMI) that wouldn't address the issue. There's also things like docking stations which can present very much like big fancy jacks.
There are patches to have the ALSA control device emit these ALSA related events natively. That's would be just better, simpler and more correct than any additionally created input device.
That's not really terribly clever for the non-audio users, they'd have to implement ALSA support.
If we can't have the native ALSA events anytime soon for some reason, we might need to merge the input device support, but I would like to avoid that.
The input device usage has been present and in use in a basic form since 2.6.18, the ALSA integration of it since 2.6.27 so this isn't terribly new.
It might be nice to have the information available via ALSA but it doesn't seem reasonable to expect anything that might need the information to have to go through ALSA.
At Mon, 20 Jun 2011 17:47:14 +0100, Mark Brown wrote:
On Mon, Jun 20, 2011 at 04:19:51PM +0200, Kay Sievers wrote:
Domain-specific events should not be 'faked' as 'input' devices. No non-input subsystem should do that if it can be avoided. That 'input' supports nice and easy events to userspace should not be a reason to misuse it for pretty much unrelated things.
OK, but since jacks aren't at all audio specific (the most obvious additional thing that goes over them is video for cases like composite out on mic and the modern digital standards like HDMI) that wouldn't address the issue. There's also things like docking stations which can present very much like big fancy jacks.
There are patches to have the ALSA control device emit these ALSA related events natively. That's would be just better, simpler and more correct than any additionally created input device.
That's not really terribly clever for the non-audio users, they'd have to implement ALSA support.
If we can't have the native ALSA events anytime soon for some reason, we might need to merge the input device support, but I would like to avoid that.
The input device usage has been present and in use in a basic form since 2.6.18, the ALSA integration of it since 2.6.27 so this isn't terribly new.
It might be nice to have the information available via ALSA but it doesn't seem reasonable to expect anything that might need the information to have to go through ALSA.
Note that the issue came up because David posted a patch for udev to change the device-permission of these input jacks via ACL together with the normal sound devices. Then, the question arose, whether this is needed really for udev for now.
If udev is being used by real users of such input devices, it'd be a good justification. In the previous thread, I also gave some examples that the input-jack device was used for HD-audio media-PC devices, but it's not clear whether udev is used there. And I'm not sure whether there are real users of embedded devices with the input-jack layer that requires udev's ACL handling.
OTOH, if this is mainly targeted for the future extension of PulseAudio, the primary question would become different. Possibly the path via ALSA control API might give more information than the current implementation with the input-jack layer.
thanks,
Takashi
On Mon, Jun 20, 2011 at 06:59:46PM +0200, Takashi Iwai wrote:
If udev is being used by real users of such input devices, it'd be a good justification. In the previous thread, I also gave some examples that the input-jack device was used for HD-audio media-PC devices, but it's not clear whether udev is used there. And I'm not sure whether there are real users of embedded devices with the input-jack layer that requires udev's ACL handling.
I don't think the embedded space cares desparately about the udev stuff since most of the time the system management daemon which owns this stuff is running as root anyway, PulseAudio or otherwise.
OTOH, if this is mainly targeted for the future extension of PulseAudio, the primary question would become different. Possibly the path via ALSA control API might give more information than the current implementation with the input-jack layer.
I think they can both either give equivalent information or work in concert, I don't see a need to pick between the two. Like I say the jacks aren't exclusively for audio use so if we have to rely on the ALSA APIs to get information about them it seems like we're messing up. For things like working out where the jack fits into the audio routing map we'd certainly want to have ALSA API integration but that doesn't mean everything about the jack has to go that way.
At Mon, 20 Jun 2011 18:17:45 +0100, Mark Brown wrote:
On Mon, Jun 20, 2011 at 06:59:46PM +0200, Takashi Iwai wrote:
If udev is being used by real users of such input devices, it'd be a good justification. In the previous thread, I also gave some examples that the input-jack device was used for HD-audio media-PC devices, but it's not clear whether udev is used there. And I'm not sure whether there are real users of embedded devices with the input-jack layer that requires udev's ACL handling.
I don't think the embedded space cares desparately about the udev stuff since most of the time the system management daemon which owns this stuff is running as root anyway, PulseAudio or otherwise.
OTOH, if this is mainly targeted for the future extension of PulseAudio, the primary question would become different. Possibly the path via ALSA control API might give more information than the current implementation with the input-jack layer.
I think they can both either give equivalent information or work in concert, I don't see a need to pick between the two.
Yes, we can implement both concurrently. They don't conflict.
Like I say the jacks aren't exclusively for audio use so if we have to rely on the ALSA APIs to get information about them it seems like we're messing up. For things like working out where the jack fits into the audio routing map we'd certainly want to have ALSA API integration but that doesn't mean everything about the jack has to go that way.
Agreed.
Takashi
participants (4)
-
David Henningsson
-
Kay Sievers
-
Mark Brown
-
Takashi Iwai