[alsa-devel] Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update. Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3, it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug, audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback, because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec. This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a common object containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port. It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode set. This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object. Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops. And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Thanks & Best Regards Mengdong
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a common object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like “display_register_audio_client” for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like “display_register_gfx_client” itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like “display_set_audio_state()” on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we’ll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though: - This should be common infrastructure useable by all alsa and drm drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ... - While at it it should also encompass power management handling of the shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind of stuff requires. - I think the biggest problem is figuring out who should register these device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that. - I agree that passing ELD and all the other information through this new structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer. - Finally I think we should assign some identifier to this link which will get exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Adding dri-devel since that's where we imo need to have this discussion.
Cheers, Daniel
_Actually_ add dri-devel ...
On Tue, Jan 21, 2014 at 2:10 PM, Daniel Vetter daniel@ffwll.ch wrote:
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a common object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like “display_register_audio_client” for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like “display_register_gfx_client” itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like “display_set_audio_state()” on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we’ll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
- While at it it should also encompass power management handling of
the shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind of stuff requires.
- I think the biggest problem is figuring out who should register
these device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
- I agree that passing ELD and all the other information through this
new structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link which
will get exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Adding dri-devel since that's where we imo need to have this discussion.
Cheers, Daniel
Daniel Vetter Software Engineer, Intel Corporation +41 (0) 79 365 57 48 - http://blog.ffwll.ch
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a
common
object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode
set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
- While at it it should also encompass power management handling of the
shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register these
device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx driver is ready? Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
- I agree that passing ELD and all the other information through this new
structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link which will get
exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing now? For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a pin. Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
Adding dri-devel since that's where we imo need to have this discussion.
Cheers, Daniel
Thanks Mengdong
On Wed, Jan 22, 2014 at 12:48:04PM +0000, Lin, Mengdong wrote:
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a
common
object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode
set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
I guess we could place them into drivers/gpu, yeah. For a name I'd suggest avsink or something like that, to make it clear that it's the combination of audio+video. For the actual interfaces I guess we just need one object in the device model, but the interface should be split into things called from the audio side only, functions for the video driver side only and stuff which can be called from both sides. This matters mostly just so we don't end up with deadlocks since we need a lock to protect the avsink state itself (e.g. the EDL or the audio_output_connected state).
- While at it it should also encompass power management handling of the
shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register these
device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx driver is ready? Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
Tbh I haven't really thought about this really. EPROBE_DEFER looks like the technique used by embedded platforms, but there's also the new aggregate device driver infrastructure that Russell King is working on for the imx driver. Or maybe we need to hand-roll our own notification scheme.
On a hunch it's probably best if the gfx side registers this device (since it also owns the output state in general) and that the audio side waits until the gfx side has registered everything if it's not there yet. I also haven't though about how the audio side could probe for the right avsink node really ...
- I agree that passing ELD and all the other information through this new
structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link which will get
exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing now? For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a pin. Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
ELD doesn't contain the serial number from the EDID, so if you have two monitors of the same model userspace can't figure out which audio output is connected to which screen.
Cheers, Daniel
On Wed, Jan 22, 2014 at 9:18 AM, Daniel Vetter daniel@ffwll.ch wrote:
On Wed, Jan 22, 2014 at 12:48:04PM +0000, Lin, Mengdong wrote:
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a
common
object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode
set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
I guess we could place them into drivers/gpu, yeah. For a name I'd suggest avsink or something like that, to make it clear that it's the combination of audio+video. For the actual interfaces I guess we just need one object in the device model, but the interface should be split into things called from the audio side only, functions for the video driver side only and stuff which can be called from both sides. This matters mostly just so we don't end up with deadlocks since we need a lock to protect the avsink state itself (e.g. the EDL or the audio_output_connected state).
sorry to jump into this a bit late, so maybe this was covered already earlier..
For drm/msm the hdmi code needs something along these lines from audio:
int hdmi_audio_info_setup(struct platform_device *pdev, bool enabled, uint32_t num_of_channels, uint32_t channel_allocation, uint32_t level_shift, bool down_mix); void hdmi_audio_set_sample_rate(struct platform_device *pdev, int rate);
(former is mainly to setup avi audio infoframe, latter for clks)
in addition to hotplug and ELD stuff
BR, -R
- While at it it should also encompass power management handling of the
shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register these
device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx driver is ready? Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
Tbh I haven't really thought about this really. EPROBE_DEFER looks like the technique used by embedded platforms, but there's also the new aggregate device driver infrastructure that Russell King is working on for the imx driver. Or maybe we need to hand-roll our own notification scheme.
On a hunch it's probably best if the gfx side registers this device (since it also owns the output state in general) and that the audio side waits until the gfx side has registered everything if it's not there yet. I also haven't though about how the audio side could probe for the right avsink node really ...
- I agree that passing ELD and all the other information through this new
structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link which will get
exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing now? For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a pin. Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
ELD doesn't contain the serial number from the EDID, so if you have two monitors of the same model userspace can't figure out which audio output is connected to which screen.
Cheers, Daniel
Daniel Vetter Software Engineer, Intel Corporation +41 (0) 79 365 57 48 - http://blog.ffwll.ch _______________________________________________ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
On Wed, Jan 22, 2014 at 10:04:14AM -0500, Rob Clark wrote:
sorry to jump into this a bit late, so maybe this was covered already earlier..
It just started, I've quoted everything when cc'ing dri-devel. But good to have examples outside of x86 (where things are mostly standardized by the Eye of Redmond).
For drm/msm the hdmi code needs something along these lines from audio:
int hdmi_audio_info_setup(struct platform_device *pdev, bool enabled, uint32_t num_of_channels, uint32_t channel_allocation, uint32_t level_shift, bool down_mix); void hdmi_audio_set_sample_rate(struct platform_device *pdev, int rate);
(former is mainly to setup avi audio infoframe, latter for clks)
in addition to hotplug and ELD stuff
Can you elaborate a bit on what you need for msm? On intel (and I think the other x86 platforms are similar) we have separate buffers in the hw for avi infoframes and audio infoframes, so the drm and alsa driver can bash the right stuff into the hardware on their own. I'm mostly confused since you say _AVI_ infoframe here, which is purely generated by the drm side of the code here. Or at least that's been my understanding.
The clocks are also funny really, but I'm not sure whether this fits. I'd have expected some DT-based generic clock source which the audio driver just grabs as part of his multi-device beast (maybe using Russell's new driver core code) and used directly. What's the reason that this has to go through the drm driverr?
Cheers, Daniel
On Wed, Jan 22, 2014 at 10:20 AM, Daniel Vetter daniel@ffwll.ch wrote:
On Wed, Jan 22, 2014 at 10:04:14AM -0500, Rob Clark wrote:
sorry to jump into this a bit late, so maybe this was covered already earlier..
It just started, I've quoted everything when cc'ing dri-devel. But good to have examples outside of x86 (where things are mostly standardized by the Eye of Redmond).
perhaps the arm/SoC stuff was standardized by the Eye of Cthulhu
btw, added a few other SoC drm types who might be interested in the topic
For drm/msm the hdmi code needs something along these lines from audio:
int hdmi_audio_info_setup(struct platform_device *pdev, bool enabled, uint32_t num_of_channels, uint32_t channel_allocation, uint32_t level_shift, bool down_mix); void hdmi_audio_set_sample_rate(struct platform_device *pdev, int rate);
(former is mainly to setup avi audio infoframe, latter for clks)
in addition to hotplug and ELD stuff
Can you elaborate a bit on what you need for msm? On intel (and I think the other x86 platforms are similar) we have separate buffers in the hw for avi infoframes and audio infoframes, so the drm and alsa driver can bash the right stuff into the hardware on their own. I'm mostly confused since you say _AVI_ infoframe here, which is purely generated by the drm side of the code here. Or at least that's been my understanding.
Sorry, typo, meant audio infoframe
We could have the API such that the audio driver constructs the infoframe.. that probably makes more sense and simplifies things. But the drm driver is the one that needs to bash that constructed buffer into the hw. Or, well, either that or both drivers ioremap the same block of registers, but that seems somewhat lame.
But I do need to know some basic things, like # of channels.. and would kinda prefer not to have to parse the audio infoframe to get that info.
The clocks are also funny really, but I'm not sure whether this fits. I'd have expected some DT-based generic clock source which the audio driver just grabs as part of his multi-device beast (maybe using Russell's new driver core code) and used directly. What's the reason that this has to go through the drm driverr?
possibly it could be exposed to the audio driver as a 'struct clk' that is implemented/registered/exported by the drm driver, I guess?
fwiw, if curious, what I have on msm so far is at:
https://github.com/freedreno/kernel-msm/commit/6ffd278d39a3ff8712c70b5fd98dc...
It works on downstream qcom android kernel.. the API exported by drm driver, called by audio driver, is basically just a clone of the hack that was already there between fbdev and alsa. I haven't tried to clean that up at all yet. It was enough work just untangling ION (!!) from alsa in that kernel :-/
BR, -R
Cheers, Daniel
Daniel Vetter Software Engineer, Intel Corporation +41 (0) 79 365 57 48 - http://blog.ffwll.ch
At Wed, 22 Jan 2014 10:45:26 -0500, Rob Clark wrote:
On Wed, Jan 22, 2014 at 10:20 AM, Daniel Vetter daniel@ffwll.ch wrote:
On Wed, Jan 22, 2014 at 10:04:14AM -0500, Rob Clark wrote:
sorry to jump into this a bit late, so maybe this was covered already earlier..
It just started, I've quoted everything when cc'ing dri-devel. But good to have examples outside of x86 (where things are mostly standardized by the Eye of Redmond).
perhaps the arm/SoC stuff was standardized by the Eye of Cthulhu
btw, added a few other SoC drm types who might be interested in the topic
For drm/msm the hdmi code needs something along these lines from audio:
int hdmi_audio_info_setup(struct platform_device *pdev, bool enabled, uint32_t num_of_channels, uint32_t channel_allocation, uint32_t level_shift, bool down_mix); void hdmi_audio_set_sample_rate(struct platform_device *pdev, int rate);
(former is mainly to setup avi audio infoframe, latter for clks)
in addition to hotplug and ELD stuff
Can you elaborate a bit on what you need for msm? On intel (and I think the other x86 platforms are similar) we have separate buffers in the hw for avi infoframes and audio infoframes, so the drm and alsa driver can bash the right stuff into the hardware on their own. I'm mostly confused since you say _AVI_ infoframe here, which is purely generated by the drm side of the code here. Or at least that's been my understanding.
Sorry, typo, meant audio infoframe
We could have the API such that the audio driver constructs the infoframe.. that probably makes more sense and simplifies things.
The HD-audio has a code for doing that, so if needed, it can be copied from there. But, even AMD HD-audio codec doesn't use this scheme but just sets up a few verbs for channel-allocations, etc, so the complete audio infoframe isn't necessary in most cases, as it seems.
But the drm driver is the one that needs to bash that constructed buffer into the hw. Or, well, either that or both drivers ioremap the same block of registers, but that seems somewhat lame.
But I do need to know some basic things, like # of channels.. and would kinda prefer not to have to parse the audio infoframe to get that info.
Yes, it makes things complex. It's one of the reasons we'd like to have a more straightforward interface.
The clocks are also funny really, but I'm not sure whether this fits. I'd have expected some DT-based generic clock source which the audio driver just grabs as part of his multi-device beast (maybe using Russell's new driver core code) and used directly. What's the reason that this has to go through the drm driverr?
possibly it could be exposed to the audio driver as a 'struct clk' that is implemented/registered/exported by the drm driver, I guess?
Hrm, but I guess this is also purely optional? I'd like to start rather from a minimum set.
fwiw, if curious, what I have on msm so far is at:
https://github.com/freedreno/kernel-msm/commit/6ffd278d39a3ff8712c70b5fd98dc...
It works on downstream qcom android kernel.. the API exported by drm driver, called by audio driver, is basically just a clone of the hack that was already there between fbdev and alsa. I haven't tried to clean that up at all yet. It was enough work just untangling ION (!!) from alsa in that kernel :-/
Thanks for the pointer!
Takashi
BR, -R
Cheers, Daniel
Daniel Vetter Software Engineer, Intel Corporation +41 (0) 79 365 57 48 - http://blog.ffwll.ch
On Wed, Jan 22, 2014 at 12:23 PM, Takashi Iwai tiwai@suse.de wrote:
At Wed, 22 Jan 2014 10:45:26 -0500, Rob Clark wrote:
On Wed, Jan 22, 2014 at 10:20 AM, Daniel Vetter daniel@ffwll.ch wrote:
On Wed, Jan 22, 2014 at 10:04:14AM -0500, Rob Clark wrote:
sorry to jump into this a bit late, so maybe this was covered already earlier..
It just started, I've quoted everything when cc'ing dri-devel. But good to have examples outside of x86 (where things are mostly standardized by the Eye of Redmond).
perhaps the arm/SoC stuff was standardized by the Eye of Cthulhu
btw, added a few other SoC drm types who might be interested in the topic
For drm/msm the hdmi code needs something along these lines from audio:
int hdmi_audio_info_setup(struct platform_device *pdev, bool enabled, uint32_t num_of_channels, uint32_t channel_allocation, uint32_t level_shift, bool down_mix); void hdmi_audio_set_sample_rate(struct platform_device *pdev, int rate);
(former is mainly to setup avi audio infoframe, latter for clks)
in addition to hotplug and ELD stuff
Can you elaborate a bit on what you need for msm? On intel (and I think the other x86 platforms are similar) we have separate buffers in the hw for avi infoframes and audio infoframes, so the drm and alsa driver can bash the right stuff into the hardware on their own. I'm mostly confused since you say _AVI_ infoframe here, which is purely generated by the drm side of the code here. Or at least that's been my understanding.
Sorry, typo, meant audio infoframe
We could have the API such that the audio driver constructs the infoframe.. that probably makes more sense and simplifies things.
The HD-audio has a code for doing that, so if needed, it can be copied from there. But, even AMD HD-audio codec doesn't use this scheme but just sets up a few verbs for channel-allocations, etc, so the complete audio infoframe isn't necessary in most cases, as it seems.
But the drm driver is the one that needs to bash that constructed buffer into the hw. Or, well, either that or both drivers ioremap the same block of registers, but that seems somewhat lame.
But I do need to know some basic things, like # of channels.. and would kinda prefer not to have to parse the audio infoframe to get that info.
Yes, it makes things complex. It's one of the reasons we'd like to have a more straightforward interface.
The clocks are also funny really, but I'm not sure whether this fits. I'd have expected some DT-based generic clock source which the audio driver just grabs as part of his multi-device beast (maybe using Russell's new driver core code) and used directly. What's the reason that this has to go through the drm driverr?
possibly it could be exposed to the audio driver as a 'struct clk' that is implemented/registered/exported by the drm driver, I guess?
Hrm, but I guess this is also purely optional? I'd like to start rather from a minimum set.
well, for anything where the video side of things does not need to know (which would incl audio clock if this doesn't need to be setup by display on some hw) would be optional, I think..
if you did need to configure audio clock on the display side of things, it might be simpler just to have a set_rate() fxn ptr in the struct, but the 'struct clk' approach seems more proper. That said, I'm not really sure how it works for the audio driver to find the clk in a non-DT kernel.. if that was a problem, I'm not one to let perfection be the enemy of improvement, so would be ok with the simpler interface.
BR, -R
fwiw, if curious, what I have on msm so far is at:
https://github.com/freedreno/kernel-msm/commit/6ffd278d39a3ff8712c70b5fd98dc...
It works on downstream qcom android kernel.. the API exported by drm driver, called by audio driver, is basically just a clone of the hack that was already there between fbdev and alsa. I haven't tried to clean that up at all yet. It was enough work just untangling ION (!!) from alsa in that kernel :-/
Thanks for the pointer!
Takashi
BR, -R
Cheers, Daniel
Daniel Vetter Software Engineer, Intel Corporation +41 (0) 79 365 57 48 - http://blog.ffwll.ch
At Wed, 22 Jan 2014 15:18:21 +0100, Daniel Vetter wrote:
On Wed, Jan 22, 2014 at 12:48:04PM +0000, Lin, Mengdong wrote:
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a
common
object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode
set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio driver.
We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
I guess we could place them into drivers/gpu, yeah. For a name I'd suggest avsink or something like that, to make it clear that it's the combination of audio+video. For the actual interfaces I guess we just need one object in the device model, but the interface should be split into things called from the audio side only, functions for the video driver side only and stuff which can be called from both sides. This matters mostly just so we don't end up with deadlocks since we need a lock to protect the avsink state itself (e.g. the EDL or the audio_output_connected state).
- While at it it should also encompass power management handling of the
shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register these
device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx driver is ready? Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
Tbh I haven't really thought about this really. EPROBE_DEFER looks like the technique used by embedded platforms, but there's also the new aggregate device driver infrastructure that Russell King is working on for the imx driver. Or maybe we need to hand-roll our own notification scheme.
On a hunch it's probably best if the gfx side registers this device (since it also owns the output state in general) and that the audio side waits until the gfx side has registered everything if it's not there yet. I also haven't though about how the audio side could probe for the right avsink node really ...
Yes, I think doing it first in the gfx side makes sense, too.
- I agree that passing ELD and all the other information through this new
structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link which will get
exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing now? For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a pin. Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
ELD doesn't contain the serial number from the EDID, so if you have two monitors of the same model userspace can't figure out which audio output is connected to which screen.
Right. And, comparing two different data just for knowing whether A and B are relevant is merely a pain.
thanks,
Takashi
-----Original Message----- From: Takashi Iwai [mailto:tiwai@suse.de] Sent: Thursday, January 23, 2014 1:19 AM To: Daniel Vetter Cc: Lin, Mengdong; Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org; dri-devel Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
At Wed, 22 Jan 2014 15:18:21 +0100, Daniel Vetter wrote:
On Wed, Jan 22, 2014 at 12:48:04PM +0000, Lin, Mengdong wrote:
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a
common
object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like
vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode
set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio
playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio
driver.
We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
I guess we could place them into drivers/gpu, yeah. For a name I'd suggest avsink or something like that, to make it clear that it's the combination of audio+video. For the actual interfaces I guess we just need one object in the device model, but the interface should be split into things called from the audio side only, functions for the video driver side only and stuff which can be called from both sides. This matters mostly just so we don't end up with deadlocks since we need a lock to protect the avsink state itself (e.g. the EDL or the
audio_output_connected state).
- While at it it should also encompass power management handling
of the shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind
of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register
these device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx
driver is ready?
Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
Tbh I haven't really thought about this really. EPROBE_DEFER looks like the technique used by embedded platforms, but there's also the new aggregate device driver infrastructure that Russell King is working on for the imx driver. Or maybe we need to hand-roll our own
notification scheme.
On a hunch it's probably best if the gfx side registers this device (since it also owns the output state in general) and that the audio side waits until the gfx side has registered everything if it's not there yet. I also haven't though about how the audio side could probe for the right avsink node really ...
Yes, I think doing it first in the gfx side makes sense, too.
- I agree that passing ELD and all the other information through
this new structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link
which will get exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing
now?
For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a
pin.
Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
ELD doesn't contain the serial number from the EDID, so if you have two monitors of the same model userspace can't figure out which audio output is connected to which screen.
Right. And, comparing two different data just for knowing whether A and B are relevant is merely a pain.
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries under /proc/asound/cardx. Is it okay?
And I have a question: how to assure the audio/gfx client find its right peer? On a x86 platform, there can be an integrated GPU and an discrete GPU. So there can be two audio controllers and two GPUs. We need to assure audio controller find the proper GPU, and vice versa. Maybe we need the peer audio/gfx to register with a same identifier (something like vendor ID) for peering.
Regards Mengdong
At Thu, 23 Jan 2014 06:35:12 +0000, Lin, Mengdong wrote:
-----Original Message----- From: Takashi Iwai [mailto:tiwai@suse.de] Sent: Thursday, January 23, 2014 1:19 AM To: Daniel Vetter Cc: Lin, Mengdong; Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org; dri-devel Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
At Wed, 22 Jan 2014 15:18:21 +0100, Daniel Vetter wrote:
On Wed, Jan 22, 2014 at 12:48:04PM +0000, Lin, Mengdong wrote:
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote:
Dear audio and gfx stakeholders,
We hope to add a new interface between audio and gfx driver, for gfx driver to notify audio about HDMI/DP hot-plug and ELD update.
Would you please share some comments on the proposal below?
Background of this issue: On Intel Haswell/Broadwell platforms, there is a HW restriction that after the display HD-Audio controller is in D3,
it cannot be waken up by HDMI/DP hot-plug. Consequently, although the gfx driver can still detect the HDMI/DP hot-plug,
audio driver has no idea about this and cannot notify user space whether the external HDMI/DP monitor is available for audio playback,
because the audio controller cannot wake up to D0 and receive HW unsolicited event about hot-plug from the audio codec.
This limitation will affect user space to decide whether we can output audio over HDMI/DP.
To solve the above limitation, Takashi suggested to add a new communication interface between audio and gfx driver: create a
common
object
containing the ops registered by both graphics and audio drivers, then communicate through it, something like
vga_switcheroo.
Is it okay to create this kernel object in i915 driver?
I915 can export an API like "display_register_audio_client" for audio driver to register a client and hot-plug notification ops.
I915 can also call some API like "display_register_gfx_client" itself and register ops for audio driver to query monitor presence and ELD info on a specific port.
It would be faster for audio driver than quering ELD by command/response over the HD-A bus, thus avoid delay in i915 mode
set.
This will also avoid waking up the audio devices unnecessarily if the user space does not really want to use HDMI/DP for audio
playback.
Whenever i195 enables/disables audio on a port in modeset, it can call some API like "display_set_audio_state()" on this kernel object and trigger notifications to the audio driver.
When the audio driver is probed (in the delayed probe stage), it can request i915 API symbol to register the audio client for this communication kernel object.
Since the 1st i915 mode set may happen before audio driver registers the ops, we'll let audio driver check ELD once after registering the audio client ops.
And for the platforms which uses this communication interface, we can disable unsolicited event for HDMI/DP hot-plug in the audio
driver.
We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
I guess we could place them into drivers/gpu, yeah. For a name I'd suggest avsink or something like that, to make it clear that it's the combination of audio+video. For the actual interfaces I guess we just need one object in the device model, but the interface should be split into things called from the audio side only, functions for the video driver side only and stuff which can be called from both sides. This matters mostly just so we don't end up with deadlocks since we need a lock to protect the avsink state itself (e.g. the EDL or the
audio_output_connected state).
- While at it it should also encompass power management handling
of the shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind
of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register
these device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx
driver is ready?
Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
Tbh I haven't really thought about this really. EPROBE_DEFER looks like the technique used by embedded platforms, but there's also the new aggregate device driver infrastructure that Russell King is working on for the imx driver. Or maybe we need to hand-roll our own
notification scheme.
On a hunch it's probably best if the gfx side registers this device (since it also owns the output state in general) and that the audio side waits until the gfx side has registered everything if it's not there yet. I also haven't though about how the audio side could probe for the right avsink node really ...
Yes, I think doing it first in the gfx side makes sense, too.
- I agree that passing ELD and all the other information through
this new structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link
which will get exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing
now?
For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a
pin.
Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
ELD doesn't contain the serial number from the EDID, so if you have two monitors of the same model userspace can't figure out which audio output is connected to which screen.
Right. And, comparing two different data just for knowing whether A and B are relevant is merely a pain.
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
And I have a question: how to assure the audio/gfx client find its right peer? On a x86 platform, there can be an integrated GPU and an discrete GPU. So there can be two audio controllers and two GPUs. We need to assure audio controller find the proper GPU, and vice versa. Maybe we need the peer audio/gfx to register with a same identifier (something like vendor ID) for peering.
Yes, it's an open issue. So far, the binding with vga_switcheroo is done by a rough guess with PCI ID.
Takashi
Date 23.1.2014 08:57, Takashi Iwai wrote:
At Thu, 23 Jan 2014 06:35:12 +0000, Lin, Mengdong wrote:
-----Original Message----- From: Takashi Iwai [mailto:tiwai@suse.de] Sent: Thursday, January 23, 2014 1:19 AM To: Daniel Vetter Cc: Lin, Mengdong; Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org; dri-devel Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
At Wed, 22 Jan 2014 15:18:21 +0100, Daniel Vetter wrote:
On Wed, Jan 22, 2014 at 12:48:04PM +0000, Lin, Mengdong wrote:
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Tuesday, January 21, 2014 9:11 PM To: Lin, Mengdong Cc: Takashi Iwai (tiwai@suse.de); Barnes, Jesse; Zanoni, Paulo R; alsa-devel@alsa-project.org; intel-gfx@lists.freedesktop.org Subject: Re: Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Jan 21, 2014 at 1:35 PM, Lin, Mengdong mengdong.lin@intel.com wrote: > Dear audio and gfx stakeholders, > > > > We hope to add a new interface between audio and gfx driver, for > gfx driver to notify audio about HDMI/DP hot-plug and ELD update. > > Would you please share some comments on the proposal below? > > > > Background of this issue: On Intel Haswell/Broadwell platforms, > there is a HW restriction that after the display HD-Audio > controller is in D3, > > it cannot be waken up by HDMI/DP hot-plug. Consequently, > although the gfx driver can still detect the HDMI/DP hot-plug, > > audio driver has no idea about this and cannot notify user space > whether the external HDMI/DP monitor is available for audio > playback, > > because the audio controller cannot wake up to D0 and receive HW > unsolicited event about hot-plug from the audio codec. > > This limitation will affect user space to decide whether we can > output audio over HDMI/DP. > > > > To solve the above limitation, Takashi suggested to add a new > communication interface between audio and gfx driver: create a common > object > > containing the ops registered by both graphics and audio > drivers, then communicate through it, something like
vga_switcheroo.
> > > > Is it okay to create this kernel object in i915 driver? > > > > I915 can export an API like "display_register_audio_client" for > audio driver to register a client and hot-plug notification ops. > > > > I915 can also call some API like "display_register_gfx_client" > itself and register ops for audio driver to query monitor > presence and ELD info on a specific port. > > It would be faster for audio driver than quering ELD by > command/response over the HD-A bus, thus avoid delay in i915 > mode set. > > This will also avoid waking up the audio devices unnecessarily > if the user space does not really want to use HDMI/DP for audio
playback.
> > > > Whenever i195 enables/disables audio on a port in modeset, it > can call some API like "display_set_audio_state()" on this > kernel object and trigger notifications to the audio driver. > > > > When the audio driver is probed (in the delayed probe stage), it > can request > i915 API symbol to register the audio client for this > communication kernel object. > > Since the 1st i915 mode set may happen before audio driver > registers the ops, we'll let audio driver check ELD once after > registering the audio client ops. > > And for the platforms which uses this communication interface, > we can disable unsolicited event for HDMI/DP hot-plug in the audio
driver.
> > > > We hope to hear your feedback and start to work out more details.
Thanks for your advice, Daniel!
Yeah, I've discussed this at KS with Takashi and we've agreed that some common object to facilitate driver interactions. A few things though:
- This should be common infrastructure useable by all alsa and drm
drivers, not just i915 and snd-hda. Especially on embedded platforms this issue is fairly rampant ...
Agree. Where to put this common object? Is it okay to put it under /driver/gpu/drm, similar to vga_switchroo? Shall we divide clients into audio and gfx categories, and define different ops for them? Since different info/request flow in different direction between audio and gfx.
I guess we could place them into drivers/gpu, yeah. For a name I'd suggest avsink or something like that, to make it clear that it's the combination of audio+video. For the actual interfaces I guess we just need one object in the device model, but the interface should be split into things called from the audio side only, functions for the video driver side only and stuff which can be called from both sides. This matters mostly just so we don't end up with deadlocks since we need a lock to protect the avsink state itself (e.g. the EDL or the
audio_output_connected state).
- While at it it should also encompass power management handling
of the shared hw imo so that we can get rid of the hsw specific hacks for the power well code. Or at least we need to rework the power well code to reuse this new infrastructure, I don't really want to maintain a few copies of the lazy symbol_get logic this kind
of stuff requires.
Sounds good.
- I think the biggest problem is figuring out who should register
these device nodes. I think it makes the most sense if we do this in the gfx driver, but that requires some trickery on the alsa side (probably with using -EPROBE_DEFER or something like that.
Can the new infrastructure allow audio driver to query whether gfx
driver is ready?
Maybe audio can wait until gfx is ready. For HD-Audio driver, the most time consuming part is delayed after the probe stage, and actually we can wait in the delayed phase.
Tbh I haven't really thought about this really. EPROBE_DEFER looks like the technique used by embedded platforms, but there's also the new aggregate device driver infrastructure that Russell King is working on for the imx driver. Or maybe we need to hand-roll our own
notification scheme.
On a hunch it's probably best if the gfx side registers this device (since it also owns the output state in general) and that the audio side waits until the gfx side has registered everything if it's not there yet. I also haven't though about how the audio side could probe for the right avsink node really ...
Yes, I think doing it first in the gfx side makes sense, too.
- I agree that passing ELD and all the other information through
this new structure makes lot more sense than the current mess we have with passing the ELD through some hardware buffer.
- Finally I think we should assign some identifier to this link
which will get exposed both on the drm side and in alsa, so that userspace can figure out which display connects to which output. With that media player could do the Right Thing and automatically place the audio stream on the right pin in alsa.
Is there something that blocks media player from doing the right thing
now?
For HD-Audio, the eld entries under /proc/asound/cardx expose the ELD info and can help user space to check if a monitor is usable on a
pin.
Current limitation is these eld entries cannot update if the audio controller is in D3, so we need the new infrastructure to notify audio driver to update them. But I'm not sure about embedded audio, maybe Takashi would like to share more info.
ELD doesn't contain the serial number from the EDID, so if you have two monitors of the same model userspace can't figure out which audio output is connected to which screen.
Right. And, comparing two different data just for knowing whether A and B are relevant is merely a pain.
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
I agree here. From my view, the cleanest solution is to create an universal API in the kernel for the shared state / binding information and inter-communication. I believe, that other drivers may use this API in the future like mentioned vga_switcheroo.
Jaroslav
And I have a question: how to assure the audio/gfx client find its right peer? On a x86 platform, there can be an integrated GPU and an discrete GPU. So there can be two audio controllers and two GPUs. We need to assure audio controller find the proper GPU, and vice versa. Maybe we need the peer audio/gfx to register with a same identifier (something like vendor ID) for peering.
Yes, it's an open issue. So far, the binding with vga_switcheroo is done by a rough guess with PCI ID.
Takashi _______________________________________________ Alsa-devel mailing list Alsa-devel@alsa-project.org http://mailman.alsa-project.org/mailman/listinfo/alsa-devel
On Thu, Jan 23, 2014 at 8:57 AM, Takashi Iwai tiwai@suse.de wrote:
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
Links in sysfs sound like the best approach. drm already has nodes for each connector, so on the gfx side there's a natural endpoint already. sysfs links also avoids any naming issues from the start, e.g. the above DP connector id might lead to clashes with multiple cards.
And I have a question: how to assure the audio/gfx client find its right peer? On a x86 platform, there can be an integrated GPU and an discrete GPU. So there can be two audio controllers and two GPUs. We need to assure audio controller find the proper GPU, and vice versa. Maybe we need the peer audio/gfx to register with a same identifier (something like vendor ID) for peering.
Yes, it's an open issue. So far, the binding with vga_switcheroo is done by a rough guess with PCI ID.
Yeah, I guess we need platform/bus specific hints. E.g. on x86 we could use the pci id of the gfx card + enumeration of all the audio input pins (and just hope that there's no one insane enough to route different codecs to the same gfx card or something like that). On ARM we could just fish device refernences out of dt. So probably we need some platform specific code to fish out the right avsink from the audio side. -Daniel
And I have a question: how to assure the audio/gfx client find its
right peer?
On a x86 platform, there can be an integrated GPU and an discrete GPU.
So there can be two audio controllers and two GPUs.
We need to assure audio controller find the proper GPU, and vice
versa. Maybe we need the peer audio/gfx to register with a same identifier (something like vendor ID) for peering.
Yes, it's an open issue. So far, the binding with vga_switcheroo is done by a rough guess with PCI ID.
Yeah, I guess we need platform/bus specific hints.
http://www.displayport.org/cables/driving-multiple-displays-from-a-single-di...
How do the audio driver know which displays when a single displayport output can drive multiple displays ?
On Fri, Jan 24, 2014 at 3:23 AM, Raymond Yau superquad.vortex2@gmail.com wrote:
And I have a question: how to assure the audio/gfx client find its right peer? On a x86 platform, there can be an integrated GPU and an discrete GPU. So there can be two audio controllers and two GPUs. We need to assure audio controller find the proper GPU, and vice versa. Maybe we need the peer audio/gfx to register with a same identifier (something like vendor ID) for peering.
Yes, it's an open issue. So far, the binding with vga_switcheroo is done by a rough guess with PCI ID.
Yeah, I guess we need platform/bus specific hints.
http://www.displayport.org/cables/driving-multiple-displays-from-a-single-di...
How do the audio driver know which displays when a single displayport output can drive multiple displays ?
Generally, there is a back channel between the display hardware and the audio hardware.
Alex
Sorry to pick up this thread after a long time.
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Thursday, January 23, 2014 4:27 PM To: Takashi Iwai
On Thu, Jan 23, 2014 at 8:57 AM, Takashi Iwai tiwai@suse.de wrote:
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries
under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
Links in sysfs sound like the best approach. drm already has nodes for each connector, so on the gfx side there's a natural endpoint already. sysfs links also avoids any naming issues from the start, e.g. the above DP connector id might lead to clashes with multiple cards.
Hi Daniel,
Is there a 1:1 mapping between these connector nodes and ports of Gfx display engine? Eg. For Haswell Ultrabook, under /sys/devices/pci0000:00/0000:00:02.0/drm/card0/ There are four connector nodes, card0-DP-1 -> DDI port B card0-eDP-1/ -> DDI port A card0-HDMI-A-1/ -> DDI Port C card0-HDMI-A-2/ -> Which DDI port ? Haswell-ULT does not support port D, and I think port E is for VGA.
Hi Takashi,
To make user space figure out which audio output is connected to which screen (connector), maybe we can define a new ALSA control for each HDMI/DP PCM device: e.g. numid=x,iface=PCM,name='Screen',device=3 Reading the control will return the name of the DRM connector nodes like ' card0-DP-1'. The audio driver can get the connector name from the gfx driver.
For DP1.2 Multi-stream transport, it's not supported by i915 and HD-A driver now. But probably there will be sub-nodes for the DP connector node in the future and an index in their name can be used distinguish monitors connected to the same DP port, like card0-DP-1.1, card0-DP-1.2, card0-DP-1.3 ... These names can be used by the above ALSA PCM 'Screen' control, so we can still know which audio output is to which monitor.
Thanks Mengdong
On Tue, Feb 18, 2014 at 01:58:22PM +0000, Lin, Mengdong wrote:
Sorry to pick up this thread after a long time.
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Thursday, January 23, 2014 4:27 PM To: Takashi Iwai
On Thu, Jan 23, 2014 at 8:57 AM, Takashi Iwai tiwai@suse.de wrote:
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries
under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
Links in sysfs sound like the best approach. drm already has nodes for each connector, so on the gfx side there's a natural endpoint already. sysfs links also avoids any naming issues from the start, e.g. the above DP connector id might lead to clashes with multiple cards.
Hi Daniel,
Is there a 1:1 mapping between these connector nodes and ports of Gfx display engine? Eg. For Haswell Ultrabook, under /sys/devices/pci0000:00/0000:00:02.0/drm/card0/ There are four connector nodes, card0-DP-1 -> DDI port B card0-eDP-1/ -> DDI port A card0-HDMI-A-1/ -> DDI Port C card0-HDMI-A-2/ -> Which DDI port ? Haswell-ULT does not support port D, and I think port E is for VGA.
There's no fixed mapping with the port and the connector name. The number in the connector name is basically just a running number per connector type. However I do believe we do register the connectors in the order of the ports more or less always, so you can *sometimes* deduce the port name from the connector.
I suppose in this example HDMI-A-1 is port B, HDMI-A-2 is port C, and DP-1 can be either port B or port C. DP++ is the reason why we have overlapping DP and HDMI connectors for the same port.
Hi Takashi,
To make user space figure out which audio output is connected to which screen (connector), maybe we can define a new ALSA control for each HDMI/DP PCM device: e.g. numid=x,iface=PCM,name='Screen',device=3 Reading the control will return the name of the DRM connector nodes like ' card0-DP-1'. The audio driver can get the connector name from the gfx driver.
For DP1.2 Multi-stream transport, it's not supported by i915 and HD-A driver now. But probably there will be sub-nodes for the DP connector node in the future and an index in their name can be used distinguish monitors connected to the same DP port, like card0-DP-1.1, card0-DP-1.2, card0-DP-1.3 ... These names can be used by the above ALSA PCM 'Screen' control, so we can still know which audio output is to which monitor.
Thanks Mengdong
Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
-----Original Message----- From: Ville Syrjälä [mailto:ville.syrjala@linux.intel.com] Sent: Tuesday, February 18, 2014 10:23 PM To: Lin, Mengdong Cc: Daniel Vetter; Takashi Iwai; alsa-devel@alsa-project.org; Barnes, Jesse; Zanoni, Paulo R; dri-devel; intel-gfx@lists.freedesktop.org Subject: Re: [Intel-gfx] Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Feb 18, 2014 at 01:58:22PM +0000, Lin, Mengdong wrote:
Sorry to pick up this thread after a long time.
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Thursday, January 23, 2014 4:27 PM To: Takashi Iwai
On Thu, Jan 23, 2014 at 8:57 AM, Takashi Iwai tiwai@suse.de wrote:
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries
under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
Links in sysfs sound like the best approach. drm already has nodes for each connector, so on the gfx side there's a natural endpoint
already.
sysfs links also avoids any naming issues from the start, e.g. the above DP connector id might lead to clashes with multiple cards.
Hi Daniel,
Is there a 1:1 mapping between these connector nodes and ports of Gfx
display engine?
Eg. For Haswell Ultrabook, under /sys/devices/pci0000:00/0000:00:02.0/drm/card0/ There are four connector nodes, card0-DP-1 -> DDI port B card0-eDP-1/ -> DDI port A card0-HDMI-A-1/ -> DDI Port C card0-HDMI-A-2/ -> Which DDI port ? Haswell-ULT does not
support port D, and I think port E is for VGA.
There's no fixed mapping with the port and the connector name. The number in the connector name is basically just a running number per connector type. However I do believe we do register the connectors in the order of the ports more or less always, so you can *sometimes* deduce the port name from the connector.
I suppose in this example HDMI-A-1 is port B, HDMI-A-2 is port C, and DP-1 can be either port B or port C. DP++ is the reason why we have overlapping DP and HDMI connectors for the same port.
Thanks for clarification, Ville!
Does Haswell support DP++ on port B/C/D? And will the name of these connector node change after system boot, e.g. after S3/S4 cycles or hot-plug?
As long as the connector name is constant, it can help to find out the screen status no matter the port works in HDMI or DP mode.
There is 1:1 mapping between audio output pins and DDI ports. And the new ALSA PCM 'screen' control for a pin can reflect the connector name in right mode. E.g. for pin/port B, it can return HDMI-A-1 or DP-1 depending on what kind of monitor is connected.
Thanks Mengdong
Hi Takashi,
To make user space figure out which audio output is connected to which
screen (connector), maybe we can define a new ALSA control for each HDMI/DP PCM device:
e.g. numid=x,iface=PCM,name='Screen',device=3 Reading the control will return the name of the DRM connector nodes
like ' card0-DP-1'. The audio driver can get the connector name from the gfx driver.
For DP1.2 Multi-stream transport, it's not supported by i915 and HD-A
driver now. But probably there will be sub-nodes for the DP connector node in the future and an index in their name can be used distinguish monitors connected to the same DP port, like card0-DP-1.1, card0-DP-1.2, card0-DP-1.3 ... These names can be used by the above ALSA PCM 'Screen' control, so we can still know which audio output is to which monitor.
Thanks Mengdong
Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
-- Ville Syrjälä Intel OTC
On Wed, Feb 19, 2014 at 09:08:39AM +0000, Lin, Mengdong wrote:
-----Original Message----- From: Ville Syrjälä [mailto:ville.syrjala@linux.intel.com] Sent: Tuesday, February 18, 2014 10:23 PM To: Lin, Mengdong Cc: Daniel Vetter; Takashi Iwai; alsa-devel@alsa-project.org; Barnes, Jesse; Zanoni, Paulo R; dri-devel; intel-gfx@lists.freedesktop.org Subject: Re: [Intel-gfx] Need your advice: Add a new communication inteface between HD-Audio and Gfx drivers for hotplug notification/ELD update
On Tue, Feb 18, 2014 at 01:58:22PM +0000, Lin, Mengdong wrote:
Sorry to pick up this thread after a long time.
-----Original Message----- From: daniel.vetter@ffwll.ch [mailto:daniel.vetter@ffwll.ch] On Behalf Of Daniel Vetter Sent: Thursday, January 23, 2014 4:27 PM To: Takashi Iwai
On Thu, Jan 23, 2014 at 8:57 AM, Takashi Iwai tiwai@suse.de wrote:
Thanks for clarification! Maybe we can add output info (eg. display port number) to the eld entries
under /proc/asound/cardx. Is it okay?
It's possible, but the proc file is just a help. It can't be the API. For accessing the information, we'll need some new API, or let inform via sysfs of the new device.
Links in sysfs sound like the best approach. drm already has nodes for each connector, so on the gfx side there's a natural endpoint
already.
sysfs links also avoids any naming issues from the start, e.g. the above DP connector id might lead to clashes with multiple cards.
Hi Daniel,
Is there a 1:1 mapping between these connector nodes and ports of Gfx
display engine?
Eg. For Haswell Ultrabook, under /sys/devices/pci0000:00/0000:00:02.0/drm/card0/ There are four connector nodes, card0-DP-1 -> DDI port B card0-eDP-1/ -> DDI port A card0-HDMI-A-1/ -> DDI Port C card0-HDMI-A-2/ -> Which DDI port ? Haswell-ULT does not
support port D, and I think port E is for VGA.
There's no fixed mapping with the port and the connector name. The number in the connector name is basically just a running number per connector type. However I do believe we do register the connectors in the order of the ports more or less always, so you can *sometimes* deduce the port name from the connector.
I suppose in this example HDMI-A-1 is port B, HDMI-A-2 is port C, and DP-1 can be either port B or port C. DP++ is the reason why we have overlapping DP and HDMI connectors for the same port.
Thanks for clarification, Ville!
Does Haswell support DP++ on port B/C/D?
Yes.
And will the name of these connector node change after system boot, e.g. after S3/S4 cycles or hot-plug?
The connector names can't change, except if you unload+reload the driver. But I suppose DP MST might change this. Or maybe we'll just expose three new fixed DP connectors per DDI port. One for each potential stream.
As long as the connector name is constant, it can help to find out the screen status no matter the port works in HDMI or DP mode.
There is 1:1 mapping between audio output pins and DDI ports.
So I guess we need to expose the port->connector mapping somehow to the audio driver.
How does this work with DP MST? On the display side the audio stuff happens in the transcoder, not the DDI port, AFAICS. For DP MST each transcoder can provide a single stream.
And the new ALSA PCM 'screen' control for a pin can reflect the connector name in right mode. E.g. for pin/port B, it can return HDMI-A-1 or DP-1 depending on what kind of monitor is connected.
Thanks Mengdong
Hi Takashi,
To make user space figure out which audio output is connected to which
screen (connector), maybe we can define a new ALSA control for each HDMI/DP PCM device:
e.g. numid=x,iface=PCM,name='Screen',device=3 Reading the control will return the name of the DRM connector nodes
like ' card0-DP-1'. The audio driver can get the connector name from the gfx driver.
For DP1.2 Multi-stream transport, it's not supported by i915 and HD-A
driver now. But probably there will be sub-nodes for the DP connector node in the future and an index in their name can be used distinguish monitors connected to the same DP port, like card0-DP-1.1, card0-DP-1.2, card0-DP-1.3 ... These names can be used by the above ALSA PCM 'Screen' control, so we can still know which audio output is to which monitor.
Thanks Mengdong
Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
-- Ville Syrjälä Intel OTC
-----Original Message----- From: Ville Syrjälä [mailto:ville.syrjala@linux.intel.com] Sent: Wednesday, February 19, 2014 7:30 PM
Is there a 1:1 mapping between these connector nodes and ports of Gfx
display engine?
Eg. For Haswell Ultrabook, under /sys/devices/pci0000:00/0000:00:02.0/drm/card0/ There are four connector nodes, card0-DP-1 -> DDI port B card0-eDP-1/ -> DDI port A card0-HDMI-A-1/ -> DDI Port C card0-HDMI-A-2/ -> Which DDI port ? Haswell-ULT does not
support port D, and I think port E is for VGA.
There's no fixed mapping with the port and the connector name. The number in the connector name is basically just a running number per connector type. However I do believe we do register the connectors in the order of the ports more or less always, so you can *sometimes* deduce the port name from the connector.
I suppose in this example HDMI-A-1 is port B, HDMI-A-2 is port C, and DP-1 can be either port B or port C. DP++ is the reason why we have overlapping DP and HDMI connectors for the same port.
Thanks for clarification, Ville!
Does Haswell support DP++ on port B/C/D?
Yes.
And will the name of these connector node change after system boot,
e.g. after S3/S4 cycles or hot-plug?
The connector names can't change, except if you unload+reload the driver.
It seems okay. We could let the gfx driver notify the audio driver when the connector nodes are destroyed or recreated. And the audio driver can always know the connector names no matter they change or not.
But I suppose DP MST might change this. Or maybe we'll just expose three new fixed DP connectors per DDI port. One for each potential stream.
It would be fine.
As long as the connector name is constant, it can help to find out the
screen status no matter the port works in HDMI or DP mode.
There is 1:1 mapping between audio output pins and DDI ports.
So I guess we need to expose the port->connector mapping somehow to the audio driver.
When the gfx driver enables audio, the mapping between transcoder/port/connector is established. And the gfx driver can notify audio the connector name, to help the user space to know the audio could be output to which screen.
How does this work with DP MST? On the display side the audio stuff happens in the transcoder, not the DDI port, AFAICS. For DP MST each transcoder can provide a single stream.
DP MST is not supported in the audio driver atm. But it could support in the future. E.g. for Haswell, audio driver can know the device index of a monitor connected to a port through HW unsolicited event, when the gfx driver establishes mapping between a transcoder and a monitor and enables audio on the transcoder. Different monitors connected to a same port simultaneously have different device index. Then the audio driver will specify the device index for an audio stream that flows to the right transcoder. And we're planning to replace the HW unsolicited event notification by SW notification between the gfx and audio driver.
Thanks Mengdong
And the new ALSA PCM 'screen' control for a pin can reflect the
connector name in right mode. E.g. for pin/port B, it can return HDMI-A-1 or DP-1 depending on what kind of monitor is connected.
Thanks Mengdong
Hi Takashi,
To make user space figure out which audio output is connected to which
screen (connector), maybe we can define a new ALSA control for each HDMI/DP PCM device:
e.g. numid=x,iface=PCM,name='Screen',device=3 Reading the control will return the name of the DRM connector nodes
like ' card0-DP-1'. The audio driver can get the connector name from the gfx driver.
For DP1.2 Multi-stream transport, it's not supported by i915 and HD-A
driver now. But probably there will be sub-nodes for the DP connector node in the future and an index in their name can be used distinguish monitors connected to the same DP port, like card0-DP-1.1, card0-DP-1.2, card0-DP-1.3 ... These names can be used by the above ALSA PCM
'Screen'
control, so we can still know which audio output is to which monitor.
Thanks Mengdong
Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
-- Ville Syrjälä Intel OTC
-- Ville Syrjälä Intel OTC
participants (8)
-
Alex Deucher
-
Daniel Vetter
-
Jaroslav Kysela
-
Lin, Mengdong
-
Raymond Yau
-
Rob Clark
-
Takashi Iwai
-
Ville Syrjälä