[alsa-devel] Connecting two Codecs to the same CPU DAI
Hi Mark, Liam,
I'm trying to find a sane solution for systems that have multiple Codecs connected to the same CPU DAI. An example would be a codec that receives the same I2S stream as a S/PDIF transceiver, and which is in fact connected to the very same bus in hardware.
Currently, I'm setting up two DAI links in such cases, and userspace has to open both subdevices and issue hw_params calls to each of them in order to set up the hardware correctly.
This is cumbersome of course, and I wonder how the core could be augmented in order to handle such cases correctly. Ideally, it should only expose one PCM interface and handle a list of Codecs in the back.
Or am I unaware of an existing feature?
Thanks, Daniel
On Tue, Jul 16, 2013 at 04:31:01PM +0200, Daniel Mack wrote:
This is cumbersome of course, and I wonder how the core could be augmented in order to handle such cases correctly. Ideally, it should only expose one PCM interface and handle a list of Codecs in the back.
Or am I unaware of an existing feature?
No, no feature here. Off the top of my head I'd suggest soc-pcm plus a virtual DAI for the second link.
On 16.07.2013 18:20, Mark Brown wrote:
On Tue, Jul 16, 2013 at 04:31:01PM +0200, Daniel Mack wrote:
This is cumbersome of course, and I wonder how the core could be augmented in order to handle such cases correctly. Ideally, it should only expose one PCM interface and handle a list of Codecs in the back.
Or am I unaware of an existing feature?
No, no feature here. Off the top of my head I'd suggest soc-pcm plus a virtual DAI for the second link.
That would still require userspace to open and configure both subdevices, right?
Do you think it's a good idea to teach DAI links support for more than one codec? At a glance, it seems quite possible: the modes and rates would be limited down to the intersection of those of each codec, and the PCM callbacks would be relayed to all codecs in the list.
Thanks, Daniel
On Tue, Jul 16, 2013 at 06:33:39PM +0200, Daniel Mack wrote:
On 16.07.2013 18:20, Mark Brown wrote:
No, no feature here. Off the top of my head I'd suggest soc-pcm plus a virtual DAI for the second link.
That would still require userspace to open and configure both subdevices, right?
Shouldn't.
Do you think it's a good idea to teach DAI links support for more than one codec? At a glance, it seems quite possible: the modes and rates would be limited down to the intersection of those of each codec, and the PCM callbacks would be relayed to all codecs in the list.
In principal yet but we want better DAPM integration I think - having an unused device on the link probably ought not have any impact on other devices for example.
On 16.07.2013 18:57, Mark Brown wrote:
On Tue, Jul 16, 2013 at 06:33:39PM +0200, Daniel Mack wrote:
On 16.07.2013 18:20, Mark Brown wrote:
No, no feature here. Off the top of my head I'd suggest soc-pcm plus a virtual DAI for the second link.
That would still require userspace to open and configure both subdevices, right?
Shouldn't.
How so? As each link exposes its own PCM subdevice, how would codec of link #1 get a hw_params call if an application just opens PCM subdevice #0 for playback? This is the case I want to address.
Do you think it's a good idea to teach DAI links support for more than one codec? At a glance, it seems quite possible: the modes and rates would be limited down to the intersection of those of each codec, and the PCM callbacks would be relayed to all codecs in the list.
In principal yet but we want better DAPM integration I think - having an unused device on the link probably ought not have any impact on other devices for example.
Well, if a link references two codecs, then both would be used in parallel, and none of it is unused. But maybe I don't get your point :)
By naive approach atm would be to just add .codec_names, .codec_dai_names and .codec_of_node to snd_soc_dai_link, and make their use mutually exclusive to the existing ones. That way, we don't need to touch any existing user.
Daniel
On Tue, Jul 16, 2013 at 07:07:33PM +0200, Daniel Mack wrote:
On 16.07.2013 18:57, Mark Brown wrote:
No, no feature here. Off the top of my head I'd suggest soc-pcm plus a virtual DAI for the second link.
That would still require userspace to open and configure both subdevices, right?
Shouldn't.
How so? As each link exposes its own PCM subdevice, how would codec of link #1 get a hw_params call if an application just opens PCM subdevice #0 for playback? This is the case I want to address.
Half the point of soc-pcm is to decouple the streams seen by the application layer from the DAIs that the driver sees so I'd expect us to be able to map a single application layer/memory stream onto multiple CODEC DAIs. One front end connected to two back ends.
In principal yet but we want better DAPM integration I think - having an unused device on the link probably ought not have any impact on other devices for example.
Well, if a link references two codecs, then both would be used in parallel, and none of it is unused. But maybe I don't get your point :)
If one of them has no active outputs or inputs then it shouldn't do anything.
By naive approach atm would be to just add .codec_names, .codec_dai_names and .codec_of_node to snd_soc_dai_link, and make their use mutually exclusive to the existing ones. That way, we don't need to touch any existing user.
May as well bite the bullet and do the refactoring, it should be automatable. There's also the fun of handling what happens when you want to stream from one CODEC to another but I think that's second stage...
On 16.07.2013 19:47, Mark Brown wrote:
On Tue, Jul 16, 2013 at 07:07:33PM +0200, Daniel Mack wrote:
On 16.07.2013 18:57, Mark Brown wrote:
No, no feature here. Off the top of my head I'd suggest soc-pcm plus a virtual DAI for the second link.
That would still require userspace to open and configure both subdevices, right?
Shouldn't.
How so? As each link exposes its own PCM subdevice, how would codec of link #1 get a hw_params call if an application just opens PCM subdevice #0 for playback? This is the case I want to address.
Half the point of soc-pcm is to decouple the streams seen by the application layer from the DAIs that the driver sees so I'd expect us to be able to map a single application layer/memory stream onto multiple CODEC DAIs. One front end connected to two back ends.
You're talking about DPCM, right? Unfortunately, there is no active user of this in mainline, and no documentation either. I'll try and find out how this is supposed to work first ...
Maybe Liam has some explanation ready?
By naive approach atm would be to just add .codec_names, .codec_dai_names and .codec_of_node to snd_soc_dai_link, and make their use mutually exclusive to the existing ones. That way, we don't need to touch any existing user.
May as well bite the bullet and do the refactoring, it should be automatable. There's also the fun of handling what happens when you want to stream from one CODEC to another but I think that's second stage...
Looking deeper into this, I'm not sure whether each of those codec instances inside one snd_soc_dai_link should have it own runtime, of if they should rather share one. The latter isn't straight-forward, as current users simply access the codec from their callbacks as rtd->codec_dai. However, open-coding routines to access the correct codec from these callbacks is not nice at all either. Hmm.
Let's see if DPCM can help here.
Daniel
On Wed, Jul 17, 2013 at 11:16:22AM +0200, Daniel Mack wrote:
On 16.07.2013 19:47, Mark Brown wrote:
Half the point of soc-pcm is to decouple the streams seen by the application layer from the DAIs that the driver sees so I'd expect us to be able to map a single application layer/memory stream onto multiple CODEC DAIs. One front end connected to two back ends.
You're talking about DPCM, right? Unfortunately, there is no active user of this in mainline, and no documentation either. I'll try and find out how this is supposed to work first ...
Yes, this is a constant complaint. Hopefully the OMAP4 code will be upstreamed at some point but as you will be aware there's issues there.
Maybe Liam has some explanation ready?
The only thing I'm aware of is the out of tree OMAP4 and Qualcomm code neither of which I'd hold my breath over.
May as well bite the bullet and do the refactoring, it should be automatable. There's also the fun of handling what happens when you want to stream from one CODEC to another but I think that's second stage...
Looking deeper into this, I'm not sure whether each of those codec instances inside one snd_soc_dai_link should have it own runtime, of if they should rather share one. The latter isn't straight-forward, as current users simply access the codec from their callbacks as rtd->codec_dai. However, open-coding routines to access the correct codec from these callbacks is not nice at all either. Hmm.
Yes, it's not an easy refactoring. But it's probably one that'll be needed for Slimbus at some point.
participants (2)
-
Daniel Mack
-
Mark Brown