On Tue, 3 Feb 2015 16:47:48 +0000 Mark Brown broonie@kernel.org wrote:
On Sat, Jan 24, 2015 at 08:30:27AM +0100, Jean-Francois Moine wrote:
Mark Brown broonie@kernel.org wrote:
On Fri, Jan 23, 2015 at 07:34:56PM +0100, Jean-Francois Moine wrote:
The simple card builder, 'dt-card' (maybe a better name would have been 'graph-card'), acts just like the simple-card except that it does not appear in the DT. Its creation is done by an audio controller.
Which audio controller? There may be several CPU side audio interfaces in the same card. For example people often want to have both low latency and high latency audio paths from the CPU into the hardware (low latency tends to increase power burn). SoC centric system designs do sometimes also have PDM I/O, expecting to be directly connected to DMICs and so on, which results in a relatively large number of CPU interfaces.
The audio controller which creates the card depends on the complexity of the card. When there are many controllers, it is up to the designer to define either a master audio controller or to instantiate a 'card' device via the DT for doing the job.
So how does the simple controller interact with a more complex one given that it's somehow picking some controller node to start from?
A way to solve this problem could be to create only one card builder. This creation could be explicit (created by the first active audio controller) or implicit by the audio subsystem on the first controller or CODEC creation.
Then, the card builder could scan all the DT looking for the audio ports and create one or more cards according to the graph connectivity.
Well, forget about this. I never clearly understood why some widgets and routes had to be defined at card level.
Please do try to understand the idea of representing simple components on the board and analogue interconects between devices - it's really important and not something that can be neglected.
The problem is that this understanding would stay abstract: I have no such a hardware. Anyway, if the representation can be done with the simple-card, it may also be done with a graph of ports.
If you have a device with any sort of speaker or microphone, or any sort of external connector for interfacing with an external device like a headphone jack, then you have something that could be a widget.
I know what are the widgets and routes, I was just wondering why they (especially the widgets) need to appear at the card level instead of just being declared in the DAIs (from the platform or the DT). And the same question may also be raised about the audio formats, clocks, tdm's...
That DT binding was done entirely in the context of video applications IIRC, this is the first time it's been discussed in this context.
http://mailman.alsa-project.org/pipermail/alsa-devel/2014-January/070622.htm... http://mailman.alsa-project.org/pipermail/alsa-devel/2015-January/086273.htm...
So there's been some in passing mentions, not really serious discussion though...
I may go back about the card builder, but Russell's idea about declaring the tda998x audio parameters by a port as declared in a graph of ports seems fine to me. This declaration should be compatible with the use of the simple-card.