[alsa-devel] <alsa-dev> RFC for OMAP4 audio driver
Hardware Background ------------------- ABE (Audio back end) IP on the OMAP4 SoC of TI provides digital audio signal processing features (real-time mixing, equalization, etc) on the media processor. ABE allows flexible routing of the audio paths with dedicated equalizer and acoustics protection for earphone/ headphone, hands-free and vibra paths. Typically the ABE is connected to a companion analog audio chip via a dedicated McPDM (multi-channel pulse density modulation) as show at
http://dev.omapzoom.org/?p=misael/kernel-audio.git;a=blob;f=sound/soc/codecs... ;
Proposal -------- On OMAP4, audio codec functionality is spread between ABE of OMAP4 and the companion analog audio chip. The interface between the ABE and Phoenix audio chip is proprietary McPDM interface. The proposal is to combine both ABE and Phoenix into a codec driver. OMAP4 ABE also provides other links for routing digital processed data to McBSP, McASP interfaces, which are at the same level as McPDM interface.
+---------+ +---------------------+ ALSA Library --|ASOC Core|--|OMAP4 Platform Driver| +---------+ +---------------------+ | <DMA-DAI> | +-----------------------+ |ABE--<McPDM>--Phoenix--|--o Speaker/MIC +-----------------------+
ABE provides a FIFO interface (similar to McBSP FIFO) and the audio data transfer between the audio buffers and ABE FIFO is handled through the CPU DAI. Handling of the McPDM interface between the digital audio (ABE) and analog audio chip (Phoenix) is handled internally in the codec driver as part of the codec DAI operations.
This RFC addresses the basic audio usecases using Phoenix companion chip with the OMAP4 audio IP block. Support and configuration for using different codecs (connectivity with BT/FM etc) using I2S/PCM interfaces are targeted for future enhancements.
High Level Design ----------------- Following are the files consisting of the TI OMAP4 audio driver:
sound/soc/omap/omap-pcm.c: This is the platform driver for OMAP audio. It remains the same across all OMAP chip versions.
sound/soc/codecs/abe-twl6030.c: This is the codec driver interface file for OMAP4 audio. It defines the codec DAIs for the HIFI, voice and Vibra subdevices. Handles the configuration of Phoenix companion chip using i2c control interface. Handles the initialization and configuration of ABE. All codec related widget controls are also handled in this file. Both, digital (ABE) and analog (twl6030) widgets will be contained in this same driver.
sound/soc/codecs/abe/*: This folder contains the ABE specific handlers. i.e reset, clock, configuration of ABE ports for different stream inputs/outputs etc.
sound/soc/omap/oma-abe.c: This is the DAI driver towards the abe.
sound/soc/omap/sdp4430.c: This is the machine driver for SDP4430 and defines the stream connections.
Questions --------- How to handle routing of digital audio from ABE to external devices like Bluetooth/FM connectivity devices which are usually connected using McBSP interfaces?. In these scenarios, we need another DAI between the ABE (platform codec) and an external codec.
ABE(platform Codec (Digital) -----> Phoenix audio codec | | +--> BT/FM codec
On Tue, Sep 01, 2009 at 09:27:30PM -0500, hari n wrote:
Proposal
On OMAP4, audio codec functionality is spread between ABE of OMAP4 and the companion analog audio chip. The interface between the ABE and Phoenix audio chip is proprietary McPDM interface. The proposal is to combine both ABE and Phoenix into a codec driver. OMAP4 ABE
Combining the two seems like it's asking for trouble further along the line. While it's likely that a lot of board designs will use the reference design combination of devices there will inevitably be some systems that want to do something different for whatever reason. If that affects the connections between the two parts of the system then there will be problems for a unified driver.
Otherwise the rest of your design looks good.
sound/soc/codecs/abe-twl6030.c: This is the codec driver interface file for OMAP4 audio. It defines the codec DAIs for the HIFI, voice and Vibra subdevices. Handles the configuration of Phoenix companion chip using i2c control interface. Handles the initialization and configuration of ABE. All codec related widget controls are also handled in this file. Both, digital (ABE) and analog (twl6030) widgets will be contained in this same driver.
Having the ABE as a CODEC does seem like a sensible approach - indeed, there are a number of audio hub CODECs on the market with a very similar feature set to the ABE.
Questions
How to handle routing of digital audio from ABE to external devices like Bluetooth/FM connectivity devices which are usually connected using McBSP interfaces?. In these scenarios, we need another DAI between the ABE (platform codec) and an external codec.
ABE(platform Codec (Digital) -----> Phoenix audio codec | | +--> BT/FM codec
This is the sort of issue I'm talking about above which is helped by explicitly representing all the DAIs in the system. With the DAIs visible it should just become a case of hooking up the links between the devices in the usual ASoC fashion.
Proposal
On OMAP4, audio codec functionality is spread between ABE
of OMAP4 and
the companion analog audio chip. The interface between the ABE and Phoenix audio chip is proprietary McPDM interface. The
proposal is to
combine both ABE and Phoenix into a codec driver. OMAP4 ABE
Combining the two seems like it's asking for trouble further along the line. While it's likely that a lot of board designs will use the reference design combination of devices there will inevitably be some systems that want to do something different for whatever reason. If that affects the connections between the two parts of the system then there will be problems for a unified driver.
Otherwise the rest of your design looks good.
sound/soc/codecs/abe-twl6030.c: This is the codec driver interface file for OMAP4 audio. It defines the codec DAIs for the HIFI, voice and Vibra subdevices. Handles the configuration of Phoenix companion chip using i2c control interface. Handles the initialization and configuration of ABE. All codec related widget controls are also handled in this file. Both, digital (ABE) and analog (twl6030) widgets will be contained in this same driver.
Having the ABE as a CODEC does seem like a sensible approach
- indeed, there are a number of audio hub CODECs on the
market with a very similar feature set to the ABE.
Mark,
I'm not familiar with the concept of an audio hub, but is it something like a codec that can have 'slave' codecs connected to it (like in cascade fashion)? Do we have any audio hub CODEC already in SoC to use as a reference?
Questions
How to handle routing of digital audio from ABE to external devices like Bluetooth/FM connectivity devices which are usually connected using McBSP interfaces?. In these scenarios, we need another DAI between the ABE (platform codec) and an external codec.
ABE(platform Codec (Digital) -----> Phoenix audio codec | | +--> BT/FM codec
This is the sort of issue I'm talking about above which is helped by explicitly representing all the DAIs in the system. With the DAIs visible it should just become a case of hooking up the links between the devices in the usual ASoC fashion.
If the ABE is considered as a separate CODEC, then it should have its own DAIs for the connection with the processor, even it the codec itself resides in the processor. What about the client CODECs? They will also have their DAIs which should be connected to the physical outputs of ABE (McPDM, McBSP, ...), and that confuses me. AFAIK, SoC allows to have multiple CODECs attached to the same processor, but not a CODEC connected to the output of another CODEC.
-Misa
On Wed, Sep 02, 2009 at 08:55:16PM -0500, Lopez Cruz, Misael wrote:
I'm not familiar with the concept of an audio hub, but is it something like a codec that can have 'slave' codecs connected to it (like in cascade fashion)?
Not quite. The idea is that in a system like a phone where you have multiple sources and sinks for audio (the CPU, bluetooth, GSM, onboard mics and speaker...) an audio hub provides a central place to route and mix all the audio signals. This can be purely analogue, purely digital or a mix of both - generally for the full thing you need a mix of both. The OMAP4 ABE is an example of this in the digital domain.
The idea is that it becomes possible to do things like run the audio subsystem while the rest of the system is powered down (eg, during a call) and that the CPU doesn't need to spend time on basic audio tasks.
Do we have any audio hub CODEC already in SoC to use as a reference?
The OpenMoko phones are probably the most obvious example of this in mainline - they have GSM connected to the CODEC via a line input and bluetooth via a second DAI.
This is the sort of issue I'm talking about above which is helped by explicitly representing all the DAIs in the system. With the DAIs visible it should just become a case of hooking up the links between the devices in the usual ASoC fashion.
If the ABE is considered as a separate CODEC, then it should have its own DAIs for the connection with the processor, even it the codec itself resides in the processor. What about the client CODECs?
I beleive this will make life easier with the current design. Looking at the system diagrams the CPU core is relating to the ABE as though it were an external device, and the design that was proposed with a unified ABE/TWL6030 driver is doing pretty much that. All I'm really saying here is that the links between the OMAP4 and the TWL6030 should be explicitly represented so that if a board design hooks things up differently for some reason then the drivers can cope.
They will also have their DAIs which should be connected to the physical outputs of ABE (McPDM, McBSP, ...), and that confuses me. AFAIK, SoC allows to have multiple CODECs attached to the same processor, but not a CODEC connected to the output of another CODEC.
At the minute it's a bit tricky but it's definitely something that ought to be supported - you may remember the refactoring that was done to unify the CODEC and CPU DAI structures, this sort of use case was one of the motivators for that. Where there's problems the core isn't a fixed thing, we can change it if required.
They will also have their DAIs which should be connected to the physical outputs of ABE (McPDM, McBSP, ...), and that confuses me. AFAIK, SoC allows to have multiple CODECs attached to the same processor, but not a CODEC connected to the output of another CODEC.
At the minute it's a bit tricky but it's definitely something that ought to be supported - you may remember the refactoring that was done to unify the CODEC and CPU DAI structures, this sort of use case was one of the motivators for that. Where there's problems the core isn't a fixed thing, we can change it if required.
Very interesting thread. My 2 cents: I find the notion of codec confusing. In most products from Wolfson and others, you have a digital/mixing part, and a second one for digital/analog conversions. OMAP4 and others have a digital part on the application processor, but the partition is still the same, only implemented on two chips instead of one. It shouldn't really matter software-wise where the digital part is done, as long as you can express the connection between these two parts. A logical split would avoid having a description of a codec connected to another codec, you would only have a digital part and an analog part. With a CPU/digital/analog split, you could address all topologies without developers incurring any risk of brain damage. -Pierre
On Thu, Sep 03, 2009 at 08:56:30AM -0500, pl bossart wrote:
My 2 cents: I find the notion of codec confusing. In most products from Wolfson and others, you have a digital/mixing part, and a second one for digital/analog conversions. OMAP4 and others have a digital part on the application processor, but the partition is still the same, only implemented on two chips instead of one. It shouldn't really matter software-wise where the digital part is done, as long as you can express the connection between these two parts. A logical split would avoid having a description of a codec
In this case "codec" is just what ASoC (for historical reasons) calls any block that isn't a DAI or DMA controller on the CPU. It doesn't actually need to be a CODEC. There are many cases even the ADCs and DACs are integrated with the CPU which means that package boundaries aren't always that helpful either.
participants (4)
-
hari n
-
Lopez Cruz, Misael
-
Mark Brown
-
pl bossart