Re: [alsa-devel] Reg: Implementation of GSM digital audio path
On Thursday 25 June 2009 14:11:04 ext shariff wrote:
Hi Peter,
Thanks for quick response. I understood some what from your mail. I will explain, how my hardware is connected. USB GSM chip -> McBSP3 McBSP2 -> TWL4030 Here, playback functionality will not be there in the GSM path. I
need to take audio data from GSM. and redirect this to TWL4030 codec.
I see. Sorry I have confused things a bit, In this case you need to do what Mark has suggested: dummy dai for the codec, connect it to McBSP3 as it were a normal codec. Than you will have two PCM on the sound card: 0.0: TWL digital audio 0.1: Your codec
arecord -Dplughw:0.1 | aplay -Dplughw:0.0 or arecord -Dhw:0.1 | aplay -Dhw:0.0
The thing is, How and where to implement this GSM path without
playback? as I am new to this ASoC framework.
What I understood from ur mail is. Audio is being redirected from
the application space i.e arecord will record and store in buffer and this buffer is
being played using aplay to other hardware (twl4030). Can we handle the same thing in kernel space ( i.e in driver itself).
You anyway have to copy from one buffer to another to be in the safe side (in theory you could use the same buffer for recording and playback, but it should be avoided). So you have to copy the samples from the capture buffer to the playback buffer. It really does not matter if it is done inside of the kernel or in user space. This kind of 'routing' clearly belongs to the user space.
Regards, Shariff
Hi Peter,
Thanks for your suggestions.
I have one more question:
In GSM mode. I have to also take audio data from TWL4030 MIC input and redirect to GSM.
From your theory, If
0.0: TWL digital audio 0.1: My GSM codec
arecord -Dplughw:0.1 | aplay -Dplughw:0.0 GSM to TWL4030.
Can I specify, for TWL4030 to GSM
arecord -Dplughw:0.0 | aplay -Dplughw:0.1 .Is this the correct thing? Can I both things at the same time?
If I want to record a GSM call? I have to record both the conversations and copy in one file. since here both the buffers are independent.
How this can be done?
Thanks & Regards, Shariff
On Thu, Jun 25, 2009 at 5:02 PM, Peter Ujfalusi peter.ujfalusi@nokia.comwrote:
On Thursday 25 June 2009 14:11:04 ext shariff wrote:
Hi Peter,
Thanks for quick response. I understood some what from your mail. I will explain, how my hardware is connected. USB GSM chip -> McBSP3 McBSP2 -> TWL4030 Here, playback functionality will not be there in the GSM path. I
need to take audio data from GSM. and redirect this to TWL4030 codec.
I see. Sorry I have confused things a bit, In this case you need to do what Mark has suggested: dummy dai for the codec, connect it to McBSP3 as it were a normal codec. Than you will have two PCM on the sound card: 0.0: TWL digital audio 0.1: Your codec
arecord -Dplughw:0.1 | aplay -Dplughw:0.0 or arecord -Dhw:0.1 | aplay -Dhw:0.0
The thing is, How and where to implement this GSM path without
playback? as I am new to this ASoC framework.
What I understood from ur mail is. Audio is being redirected from
the application space i.e arecord will record and store in buffer and
this
buffer is
being played using aplay to other hardware (twl4030). Can we handle the same thing in kernel space ( i.e in driver
itself).
You anyway have to copy from one buffer to another to be in the safe side (in theory you could use the same buffer for recording and playback, but it should be avoided). So you have to copy the samples from the capture buffer to the playback buffer. It really does not matter if it is done inside of the kernel or in user space. This kind of 'routing' clearly belongs to the user space.
Regards, Shariff
-- Péter
participants (2)
-
Peter Ujfalusi
-
shariff