[alsa-devel] [PATCH 0/3] ASoC: Enable a new IC master mode: bcm2835<=>IC<=>cs42xx8
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Matt Flax (3): ASoC : Add an IC bit and frame master mode (SoC and Codec slave). ASoC: cs42xx8: allow IC master mode. ASoC: bcm2835: Add mutichannel mode in DSP and IC master modes.
include/sound/soc-dai.h | 9 ++++++--- sound/soc/bcm/bcm2835-i2s.c | 21 ++++++++++++++++++++- sound/soc/codecs/cs42xx8.c | 1 + 3 files changed, 27 insertions(+), 4 deletions(-)
This patch adds a new clock mode to the SoC system. The new mode allows for an IC (between the SoC and the Codec) to be bit and frame master.
As SoC and codec drivers default to return -EINVAL, this new mode can not set either as master.
Signed-off-by: Matt Flax flatmax@flatmax.org --- include/sound/soc-dai.h | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-)
diff --git a/include/sound/soc-dai.h b/include/sound/soc-dai.h index 200e1f0..22f516f 100644 --- a/include/sound/soc-dai.h +++ b/include/sound/soc-dai.h @@ -77,14 +77,17 @@ struct snd_compr_stream; /* * DAI hardware clock masters. * - * This is wrt the codec, the inverse is true for the interface - * i.e. if the codec is clk and FRM master then the interface is - * clk and frame slave. + * SND_SOC_DAIFMT_C* definitions are wrt the codec, the inverse is true + * for the interface. i.e. if the codec is clk and FRM master then the + * interface is clk and frame slave. + * The SND_SOC_DAIFMT_IBM_IFM indicates that an IC inbetween the codec and SoC + * is master. In SND_SOC_DAIFMT_IBM_IFM mode both the codec and SoC are slaves. */ #define SND_SOC_DAIFMT_CBM_CFM (1 << 12) /* codec clk & FRM master */ #define SND_SOC_DAIFMT_CBS_CFM (2 << 12) /* codec clk slave & FRM master */ #define SND_SOC_DAIFMT_CBM_CFS (3 << 12) /* codec clk master & frame slave */ #define SND_SOC_DAIFMT_CBS_CFS (4 << 12) /* codec clk & FRM slave */ +#define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
#define SND_SOC_DAIFMT_FORMAT_MASK 0x000f #define SND_SOC_DAIFMT_CLOCK_MASK 0x00f0
On Sat, Feb 25, 2017 at 04:03:12PM +1100, Matt Flax wrote:
This patch adds a new clock mode to the SoC system. The new mode allows for an IC (between the SoC and the Codec) to be bit and frame master.
As SoC and codec drivers default to return -EINVAL, this new mode can not set either as master.
I think I'd expect to see this device represented as a CODEC with two CODEC-CODEC links rather than transparently as here.
This patch allows the cs42xx8 to be put into slave mode when an IC (between this codec and the SoC) is master.
It sets slave mode by treating SND_SOC_DAIFMT_IBM_IFM in the same way it treats SND_SOC_DAIFMT_CBS_CFS.
Signed-off-by: Matt Flax flatmax@flatmax.org --- sound/soc/codecs/cs42xx8.c | 1 + 1 file changed, 1 insertion(+)
diff --git a/sound/soc/codecs/cs42xx8.c b/sound/soc/codecs/cs42xx8.c index c1785bd..34f9adb 100644 --- a/sound/soc/codecs/cs42xx8.c +++ b/sound/soc/codecs/cs42xx8.c @@ -235,6 +235,7 @@ static int cs42xx8_set_dai_fmt(struct snd_soc_dai *codec_dai, /* Set master/slave audio interface */ switch (format & SND_SOC_DAIFMT_MASTER_MASK) { case SND_SOC_DAIFMT_CBS_CFS: + case SND_SOC_DAIFMT_IBM_IFM: cs42xx8->slave_mode = true; break; case SND_SOC_DAIFMT_CBM_CFM:
This patch adds multitrack capability if in DSP mode A and an IC (between the SoC and codec) is master.
In bcm2835_i2s_startup, snd_pcm_hw_constraint_single is used to set channels to 8 if both SND_SOC_DAIFMT_IBM_IFM and SND_SOC_DAIFMT_DSP_A are set. Otherwise, channels are set to 2. These settings are accomplished using the SNDRV_PCM_HW_PARAM_CHANNELS variable.
This patch protects against DSP mode misuse by failing if either the SoC or Codec is master. i.e. if SND_SOC_DAIFMT_DSP_A is chosen but not in SND_SOC_DAIFMT_IBM_IFM mode, then -EINVAL is returned.
In bcm2835_i2s_shutdown the channels are set to 2 by default.
In bcm2835_i2s_hw_params, DSP mode A format is now an option. Before replicating the format variable (from ch2 to ch1) for register loading, requested channels are checked to be either 2 or 8. This can be expanded later to accomodate other channel counts if supported by the sound card hardware.
It has been tested to work with both a regular stereo sound card and an 8 channel sound card.
Signed-off-by: Matt Flax flatmax@flatmax.org --- sound/soc/bcm/bcm2835-i2s.c | 21 ++++++++++++++++++++- 1 file changed, 20 insertions(+), 1 deletion(-)
diff --git a/sound/soc/bcm/bcm2835-i2s.c b/sound/soc/bcm/bcm2835-i2s.c index 6ba2049..dbfecb3 100644 --- a/sound/soc/bcm/bcm2835-i2s.c +++ b/sound/soc/bcm/bcm2835-i2s.c @@ -296,6 +296,7 @@ static int bcm2835_i2s_hw_params(struct snd_pcm_substream *substream,
switch (dev->fmt & SND_SOC_DAIFMT_FORMAT_MASK) { case SND_SOC_DAIFMT_I2S: + case SND_SOC_DAIFMT_DSP_A: data_delay = 1; break; default: @@ -312,6 +313,7 @@ static int bcm2835_i2s_hw_params(struct snd_pcm_substream *substream,
switch (params_channels(params)) { case 2: + case 8: format = BCM2835_I2S_CH1(format) | BCM2835_I2S_CH2(format); format |= BCM2835_I2S_CH1(BCM2835_I2S_CHPOS(ch1pos)); format |= BCM2835_I2S_CH2(BCM2835_I2S_CHPOS(ch2pos)); @@ -526,7 +528,20 @@ static int bcm2835_i2s_startup(struct snd_pcm_substream *substream, regmap_update_bits(dev->i2s_regmap, BCM2835_I2S_CS_A_REG, BCM2835_I2S_STBY, BCM2835_I2S_STBY);
- return 0; + /* Only allow 2 channels, unless in DSP mode where an IC (between + * the SoC and codec) is master. + */ + if ((dev->fmt & SND_SOC_DAIFMT_FORMAT_MASK) + == SND_SOC_DAIFMT_DSP_A) + if ((dev->fmt & SND_SOC_DAIFMT_MASTER_MASK) + != SND_SOC_DAIFMT_IBM_IFM) + return -EINVAL; + else + return snd_pcm_hw_constraint_single(substream->runtime, + SNDRV_PCM_HW_PARAM_CHANNELS, 8); + else + return snd_pcm_hw_constraint_single(substream->runtime, + SNDRV_PCM_HW_PARAM_CHANNELS, 2); }
static void bcm2835_i2s_shutdown(struct snd_pcm_substream *substream, @@ -549,6 +564,10 @@ static void bcm2835_i2s_shutdown(struct snd_pcm_substream *substream, * not stop the clock when SND_SOC_DAIFMT_CONT */ bcm2835_i2s_stop_clock(dev); + + /* Default to 2 channels */ + snd_pcm_hw_constraint_single(substream->runtime, + SNDRV_PCM_HW_PARAM_CHANNELS, 2); }
static const struct snd_soc_dai_ops bcm2835_i2s_dai_ops = {
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
so long,
Hias
Matt Flax (3): ASoC : Add an IC bit and frame master mode (SoC and Codec slave). ASoC: cs42xx8: allow IC master mode. ASoC: bcm2835: Add mutichannel mode in DSP and IC master modes.
include/sound/soc-dai.h | 9 ++++++--- sound/soc/bcm/bcm2835-i2s.c | 21 ++++++++++++++++++++- sound/soc/codecs/cs42xx8.c | 1 + 3 files changed, 27 insertions(+), 4 deletions(-)
-- 2.7.4
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
thanks Matt
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
so long,
Hias
On 27/02/17 01:49, Matthias Reichl wrote:
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
All configuration is done in a machine driver, which I haven't submitted. The machine driver's dai_fmt is as follows :
.dai_fmt = SND_SOC_DAIFMT_IBM_IFM|SND_SOC_DAIFMT_DSP_A|SND_SOC_DAIFMT_NB_NF,
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
Clocks are set up in the machine driver as well.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
Thanks, this is actually very constructive advice. Let me try to understand what you are suggesting here... I am confused about how to set bit depth correctly for the codec when I play to the IC chip in this setup... lets walk me through this one...
I implement a new codec which matches this IC. The IC can be setup to be master in its dai fmt. The sound card shows up with two devices, only the IC device can be used to play and record.
Say I use aplay to play to the first device (the IC chip) it does hw_params and clocks are set. But the second device (the codec) never gets hw_params executed ? If I select 16 or 24 bits, it never knows ... is that right ? If so, how do we solve this problem ?
thanks Matt
On Mon, Feb 27, 2017 at 07:21:13AM +1100, Matt Flax wrote:
On 27/02/17 01:49, Matthias Reichl wrote:
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
All configuration is done in a machine driver, which I haven't submitted. The machine driver's dai_fmt is as follows :
.dai_fmt = SND_SOC_DAIFMT_IBM_IFM|SND_SOC_DAIFMT_DSP_A|SND_SOC_DAIFMT_NB_NF,
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
Clocks are set up in the machine driver as well.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
Thanks, this is actually very constructive advice. Let me try to understand what you are suggesting here... I am confused about how to set bit depth correctly for the codec when I play to the IC chip in this setup... lets walk me through this one...
I implement a new codec which matches this IC. The IC can be setup to be master in its dai fmt. The sound card shows up with two devices, only the IC device can be used to play and record.
Actually you don't get 2 PCMs from that in userspace, you are only telling ALSA that your stream is routed through another codec inbetween.
Say I use aplay to play to the first device (the IC chip) it does hw_params and clocks are set. But the second device (the codec) never gets hw_params executed ? If I select 16 or 24 bits, it never knows ... is that right ? If so, how do we solve this problem ?
Have a look at the end of Documentation/sound/alsa/soc/DPCM.txt where codec<->codec links are explained.
You can setup the stream parameters for the "IC"<->"real codec" link with snd_soc_dai_link.params.
You can do that for example in your machine driver in hwparams: when hw_params is called eg with 2 channels 192kHz just set the dai link parameters of the codec to 8 channels 48kHz.
Maybe using dynamic pcm (which offers a be_hw_params_fixup hook) is the more appropriate way to do that, but I can't tell for sure as I never used it and am not familiar with it.
so long,
Hias
On 27/02/17 09:16, Matthias Reichl wrote:
On Mon, Feb 27, 2017 at 07:21:13AM +1100, Matt Flax wrote:
On 27/02/17 01:49, Matthias Reichl wrote:
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
All configuration is done in a machine driver, which I haven't submitted. The machine driver's dai_fmt is as follows :
.dai_fmt = SND_SOC_DAIFMT_IBM_IFM|SND_SOC_DAIFMT_DSP_A|SND_SOC_DAIFMT_NB_NF,
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
Clocks are set up in the machine driver as well.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
Thanks, this is actually very constructive advice. Let me try to understand what you are suggesting here... I am confused about how to set bit depth correctly for the codec when I play to the IC chip in this setup... lets walk me through this one...
I implement a new codec which matches this IC. The IC can be setup to be master in its dai fmt. The sound card shows up with two devices, only the IC device can be used to play and record.
Actually you don't get 2 PCMs from that in userspace, you are only telling ALSA that your stream is routed through another codec inbetween.
Say I use aplay to play to the first device (the IC chip) it does hw_params and clocks are set. But the second device (the codec) never gets hw_params executed ? If I select 16 or 24 bits, it never knows ... is that right ? If so, how do we solve this problem ?
Have a look at the end of Documentation/sound/alsa/soc/DPCM.txt where codec<->codec links are explained.
You can setup the stream parameters for the "IC"<->"real codec" link with snd_soc_dai_link.params.
You can do that for example in your machine driver in hwparams: when hw_params is called eg with 2 channels 192kHz just set the dai link parameters of the codec to 8 channels 48kHz.
Maybe using dynamic pcm (which offers a be_hw_params_fixup hook) is the more appropriate way to do that, but I can't tell for sure as I never used it and am not familiar with it.
Thanks for this information, I will give this a go - seems a better way now then introducing an IC master into the mix !
Do you agree that we still need the BCM2835 SoC CPU driver to be expanded to handle DSP mode and multichannel setups ? I can't see any other way to enable the multichannel hardware setup without altering bcm2835_i2s.c to handle DSP mode and more then 2 channels in its _hw_params.
thanks Matt
On Mon, Feb 27, 2017 at 09:35:14AM +1100, Matt Flax wrote:
On 27/02/17 09:16, Matthias Reichl wrote:
On Mon, Feb 27, 2017 at 07:21:13AM +1100, Matt Flax wrote:
On 27/02/17 01:49, Matthias Reichl wrote:
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote: >This patch set lets the ASoC system specify that an IC between the SoC and codec >is master. This is intended to put both the SoC and Codec into slave modes. > >By default un-patched SoC and Codec drivers will return -EINVAL if they aren't >enabled and tested for this mode. > >soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : >#define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */ > >The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 >SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 >where the IC is bit and frame master. Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
All configuration is done in a machine driver, which I haven't submitted. The machine driver's dai_fmt is as follows :
.dai_fmt = SND_SOC_DAIFMT_IBM_IFM|SND_SOC_DAIFMT_DSP_A|SND_SOC_DAIFMT_NB_NF,
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
Clocks are set up in the machine driver as well.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
Thanks, this is actually very constructive advice. Let me try to understand what you are suggesting here... I am confused about how to set bit depth correctly for the codec when I play to the IC chip in this setup... lets walk me through this one...
I implement a new codec which matches this IC. The IC can be setup to be master in its dai fmt. The sound card shows up with two devices, only the IC device can be used to play and record.
Actually you don't get 2 PCMs from that in userspace, you are only telling ALSA that your stream is routed through another codec inbetween.
Say I use aplay to play to the first device (the IC chip) it does hw_params and clocks are set. But the second device (the codec) never gets hw_params executed ? If I select 16 or 24 bits, it never knows ... is that right ? If so, how do we solve this problem ?
Have a look at the end of Documentation/sound/alsa/soc/DPCM.txt where codec<->codec links are explained.
You can setup the stream parameters for the "IC"<->"real codec" link with snd_soc_dai_link.params.
You can do that for example in your machine driver in hwparams: when hw_params is called eg with 2 channels 192kHz just set the dai link parameters of the codec to 8 channels 48kHz.
Maybe using dynamic pcm (which offers a be_hw_params_fixup hook) is the more appropriate way to do that, but I can't tell for sure as I never used it and am not familiar with it.
Thanks for this information, I will give this a go - seems a better way now then introducing an IC master into the mix !
Do you agree that we still need the BCM2835 SoC CPU driver to be expanded to handle DSP mode and multichannel setups ? I can't see any other way to enable the multichannel hardware setup without altering bcm2835_i2s.c to handle DSP mode and more then 2 channels in its _hw_params.
No, DSP mode and multichannel don't work on the BCM2835. Keep the driver as it is.
Use the alsa plugin approach to tunnel multichannel data over a 2-channel PCM and you don't need to modify existing drivers.
so long,
Hias
thanks Matt
On 27/02/17 19:04, Matthias Reichl wrote:
On Mon, Feb 27, 2017 at 09:35:14AM +1100, Matt Flax wrote:
On 27/02/17 09:16, Matthias Reichl wrote:
Have a look at the end of Documentation/sound/alsa/soc/DPCM.txt where codec<->codec links are explained.
You can setup the stream parameters for the "IC"<->"real codec" link with snd_soc_dai_link.params.
You can do that for example in your machine driver in hwparams: when hw_params is called eg with 2 channels 192kHz just set the dai link parameters of the codec to 8 channels 48kHz.
Maybe using dynamic pcm (which offers a be_hw_params_fixup hook) is the more appropriate way to do that, but I can't tell for sure as I never used it and am not familiar with it.
Thanks for this information, I will give this a go - seems a better way now then introducing an IC master into the mix !
Do you agree that we still need the BCM2835 SoC CPU driver to be expanded to handle DSP mode and multichannel setups ? I can't see any other way to enable the multichannel hardware setup without altering bcm2835_i2s.c to handle DSP mode and more then 2 channels in its _hw_params.
No, DSP mode and multichannel don't work on the BCM2835. Keep the driver as it is.
Use the alsa plugin approach to tunnel multichannel data over a 2-channel PCM and you don't need to modify existing drivers.
Thanks again for this suggestion. I have been playing around with user space plugins lately, they are great. I plan to release some C++ classes to help trivialise the creation of such plugins.
However for a product, a piece of robustly functioning hardware, I am inclined to take the route of introducing a DSP mode into the bcm2835_i2s.c driver.
From the previous patches, it is clear that people are willing to allow multichannel (DSP mode) into the bcm2835 CPU driver. It improves the usefulness of hardware which use the BCM2835 derived chipset. From my point of view that is a major simplification in terms of releasing functioning multichannel hardware. I would imagine from the manufacturer's point of view that is a big bonus too !
Again I intend to implement your concept of the snd_soc_dai_link for my machine driver, that is a fantasic lead you have given me ... when I have questions @ implementation time I will copy you in seeking for more guidance.
thanks again, Matt
On Mon, Feb 27, 2017 at 09:08:14PM +1100, Matt Flax wrote:
On 27/02/17 19:04, Matthias Reichl wrote:
On Mon, Feb 27, 2017 at 09:35:14AM +1100, Matt Flax wrote:
On 27/02/17 09:16, Matthias Reichl wrote:
Have a look at the end of Documentation/sound/alsa/soc/DPCM.txt where codec<->codec links are explained.
You can setup the stream parameters for the "IC"<->"real codec" link with snd_soc_dai_link.params.
You can do that for example in your machine driver in hwparams: when hw_params is called eg with 2 channels 192kHz just set the dai link parameters of the codec to 8 channels 48kHz.
Maybe using dynamic pcm (which offers a be_hw_params_fixup hook) is the more appropriate way to do that, but I can't tell for sure as I never used it and am not familiar with it.
Thanks for this information, I will give this a go - seems a better way now then introducing an IC master into the mix !
Do you agree that we still need the BCM2835 SoC CPU driver to be expanded to handle DSP mode and multichannel setups ? I can't see any other way to enable the multichannel hardware setup without altering bcm2835_i2s.c to handle DSP mode and more then 2 channels in its _hw_params.
No, DSP mode and multichannel don't work on the BCM2835. Keep the driver as it is.
Use the alsa plugin approach to tunnel multichannel data over a 2-channel PCM and you don't need to modify existing drivers.
Thanks again for this suggestion. I have been playing around with user space plugins lately, they are great. I plan to release some C++ classes to help trivialise the creation of such plugins.
However for a product, a piece of robustly functioning hardware, I am inclined to take the route of introducing a DSP mode into the bcm2835_i2s.c driver.
You can add DSP mode when you found which undocumented registers inside the bcm2835 are needed to enable DSP mode, set channel 3-8 positions etc. I'm not sure they exist, so there's little point in doing that - as we both agreed, clocking the bcm2835 in DSP mode (while it's set up as a I2S slave) doesn't work.
From the previous patches, it is clear that people are willing to allow multichannel (DSP mode) into the bcm2835 CPU driver.
I don't see that. Some people might want to be able to use multichannel on RPi, but that's something very different than adding code that just lies about driver capabilites.
It improves the usefulness of hardware which use the BCM2835 derived chipset. From my point of view that is a major simplification in terms of releasing functioning multichannel hardware. I would imagine from the manufacturer's point of view that is a big bonus too !
Again I intend to implement your concept of the snd_soc_dai_link for my machine driver, that is a fantasic lead you have given me ... when I have questions @ implementation time I will copy you in seeking for more guidance.
thanks again, Matt
On 27/02/17 21:30, Matthias Reichl wrote:
I don't see that. Some people might want to be able to use multichannel on RPi, but that's something very different than adding code that just lies about driver capabilites.
I am going to invite you over for dinner to eat steamed vegetables :) If I make the steamed vegetables in the microwave and they taste like steamed vegetables, are they steamed vegetables ?
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
If a chipset can do multichannel, do we need to invent a new DSP mode name to call it dsp mode ? Surely not, because steamed vegetables are steamed vegetables :)
Just because steamed vegetables are made in the microwave doesn't mean they don't belong in the Linux Kernel !
Matt
On Mon, Feb 27, 2017 at 10:21:25PM +1100, Matt Flax wrote:
On 27/02/17 21:30, Matthias Reichl wrote:
I don't see that. Some people might want to be able to use multichannel on RPi, but that's something very different than adding code that just lies about driver capabilites.
I am going to invite you over for dinner to eat steamed vegetables :) If I make the steamed vegetables in the microwave and they taste like steamed vegetables, are they steamed vegetables ?
If you connect an amplifier with a volume control to your soundcard does that mean that your soundcard now has a volume control? No, your amp has.
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
If a chipset can do multichannel, do we need to invent a new DSP mode name to call it dsp mode ? Surely not, because steamed vegetables are steamed vegetables :)
You can do multichannel audio via IEC958 AC3/DTS passthrough as well. This doesn't mean that each driver has to implement that as a separate mode.
In fact they are still operating in 2-channel PCM mode, only the data packed into the stream has to be interpreted differently. This even worked with standard CD players, you could have DTS on CD. Of course if you connected a standard analog amp to it you'd only get noise. But an amp with digital S/PDIF input and an AC3/DTS decoder could decode the multichannel audio stream and output it to it's 5.1 speakers.
so long,
Hias
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
On Mon, Feb 27, 2017 at 10:21:25PM +1100, Matt Flax wrote:
On 27/02/17 21:30, Matthias Reichl wrote:
I don't see that. Some people might want to be able to use multichannel on RPi, but that's something very different than adding code that just lies about driver capabilites.
I am going to invite you over for dinner to eat steamed vegetables :) If I make the steamed vegetables in the microwave and they taste like steamed vegetables, are they steamed vegetables ?
If you connect an amplifier with a volume control to your soundcard does that mean that your soundcard now has a volume control? No, your amp has.
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
Thanks, Charles
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote:
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote:
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
What I have been trying to make happen over the past few months is to enable a different capability of the AMBA audio core on the bcm2835. Can you suggest a realistic approach to realising this which is acceptable to the ALSA code base ?
Do you suggest I go back to the approach of implementing the DSP mode for this platform driver ?
thanks Matt
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote:
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
AMBA/APB is the interface which connects the peripheral to the system memory bus. It is the interface over which the CPU does configuration register writes. This has nothing and absolutely nothing to do with the I2S interface that is also implemented by the peripheral that is used to stream audio to and from external components.
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote:
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
I have a bcm2835 (Pi 2 and 3) SoC here. It is producing multichannel (8 out, 6 in) audio. In ALSA we call that DSP mode - right ?!
No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
AMBA/APB is the interface which connects the peripheral to the system memory bus. It is the interface over which the CPU does configuration register writes. This has nothing and absolutely nothing to do with the I2S interface that is also implemented by the peripheral that is used to stream audio to and from external components.
Their (BCM reference) document [1] specifically states "It supports many classic PCM formats including I2S".
Do agree with Mark's statement "the specifications of the DSP modes are such that only one edge in the LRCLK matters" ?
If we look at the bcm2835 platform driver setup, it is concerned with bit clock counting to specify the audio data for both of the AMBA/APB channels from serial bitstream into memory. It has two channels into memory, however "it supports many classic PCM formats" ... my vote for one classic format is DSP mode !
Do you see a problem with that ?
thanks Matt [1] https://www.raspberrypi.org/wp-content/uploads/2012/02/BCM2835-ARM-Periphera...
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote:
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
> I have a bcm2835 (Pi 2 and 3) SoC here. It is producing > multichannel (8 > out, > 6 in) audio. In ALSA we call that DSP mode - right ?! No. DSP modes are protocol/timing specifications as I2S, PDP, S/PDIF, ... You can look these up in datasheets and if a chip implements such a protocol you can be sure that it adheres to that standard - i.e. it will sync the frames to the pulses on LRclk.
I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
AMBA/APB is the interface which connects the peripheral to the system memory bus. It is the interface over which the CPU does configuration register writes. This has nothing and absolutely nothing to do with the I2S interface that is also implemented by the peripheral that is used to stream audio to and from external components.
Their (BCM reference) document [1] specifically states "It supports many classic PCM formats including I2S".
Do agree with Mark's statement "the specifications of the DSP modes are such that only one edge in the LRCLK matters" ?
If we look at the bcm2835 platform driver setup, it is concerned with bit clock counting to specify the audio data for both of the AMBA/APB channels from serial bitstream into memory. It has two channels into memory, however "it supports many classic PCM formats" ... my vote for one classic format is DSP mode !
Do you see a problem with that ?
thanks Matt [1] https://www.raspberrypi.org/wp-content/uploads/2012/02/BCM2835-ARM-Periphera... _______________________________________________
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup. According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
Emmanuel.
On Tue, Mar 21, 2017 at 10:21:04PM +0100, Emmanuel Fusté wrote:
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote:
On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote: >>I have a bcm2835 (Pi 2 and 3) SoC here. It is producing >>multichannel (8 >>out, >>6 in) audio. In ALSA we call that DSP mode - right ?! >No. DSP modes are protocol/timing specifications as I2S, PDP, >S/PDIF, ... >You can look these up in datasheets and if a chip implements such a >protocol you can be sure that it adheres to that standard - i.e. it >will sync the frames to the pulses on LRclk. I agree with the thoughts in this thread really if the AP doesn't actually support DSP A mode we shouldn't add DSP A mode.
The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
AMBA/APB is the interface which connects the peripheral to the system memory bus. It is the interface over which the CPU does configuration register writes. This has nothing and absolutely nothing to do with the I2S interface that is also implemented by the peripheral that is used to stream audio to and from external components.
Their (BCM reference) document [1] specifically states "It supports many classic PCM formats including I2S".
Do agree with Mark's statement "the specifications of the DSP modes are such that only one edge in the LRCLK matters" ?
If we look at the bcm2835 platform driver setup, it is concerned with bit clock counting to specify the audio data for both of the AMBA/APB channels from serial bitstream into memory. It has two channels into memory, however "it supports many classic PCM formats" ... my vote for one classic format is DSP mode !
Do you see a problem with that ?
thanks Matt [1] https://www.raspberrypi.org/wp-content/uploads/2012/02/BCM2835-ARM-Periphera... _______________________________________________
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup.
I've re-read the document, too, last week and noticed the framesync registers - sorry, I had completely forgotten about these. I guess it should be possible to configure the bcm2835 to DSP mode but it'd still be limited to 2 channel setups - the hardware only has 2 channel position registers for each direction.
According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
I'm not quite sure if I can follow you on this - how would you configure 16 channels when there are only 2 channel position registers?
With bclk ratio eg set to 16*32=512 BCM2835 will only transmit 2*32 bits of data (at configurable bit positions), the remaining 448 bits will be zero.
so long,
Hias
On 22/03/17 09:11, Matthias Reichl wrote:
On Tue, Mar 21, 2017 at 10:21:04PM +0100, Emmanuel Fusté wrote:
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote: > On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote: >>> I have a bcm2835 (Pi 2 and 3) SoC here. It is producing >>> multichannel (8 >>> out, >>> 6 in) audio. In ALSA we call that DSP mode - right ?! >> No. DSP modes are protocol/timing specifications as I2S, PDP, >> S/PDIF, ... >> You can look these up in datasheets and if a chip implements such a >> protocol you can be sure that it adheres to that standard - i.e. it >> will sync the frames to the pulses on LRclk. > I agree with the thoughts in this thread really if the AP doesn't > actually support DSP A mode we shouldn't add DSP A mode. The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
AMBA/APB is the interface which connects the peripheral to the system memory bus. It is the interface over which the CPU does configuration register writes. This has nothing and absolutely nothing to do with the I2S interface that is also implemented by the peripheral that is used to stream audio to and from external components.
Their (BCM reference) document [1] specifically states "It supports many classic PCM formats including I2S".
Do agree with Mark's statement "the specifications of the DSP modes are such that only one edge in the LRCLK matters" ?
If we look at the bcm2835 platform driver setup, it is concerned with bit clock counting to specify the audio data for both of the AMBA/APB channels from serial bitstream into memory. It has two channels into memory, however "it supports many classic PCM formats" ... my vote for one classic format is DSP mode !
Do you see a problem with that ?
thanks Matt [1] https://www.raspberrypi.org/wp-content/uploads/2012/02/BCM2835-ARM-Periphera... _______________________________________________
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup.
I've re-read the document, too, last week and noticed the framesync registers - sorry, I had completely forgotten about these. I guess it should be possible to configure the bcm2835 to DSP mode but it'd still be limited to 2 channel setups - the hardware only has 2 channel position registers for each direction.
According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
I'm not quite sure if I can follow you on this - how would you configure 16 channels when there are only 2 channel position registers?
With bclk ratio eg set to 16*32=512 BCM2835 will only transmit 2*32 bits of data (at configurable bit positions), the remaining 448 bits will be zero.
The document seems to stipulate that the PCM audio device is an AMBA device with 2 APB data channels. The first sync edge marks the beginning of the two data words. Their frame lengths can be up to 1024+32 bits in length !
I think the point is that they intended their PCM audio interface to be configurable, they say in their document "It supports many classic PCM formats".
The important point here is that in ALSA we can only have I2S or DSP modes - right ? Unless we want to create a new ALSA mode (which clearly worries people) then we need to support the versatility of the bcm2835 PCM hardware using either DSP or I2S modes. Now, we have already implemented the I2S mode, so logically the only available mode left is the DSP mode. Using this mode, we can implement more features of this device.
People seem to want to reserve DSP and I2S modes for strictly I2S and DSP protocols. At the same time people don't want to allow a looser "APB" mode into ALSA. For that reason, we have a lack of functionality for perfectly versatile hardware - the bcm2835 hardware.
Matt
On Wed, Mar 22, 2017 at 10:29:33AM +1100, Matt Flax wrote:
On 22/03/17 09:11, Matthias Reichl wrote:
On Tue, Mar 21, 2017 at 10:21:04PM +0100, Emmanuel Fusté wrote:
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote: >On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote: >>On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup.
I've re-read the document, too, last week and noticed the framesync registers - sorry, I had completely forgotten about these. I guess it should be possible to configure the bcm2835 to DSP mode but it'd still be limited to 2 channel setups - the hardware only has 2 channel position registers for each direction.
According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
I'm not quite sure if I can follow you on this - how would you configure 16 channels when there are only 2 channel position registers?
With bclk ratio eg set to 16*32=512 BCM2835 will only transmit 2*32 bits of data (at configurable bit positions), the remaining 448 bits will be zero.
The document seems to stipulate that the PCM audio device is an AMBA device with 2 APB data channels. The first sync edge marks the beginning of the two data words. Their frame lengths can be up to 1024+32 bits in length !
I think the point is that they intended their PCM audio interface to be configurable, they say in their document "It supports many classic PCM formats".
The important point here is that in ALSA we can only have I2S or DSP modes - right ? Unless we want to create a new ALSA mode (which clearly worries people) then we need to support the versatility of the bcm2835 PCM hardware using either DSP or I2S modes. Now, we have already implemented the I2S mode, so logically the only available mode left is the DSP mode. Using this mode, we can implement more features of this device.
People seem to want to reserve DSP and I2S modes for strictly I2S and DSP protocols. At the same time people don't want to allow a looser "APB" mode into ALSA. For that reason, we have a lack of functionality for perfectly versatile hardware - the bcm2835 hardware.
Apologies but I am still a little unclear as to what actually happens on the bus here.
Are we saying that what gets transmitted on the bus is neither valid I2S or DSP mode data? But as you have your custom hardware block in the middle it interprets this data correctly and converts it to a regular bus format on the other side that goes to the CODEC?
Thanks, Charles
On 22/03/17 20:43, Charles Keepax wrote:
On Wed, Mar 22, 2017 at 10:29:33AM +1100, Matt Flax wrote:
On 22/03/17 09:11, Matthias Reichl wrote:
On Tue, Mar 21, 2017 at 10:21:04PM +0100, Emmanuel Fusté wrote:
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote: > On 16/03/17 06:01, Mark Brown wrote: >> On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote: >>> On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup.
I've re-read the document, too, last week and noticed the framesync registers - sorry, I had completely forgotten about these. I guess it should be possible to configure the bcm2835 to DSP mode but it'd still be limited to 2 channel setups - the hardware only has 2 channel position registers for each direction.
According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
I'm not quite sure if I can follow you on this - how would you configure 16 channels when there are only 2 channel position registers?
With bclk ratio eg set to 16*32=512 BCM2835 will only transmit 2*32 bits of data (at configurable bit positions), the remaining 448 bits will be zero.
The document seems to stipulate that the PCM audio device is an AMBA device with 2 APB data channels. The first sync edge marks the beginning of the two data words. Their frame lengths can be up to 1024+32 bits in length !
I think the point is that they intended their PCM audio interface to be configurable, they say in their document "It supports many classic PCM formats".
The important point here is that in ALSA we can only have I2S or DSP modes - right ? Unless we want to create a new ALSA mode (which clearly worries people) then we need to support the versatility of the bcm2835 PCM hardware using either DSP or I2S modes. Now, we have already implemented the I2S mode, so logically the only available mode left is the DSP mode. Using this mode, we can implement more features of this device.
People seem to want to reserve DSP and I2S modes for strictly I2S and DSP protocols. At the same time people don't want to allow a looser "APB" mode into ALSA. For that reason, we have a lack of functionality for perfectly versatile hardware - the bcm2835 hardware.
Apologies but I am still a little unclear as to what actually happens on the bus here.
Are we saying that what gets transmitted on the bus is neither valid I2S or DSP mode data? But as you have your custom hardware block in the middle it interprets this data correctly and converts it to a regular bus format on the other side that goes to the CODEC?
Yes, essentially there is translation between the two data word edge triggered ABP (the bcm2835's PCM block) and a Cirrus Logic TDM codec.
Matt
On Wed, Mar 22, 2017 at 11:04:34PM +1100, Matt Flax wrote:
Are we saying that what gets transmitted on the bus is neither valid I2S or DSP mode data? But as you have your custom hardware block in the middle it interprets this data correctly and converts it to a regular bus format on the other side that goes to the CODEC?
Yes, essentially there is translation between the two data word edge triggered ABP (the bcm2835's PCM block) and a Cirrus Logic TDM codec.
Hmm... I guess that is the root difficulty here. As we probably don't want to call it DSP mode in the driver if connecting it to something that was expecting data in that format wouldn't work. Especially if the device actually could support DSP mode as well, since we would then be blocking someone for implementing legimate DSP mode in the future.
I guess really the correct thing to do would be to add some define for the new format, but I can see that might be a tricky road. Would perhaps some sort of BESPOKE define or something be possible to indicate odd formats that don't fall into regular classifications?
Thanks, Charles
On 03/22/2017 06:04 AM, Matt Flax wrote:
On 22/03/17 20:43, Charles Keepax wrote:
On Wed, Mar 22, 2017 at 10:29:33AM +1100, Matt Flax wrote:
On 22/03/17 09:11, Matthias Reichl wrote:
On Tue, Mar 21, 2017 at 10:21:04PM +0100, Emmanuel Fusté wrote:
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote: > On 03/16/2017 09:51 PM, Matt Flax wrote: >> On 16/03/17 06:01, Mark Brown wrote: >>> On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote: >>>> On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote:
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup.
I've re-read the document, too, last week and noticed the framesync registers - sorry, I had completely forgotten about these. I guess it should be possible to configure the bcm2835 to DSP mode but it'd still be limited to 2 channel setups - the hardware only has 2 channel position registers for each direction.
According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
I'm not quite sure if I can follow you on this - how would you configure 16 channels when there are only 2 channel position registers?
With bclk ratio eg set to 16*32=512 BCM2835 will only transmit 2*32 bits of data (at configurable bit positions), the remaining 448 bits will be zero.
The document seems to stipulate that the PCM audio device is an AMBA device with 2 APB data channels. The first sync edge marks the beginning of the two data words. Their frame lengths can be up to 1024+32 bits in length !
I think the point is that they intended their PCM audio interface to be configurable, they say in their document "It supports many classic PCM formats".
The important point here is that in ALSA we can only have I2S or DSP modes - right ? Unless we want to create a new ALSA mode (which clearly worries people) then we need to support the versatility of the bcm2835 PCM hardware using either DSP or I2S modes. Now, we have already implemented the I2S mode, so logically the only available mode left is the DSP mode. Using this mode, we can implement more features of this device.
People seem to want to reserve DSP and I2S modes for strictly I2S and DSP protocols. At the same time people don't want to allow a looser "APB" mode into ALSA. For that reason, we have a lack of functionality for perfectly versatile hardware - the bcm2835 hardware.
Apologies but I am still a little unclear as to what actually happens on the bus here.
Are we saying that what gets transmitted on the bus is neither valid I2S or DSP mode data? But as you have your custom hardware block in the middle it interprets this data correctly and converts it to a regular bus format on the other side that goes to the CODEC?
Yes, essentially there is translation between the two data word edge triggered ABP (the bcm2835's PCM block) and a Cirrus Logic TDM codec.
Are you really sure about this? I believe the BCM2835 I2S controller is a standard I2S controller.
Note that, as was pointed out earlier in this thread, "APB" has nothing to do with the audio side of the I2S controller's HW. Rather, the I2S controller connects to the SoC's APB bus so that the CPU can program the I2S controller's registers, and so that the CPU or DMA engine can read/write the audio data stream. The audio side of the I2S controller generates/consumes the standard I2S signals of clock, frame sync, data in, and data out. As such, I believe standard I2S and DSP modes are perfectly possible.
Now the BCM2835 I2S controller does appear to have a few features beyond plain I2S/DSP modes, that not all I2S controllers might have, such as:
- Frame length, FS length, channel position, and channel width are specified at bit resolution rather than byte/sample/...
- PDM mode (for digital mics).
- Sign extension of RX data.
So, the controller can generate/receive some formats beyond plain I2S/DSP, but to be honest I doubt that this level of detail needs to be exposed to the ALSA/ASoC core or user-space, nor is it required to interface to any typical HW, so it can be hidden in the driver's register programming implementation.
On Wed, Mar 22, 2017 at 09:38:28AM -0600, Stephen Warren wrote:
Now the BCM2835 I2S controller does appear to have a few features beyond plain I2S/DSP modes, that not all I2S controllers might have, such as:
- Frame length, FS length, channel position, and channel width are specified
at bit resolution rather than byte/sample/...
- PDM mode (for digital mics).
- Sign extension of RX data.
So, the controller can generate/receive some formats beyond plain I2S/DSP, but to be honest I doubt that this level of detail needs to be exposed to the ALSA/ASoC core or user-space, nor is it required to interface to any typical HW, so it can be hidden in the driver's register programming implementation.
At the minute the driver doesn't even support these features, it just supports plain stereo I2S mode and it *may* be that all that's needed here is the addition of the DSP modes.
On Wed, Mar 22, 2017 at 11:04:34PM +1100, Matt Flax wrote:
On 22/03/17 20:43, Charles Keepax wrote:
Are we saying that what gets transmitted on the bus is neither valid I2S or DSP mode data? But as you have your custom hardware block in the middle it interprets this data correctly and converts it to a regular bus format on the other side that goes to the CODEC?
Yes, essentially there is translation between the two data word edge triggered ABP (the bcm2835's PCM block) and a Cirrus Logic TDM codec.
Could you please be concrete about what the two formats you're talking about here are and how these differences are observable on the wire? I don't know what "two data word edge triggered ABP" means.
On 25/03/17 06:09, Mark Brown wrote:
On Wed, Mar 22, 2017 at 11:04:34PM +1100, Matt Flax wrote:
On 22/03/17 20:43, Charles Keepax wrote:
Are we saying that what gets transmitted on the bus is neither valid I2S or DSP mode data? But as you have your custom hardware block in the middle it interprets this data correctly and converts it to a regular bus format on the other side that goes to the CODEC?
Yes, essentially there is translation between the two data word edge triggered ABP (the bcm2835's PCM block) and a Cirrus Logic TDM codec.
Could you please be concrete about what the two formats you're talking about here are and how these differences are observable on the wire? I don't know what "two data word edge triggered ABP" means.
On the codec side it is a regular TDM stream. On the SoC side, two channels are arbitrarily offset from a PCM frame sync clock (PCM_FS) leading edge. I have chosen to have 64 bits per frame with 1 bit offset for the first word (from the leading edge) and 33 bits offset (from the leading edge) for the second word. This resembles I2S, but it doesn't have to for the bcm2835.
In more detail ... The bcm2835's flexible PCM hardware (serialiser+APB) is set up to handle two words at a time in/out from a 64 word FIFO buffers. The serialiser loads and unloads two registers according to offsets from the PCM_FS clock's leading edge. The APB interface is responsible for (un)loading the 64 word FIFO buffer to/from memory - with DMA I guess.
thanks Matt
On Sat, Mar 25, 2017 at 04:45:46PM +1100, Matt Flax wrote:
On 25/03/17 06:09, Mark Brown wrote:
Could you please be concrete about what the two formats you're talking about here are and how these differences are observable on the wire? I don't know what "two data word edge triggered ABP" means.
On the codec side it is a regular TDM stream. On the SoC side, two channels are arbitrarily offset from a PCM frame sync clock (PCM_FS) leading edge. I have chosen to have 64 bits per frame with 1 bit offset for the first word (from the leading edge) and 33 bits offset (from the leading edge) for the second word. This resembles I2S, but it doesn't have to for the bcm2835.
What's internal to the SoC is not relevant here, what matters is what's externally visible. The formats on the DAI are how the SoC interfaces with the outside world.
On 27/03/17 21:01, Mark Brown wrote:
On Sat, Mar 25, 2017 at 04:45:46PM +1100, Matt Flax wrote:
On 25/03/17 06:09, Mark Brown wrote:
Could you please be concrete about what the two formats you're talking about here are and how these differences are observable on the wire? I don't know what "two data word edge triggered ABP" means.
On the codec side it is a regular TDM stream. On the SoC side, two channels are arbitrarily offset from a PCM frame sync clock (PCM_FS) leading edge. I have chosen to have 64 bits per frame with 1 bit offset for the first word (from the leading edge) and 33 bits offset (from the leading edge) for the second word. This resembles I2S, but it doesn't have to for the bcm2835.
What's internal to the SoC is not relevant here, what matters is what's externally visible. The formats on the DAI are how the SoC interfaces with the outside world.
In this case, there is the TDM (DSP mode) protocol on the Codec. The SoC however is communicating in channel pairs {{0, 1}, {2, 3}, {4, 5}, {6, 7}}. Each channel separated by the PCM_FS clk leading edge.
As far as the codec is concerned it is DSP mode. As far as the SoC is concerned, it is not I2S, nor is it strictly DSP mode because there is more then one PCM_FS per frame !
If we look at this from the perspective of the Codec, then it is DSP mode. I am just not sure what to call the SoC's protocol, other then multi-paired PCM.
Matt
On Mon, Mar 27, 2017 at 09:35:33PM +1100, Matt Flax wrote:
In this case, there is the TDM (DSP mode) protocol on the Codec. The SoC however is communicating in channel pairs {{0, 1}, {2, 3}, {4, 5}, {6, 7}}. Each channel separated by the PCM_FS clk leading edge.
As far as the codec is concerned it is DSP mode. As far as the SoC is concerned, it is not I2S, nor is it strictly DSP mode because there is more then one PCM_FS per frame !
If we look at this from the perspective of the Codec, then it is DSP mode. I am just not sure what to call the SoC's protocol, other then multi-paired PCM.
This isn't DSP mode from anything's point of view, the extra frame syncs mean it's a whole new format. Off the top of my head I'd suggest just running it as stereo DSP mode from a kernel point of view and fixing things up in userspace.
Le 21/03/2017 à 23:11, Matthias Reichl a écrit :
On Tue, Mar 21, 2017 at 10:21:04PM +0100, Emmanuel Fusté wrote:
Le 16/03/2017 à 23:14, Matt Flax a écrit :
On 17/03/17 08:27, Lars-Peter Clausen wrote:
On 03/16/2017 09:51 PM, Matt Flax wrote:
On 16/03/17 06:01, Mark Brown wrote:
On Tue, Feb 28, 2017 at 09:59:29AM +0000, Charles Keepax wrote: > On Mon, Feb 27, 2017 at 12:51:08PM +0100, Matthias Reichl wrote: >>> I have a bcm2835 (Pi 2 and 3) SoC here. It is producing >>> multichannel (8 >>> out, >>> 6 in) audio. In ALSA we call that DSP mode - right ?! >> No. DSP modes are protocol/timing specifications as I2S, PDP, >> S/PDIF, ... >> You can look these up in datasheets and if a chip implements such a >> protocol you can be sure that it adheres to that standard - i.e. it >> will sync the frames to the pulses on LRclk. > I agree with the thoughts in this thread really if the AP doesn't > actually support DSP A mode we shouldn't add DSP A mode. The trouble here is that this isn't 100% clear, the specifications of the DSP modes are such that only one edge in the LRCLK matters and so providing you're doing mono or have exact clocking they interoperate perfectly well. We already frequently do similar things the other way, most of the programmable serial ports can't actually do I2S modes properly and rely on exact clocking to get things right when operating as I2S since they only sync on one edge (though they can generally generate the clocks correctly when operating as master, they just don't pay attention to the left/right switch edge).
That said unless we're doing something with the data layout or similar configuration there's a fairly strong case for putting the mangling for this in the core, something like just falling back to I2S mode if we set DSP A and so on. Which would be a lot nicer if we actually got round to putting mode capability information in the drivers.
I agree, the data layout is already configurable in the bcm2835_i2s.c platform driver. We can already use the "snd_soc_dai_set_bclk_ratio" function to manage word offsets in our machine drivers.
There is nothing which says that the bcm2835 SoC is I2S restricted in any way. In fact, the reference document says quite the opposite.
In the reference "BCM2835 ARM Peripherals" pdf, they call the audio system an "APB peripheral". They are saying that it is reconfigurable and part of the AMBA family of interconnect schemes.
As far as the bcm2835_i2s platform driver goes, it has implemented an AMBA protocol, where audio words are counted from the LR clock onset - for some reason people are insisting this is an I2S bus. Really our implementation is not I2S at all, because word onsets are programmable and flexible in the bcm2835_i2s.c driver.
AMBA/APB is the interface which connects the peripheral to the system memory bus. It is the interface over which the CPU does configuration register writes. This has nothing and absolutely nothing to do with the I2S interface that is also implemented by the peripheral that is used to stream audio to and from external components.
Their (BCM reference) document [1] specifically states "It supports many classic PCM formats including I2S".
Do agree with Mark's statement "the specifications of the DSP modes are such that only one edge in the LRCLK matters" ?
If we look at the bcm2835 platform driver setup, it is concerned with bit clock counting to specify the audio data for both of the AMBA/APB channels from serial bitstream into memory. It has two channels into memory, however "it supports many classic PCM formats" ... my vote for one classic format is DSP mode !
Do you see a problem with that ?
thanks Matt [1] https://www.raspberrypi.org/wp-content/uploads/2012/02/BCM2835-ARM-Periphera... _______________________________________________
Re-reading this document, the bcm2835 PCM IP block SHOULD support real DSP mode, with one BCLK pulsed LRCLK, zero BCLK delay etc... It just need to be properly setup.
I've re-read the document, too, last week and noticed the framesync registers - sorry, I had completely forgotten about these. I guess it should be possible to configure the bcm2835 to DSP mode but it'd still be limited to 2 channel setups - the hardware only has 2 channel position registers for each direction.
According to the same document, you could program the bmc up to 16 32bits channels when in master mode, so I suspect that you could go up to this limit in slave mode. But as it is designed, it could only use up to two of any channels among the 16.
I'm not quite sure if I can follow you on this - how would you configure 16 channels when there are only 2 channel position registers?
With bclk ratio eg set to 16*32=512 BCM2835 will only transmit 2*32 bits of data (at configurable bit positions), the remaining 448 bits will be zero.
Yes, two arbitrary tdm channels out of 16. And as I and you said, the others are not usable. Could be useful in a TDM chained scenario , but not for what we are talking here.
Emmanuel.
On Mon, Feb 27, 2017 at 07:21:13AM +1100, Matt Flax wrote:
On 27/02/17 01:49, Matthias Reichl wrote:
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
Thanks, this is actually very constructive advice. Let me try to understand what you are suggesting here... I am confused about how to set bit depth correctly for the codec when I play to the IC chip in this setup... lets walk me through this one...
I implement a new codec which matches this IC. The IC can be setup to be master in its dai fmt. The sound card shows up with two devices, only the IC device can be used to play and record.
Say I use aplay to play to the first device (the IC chip) it does hw_params and clocks are set. But the second device (the codec) never gets hw_params executed ? If I select 16 or 24 bits, it never knows ... is that right ? If so, how do we solve this problem ?
Yeah I think there are basically two ways you could model this sort of setup, either with a CODEC to CODEC link as suggested here or a DPCM setup.
The CODEC to CODEC link is probably the better model for the system as that is really the direction people are trying to take things within ASoC, however at the moment being a CODEC to CODEC link does impose some restrictions, you have to basically specify fixed parameters for the link in the machine driver. You can specify multiple parameters and an ALSA control will be created to switch between them, however this does require user involvement.
I am interested you mention the bit depth but not the sample rate which is often the bigger problem? Probably the easiest solution here assuming you don't have sample rate issues, is just to have the link between your FPGA and the CODEC always be 24bit and just have the FPGA insert padding if the link on the other side is 16bit.
Alternatively you could look at implementing this as DPCM, although I am not the greatest fan of it. That would let you fix up the settings before they are passed to the CODEC, but leaves the FPGA part relatively unmodelled and would likely be implicitly supported in the machine driver. Really DPCM is more normally used to model hardware that is an actual part of the SoC so this case does feel slightly abusive.
Thanks, Charles
Le 26/02/2017 à 15:49, Matthias Reichl a écrit :
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
From the beginning, I completely agree with you when you take the two problems apart: - for the timing problem : model properly the converter as a codec - for the encapsulation problem : do the encapsulation / packing with a userspace plugin
But when you take the whole together, the plugin part seems completely overcomplicated. As the whole is properly modeled, if we could have a simple solution to relax the channel numbers constraint on the I2S on the higher part of the stack all will "magically" work with little effort/complexity.
Otherwise, the user-space packer would have to do a lot of more than packing : interact with private machine driver controls to manage all the channels and the machine driver will need to forward /translate some parts to directly drive the final (cs42xx8) codec. An potentially if such hardware continue to pop-up, each machine driver will need it's own user-space counterpart.
Packing DSD in PCM (DoP) require zero kernel knowledge and could be fully implemented in user-space as we don't change the number of channel assumption of the data part of the format. But here, we simply want the DSP A data semantic with the I2S hardware bus timing.
I'm a complete newbie to ASoC but I take part to this tread to learn as I hate to see how badly all diy and amateur audio hw are integrated with Alsa/ASoC/Linux and so never go upstream. On a professional/commercial dev, you would never take this ... convoluted I2S multi channel path.
Emmanuel.
On 27/02/17 07:41, Emmanuel Fusté wrote:
I'm a complete newbie to ASoC but I take part to this tread to learn as I hate to see how badly all diy and amateur audio hw are integrated with Alsa/ASoC/Linux and so never go upstream. On a professional/commercial dev, you would never take this ... convoluted I2S multi channel path.
Hey Emmanuel,
We now know that we can retrofit legacy I2S silicon for multichannel, in a very direct manner.
If you were a chip manufacturer and you could potentially increase your market take up by hundreds of millions of new potential low power devices, wouldn't you manufacture ALL new chips with these minor modifications ?
Matt
Le 26/02/2017 à 22:44, Matt Flax a écrit :
On 27/02/17 07:41, Emmanuel Fusté wrote:
I'm a complete newbie to ASoC but I take part to this tread to learn as I hate to see how badly all diy and amateur audio hw are integrated with Alsa/ASoC/Linux and so never go upstream. On a professional/commercial dev, you would never take this ... convoluted I2S multi channel path.
Hey Emmanuel,
We now know that we can retrofit legacy I2S silicon for multichannel, in a very direct manner.
If you were a chip manufacturer and you could potentially increase your market take up by hundreds of millions of new potential low power devices, wouldn't you manufacture ALL new chips with these minor modifications ?
Matt
Hi,
No, because this is a hack. There is established an more importantly interoperable bus standards to do multichannel digital audio : TDM/DSP A/DSP B As today standards, even ultra low power and low powerfull IC are able to do standard multichannel serial audio. Atmel AT & ARM cortex M µC, STMicro STM32 family etc ... I2S signaling is only part of the equation, fifo, DMA handling, etc... of the multichannel part could be important depending of the implementation. And the FPGA glue logic is way more costly than choosing a SOC / µC with a properly designed multichannel serial audio IP. The real question is why this SOC so poorly endowed with bad device IP is so popular... but this is another story...
Don't get me wrong. I'm nevertheless a real supporter of this "hack". But it should be done the "right" way.
Emmanuel.
On Sun, Feb 26, 2017 at 09:41:11PM +0100, Emmanuel Fusté wrote:
Le 26/02/2017 à 15:49, Matthias Reichl a écrit :
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
From the beginning, I completely agree with you when you take the two problems apart:
- for the timing problem : model properly the converter as a codec
- for the encapsulation problem : do the encapsulation / packing with a
userspace plugin
But when you take the whole together, the plugin part seems completely overcomplicated. As the whole is properly modeled, if we could have a simple solution to relax the channel numbers constraint on the I2S on the higher part of the stack all will "magically" work with little effort/complexity.
Otherwise, the user-space packer would have to do a lot of more than packing : interact with private machine driver controls to manage all the channels and the machine driver will need to forward /translate some parts to directly drive the final (cs42xx8) codec. An potentially if such hardware continue to pop-up, each machine driver will need it's own user-space counterpart.
Quite on the contrary. You'd need to add the channel relaxing constraint to all existing I2S drivers to get this working. And I don't see a need to artificially restrict that to a specific codec (cs42xx8) and a specific number of channels. Why only 8 channels, why not 4 or 384?
Doing it as an ALSA plugin makes it reusable with existing drivers and hardware.
The idea behind these modular components (plugins, codecs, dais) is to create reusable components and you don't have to add identical code to all other drivers when you add some new functionality.
Actually, if you add a new feature, you need to have very good reasons to restrict it to a specific driver or change all existing drivers. If you can implement that feature in a generic way without touching existing code it's often the better solution.
Matt is trying to tunnel multichannel PCM over a 2-channel PCM link running at a higher samplerate. I've described a way how this is possible without modifying current code. There are certainly other, probably better, ways to do that. This was a first quick idea how it could be done and I still think it's not too bad.
All the plugin has to do is expose a multi-channel PCM and configure the hw/backend PCM to 2 channels at a higher samplerate (all of which current drivers are already capable of). The plugin settings determine the number of channels, channel map, samplerate factor etc. That same plugin can also be used with other "unpacking codecs" and other channel numbers - you just need to change the plugin configuration in your alsa card conf and tell it you have 4 (or whatever) channels.
If you need interaction with the backend codec (Matt's "IC"/FPGA) that does PCM unpacking to multichannel you can do that for example in the plugin or in the alsa card.conf via the hooks plugin and alsa controls.
Packing DSD in PCM (DoP) require zero kernel knowledge and could be fully implemented in user-space as we don't change the number of channel assumption of the data part of the format. But here, we simply want the DSP A data semantic with the I2S hardware bus timing.
Yes, and this is what Matt's codec (FPGA) is doing. It's creating that semantic, together with the machine driver and the plugin to set it all up.
I'm a complete newbie to ASoC but I take part to this tread to learn as I hate to see how badly all diy and amateur audio hw are integrated with Alsa/ASoC/Linux and so never go upstream. On a professional/commercial dev, you would never take this ... convoluted I2S multi channel path.
Emmanuel.
Le 27/02/2017 à 10:14, Matthias Reichl a écrit :
On Sun, Feb 26, 2017 at 09:41:11PM +0100, Emmanuel Fusté wrote:
Le 26/02/2017 à 15:49, Matthias Reichl a écrit :
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
From the beginning, I completely agree with you when you take the two problems apart:
- for the timing problem : model properly the converter as a codec
- for the encapsulation problem : do the encapsulation / packing with a
userspace plugin
But when you take the whole together, the plugin part seems completely overcomplicated. As the whole is properly modeled, if we could have a simple solution to relax the channel numbers constraint on the I2S on the higher part of the stack all will "magically" work with little effort/complexity.
Otherwise, the user-space packer would have to do a lot of more than packing : interact with private machine driver controls to manage all the channels and the machine driver will need to forward /translate some parts to directly drive the final (cs42xx8) codec. An potentially if such hardware continue to pop-up, each machine driver will need it's own user-space counterpart.
Quite on the contrary. You'd need to add the channel relaxing constraint to all existing I2S drivers to get this working. And I don't see a need to artificially restrict that to a specific codec (cs42xx8) and a specific number of channels. Why only 8 channels, why not 4 or 384?
Doing it as an ALSA plugin makes it reusable with existing drivers and hardware.
The idea behind these modular components (plugins, codecs, dais) is to create reusable components and you don't have to add identical code to all other drivers when you add some new functionality.
Actually, if you add a new feature, you need to have very good reasons to restrict it to a specific driver or change all existing drivers. If you can implement that feature in a generic way without touching existing code it's often the better solution.
Matt is trying to tunnel multichannel PCM over a 2-channel PCM link running at a higher samplerate. I've described a way how this is possible without modifying current code. There are certainly other, probably better, ways to do that. This was a first quick idea how it could be done and I still think it's not too bad.
All the plugin has to do is expose a multi-channel PCM and configure the hw/backend PCM to 2 channels at a higher samplerate (all of which current drivers are already capable of). The plugin settings determine the number of channels, channel map, samplerate factor etc. That same plugin can also be used with other "unpacking codecs" and other channel numbers - you just need to change the plugin configuration in your alsa card conf and tell it you have 4 (or whatever) channels.
If you need interaction with the backend codec (Matt's "IC"/FPGA) that does PCM unpacking to multichannel you can do that for example in the plugin or in the alsa card.conf via the hooks plugin and alsa controls.
Packing DSD in PCM (DoP) require zero kernel knowledge and could be fully implemented in user-space as we don't change the number of channel assumption of the data part of the format. But here, we simply want the DSP A data semantic with the I2S hardware bus timing.
Yes, and this is what Matt's codec (FPGA) is doing. It's creating that semantic, together with the machine driver and the plugin to set it all up.
Ok I'm convinced.
Thank you for your detailed explanation.
Emmanuel.
Le 27/02/2017 à 10:14, Matthias Reichl a écrit :
On Sun, Feb 26, 2017 at 09:41:11PM +0100, Emmanuel Fusté wrote:
Le 26/02/2017 à 15:49, Matthias Reichl a écrit :
On Sun, Feb 26, 2017 at 09:13:09AM +1100, Matt Flax wrote:
On 26/02/17 00:39, Matthias Reichl wrote:
On Sat, Feb 25, 2017 at 04:03:11PM +1100, Matt Flax wrote:
This patch set lets the ASoC system specify that an IC between the SoC and codec is master. This is intended to put both the SoC and Codec into slave modes.
By default un-patched SoC and Codec drivers will return -EINVAL if they aren't enabled and tested for this mode.
soc-dia.h has the new SND_SOC_DAIFMT_IBM_IFM definition set as : #define SND_SOC_DAIFMT_IBM_IFM (5 << 12) /* IC clk & FRM master */
The cs42xx8 codec driver is enabled for this mode and so too is the BCM2835 SoC driver. This forms a chain : bcm2835<=>IC<=>cs42xx8 where the IC is bit and frame master.
Model your IC as a codec. No need to add patches to random drivers and add a flag with the rather meaningless semantics "someone else is automagically setting up clocks for me".
My last patch, used the two codec approach, however it was pointed out that the bcm2835 was run in DSP mode with a codec master (rather then IC master) and that the patch doesn't work. Which is clearly true and a problem, it can only work with an intermediate non-codec master.
I think you summed it up well with your statement :
On 25/02/17 Matthias Reichl wrote: If the clock timing adheres to DSP mode A timing and you add code to the the CPU DAI driver so it can operate in DSP mode A then that should also work. If not, it's broken.
Your bcm2835 patch doesn't configure the bcm2835 to DSP mode A, it's still setup for I2S (slave) mode. You are just adding code to pretend it's running in DSP mode A. Don't do that, it's wrong.
This patch set fixes the problem of a daisy chain of three possible masters (CPU <=> IC <=> codec) where only the IC can be master. In fact, when retro fitting DSP mode to old silicon, the CPU can specify which of the three can be masters and there is no chance that someone can fire the system up with the wrong master (which we know produces bit offset and random channel swapping when a codec is master).
Please follow the advice I gave you about 3 weeks ago and model your setup properly.
| So you have bcm2835 I2S <-> FPGA <-> codec - IOW a standard codec<->codec | link. | | What you seem to be missing is just a method to transfer your 8-channel | data via a 2-channel link - userspace want's to see an 8-channel PCM, | but the hardware link (bcm2835-i2s) is only 2-channel. | | And that's where IMO as userspace plugin looks like a very good solution. | It's basically the counterpart of your FPGA and contains the code that's | neccessary to encapsulate/pack/whatever the 8-channel data into a 2-channel | stream so it can then be unpacked to 8-channel by the FPGA. | | If you go this route your hardware and machine driver will work with | other I2S codecs as well, and IMO that's a far better solution than | adding very ugly hacks to a single I2S driver.
If you add an active hardware component (your "IC"/FPGA) you also have to model that in software.
If that component is acting as a clock master it probably has some method to setup clocks. Even if you don't have that, eg if you are running at some fixed rate you'll have to store that information somewhere.
The place to do that is in a codec driver. In your setup it'll look like this:
That "IC" codec has 2 DAIs and operates as a clock master on both. You link one DAI in I2S mode to the bcm2835 and the other DAI in DSP (or whatever mode you are using) to the cs42xx8.
If you model it this way you no longer work against ALSA and you can stop adding hacks to existing drivers.
From the beginning, I completely agree with you when you take the two problems apart:
- for the timing problem : model properly the converter as a codec
- for the encapsulation problem : do the encapsulation / packing with a
userspace plugin
But when you take the whole together, the plugin part seems completely overcomplicated. As the whole is properly modeled, if we could have a simple solution to relax the channel numbers constraint on the I2S on the higher part of the stack all will "magically" work with little effort/complexity.
Otherwise, the user-space packer would have to do a lot of more than packing : interact with private machine driver controls to manage all the channels and the machine driver will need to forward /translate some parts to directly drive the final (cs42xx8) codec. An potentially if such hardware continue to pop-up, each machine driver will need it's own user-space counterpart.
Quite on the contrary. You'd need to add the channel relaxing constraint to all existing I2S drivers to get this working. And I don't see a need to artificially restrict that to a specific codec (cs42xx8) and a specific number of channels. Why only 8 channels, why not 4 or 384?
Doing it as an ALSA plugin makes it reusable with existing drivers and hardware.
The idea behind these modular components (plugins, codecs, dais) is to create reusable components and you don't have to add identical code to all other drivers when you add some new functionality.
Actually, if you add a new feature, you need to have very good reasons to restrict it to a specific driver or change all existing drivers. If you can implement that feature in a generic way without touching existing code it's often the better solution.
Matt is trying to tunnel multichannel PCM over a 2-channel PCM link running at a higher samplerate. I've described a way how this is possible without modifying current code. There are certainly other, probably better, ways to do that. This was a first quick idea how it could be done and I still think it's not too bad.
All the plugin has to do is expose a multi-channel PCM and configure the hw/backend PCM to 2 channels at a higher samplerate (all of which current drivers are already capable of). The plugin settings determine the number of channels, channel map, samplerate factor etc. That same plugin can also be used with other "unpacking codecs" and other channel numbers - you just need to change the plugin configuration in your alsa card conf and tell it you have 4 (or whatever) channels.
If you need interaction with the backend codec (Matt's "IC"/FPGA) that does PCM unpacking to multichannel you can do that for example in the plugin or in the alsa card.conf via the hooks plugin and alsa controls.
Packing DSD in PCM (DoP) require zero kernel knowledge and could be fully implemented in user-space as we don't change the number of channel assumption of the data part of the format. But here, we simply want the DSP A data semantic with the I2S hardware bus timing.
Yes, and this is what Matt's codec (FPGA) is doing. It's creating that semantic, together with the machine driver and the plugin to set it all up.
Ok I'm convinced.
Thank you for your detailed explanation.
Emmanuel.
PS: 2nd try, sorry for the html email.
participants (7)
-
Charles Keepax
-
Emmanuel Fusté
-
Lars-Peter Clausen
-
Mark Brown
-
Matt Flax
-
Matthias Reichl
-
Stephen Warren