[alsa-devel] [PATCH 1/2] ASoC: Allow drivers to specify how many bits are significant on a DAI

Mark Brown broonie at opensource.wolfsonmicro.com
Tue Jan 17 14:19:50 CET 2012


On Tue, Jan 17, 2012 at 02:06:44PM +0100, Peter Ujfalusi wrote:

> I can only speak in behalf of OMAP, twl4030, tlv320dac33 here, but the
> 24msbit only applies to 32bit samples. In case of 16 bit the samples are
> not converted in any way, they are processed as 16 bit data.
> So if we say 24msbit for 16bit sample we are not correct. It is correct
> for 32bit sample.
> I would think most of the codecs/cpus are working in a same way.

For the CODECs if you look at what they're doing you'll probably find
that the device is actually operating at a fixed sample size internally
and converting the data somehow at the interface (zero extension being
one option when converting up, but more fancy approaches are also
possible).  This is fairly obvious when you think about how things are
likely to be implemented in hardware, it's going to increase complexity
a lot to be able to genuinely switch the entire chip from one sample
size to another.

On the CPU side specifying significant bits would normally only be
appropriate on PDM interfaces as they have most of a DAC or ADC in them
to move between the sampled and PDM formats.  I'd be surprised to see
anything else setting these flags, most of the hardware is just passing
the data straight through.


More information about the Alsa-devel mailing list