[alsa-devel] [PATCH 1/2] ASoC: Allow drivers to specify how many bits are significant on a DAI
broonie at opensource.wolfsonmicro.com
Tue Jan 17 15:56:47 CET 2012
On Tue, Jan 17, 2012 at 03:18:35PM +0100, Peter Ujfalusi wrote:
> It is mostly true. DAC33 can be configured to operate internally 16bit
> or 24msbit/32bit for example.
Well, if it's doing something more complicated that doesn't fit in the
framework then it shouldn't be doing that.
> It does not give any useful information for application that the codec
> will upsample the 16bit data internally to 24 bits. It does not really
> matter for them since all 16bit data will be used by the codec.
Oh, I dunno - I'm sure someone could think of a use for it.
> > On the CPU side specifying significant bits would normally only be
> > appropriate on PDM interfaces as they have most of a DAC or ADC in them
> > to move between the sampled and PDM formats. I'd be surprised to see
> > anything else setting these flags, most of the hardware is just passing
> > the data straight through.
> True, the CPU side mostly passes the data as it is, it does not care
> about msbits. For McPDM it is different since the internal FIFO in 24bit
> long word lines, so if application would use all 32bit we it will loose
Right, like I say that's because it's got most of a DAC in it.
> 8bit lsb. This can make difference for PA when applying the digital gain
> in SW.
Well, it saves it a bit of effort but that's about it.
More information about the Alsa-devel