[alsa-devel] [PATCH 1/2] ASoC: Allow drivers to specify how many bits are significant on a DAI

Peter Ujfalusi peter.ujfalusi at ti.com
Tue Jan 17 19:51:29 CET 2012


On 01/17/2012 07:17 PM, Mark Brown wrote:
> Off the top of my head it may decide that if it happens to have 24 bit
> data then passing it straight through was sensible.  But frankly I'm
> just working on the basis that if it's more effort to hide information
> than to provide it then providing it seems like the way forwards.  I've
> certainly got no intention of writing any code here myself unless
> there's some issue found.

We have the sample formats (S16_LE, S32_LE, etc) to tell application
about the supported level of bit depth.
They can choose to use 16bit if that is better for them, or choose
32bit. They make the decision based on the sample is playing,
configuration, whatever.
It does not help them if we tell when they use 16bit audio: hey, but you
could use 24bit.
When they use 32bit on the other hand it make sense to let them know
that out of the 32bit only the 24msb will be used, so they can align
their processing accordingly (if they care).

>> Yeah, but it is not correct. If it does not know this we have 8bit
>> 'latency' in gain control. Pulse can change the gain which will have no
>> effect.
> 
> Which will have the same overall effect as if it doesn't do anything
> based on knowing the resolution.

But we still 'loose' 8bits. It might be not a big loss at the end when
PA for example applies the gain, but for sure we will loose resolution
(8bit worth).

My only problem is to say this to application: "out of 8/16bit you
should use 24msb". AFAIK this is the meaning of the constraint. This
constraint makes sense for 32bit samples: "out of 32bit you should use
24msb".

-- 
Péter


More information about the Alsa-devel mailing list