At Mon, 19 Jul 2010 18:09:47 +0100, Mark Brown wrote:
On Mon, Jul 19, 2010 at 06:44:31PM +0200, Takashi Iwai wrote:
The guess work isn't always trivial. If the hardware switches the curve between log and linear over a hole, which should be taken for a value in the hole? Whne TLV is given for the hole, at least it's a good hint.
I don't see what difference the curves make here? However the gap is expressed only the points at either end are selectable (and there's always the chance that this is part of a series of random individual points with no particular curve through them).
Right, disregard my comment.
Personally I'd rather the kernel just provided a mapping between control values and dB scales and then the application layer can decide what it thinks an appropriate match is. Decisions about how accurate a match is needed seem very policy like and therefore userspaceish and it seems wrong to be coding the data tables in the kernel in a way that works around the particular match algorithm in the application layer.
User space can check the correctness always by checking the reverse conversion between dB <-> value again.
It still seems silly to have to paper over this for every single dB interval - if we need to link up the ranges it seems like we ought to be fixing this in one place.
Well, what I don't feel right is that this breaks the simpleness of DB_RANGE handling. It's just damn simple, and works as long as the target point is covered by defined ranges. If something is wrong, it's the definition, not the parser. That's why I wondered first whether it can't be fixed in the driver side. I want the parser stuff as simple as possible.
So, practically seen, how many driver are buggy in this regard? And for how many of them should a new TLV entry added? The only worst case is discontinued lines you mentioned. Is there a real device that behaves so (and implemented in that way)?
If fixing in alsa-lib is a better compromise, I'd take it. But I'd like to hear numbers first.
thanks,
Takashi