At Tue, 06 Nov 2007 14:47:38 +0000, James Courtier-Dutton wrote:
Takashi Iwai wrote:
removing the hack there. Maybe not. But, I feel it's a barren discussion. It's really a design problem. Sigh.
Are any steps being taken to change the design so that sample rate conversion works better? The options are:
- get the sound card hardware to produce time based interrupts for the
rate plugin to use as it's period trigger. This is how the old OSS drivers did it.
It's not neccessarily to be a sound card that generates interrupts. It can be simply an any timer. We have a hrtimer now, so the material to cook is there. We need an exact synchronization, though, for example with the timing correction a la PLL.
- allow the user application to use different buffer/period sizes than
the hardware itself.
This is the best option, together with an arbitrary interrupt source.
- Try to encourage applications not to use the pcm_rate plugin at
all!!! Instead force each application to do its own sample rate conversion to match what the hardware can do.
No, this cannot be accepted in the current situation. It's 100 steps backward. If we really do this, we should get rid of the whole plugins and provide only very bare stuff. That is, it'd be better to re-design the whole ALSA API from scratch.
Takashi