At Mon, 24 Aug 2009 20:38:15 +0200, Manuel Jander wrote:
Hi Clemens,
The question is, who is the entity deciding those default values ?
snd_pcm_hw_params_choose() in alsa-lib/src/pcm/pcm_params.c: /* Choose one configuration from configuration space defined by PARAMS The configuration chosen is that obtained fixing in this order: first access first format first subformat min channels min rate min period time max buffer size min tick time */
Thanks, that information is really helpful. But why would it be a good idea to pick to lowest period time ?
Traditionally, the best way to avoid buffer underrun is to use the smallest period size and the largest buffer size. That is, wake up as much as possible while allowing to sleep as much as possible.
Maybe a given soundcard is able to do 4 byte periods and all those low delay freaks will just start drooling because of that fact, but obviously with a high interrupt trigger rate, thus consuming more CPU. But OK, if you want low delay, you gotta pay for that.
But 4 byte periods are still unrealistic. No Linux machine could work in practice use.
Takashi