[alsa-devel] How does ALSA detect underrun in OSS emulation?
timur at freescale.com
Mon Feb 25 16:22:41 CET 2008
Can someone explain to me the exact method that ALSA uses to detect underrun
with OSS emulation? I have a driver that reports underrun on almost every
period when I use OSS emulation and I'm playing with a non-supported sample
rate. So I'm assuming that maybe my hardware is playing the audio too fast or
too slow, and the driver is returning periods before ALSA expects them. What I
don't understand is: how does ALSA know that a period was finished too early.
Does it use a timer, or is it purely application driven?
Linux kernel developer at Freescale
More information about the Alsa-devel