Trent Piepho wrote:
Say an application does not write a full multiple of the period size with snd_pcm_writei(). It's playing a clip and the clip ends and it doesn't happen to end on a multiple of 6000 frames or whatever the period size is.
Now the application calls snd_pcm_drain() to wait for the clip to finish.
How does the driver know to stop when it gets to the end of the supplied data?
It doesn't.
It seems like the driver will keep playing until it calls snd_pcm_period_elapsed() at the end of the final period. Then the alsa pcm layer will call the ->trigger() method and stop the stream.
Yes.
Hopefully before the driver has started the next DMA for the next period!
Let's hope that the FIFOs are large enough so that the actual playback hasn't yet reached the period boundary ...
I don't see any documentation for snd_pcm_drain() that says it will keep playing until it finishes a period, even if the data runs out before hand. "For playback wait for all pending frames to be played and then stop the PCM." Seems pretty clear that only the pending frames are played and not the rest of a period.
ALSA will stop the device when it realizes that it has run out of data (just like an underrun). This might happen before the period boundary if there is some call to snd_pcm_status() or _delay() that reads the current position (and if the hardware also supports reading the current position).
It's racy and undocumented, but there's nothing a driver can do about this.
Regards, Clemens