Say an application does not write a full multiple of the period size with snd_pcm_writei(). It's playing a clip and the clip ends and it doesn't happen to end on a multiple of 6000 frames or whatever the period size is.
Now the application calls snd_pcm_drain() to wait for the clip to finish.
How does the driver know to stop when it gets to the end of the supplied data? I don't see anywhere where this will get taken care of. It seems like the driver will keep playing until it calls snd_pcm_period_elapsed() at the end of the final period. Then the alsa pcm layer will call the ->trigger() method and stop the stream. Hopefully before the driver has started the next DMA for the next period! The application only wrote data for part of the final period, so the driver will have played past the end of the supplied data.
I don't see any documentation that says you must always supply a multiple of the period size with snd_pcm_write[in](). Nor about how to deal with non-blocking mode and only part of the data getting written.
I don't see any documentation for snd_pcm_drain() that says it will keep playing until it finishes a period, even if the data runs out before hand. "For playback wait for all pending frames to be played and then stop the PCM." Seems pretty clear that only the pending frames are played and not the rest of a period.
Yet the driver I'm working on definitely keeps playing to the end of the period on drain. Could be a driver problem, but I don't see how any other drivers do anything differently here.