[alsa-devel] DMA buffer gets played only once

Trent Piepho xyzzy at speakeasy.org
Thu Sep 6 07:14:17 CEST 2007


On Thu, 6 Sep 2007, Markus Franke wrote:
> Takashi Iwai wrote:
> >> At least this is the behaviour I experienced.
> >
> > It's your mis-interpretation of START/STOP concenpt in the ALSA
> > framework.  The trigger START and STOP mean the start/stop of the
> > whole streaming operation.  It's basically called from the outside,
> > i.e. the application starts/stops the stream.  If you need to keep
> > some DMA start/stop operations internally, do it in the driver
> > lowlevel side internally.
>
> thanks for reply but as already stated this is the behaviour I
> _experienced_. I don't know why also continuously calls trigger START,
> play one period, trigger STOP. Here is some pseudo code:
>
>
> vi1888_pcm_dma_userCallback() // gets called upon completion of DMA
> transfer
> {
>          prtd->period_ptr += prtd->period_size;
>          if (prtd->period_ptr >= prtd->dma_buffer_end) {
>                  prtd->period_ptr = prtd->dma_buffer;
>          }

You need to make sure the buffer size is an integer multiple of the period
size if you wrap like this.

> After each call to snd_pcm_period_elapsed() Alsa calls
> pcm_trigger(TRIGGER_STOP).
>
> Please tell me what I am doing wrong in these callbacks?

I found that ALSA would call TRIGGER_STOP if it detected an overrun.

Check the value returned for every call of your pointer callback.  I found out
that ALSA was calling the pointer callback before the first IRQ, which was
something I hadn't planned on and the values confused ALSA.


More information about the Alsa-devel mailing list