Hi Marcin,
Thanks for your answer. Few comments inline.
What is the exact usage of local buffer in codec_adapter component?
[Marcin Rajwa] with current implementation the local buffer plays a crucial role when codec input buffer is different than the pipeline period. In this case, we need to store enough samples before we actually start producing samples outside otherwise we are prone to glitches.
Commit below says: commit 44137fad6c4a8e5dbf4ae8a959885086f3a81748 codec_adapter: add local buffer
This patch adds local buffer to codec adapter. The aim of this buffer is to damp any irregularity between pipeline and codec. The difference in buffer sizes is one of the examples here - thanks to additional local buffer we can call codec processing several times during one period or we can store several pipeline periods in local buffer and process them once.
In case of decoder algorithms. Is this used as an input or output buffer?
[Marcin Rajwa] well as you know, no decoding/encoding functionality is officially supported yet. However this buffer is supposed to be an output. Your input buffer is really a source buffer of codec_adapter defined in the topology. Make sure to make it big enough so it can store all samples needed to start decoding. Now, imagine you have decoded four period of samples (this is the starting threshold), you produced them all to the DAI and what's going to happen next? Next you will produce nothing in following four periods, so DAI will report underruns .. the issue is apparent. So here local buffer alongside with deep buffering feature of codec_adapter comes in as a cure - you decode samples to local buffer but produce them out only when we calleted sufficient amount, so we can ensure each pipeline period will get its period data.
DB: I am trying to add decoding functionality on top of your codec_adapter code. The only difference now is that now (just as an example) for X bytes of input compressed data I get 10 X bytes at output of PCM data.
So, I need to find a good balance between source buffer size, local output buffer size and the moment when I should call process function.
It looks like now deep buffering only happens at the start of stream, then cd->deep_buff_bytes is set to 0
comp_dbg(dev, "codec_adapter_copy(): deep buffering has ended after gathering %d bytes of processed data", audio_stream_get_avail_bytes(&local_buff->stream)); cd->deep_buff_bytes = 0;
and then never changed.
I will add a topic about compress decoding for our next week meeting.