On Mon, Apr 8, 2013 at 4:37 PM, Pierre-Louis Bossart pierre-louis.bossart@linux.intel.com wrote:
Of course you're right. That's annoying, it will have to convert from ms to frames (in get_delay) then to ns here. Converting from frames to ns here is a better idea, but for this particular codec, all I have is ms resolution.
This is odd, looks completely arbitrary...
It does, I'll change this patch to convert from frames to ns, and check to see if the codec vendor can supply more accurate DSP latency numbers.
Pierre, What should the capture timestamp represent? When the sample hits the A-to-D or when it is read out of the buffer?
When the samples hit A-to-D, as close as possible to the input (or the serial link if the codec doesn't report delay)
OK, the codec latency will be subtracted from both the playback and capture times.
Thanks for the clarifications!
-dg
-Pierre