On Mon, Apr 8, 2013 at 9:34 AM, Takashi Iwai tiwai@suse.de wrote:
At Mon, 08 Apr 2013 11:31:12 -0500, Pierre-Louis Bossart wrote:
- if (hinfo->ops.get_delay) {
codec_nsec =
hinfo->ops.get_delay(hinfo, codec, substream) * 1000000;
if (stream == SNDRV_PCM_STREAM_CAPTURE)
nsec = (nsec > codec_nsec) ? nsec - codec_nsec : 0;
else if (stream == SNDRV_PCM_STREAM_PLAYBACK)
nsec += codec_nsec;
Can the .get_delay be modified to provide a better resolution than a ms? If you already convert to time, microseconds would seem like a better fit? Your codec seems to report frames (ie 20.83 us).
Actually the patch must be wrong -- get_delay returns the delay in frame (sample) unit.
Of course you're right. That's annoying, it will have to convert from ms to frames (in get_delay) then to ns here. Converting from frames to ns here is a better idea, but for this particular codec, all I have is ms resolution.
Pierre, What should the capture timestamp represent? When the sample hits the A-to-D or when it is read out of the buffer?
Thanks,
-dg
Takashi