
On Tue, Oct 23, 2012 at 11:00 AM, Jamey Drennan jamey.drennan@gmail.comwrote:
On Tue, Oct 23, 2012 at 2:35 AM, Clemens Ladisch clemens@ladisch.dewrote:
(quoting fixed; please don't top-post) Jamey Drennan wrote:
On Mon, Oct 22, 2012 at 4:45 PM, Florian Faber faber@faberman.de
wrote:
How do you synchronize the streams? You will either have to synchronize the media clocks or do some sort of SRC.
Maybe I don't quite understand your question but I use RTP (oRTP
library)
to manage the sending/receiving of the audio streams.
So the RTP stream is synchronized to the sender's clock. How do you handle the differences between the stream's clock and the playback device's clock?
Regards, Clemens
The client and server negotiate the connection parameters including packet interval, size, audio format, and rate. The rtp library ensures that the packets arrive on time and accounts for initial time differences. Maybe the timestamps of the stream packets aren't enough to keep the two clocks synchronized(or a frame is the same regardless if the two devices are set up the same)? In testing, the two clocks are one and the same since I am running the client and server on the same device.
So the problem shows up even when testing?
Yes the problem is occurring in testing as well. An interesting fix of sorts that I just found is, if I adjust my playback function to play slightly less than a whole packet, the playback function gets called twice and the underrun goes away. This makes sense in a way because now I'm essentially giving the ring buffer twice the amount of data than it was getting before. The odd thing is there are no noticeable sound artifacts like echoing or fuzz as I would expect. I agree that is definitely a timing/clock issue and I appreciate the help.