Re: [Ekiga-list] PTLIB alsa plugin status
- From: Derek Smithies <derek indranet co nz>
- To: Ekiga mailing list <ekiga-list gnome org>
- Subject: Re: [Ekiga-list] PTLIB alsa plugin status
- Date: Fri, 27 Feb 2009 10:24:48 +1300 (NZDT)
Hi,
On Thu, 26 Feb 2009, Alec Leamas wrote:
Just to sort his out, trying to understand the overall requirements.
Which is a very reasonable thing to do, and it is a good question.
And with the idea that using threads is perfectly reasonable in this context :-)
Excellent. This idea is very reasonable.
-Let's focus on the playback case, leaving any read aside (which refer to a
different alsa device).
Good idea.
- This should mean that while one thread (A) is closing it's playback
device, another thread (B) starts writing to it.
Yes, thread B is writing audio to the playback device. Thread B is
collecting the data off the jitter buffer,
decoding the audio using the specified codec,
sending the data to the playback device.
thread B is stopped by a bad write to the playback device. Typically, a
bad write to the playback device is caused by the playback device being
closed.
In the gui thread, user clicked hangup, and a close down thread is
created. This close down thread runs independantly of the gui (so
does not hold up the gui so responses work ok) and makes a call to
OpalCall::Clear() (which is a structure inside Opal) which then goes
through all the RTP threads (including audio) and closes the devices.
Since the Open, Close and Read/Write operations are atomic, there is no
possibility of one happening while the other happens and breaking things.
The Opal thread which does the call to device Open then goes on to launch
the read/write threads. So read/writes don't run before the device is
open.
I don't think this aspect of the the Opal design is a problem. The problem
we are are trying to address is the reason for the buffering - why is
there a 100ms delay???
Answer:
There are two entities that I have seen which can "store" audio to give
you a delay.
The jitter buffer, which can store seconds of audio.
There are status variables in the jitter buffer which indicate how long it
is buffering audio for.
The sound device. Opal sets the sound device (on linux) to store 2 buffers
of audio, which is (at most) 2 x 30ms.
One of the 30ms buffers is the buffer currently being written to the sound
device.
The second 30ms buffer is the next buffer to be written.
The buffering depth is set by the call to
devices/audiodev.cpp:bool PSoundChannel_EKIGA::SetBuffers (PINDEX size, PINDEX count)
size is <=480 (480 is for a 30ms long buffer. GSM uses 20ms.)
count is typically 2 (windows uses 3 or more)
It "is" possible that this call is not happening at the right time. I
doubt this, but you could verify this with a review of the logs.
If this command was being missed, the sound device would get whatever
value it is defaulted to..
Derek.
--
Derek Smithies Ph.D.
IndraNet Technologies Ltd.
Email: derek indranet co nz
ph +64 3 365 6485
Web: http://www.indranet-technologies.com/
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]