IOchannel latency ?


I have a little problem in my gtk application.

I have a IOChannel, using a socket (stream mode).
I've connected the signal_io signal on a callback that simply read a
specific data size (let's say N), and exit.

Sometimes, when I send by my client socket (java application) 2 messages of
N size, a latency happens :
1/  Enter the callback, read N size, and exit the callback
2/  Latency (~20 seconds), i don't know what the application is doing (not
freezed, no process)
3/  Enter the callback, read N size, and exit the callback

I don't understand the point 2... I don't think it's an error of the size i
read, because i'm not even in the callback... It seems the IO signal has
not been received yet ! (I'm on a single computer, so i don't think network
latency can be the cause)
I've checked my java application logs, the write timestamp are ok, the 2
messages are sent directly, no latency from this side... it really seems by
Gtk/Glib loop does not detect right now the IO_signal, but I don't
understand why...

I tried to flush the socket right after sending the 1st message on my java
client side, but it changed nothing. I tried to flush my IOChannel just
before exiting the callback, but it changed nothing neither...

If somebody has an idea,
Thanx a lot :)
(and sorry for my english)

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]