g_io_channel_read_chars vs.g_io_channel_read




Point of reference:

$ pkg-config --modversion gtk+-2.0
2.2.1

g_io_channel_read_chars() is the designated replacement for the
deprecated g_io_channel_read(). However, the two functions behave
differently and I'm having trouble reconciling those differences.
Specifically, the symptom is that g_io_channel_read_chars() blocks
whereas g_io_channel_read() does not.

My goal is to allow client's to communicate with my application via a
socket interface. I also want to utilize glib so that a call back
function is invoked whenever the client sends data. This callback
function checks the condition for the callback and will read the data
if there's any available. To this end, I've created a GIOChannel to
hold the socket and have set the callback GIOCondition to invoke a
handler when the socket has data availble. The callback function reads
the channel to fetch the data.

So far though, I've only been able to read binay data with the deprecated
g_io_channel_read() function. I've also had success for '\n' terminated
ASCII strings and g_io_channel_read_line().

Here are the details on what I've done:

-- I've created a socket:

   my_socket = socket(AF_INET, SOCK_STREAM, 0)

-- I've created a GIOChannel for that socket:

   channel = g_io_channel_unix_new(my_socket);

-- I've created a watch for this channel:

GIOCondition condition = (GIOCondition)(G_IO_IN | G_IO_HUP | G_IO_ERR);

    g_io_add_watch(channel,
                   condition,
                   server_accept,
                   (void*)&s);

Where 'server_accept' is the callback function that will get invoked
and 's' is some object whose address will get passed to
'server_accept'.

Now, in the server_accept callback function,

gboolean server_callback(GIOChannel *channel, GIOCondition condition, void *d )
{
...
}

I examine the condition and when it is G_IO_IN I read the data

  g_io_channel_read(channel, buffer, BUFSIZE, &bytes_read);

However if I use:

  GError *err=NULL;
  GIOStatus stat;

stat = g_io_channel_read_chars(channel, s->buffer, BUFSIZE, &bytes_read, &err);

then the call blocks.

I've experimented with both changing the channel encoding by calling:

    g_io_channel_set_encoding (channel, NULL, &err);

just after the channel has been created.

I've also tried setting the channel to non-blocking by:

    g_io_channel_set_flags (channel, G_IO_FLAG_SET_MASK, &err);

But neither of these had any affect on the described symptom.


Note, I set up the channel after the socket has been created, but
before the client has connected.

Does anyone have an idea why g_io_channel_read_chars() behaves
different than g_io_channel_read()?

Scott



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]