RE: poll with timeout 0 in main loop

hi, this is the code that makes the server socket

    // create the server socket and return the file descriptor
    fd = tcp_server_listerner_new(new_server->configurations,&error);
    if(fd < 0)
        g_error("Could not create listener, server creation aborted: %s",error->message);
        return NULL;

    // create the GIOChannel for the server socket
    new_server->socket = g_io_channel_unix_new(fd);
    // set the encoding safe to read binary data on the server GIOChannel
   // create a source for connection events on the server socket
    new_server->event_source = 
    // set the callback function to handle connection events
    // attach the server socket connection event source to the server context

   new_server->main_loop = g_main_loop_new(new_server->context,FALSE);

and this is how i run the main loop:

void tcp_server_run(GMTCPServer *p_server)
    // run the main loop now


this method is run from the main thread like this:

gm_tcp_server->server_thread = g_thread_create((GThreadFunc)tcp_server_run,gm_tcp_server,TRUE,/*&error*/NULL);

and this is how i create the server socket:

int tcp_server_listerner_new(const Configurations *p_configurations,GError **p_error)
    int fd;
    struct sockaddr_in servAddr;
    int set_option = 1;

    // create the socket
    if(fd < 0)
        *p_error = g_error_new(G_IO_CHANNEL_ERROR,G_IO_CHANNEL_ERROR_FAILED,"Could not create server socket: 
        return -1;
    // set some options on the socket
    fcntl(fd,F_SETFL,O_NONBLOCK);// set the socket non-blocking
    setsockopt(fd,SOL_SOCKET,SO_REUSEADDR,&set_option,sizeof(set_option));// reuse address
    setsockopt(fd,SOL_SOCKET,SO_KEEPALIVE,&set_option,sizeof(set_option));// send keep alive messages

    // construct local address structure
    memset(&servAddr,0x00, sizeof(servAddr));
    servAddr.sin_family = AF_INET;
    servAddr.sin_addr.s_addr = inet_addr(p_configurations->tcp_server.interface);
    servAddr.sin_port = htons(p_configurations->tcp_server.port);
    // Bind to the local address
    if(bind(fd,(struct sockaddr*)&servAddr,sizeof(servAddr)) < 0)
        *p_error = g_error_new(G_IO_CHANNEL_ERROR,G_IO_CHANNEL_ERROR_FAILED,"Could not bind server socket: 
        // close the socket
        return -1;
    if(listen(fd,10) < 0)
        *p_error = g_error_new(G_IO_CHANNEL_ERROR,G_IO_CHANNEL_ERROR_FAILED,"Could not start listening on the 
server socket: %s",strerror(errno));
        // close the socket
        return -1;

    return fd;

thanks !

Date: Fri, 22 Oct 2010 00:06:08 -0200
Subject: Re: poll with timeout 0 in main loop
From: maginot junior gmail com
To: jpablolorenzetti hotmail com
CC: gtk-app-devel-list gnome org

On Thu, Oct 21, 2010 at 11:27 PM, Juan Pablo L. <jpablolorenzetti hotmail com> wrote:

Hi, i have a problem with an application i m building with glib-2.24.2 on linux, i create a server socket 
which i use to create a GIOChannel and add it to a main loop, everything was fine until i found that the 
application consumes 99.8% of the CPU while waiting for connections (doing nothing else just sitting there 
waiting for connections) and it gets even worse as new connections come in because i create a new loop in 
each connection (i create a thread for each connection) to deal with the incoming packages and each 
connection has N handlers to process the requests, each handler has its main loop for asynchronous 
communication between them and the connection thread that owns the handlers, so i did a strace and found out 
that the time out being passed to poll is 0 so for each iteration the poll returns inmediatly therefore i m 
stuck with a busy wait in all main loops, so i have read the documentation trying to find out how to modify 
this bahaviour but i could find nothing ab

 out it, can you please tell me why this is happenning ?? thanks!


gtk-app-devel-list mailing list

gtk-app-devel-list gnome org

Can you post an snipt of your code? 

And if you started a Thread and a loop inside it, it will stay interacting  if you don't stop it, this will 
consume CPU for sure. I think you must work with signals or interrupts, I'm not sure about GIOChannel, maybe 
a snipt would clear a bit.

[ ]'s

Maginot Júnior
LPIC 1 - LPIC 2 - LPIC 3 -  CCNA - CLA - Forensics Analyst


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]