CPU load becomes high when using signal_timeout and thread_init on Windows.



Hi,
I have a problem being unsolvable.

This isn't a problem on Linux and BSD.
CPU load becomes high when using signal_timeout and thread_init on Windows.

Please teach me how to avoid a problem, with compatibility with Linux.

Windows XP SP3
g++.exe (GCC) 3.4.5 (mingw-vista special r3)
gtkmm-win32-devel-2.16.0-4.exe

-----------------
#include <gtkmm.h>

bool slot_timeout( int timer_number ) {
    return true;
}

int main( int argc, char **argv ) {
    // if comment below, no cpu loads
    if( !Glib::thread_supported() ) Glib::thread_init();

    sigc::slot< bool > timeout = sigc::bind( sigc::ptr_fun(
&slot_timeout ), 0 );
    // CPU useage becomes about 15% (Athlon2500)
    Glib::signal_timeout().connect( timeout, 50 );
    // if your pc is powerful , uncomment below
//    Glib::signal_timeout().connect( timeout, 25 );

    Gtk::Main m( &argc, &argv );
    Gtk::Window* obj = new Gtk::Window();
    m.run(*obj);

    return 0;
}
-----------------

thanks and  regards
--
weise


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]