On Tue, Apr 13, 2004 at 11:17:21PM -0400, muppet wrote:
On Tuesday, April 13, 2004, at 05:23 PM, Bob Wilkinson wrote:I have read the FAQ Q: How do i keep my gui updating while doing a long file read? and added code to the process_infile routine.<snip>However, it only updated my TextView at the end of the processing.Some sanity-check questions: - Are you returning control to the event loop?
Implicitly. once the processing routine is finished control returns. How would I do this explicitly, once the processing routine is started?
- Does the child print anything out before the end of processing?
No :-(
- Is the child doing buffered writes which would result in a chunk at the end even though it should print out other stuff in the middle?
Not that I am aware of. The processing routine is an XS wrapper around a C library. The C library "printf"s percentage progress completed. If I run it from a terminal it appears unbuffered. Prior to me starting to show this progress in a TextView pane it was appearing, apparently unbuffered, in the terminal from which the GUI interface was started.
- Would a terminal emulator widget, e.g., Gnome2::Vte actually make more sense for this?
Probably! I will look into that this evening. I knew asking a question may help me with my problem.
I also discovered, via ps ax, that each file was being processed simultaneously. I was testing with 2 files in the directory - though could imagine problems were I to try to process many files, due to the processor-intensive nature of the processing. The solution of forking n subprocesses isn't scalable.I presume this is because the parent forks and goes on; from the looks of your click handler, you spawn a child for all the entries in the dir before returning from the click handler.
Initially, this is what I did. I then noticed that the children were being spawned simultaneously, and wanting to avoid this I collected the filenames and passed them to the processing routine - however in both cases the output in the TextView was only updated at the end of processing.
You need to implement some form of throttling mechanism that allows at most $n_active_children at a time; then sequential processing becomes $n_active_children=1. There are tons of ways to do it, using semaphores, file locking, a counting reaper, chaining handlers, etc etc.
I tried setting a global lock variable, and to run the children only when the lock was unset. Unfortunately they ignored me and ran anyway. within the directory reading loop, I used something like: my $d = new DirHandle $::settings->{dirname}; if (defined $d) { my $filename = join ("/",$::settings->{dirname},$_); while ($::settings->{locate_lock}) {warn "LOCK IS SET for ", $filename; sleep 1;} $::settings->{locate_lock} = 1; process_file($filename); $::settings->{locate_lock} = 0; } undef $d; I think though that since I was only testing with 2 files I may have been victim of a race condition.
An "interesting" side effect of such a mechanism is that the button click will actually start the processing, not contain all of the processing. You'll want something like on click scan dir to create list of files to process add these files to the processing queue start the queue desensitize portions of the ui that the user shouldn't bother while you're busy sensitize a "cancel" button And then the queue handler is some object that knows about process throttling and synchronization and all that.
OK. But all of the GUI seemed insensitive while processing the files - they typically take about 20-30 seconds to process each file.
So I would like to be able to process the files sequentially - I have found that if I build a list of files in the callback and pass this to the processing routine, I can effect this, though with the side-effect that the GUI is unresponsive, and the TextView still only updates when all of the files are processed.That sounds a lot like control is not returning to the main loop until the job is finished. What other stuff is happening in process_file()?
It is just a wrapper around C code which does signal processing on the supplied filename.
I am thinking that I need a more sophisticated approach. I could implement this via the use of the "threads" and "threads::shared" modules. I have also found POE::Loop::Gtk2; I am not however sure of the best approach, so am asking for guidance from any one with experience of using either of these approaches.Even if you used POE, you'd still have to return control to the main loop in order for the UI to be responsive, and that will require some form of asynchronous processing.
As I inferred above, maybe I'm not sure that I am correctly returning control to the main loop.
I can't recommend threads in good conscience, for several reasons: - A fork/pipe system will be a little more stable and robust (on unix, at least). - Using threads requires a threaded perl, which is not available everywhere, whereas perl's forking open() is. - You can't share widgets with threads::shared. In fact, even with the Gtk2::Gdk::Threads functions (which don't appear in the FAQ, i notice), there are "interesting" issues involving Glib::Objects being destroyed out from under the main thread when a worker thread exits, for which there are no clean solutions.
OK - I will ignore threads. I will go to look at Gnome::VTE::Terminal. If this does what I want, I will inform the list (and publish the code) Thanks for your advice. Bob -- I never failed to convince an audience that the best thing they could do was to go away.
Attachment:
signature.asc
Description: Digital signature