Re: How best to fetch a web page...




my reasoning for using wget in the first place, instead of a raw
tcp connection, was to have at least some sense of robustness...

wget is, of course, open-source and GPLd, so it's quite possible to
extract the minimal code from wget to accomplish what you want, and
incorporate it in your own app.  This is what I have done.

And I suspect, either switch in threads (as I'd have to do for that libcurl example), or blend it into the 
g_main_loop somehow.  I think I'll stick with the two existing options.  ;)

Using wget in its entirity also means that if they decide to ask for https authentification instead of a 
username/password in the query as they do now, I could just modify the wget command-line options.

As far as I can tell from my quick peek, libcurl aught to be able to handle that situation if it happens...  
Haven't looked at GnetHttp at all yet, because I'm sticking to packages available through standard 
Debian/unstable.  Though I suspect (at least I hope ;) ) it will by the time such a change is needed.  But 
then, that's why the whole of the fetching code is going into its own little source file.  ;)


Fredderic

_______________________________________________
Join Excite! - http://www.excite.com
The most personalized portal on the Web!



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]