Hi Santtu On 2014-03-06 14:48, Santtu Lakkala <inz inz fi> wrote:
On 06/03/14 13:59, David King wrote:At a glance, it looks much better with the regexes! If the performance is similar, I think it should be fine. The many man-hours saved during maintenance will be worth any slowdown. ;-)
Of course, I then read the regexes a bit more and realised that I had to look at a reference to decipher some of them. Ah well!
… These are results for 100000 repetitions for the string "Jehtro Tull" (first artist that happened to come to mind =). So cannot say, that performance was "similar", but I'd still say that even 2 seconds for 100 000 iterations is very little compared to other slowdowns. With simplest kind of caching (put created GRegex-objects in a GHashTable, with regex itself as key) and using G_REGEX_OPTIMIZE-flag, the results are noticably better, but still about an order of magniture slower: … These results are a bit skewed against current implementations due to the wrapper mechanism (and hence "forced" g_strdup), but as it's anyway kicking major regex arse, I'd say it's no biggie. Now the question remains, is this acceptable performance.
Considering the code which calls these functions, it might be OK. On the other hand, for the trivial replacements, the current functions are readable and seem fine, and for the more-complex regex functions, I have to take a bit too long to decipher the intent of the expression for my liking.
I think that the best solution is to write tests for the current functions, and to fix up any problems and refactor as necessary. Abhinav Jangda is a potential Google Summer of Code student who has been making some good contributions to EasyTAG recently. I suggested to him that some tests for the scanner string-manipulation functions would be useful, as this discussion came up on the mailing list, so he is working on that now and should have some results in the next few days.
-- http://amigadave.com/
Attachment:
signature.asc
Description: Digital signature