On Profiling and Optimizing OpenSource Software
A couple weeks ago I read an entry on
Optimizing GNU ld from a Mozilla developer. Very interesting reading, do not miss. It was shortly after start of my work on
optimizing Pango opentype code. We both used
OProfile, which I hereby declare as a quite handy tool. Profiles anything and everything.
From my own experience and what I've read here and there, including the mentioned report, seems like most OpenSource software is spending most of its in things like strcmp, a handwritten sort or search, etc, and not algorithms or other things we expect it to spend. Lemme rearrane my argument: If you want to optimize a piece of software, 95% of time you do not need to find a way to reduce the number of times the fast path is taken, you simply need to remove heavy (by C peoples' definition of heavy, like strcmp) operations out of the way, just that. Yes, by removing an O(n^2) algorithm in a fast path for an O(n log n) one you get a lot, but if you are working on such a code, you most probably do not need profiling!