Does anybody know of some statistics (unix) tools, well suited to
work on the access_log file that NCSA's httpd creates ?
/Kurt
_______________________
Kurt Westh Nielsen email: kwn@ingenioeren.dk
-------------------------
Roy Fielding's wwwstat program is in use at a lot of sites, including
here; see
http://www.ics.uci.edu/WebSoft/wwwstat/
The only problem I've had with it is that it produces too much
information to peruse on a daily basis (wwwstat summaries are still
pretty big).
I finally wrote a program of my own which takes the wwwstat summary,
and produces a two-page metasummary whose major content is a breakdown
of traffic by directory (cumulative over subdirectories), with traffic
in images broken out from other files (to give some idea how much
traffic comes from inline images), and with low-traffic directories
suppressed. This is a perl5 script; see
http://www.ai.mit.edu/tools/usum/usum.html
for pointers to a sample of the output, and a copy of the script, if
you're interested.
rst