Logfile Download from Strato

I am looking for a new maintainer for this project. If you are interested, please contact me!

This is a small Perl script intended to download logfiles from Web accounts from Strato hosted sites.

The problem is that the normal download interface is interactive, and that only about six weeks are kept on the server. So without scripting, you're likely to lose valuable log information (at least I was unable and unwilling to remember to download the logfiles once every six weeks).

This script logs in via the SSL-encrypted web-interface and downloads the files without needing a web-browser (proxies are configurable via the https_proxy environment variable).

Note: For SSL support, you might need additional Perl modules since some distros don't include crypto stuff: either use IO::Socket::SSL or Net::SSL (in Crypt::SSLeay).

If an SSL module is missing, https access will not work.

Usage is very easy:
$ get-logfiles -p mypassword www.mysite.de > logfile.gz


Download logs:

$ get-logfile -p bypassword -u www.mysite.de > logfiles.gz

Set the data in environment variables (in bash syntax):

$ export LOGFILE_SITE=www.mysite.de
$ export LOGFILE_PWD=mypassword
$ get-logfile > logfiles.gz

Same as before with reasonable name (note that in a crontab, you need to use \% instead of %):

$ get-logfile > "`date +%Y-%m-%d`".gz

Same as before, but only download the last 8..15 days:

$ get-logfile -start=32 > "`date +%Y-%m-%d`".gz

Same as before, but output diff wrt. files that already exist (new feature in v1.1):

$ get-logfile -start=32 -D *-*-*.gz > "`date +%Y-%m-%d`".gz

Missing Features / Future Work


Source Code





October 28th, 2007
Comments? Suggestions? Corrections? You can drop me a line.