If you manage several sites on a server and want to track down the missing files, here is a hacked up script to do so. Put it into a weekly cron and chmod the file to be an executable.
#!/bin/sh # /etc/cron.weekly/404.sh # report of top 20 missing files in error_log # array of error_log file paths of the different domains. log_file=("/path/to/domain1/error_log" "/path/to/domain2/error_log" "/path/to/domain3/error_log") (for ((i=0; i<${#log_file[@]}; i++)) do echo "Report of top 20 'missing' files in ${log_file[$i]}." for x in `grep "File does not exist:" ${log_file[$i]} | awk '{print $13}' | sort | uniq` do grep $x ${log_file[$i]} | wc -l | tr -d '\n' echo " : $x" # Change the head value to the number of missing files to report. done | sort -rn | head -20 echo done) | mail -s "Missing File Report" username@yourdomain.com
Change the "username@yourdomain.com" to reflect the email address of the person that should get the reports.
- sandip's blog
- Login or register to post comments