Website monitoring

Hello,

I’m trying to find a way for nagios to monitor sites a bit more in depth. Currently we just have it setup with http ping check and it works fine for most situations but just today we had a problem where our wildcard ssl key wasn’t applied to certain sites and it broke something but our redirect page came up so when nagios tried pinging the site it saw that it was still up and I wasn’t notified until a customer had called in.

Would there be a way to say scan the html page for say the title and if such a title appears to have it send me a warning message that the site is down?.

thank you
-Crips

anyone?

This website has nice explanations for a lot of commonly used check commands. Take a look at the description of “check_http”. Looks to me like you might be able to make use of the -s flag to look for content on your expected webpage. The -C flag is also pretty handy for your SSL sites.

You could also write yourself a little script that does a wget whatever.com/lala.htm , and then does some greps or regex matching on the downloaded file. This is a terrible example that might not work but it should give you some ideas:

#!/bin/bash
wget somesite.com/mypage.html

$result=grep "website title" mypage.html

if -z $result
then
echo "omg no title found! good!"
exit 0
else
echo "zomg bad page!"
exit 2
fi

With an exit code of 2, nagios will throw a crit. 0 is OK. You could of course make this waaaaaay more robust

thanks for the help guys much appreciated!