11 years 7 months ago
Problem - I need to know if a given website is up and responding to requests - not just that the server is up and answering pings, but that a specific page is available - and to have a robot do this for me, on a schedule I can define, and alert me if it’s down. Let’s pretend we are interested in
www.mysite.org/mypages/mypage.shtml
It turns out that the common commandline (and therefore scriptable) tool ‘wget’ can do the heavy lifting for us. We’ll use the “-spider” mode switch. We just check for the “200 OK” vs. “404 Not Found” in the reply. wget is built-in on most Linux distros, and is available as a free add-on for the Windows shell (
www.gnu.org/software/wget/).
Got an idea or a direction that I can follow?
--
Thanks,
Ray