General
cancel
Showing results for 
Search instead for 
Did you mean: 

monitoring HTTP GET and HTTPS - 200

SOLVED
Go to solution
Omar Alvi_1
Super Advisor

monitoring HTTP GET and HTTPS - 200

Hi,

I want to create a monitoring script for website up/down status which will eventually be run as a scheduled action with OVOW as a scheduled action and opcmsg.

I'm looking to use perl for this purpose but have not been able to find anything straightforward as to how such monitoring can be done through scripting.

Any idea - what are the commands that will "ping" a website using http / https and then I can evaluate the 200 OK response or otherwise.

Really appreciate any assistance and support.

Thanks and Regards,

-Alvi
15 REPLIES
James R. Ferguson
Acclaimed Contributor

Re: monitoring HTTP GET and HTTPS - 200

Hi:

You could do something like this:

# cat .probeurl
#!/usr/bin/perl
use strict;
use warnings;
use HTTP::Headers;
use LWP::UserAgent;
my $url = shift or die "URL expected\n";
my $useragent = LWP::UserAgent->new;
my $request = HTTP::Request->new( HEAD => $url );
my $response = $useragent->request($request);
if ( $response->is_success ) {
print $response->status_line, "\n";
}
else {
print "Failed: ", $response->status_line, "\n";
}
1;

...run as:

# ./probeurl http://www.womesite.com

# ./probeurl http://www.hp.com

Regards!

...JRF...
Steven E. Protter
Exalted Contributor
Solution

Re: monitoring HTTP GET and HTTPS - 200

Shalom,

wget is the tool for this.

Available for 11.23
http://h20392.www2.hp.com/portal/swdepot/displayProductInfo.do?productNumber=HPUXIEXP1123

For 11.31
http://h20392.www2.hp.com/portal/swdepot/displayProductInfo.do?productNumber=HPUXIEXP1131

Part of a multi tool set you will need to choose it or download the entire package.

http://hpux.connect.org.uk/hppd/hpux/Gnu/wget-1.11.4/

Another source that includes 11.11.

wget http://www.yoursite.com

Will get index.html

Script example.

wget http://www.yoursite.com
rc=$?
if [ $rc -eq 0 ]
then
echo "site is up"
else
echo site is down"
fi

You can work the script from there.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
Steven Schweda
Honored Contributor

Re: monitoring HTTP GET and HTTPS - 200

> wget is the tool for this.

It's certainly _a_ tool for this, and it's
the one I use.

> wget http://www.yoursite.com
> rc=$?
> if [ $rc -eq 0 ]

I would not rely on the exit status from
wget. It's often very happy with results I
find unsatisfactory. I use a command like
"wget -O ", and then scan
for "200 ".
James R. Ferguson
Acclaimed Contributor

Re: monitoring HTTP GET and HTTPS - 200

Hi (again):

Actually, I could simply the Perl you are looking to use (since I gutted some features I had that were not germane to your need:

# cat ./probeurl
#!/usr/bin/perl
use strict;
use warnings;
use LWP::UserAgent;
my $url = shift or die "URL expected\n";
my $useragent = LWP::UserAgent->new;
my $request = HTTP::Request->new( HEAD => $url );
my $response = $useragent->request($request);
print $response->status_line, "\n";
1;

# ./probeurl http://www.hp.com
200 OK
# https://www.hp.com
404 Not Found
# ftp://www.hp.com
200 OK

Regards!

...JRF...
Omar Alvi_1
Super Advisor

Re: monitoring HTTP GET and HTTPS - 200

Thanks a lot,

I've been trying out the perl script option, but it seems I'm getting time outs for webstes I'm accessing normally using my browser.

I haven't tried wget yet on my windows.

Will get back with the results - after my weekend.

Really appreciate the great inputs.

Thanks and Regards,

-Alvi
Steven Schweda
Honored Contributor

Re: monitoring HTTP GET and HTTPS - 200

> I haven't tried wget yet on my windows.

Windows? Now you're scaring me.
James R. Ferguson
Acclaimed Contributor

Re: monitoring HTTP GET and HTTPS - 200

Hi (again):

> I've been trying out the perl script option, but it seems I'm getting time outs for webstes I'm accessing normally using my browser

I'd be interested to know a few of the URLs that give you problems, including the HTTP code the script returns.

Regards!

...JRF...
Omar Alvi_1
Super Advisor

Re: monitoring HTTP GET and HTTPS - 200

Hi JRF,

Following ares ome of my trials with this script.

E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://www.google.com
500 Can't connect to www.google.com:80 (connect: timeout)

E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://elm.com.sa
500 Can't connect to elm.com.sa:80 (connect: timeout)

E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://www.hp.com
500 Can't connect to www.hp.com:80 (connect: timeout)

E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://www.hp.com:8080
500 Can't connect to www.hp.com:8080 (connect: timeout)

There doesn't seem any delay in accessing the internet from this particular server. I tried 8080, as this is the setting of our proxy.

The following, a local site with no security restrictions, came back successfully.

E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://elm-as-01
200 OK

Really appreciate your support.

I am working with perl on windows, as this will eventually be used on my OVOW server.

Thanks and Regards,

-Alvi
Steven Schweda
Honored Contributor

Re: monitoring HTTP GET and HTTPS - 200

> I am working with perl on windows, [...]

Asking about Windows problems in an HP-UX
forum may not be the most efficient way to
get useful answers.

When you find an appropriate (Microsoft)
forum, you might consider mentioning whether
you can reach these servers from this system
using a Web browser, so that someone could
tell whether you have a Perl problem or a
general network problem.
James R. Ferguson
Acclaimed Contributor

Re: monitoring HTTP GET and HTTPS - 200

Hi (again) Alvi:

> The following, a local site with no security restrictions, came back successfully.

So it appears that you may need to consider firewall settings. My Perl script has no problems in UNIX or Windows (as one might expect).

You might need to set your 'http_proxy' environment variable to enable the Perl script to offer its service.

Regards!

...JRF...
Omar Alvi_1
Super Advisor

Re: monitoring HTTP GET and HTTPS - 200

Hi,

Found LWP quite interesting.

Following the script modified to include the proxy setting, as adding the http_proxy environent variable had no impact on this script.

---------------

# cat ./probeurl
use strict;
use warnings;
use LWP::UserAgent;

my $wsr = LWP::UserAgent -> new;
$wsr -> timeout( 20 );

$wsr->proxy(['http', 'ftp','https'], 'http://192.168.5.39:8080/');

my $url = shift or die "URL expected\n";
my $useragent = LWP::UserAgent->new;

my $request = HTTP::Request->new( HEAD => $url );
my $response = $useragent->request($request);

print $response->status_line, "\n";
print $response->is_success, "\n";

-------------

Had some limited success but still getting the timeout errors for most websites. There was success as before for the internal websites.

The thing is that the external websites (behind proxy) are givign time outs and not authenitcation or forbidden messages.

C:\SCRIPTING>urltestv4.pl http://www.google.com
500 Can't connect to www.google.com:80 (connect: timeout)

Lookin forward to your support

Thanks and regards,

-Alvi

Omar Alvi_1
Super Advisor

Re: monitoring HTTP GET and HTTPS - 200

Additonally, after setting the http_prox variable, wget worked fine for a while. But now it returns proxy authentication errors.

Using Options without quotes

C:\Program Files\GnuWin32\bin>wget http://www.google.com --proxy-user=oalvi --pr
oxy-passwd=xxxxxx
SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc
syswgetrc = C:\Program Files\GnuWin32/etc/wgetrc
--2009-07-08 16:14:00-- http://www.google.com/
Connecting to 192.168.5.39:8080... connected.
Proxy request sent, awaiting response... 407 Proxy Authentication Required
2009-07-08 16:14:00 ERROR 407: Proxy Authentication Required.

Using Options with quotes

C:\Program Files\GnuWin32\bin>wget http://www.google.com --proxy-user="domain\oalvi
" --proxy-passwd="xxxxxx"
SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc
syswgetrc = C:\Program Files\GnuWin32/etc/wgetrc
--2009-07-08 16:15:22-- http://www.google.com/
Connecting to 192.168.5.39:8080... connected.
Proxy request sent, awaiting response... 403 Forbidden
2009-07-08 16:15:22 ERROR 403: Forbidden.



Some suiccess with LWP in that the followingg was successful. Here, I have success whether proxy is set or not.

C:\SCRIPTING>urltestv4.pl ftp://www.hp.com
200 OK
1

James R. Ferguson
Acclaimed Contributor

Re: monitoring HTTP GET and HTTPS - 200

Hi (again) Alvi:

> Had some limited success but still getting the timeout errors for most websites. There was success as before for the internal websites.

The default timeout is 180 (seconds). You might try increasing your value of 20 and see if you have better success.

Regards!

...JRF...



Omar Alvi_1
Super Advisor

Re: monitoring HTTP GET and HTTPS - 200

Hi JRF ...

Appreciate your persistence in assistance.

I ran it mostly with the 180 default timeout - it was still timing out. I put it to 20 only to reduce my troubleshooting time when I was trying out different stuff.

I've tried the proxy, by hardcoding as well as by passing from the environment - but no success.

Regards,

-Alvi
Omar Alvi_1
Super Advisor

Re: monitoring HTTP GET and HTTPS - 200

Well, it turned out to be some issues with the code.

The proxy was needed, but I had a variable too many and wasn't using the variable I had defined with the proxy.

Below the correct code with proxy

---------
# !C:\Perl\bin

use strict;
use warnings;
use LWP::UserAgent;

my $url = shift or die "URL expected\n";
my $useragent = LWP::UserAgent->new;

$useragent->proxy(['http', 'ftp','https'], 'http://192.168.5.39:8080/');
$useragent -> timeout( 20 );

my $request = HTTP::Request->new( HEAD => $url );
my $response = $useragent->request($request);

print $response->status_line, "\n";
print $response->is_success, "\n";
print $useragent->proxy('http'),"\n";
---------

Regards,

-Alvi