- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - Microsoft
- >
- Re: monitoring HTTP GET and HTTPS - 200
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-30-2009 07:28 AM
тАО06-30-2009 07:28 AM
I want to create a monitoring script for website up/down status which will eventually be run as a scheduled action with OVOW as a scheduled action and opcmsg.
I'm looking to use perl for this purpose but have not been able to find anything straightforward as to how such monitoring can be done through scripting.
Any idea - what are the commands that will "ping" a website using http / https and then I can evaluate the 200 OK response or otherwise.
Really appreciate any assistance and support.
Thanks and Regards,
-Alvi
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-30-2009 08:10 AM
тАО06-30-2009 08:10 AM
Re: monitoring HTTP GET and HTTPS - 200
You could do something like this:
# cat .probeurl
#!/usr/bin/perl
use strict;
use warnings;
use HTTP::Headers;
use LWP::UserAgent;
my $url = shift or die "URL expected\n";
my $useragent = LWP::UserAgent->new;
my $request = HTTP::Request->new( HEAD => $url );
my $response = $useragent->request($request);
if ( $response->is_success ) {
print $response->status_line, "\n";
}
else {
print "Failed: ", $response->status_line, "\n";
}
1;
...run as:
# ./probeurl http://www.womesite.com
# ./probeurl http://www.hp.com
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-30-2009 08:11 AM
тАО06-30-2009 08:11 AM
Solutionwget is the tool for this.
Available for 11.23
http://h20392.www2.hp.com/portal/swdepot/displayProductInfo.do?productNumber=HPUXIEXP1123
For 11.31
http://h20392.www2.hp.com/portal/swdepot/displayProductInfo.do?productNumber=HPUXIEXP1131
Part of a multi tool set you will need to choose it or download the entire package.
http://hpux.connect.org.uk/hppd/hpux/Gnu/wget-1.11.4/
Another source that includes 11.11.
wget http://www.yoursite.com
Will get index.html
Script example.
wget http://www.yoursite.com
rc=$?
if [ $rc -eq 0 ]
then
echo "site is up"
else
echo site is down"
fi
You can work the script from there.
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-30-2009 09:56 AM
тАО06-30-2009 09:56 AM
Re: monitoring HTTP GET and HTTPS - 200
It's certainly _a_ tool for this, and it's
the one I use.
> wget http://www.yoursite.com
> rc=$?
> if [ $rc -eq 0 ]
I would not rely on the exit status from
wget. It's often very happy with results I
find unsatisfactory. I use a command like
"wget -O
for "200 ".
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-30-2009 10:15 AM
тАО06-30-2009 10:15 AM
Re: monitoring HTTP GET and HTTPS - 200
Actually, I could simply the Perl you are looking to use (since I gutted some features I had that were not germane to your need:
# cat ./probeurl
#!/usr/bin/perl
use strict;
use warnings;
use LWP::UserAgent;
my $url = shift or die "URL expected\n";
my $useragent = LWP::UserAgent->new;
my $request = HTTP::Request->new( HEAD => $url );
my $response = $useragent->request($request);
print $response->status_line, "\n";
1;
# ./probeurl http://www.hp.com
200 OK
# https://www.hp.com
404 Not Found
# ftp://www.hp.com
200 OK
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-01-2009 03:43 AM
тАО07-01-2009 03:43 AM
Re: monitoring HTTP GET and HTTPS - 200
I've been trying out the perl script option, but it seems I'm getting time outs for webstes I'm accessing normally using my browser.
I haven't tried wget yet on my windows.
Will get back with the results - after my weekend.
Really appreciate the great inputs.
Thanks and Regards,
-Alvi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-01-2009 04:30 AM
тАО07-01-2009 04:30 AM
Re: monitoring HTTP GET and HTTPS - 200
Windows? Now you're scaring me.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-01-2009 04:44 AM
тАО07-01-2009 04:44 AM
Re: monitoring HTTP GET and HTTPS - 200
> I've been trying out the perl script option, but it seems I'm getting time outs for webstes I'm accessing normally using my browser
I'd be interested to know a few of the URLs that give you problems, including the HTTP code the script returns.
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-05-2009 04:38 AM
тАО07-05-2009 04:38 AM
Re: monitoring HTTP GET and HTTPS - 200
Following ares ome of my trials with this script.
E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://www.google.com
500 Can't connect to www.google.com:80 (connect: timeout)
E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://elm.com.sa
500 Can't connect to elm.com.sa:80 (connect: timeout)
E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://www.hp.com
500 Can't connect to www.hp.com:80 (connect: timeout)
E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://www.hp.com:8080
500 Can't connect to www.hp.com:8080 (connect: timeout)
There doesn't seem any delay in accessing the internet from this particular server. I tried 8080, as this is the setting of our proxy.
The following, a local site with no security restrictions, came back successfully.
E:\dump\scripts\elm-urlmon\Test>URLTEST.pl http://elm-as-01
200 OK
Really appreciate your support.
I am working with perl on windows, as this will eventually be used on my OVOW server.
Thanks and Regards,
-Alvi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-05-2009 07:37 AM
тАО07-05-2009 07:37 AM
Re: monitoring HTTP GET and HTTPS - 200
Asking about Windows problems in an HP-UX
forum may not be the most efficient way to
get useful answers.
When you find an appropriate (Microsoft)
forum, you might consider mentioning whether
you can reach these servers from this system
using a Web browser, so that someone could
tell whether you have a Perl problem or a
general network problem.