- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Retrieve a webpage from a shell script
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-09-2004 09:12 PM
тАО11-09-2004 09:12 PM
I'm looking for a way to retrieve a webpage from a shellscript or command line. The webpage contains basic system info like output from 'bdf' but I do not have a prompt login on that machine as it is fully serviced by others.
How do i get hold of that system info page on a daily basis best? I tried perl but we don't have the LWP module to do something like:
#!/usr/contrib/bin/perl
use LWP::Simple;
$AS=get('http://somelanhost/cgi-bin/solaris_cmd?df');
print $AS;
I'm not allowed to install tools like 'lynx' et al to retrieve the info.
Has hp-ux 11.11 any standard tools to retrieve a webpage from the command line?
Thanks in advance, John
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-09-2004 09:15 PM
тАО11-09-2004 09:15 PM
Re: Retrieve a webpage from a shell script
Try 'wget' from here.
http://hpux.cs.utah.edu/hppd/hpux/Gnu/wget-1.9.1/
There are few more tools available in the same web site. Go to home and search for web.
-Sri
- Tags:
- wget
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-09-2004 09:47 PM
тАО11-09-2004 09:47 PM
Re: Retrieve a webpage from a shell script
Regards,
Fred
"Reality is just a point of view." (P. K. D.)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-09-2004 10:15 PM
тАО11-09-2004 10:15 PM
Re: Retrieve a webpage from a shell script
You can however use Perl, as your post shows, but as you do not have LWP::Simple, just fake it using CORE modules:
--8<---
#!/pro/bin/perl -w
use strict;
use IO::Socket::INET;
my $itrcweb = "http://forums1.itrc.hp.com/service/forums";
my $server = "forums1.itrc.hp.com";
my $ux_url = "service/forums";
sub webget ($;$)
{
my ($url, $F) = @_;
local $/;
my $peer = $url =~ s{^(\w+)://}{} ? $1 : "http(80)";
my ($server, $page) = ($url =~ m{^([^/]+)(?:/(.*)});
(my $f = $page) =~ s:.*/::;
$F ||= $f;
unlink $F;
printf STDERR "get %-38s\t", $F;
my $socket = IO::Socket::INET->new (
PeerHost => $server,
PeerPort => $peer,
Proto => "tcp",
timeout => 90) or die;
print $socket
"GET $page\n",
"Host: $server\n",
"User-Agent: Internet Explorer\n",
"\n";
my $t = 0;
my $data = <$socket>;
if (ref $F) {
$$F = $data;
return $t;
}
if ($data) {
$t = length $data;
undef $p;
open $p, "> $F" or die "$F: $!\n";
print $p $data;
close $p;
}
$t;
} # webget
webget ("http://somehost.lan/cgi-bin/solaris_cmd?df", "solaris.df");
my $sol_df;
webget ("http://somehost.lan/cgi-bin/solaris_cmd?df", \$sol_df);
-->8---
(Not tested)
Enjoy, Have FUN! H.Merijn
- Tags:
- Perl
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-09-2004 10:18 PM
тАО11-09-2004 10:18 PM
Re: Retrieve a webpage from a shell script
Regards,
Fred
"Reality is just a point of view." (P. K. D.)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-09-2004 11:56 PM
тАО11-09-2004 11:56 PM
Re: Retrieve a webpage from a shell script
http://www.gnu.org/software/wget/wget.html
http://curl.haxx.se/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-10-2004 02:27 AM
тАО11-10-2004 02:27 AM
Solution#!/sbin/sh
if [[ $# -lt 1 ]] ;then
print "supply stock symbol on command line"
exit 1
fi
for sym in $@
{
(stelnet quote.yahoo.com 80 << eof
GET http://quote.yahoo.com/download/quotes.csv?symbols=${sym}&format=sl1d1t1
eof
) 2>/dev/null |awk '$0 ~ /\"/ {print}'
}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-10-2004 07:35 AM
тАО11-10-2004 07:35 AM
Re: Retrieve a webpage from a shell script
wget http://hostname.ext/page.html
if you need to set it up where it's a timed thing then set up a cron job to do it.
You may want to Google "hcat" and "hgrepurl."
these are little perl scripts that OReilly press has on their FTP server. If you can't find them there I am attaching a tarball.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-10-2004 12:46 PM
тАО11-10-2004 12:46 PM
Re: Retrieve a webpage from a shell script
Cheers! John