Simpler Navigation coming for Servers and Operating Systems
Coming soon: a much simpler Servers and Operating Systems section of the Community. We will combine many of the older boards, and you won't have to click through so many levels to get at the information you need. If you are looking for an older board and do not find it, check the consolidated boards, as the posts are still there.
Networking
cancel
Showing results for 
Search instead for 
Did you mean: 

Newbie question: remote files via http?

SOLVED
Go to solution
eriera1
Occasional Visitor

Newbie question: remote files via http?

If I have several files in an external web server. From unix shell it is posible to have access to this files as if there were local files?

Something like that:
more http://mydomain/test.txt

and ths contents of the external text.txt file would be displayed...

Thanks,
5 REPLIES
Torsten.
Acclaimed Contributor
Solution

Re: Newbie question: remote files via http?

For scripting you need something like

http://hpux.connect.org.uk/hppd/hpux/Networking/WWW/curl-7.20.1/

or a perl or tcl script.

Hope this helps!
Regards
Torsten.

__________________________________________________
There are only 10 types of people in the world -
those who understand binary, and those who don't.

__________________________________________________
No support by private messages. Please ask the forum!

If you feel this was helpful please click the KUDOS! thumb below!   
Matti_Kurkela
Honored Contributor

Re: Newbie question: remote files via http?

In your example command:
> more http://mydomain/test.txt

the shell will just relay the string "http://mydomain/test.txt" as a parameter to /usr/bin/more.

Because HP-UX /usr/bin/more does not support HTTP, you will get an error message:

http://mydomain/test.txt: No such file or directory

Your desired functionality cannot be easily achieved by just adding features to Unix shell. To get what you want, you would need one of the following:

A) support for HTTP protocol added to all commands you wish to use with an external web server

or

B) a "HTTP filesystem" which would intercept all file requests within its mountpoint and download the appropriate file from the web server.
(e.g.:
mount -t httpfs dummy_parameter /web
more /web/mydomain/test.txt
or something like that.)

Unfortunately, as far as I know, B) is not available for HP-UX and A) would require replacing standard system commands with customized versions.

In Linux, a sub-system called FUSE allows developers an easy way to create pseudo-filesystems like this, so someone has already implemented it:
http://httpfs.sourceforge.net/

If someone would first implement a FUSE-like interface for HP-UX kernel, then it might be relatively simple to port this httpfs to HP-UX too.

MK
MK
James R. Ferguson
Acclaimed Contributor

Re: Newbie question: remote files via http?

Hi:

And exactly what do you want to do with these files? You could obtain a textual representation with 'lynx' as for example:

# lynx -dump http://forums.itrc.hp.com/service/forums/questionanswer.do?threadId=1432950

This utility is available for HP-uX from the HP-UX Porting Centre if you are interested.

Regards!

...JRF...
Steven Schweda
Honored Contributor

Re: Newbie question: remote files via http?

> For scripting you need something like [...]

Or you could use wget.

http://www.gnu.org/software/wget/

> [...] have access [...]

Details depend on exactly what "have access"
means.
eriera1
Occasional Visitor

Re: Newbie question: remote files via http?

Thanks everybody!

I'm using a software called Adobe Output Designer to generate pdf's. This software has a built-in instruction called ^graph that receive as parameter the full path to an image file and render the image in the generated pdf. Like ^graph \abc\myimage.jpg

I was wondering if it was posible to use http stored images like option B (FUSE). :) I mean ACO software would still think the file is a local file but instead a wrapper would get the file via http.

Anyway because before an external process would need to retrieve the list of urls where the images exists, he can do the work of downloading and copying into a local file too (using wget or curl or whatever).