- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- gzcat limits?
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-07-2004 10:43 PM
тАО10-07-2004 10:43 PM
Looking at extracting single files from ignite make_net_recovery archives and I am having problems.
It appears that gzcat fails (with 'unkown error') when the archive > 2Gb, its fine below that.
Is this a problem with gzcat genrally, or the version I have - gzcat 1.2.4 (18 Aug 93)
Any help appreciated!
Tony.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-07-2004 11:01 PM
тАО10-07-2004 11:01 PM
Solutionhttp://forums1.itrc.hp.com/service/forums/questionanswer.do?threadId=397889
The default version of gzip tools supplied with hpux don't support >2Gb files. Update your gzip and this should sort your problem.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-07-2004 11:14 PM
тАО10-07-2004 11:14 PM
Re: gzcat limits?
Thanks - I feel a bit stupid because I had actually scanned that posting! Obviously I should pay more attention.
I'll upgrade to the later gzip packages.
Cheers,
Tony.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-11-2004 05:33 AM
тАО10-11-2004 05:33 AM
Re: gzcat limits?
Before replacing gzip, please read what one Ignite-UX guru once told me:
----------------------------------------------------------------------
8. Ignite-UX, gzip, and large files
There seems have been some of questions recently about Ignite-UX,
gzip, and large files. There is a lot of misinformation and
customer confusion related to this issue as well.
The issue of gzip/gunzip and the fact that the version shipped with
HP-UX doesn't support large files directly makes people think that any
large file issues must be because of gzip (or at the very least the
first thing to look at). This is not necessarily true.
The way that Ignite-UX uses gzip/gunzip ensures that the version
shipped with HP-UX works perfectly. There was a problem back in 1999
and it was resolved then and has worked since. Lets look at an
example:
If you use gzip as follows you can create a large gzipped file (this
is similar to the way that make_sys_image uses gzip):
# tar cvf - . | gzip > /var/largefiles/a.tgz
The gzip command will cope with whatever amount of data is being piped
through it by tar. To unzip the data you would do something like:
# cat /var/largefiles/a.tgz | gzip d | tar tvf -
This works because the shells on HP-UX support large files. The gzip
command does not have to directly support large files to make this
work. Of course the gzip command still won't directly operate on a
large file but it is usually possible to do things in a pipeline
instead of operating directly on a file.
Please understand that one doesn't necessarily need to update
gzip when encountering large file errors from Ignite-UX and there
has been a lot of misinformation and myth propogated around this
issue.
The only known reasons for large file errors in Ignite-UX is from
make_net_recovery (except when make_sys_image is used directly).
And typically, there errors are caused by:
1) NFSv2 is being used instead of NFSv3, NFSv2 only supports
2Gb files and does not support large files at all.
2) The local or remote file system the archive or golden image
is being written to does not have large file support enabled.
When the file system does not support large files, it is gzip (from
within make_sys_image and make_net_recovery) that will print an error
about the file. Although gzip itself could successfully write a large
file via a shell pipe, an error can occur when a write is attempted
to a file system which does not support large files. This is not a
problem with gzip but rather a problem with the file system being
written to.
----------------------------------------------------------------------
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-11-2004 08:14 PM
тАО10-11-2004 08:14 PM
Re: gzcat limits?
I appreciate the information, but I believe in this case the issue was the version of the gzip utils that was at fault. make_net_recovery had created an archive of over 2Gb in size. The filesystems on both the host being archived and the NFS mounted ignite fs have largefiles on. The archive had been created OK, but when I tried to use gzcat to extract a single file (or list of files) I got 'unspecified error'. Updating gzip allowed me to extract a file -- I have only updated gzip on the ignite host so far, but will be doing it everywhere in case I need to use the archive 'in anger'. Thanks once again for the info.
Tony.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-12-2004 07:07 AM
тАО10-12-2004 07:07 AM
Re: gzcat limits?
You are using gzcat and it will have the same limitation as gzip and gunzip in that it can not "directly" open a LARGEFILE.
I have worked up some notes... I'll post them in a separate reply.
Regards,
jpkole
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-12-2004 07:15 AM
тАО10-12-2004 07:15 AM
Re: gzcat limits?
01) The gzcat utility does not handle files > 2gb:
-----------------------------------------------------------------
The gzcat utility is provided by the HP product called Software
Distributor (SW-DIST). It does not handle large files as we can
see from the tusc output below:
# which gzcat
/usr/contrib/bin/gzcat
# /usr/contrib/bin/gzcat -V
gzcat 1.2.4 (18 Aug 93)
Compilation options:
DIRENT UTIME STDC_HEADERS HAVE_UNISTD_H
# ls -al OSarch.gz
-rw-rw-rw- 1 root sys 2278914411 Oct 11 16:26 OSarch.gz
# (tusc -o2 /usr/contrib/bin/gzcat OSarch.gz) 2>&1 | head | tail -1
stat("OSarch.gz", 0x400345a0) .................. ERR#72 EOVERFLOW
As you can see above, gzcat attempted to issue a "stat" system
call on the large file, and that (predictably) fails with errno=72
(EOVERFLOW) which basically means "Value too large to be stored in
data type".
02) The cat utility will handle files > 2gb:
------------------------------------------------------
The cat utility is provided by the HP product called Core
Operating System (OS-Core). It will handle files > 2gb as
can be seen from the following tusc output:
# which cat
/usr/bin/cat
# ident /usr/bin/cat
/usr/bin/cat:
$Revision: 1.13 $
$Revision: 1.13 $
# (tusc -o2 /usr/bin/cat OSarch.gz) 2>&1 | head -55 | tail -2
open("OSarch.gz", O_RDONLY|O_LARGEFILE, 0) ..... = 3
fstat64(3, 0x7a002fa8) ......................... = 0
As you can see from the above output, the cat utility first
opens the file with attribute "O_LARGEFILE" (thus enabling large
file access) and then does an fstat64 system call sucessfully.
03) The gzip utility works on a file > 2gb when used in a pipe:
-----------------------------------------------------------------------------------
Create a file bigger than the 2gb barrier (2,147,483,647 bytes):
# rm -f BigFile ; touch BigFile
# perl -e '$f="BigFile";syscall(129,$f,0x7fffffff)'
# ls / >> BigFile
# ls -la BigFile
-rwxrwxrwx 1 kole users 2147484058 Oct 11 14:47 BigFile
And we have the standard HP-UX gzip:
# which gzip
/usr/contrib/bin/gzip
# /usr/contrib/bin/gzip -V
gzip 1.2.4 (18 Aug 93)
Compilation options:
DIRENT UTIME STDC_HEADERS HAVE_UNISTD_H
And we try to gzip it from the command line:
# /usr/contrib/bin/gzip BigFile > /dev/null
BigFile: Unknown error
FAILURE!
But if we give gzip the same amount of data via a Unix pipe:
# /usr/bin/cat BigFile | gzip --fast > /dev/null
#
SUCCESS!
04) The gunzip utility works on a OS archive > 2gb when used in a pipe:
-------------------------------------------------------------------------------------------------
If we have an existing OS archive > 2gb:
# ls -al OSarch.gz
-rw-rw-rw- 1 root sys 2278914411 Oct 11 16:26 OSarch.gz
Then the gunzip utility command line option fails:
# /usr/contrib/bin/gunzip -c OSarch.gz | pax -vf -
OSarch.gz: Unknown error
pax: The archive is empty.
FAILURE!
But if we include gunzip as part of a Unix pipe, all is OK (and
this is how it is done within Ignite's archive extract):
# /usr/bin/cat OSarch.gz | /usr/contrib/bin/gunzip | pax -vf - | head -4
USTAR format archive
drwxr-xr-x 0 root root Oct 11 15:51 ./
drwxr-xr-x 0 root root Oct 5 09:36 lost+found/
dr-xr-xr-x 0 bin bin Oct 11 15:51 etc/
-r--r--r-- 0 bin bin 184 Mar 26 2004 etc/group
#
SUCCESS!
05) On a side note, gzip, gunzip and gzcat are identical:
---------------------------------------------------------------------------
By looking at the inode number and the link count for the 3
utilities (gzip, gunzip and gzcat) we can see they are all
actually the same executable (delivered by SW-DIST):
# ls -ali $(which gzip gunzip gzcat)
33844 -r-xr-xr-x 3 bin bin 139264 Aug 5 2003 /usr/contrib/bin/gunzip
33844 -r-xr-xr-x 3 bin bin 139264 Aug 5 2003 /usr/contrib/bin/gzcat
33844 -r-xr-xr-x 3 bin bin 139264 Aug 5 2003 /usr/contrib/bin/gzip
So I am not surprised they have common limitations.
06) The pax utility can not handle a file > 2gb without a patch:
-----------------------------------------------------------------------------------
First, be sure patch PHCO_30150 is NOT on the system:
# swremove PHCO_30150
Create a file bigger than the 2gb barrier (2,147,483,647 bytes):
# rm -f BigFile ; touch BigFile
# perl -e '$f="BigFile";syscall(129,$f,0x7fffffff)'
# ls / >> BigFile
# ls -la BigFile
-rwxrwxrwx 1 kole users 2147484058 Oct 11 14:47 BigFile
Try to pax the file which is now > 2gb:
# pax -wvf /dev/null BigFile
pax: BigFile : > 2GB. Not Dumped.
FAILURE!
Now install patch PHCO_30150 (this patch can be obtained from
ftp://hpatlse.atl.hp.com/hp-ux_patches/s700_800/11.X/PHCO_30150 ):
# vpatch -a PHCO_30150
Retry the pax command:
# pax -wvf /dev/null BigFile
BigFile
#
SUCCESS!
07) And please remember:
-------------------------------------
Please remove the BigFile (your sysadmin will thank you):
# rm BigFile
[end]
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-12-2004 08:17 PM
тАО10-12-2004 08:17 PM
Re: gzcat limits?
Thanks for the info.
With the upgraded gzip tools (v 1.3.5) I can use gzcat to extract files, which is what I'm used to doing.
I have applied the lateest pax patch prior to installing Ignite 5.4.0.
Thanks again for the info.
My bigger concern is why the rootdg should be more than 2Gb anyway!
Tony.