Operating System - HP-UX
1834089 Members
2864 Online
110063 Solutions
New Discussion

make_net_recovery and "gzip: stdout: File too large"

 
Charles Holland
Trusted Contributor

make_net_recovery and "gzip: stdout: File too large"

Well here goes.... In the last 2 days I have upgraded Ignite to version B.5.3.35 and gzip to version 1.3.5 on both the archive server and the client. The client is 11i and the server is ll.0.

In addition I made the /ignite directory on the server capable of largfiles, and changed the /etc/fstab file to reflect this change.

Unfortunately it still bombs out with the error "gzip: stdout: File too large". I thought I had it fixed because of the while/do loop I was doing on the output directory showed the following...

-rw------- 1 bin sys 1210143384 Feb 10 13:53 2004-02-10,13:40
-rw------- 1 bin sys 1996434537 Feb 11 08:03 2004-02-11,07:46
-rw------- 1 bin sys 2045313024 Apr 30 13:31 2004-04-30,13:15
total 6262882
-rw------- 1 bin sys 1210143384 Feb 10 13:53 2004-02-10,13:40
-rw------- 1 bin sys 1996434537 Feb 11 08:03 2004-02-11,07:46

But then the one I was building went away..

Attached is the output of the following...
make_net_recovery command used and the results.
output of fsadm and fstyp
output of gzip -V

I am at a loss now and are welcome to all suggestions.

Regards
"Not everything that can be counted counts, and not everything that counts can be counted" A. Einstein
6 REPLIES 6
Patrick Wallek
Honored Contributor

Re: make_net_recovery and "gzip: stdout: File too large"

Make sure that make_net_recovery is using the correct version of gzip. The problem most likely is that the new version installed to a different directory than the old version and the old version is still being used due to the way your $PATH is defined.

In the past I have had to mv the old version and make a link from the location of the old version to the new version of gzip, gunzip, gzcat, etc.....

Dave Hutton
Honored Contributor

Re: make_net_recovery and "gzip: stdout: File too large"

I ran into this also. I figured I needed to do fsadm -F vxfs -o largefiles on my filesystems... After looking closer into it. I think the limitation for Ignite is 2gig unless patched then its 8 gig?
Needless to say try excluding some filesystems to get your image under 2 gigs.
Also do an itrc forum search. I know I've read about this before.

Dave

Nicolas Dumeige
Esteemed Contributor

Re: make_net_recovery and "gzip: stdout: File too large"

Hello,

A workaround could be to use a pipe and a background dd.

A 2 cents solution !

Nicolas
All different, all Unix
Charles Holland
Trusted Contributor

Re: make_net_recovery and "gzip: stdout: File too large"

Patrick
Yesterday when I installed the newer gzip I moved it from /usr/local/bin to /usr/contrib/bin and the output of a find from root directory confirms only the existance of the one gzip.

Hari,
While I couldn't follow your instrustions the letter I was able to re-export the mountpoint on the server. For the client I simply rebooted the machine as it is a test box at this time. There is not a mountpoint on the client that corresponds to the mountpont on the server. I attempted the make_net_recovery command and the output on the server is as follows...
total 11016834
-rw------- 1 bin sys 1210143384 Feb 10 13:53 2004-02-10,13:40
-rw------- 1 bin sys 1996434537 Feb 11 08:03 2004-02-11,07:46
-rw------- 1 bin sys 2404712448 Apr 30 15:10 2004-04-30,14:50
total 11278978
-rw------- 1 bin sys 1210143384 Feb 10 13:53 2004-02-10,13:40
-rw------- 1 bin sys 1996434537 Feb 11 08:03 2004-02-11,07:46
-rw------- 1 bin sys 2504622080 Apr 30 15:11 2004-04-30,14:50
total 8915400
-rw------- 1 bin sys 1996434537 Feb 11 08:03 2004-02-11,07:46
-rw------- 1 bin sys 2530359558 Apr 30 15:11 2004-04-30,14:50

As you can see it succesfully created the file, and it is > 2GB.

Dave,
I don't know about Ignite limitation. In all my work I probably could have stayed on my previous verison of Ignite, upgraded my gzip to 1.3.5, changed the directory on the server to largefiles capable, and re-exported the directory.

Nicolas,
Never had much luck with dd's. But I appreciate the .02.
"Not everything that can be counted counts, and not everything that counts can be counted" A. Einstein
skp
Occasional Contributor

Re: make_net_recovery and "gzip: stdout: File too large"

Hi Charles ,
Thanks alot for mentioning that re-export as a solution to overcome that filesystem limit.
In my case i have two clients hp 11.11 for the ignite server hp 11.11 and the make_net_recovery worked on the other but fails on the 2nd with "gzip:stdout:file too big" error.Everything was same on the two clients and the server i.e. the gzip version and product,patch etc.. Also the size of the archive created by 1st client was even larger than that of the 2nd ,so why the 2nd was failing ?
Then i changed the largefiles support for the exported filesystem and reexported it with exportfs command and the make_net_recovery worked on the 2nd client.

Thanks
SKP
Charles Holland
Trusted Contributor

Re: make_net_recovery and "gzip: stdout: File too large"

should have been closed a long time ago.
"Not everything that can be counted counts, and not everything that counts can be counted" A. Einstein