Operating System - HP-UX
1830618 Members
2507 Online
110015 Solutions
New Discussion

Best way to copy directories with big files (>8GB)

 
SOLVED
Go to solution
noldi
Valued Contributor

Best way to copy directories with big files (>8GB)

Hello,

What is the best way to copy directories around
with large files (bigger than 8GB) on HP-UX 11.00
onwards on vxfs with largefiles support?
I would like to preserve
devs/links/perms/owner/times, etc.
All of tar/pax/cpio utilies seem to have
file size restrictions that prevent OOTB
operation and leave the big files behind.

Thank's & best Regards,
Arnold Sutter
9 REPLIES 9
Prashant Zanwar_4
Respected Contributor

Re: Best way to copy directories with big files (>8GB)

I would say cpio or pax.

pax -rw olddir newdir

find . -xdev -print | cpio -pdmuv /targetdir

This shall be taking care of your query.
Also you can use fbackup, frecover combination for faster copy.

Thanks
Prashant

"Intellect distinguishes between the possible and the impossible; reason distinguishes between the sensible and the senseless. Even the possible can be senseless."
A. Clay Stephenson
Acclaimed Contributor

Re: Best way to copy directories with big files (>8GB)

Fbackup/frecover is probably your best bet although it is HP-UX specific.
If it ain't broke, I can fix that.
Steven E. Protter
Exalted Contributor

Re: Best way to copy directories with big files (>8GB)

If you have network connectivity between the boxes, openssh/secureshell can do the job without size limitations.

http://software.hp.com/portal/swdepot/displayProductInfo.do?productNumber=T1471AA

scp -pr source destination

That will try to preserve permissions and do a recursive copy.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
noldi
Valued Contributor

Re: Best way to copy directories with big files (>8GB)

pax seems to have 2 personalities,
cpio and ustar. Both seem to leave
big files behind (similar to vanilla
tar). Especially, files larger than
8GB were not copied.

On the other hand, fbackup/frecover
could be an alternative.

I usually use this command:
"cd /from; tar cf - . | (cd /to; tar xpf -)"

How would this command be translated
to fbackup/frecover?

Thank's again & Regards,

Arnold
Rick Garland
Honored Contributor

Re: Best way to copy directories with big files (>8GB)

If you are going to use cpio, get the GNU version. You won't have to worrt about files greater than 2GB

Geoff Wild
Honored Contributor

Re: Best way to copy directories with big files (>8GB)

How about with vxdump/vxrestore?

If between servers, then you would have to nfs mount.



Example:

vxdump -0 -f - -s 1000000 -b 16 /export/usr/sap/trans | (cd /mnt/usr/sap/trans ; vxrestore rf -)&

Rgds...Geoff
Proverbs 3:5,6 Trust in the Lord with all your heart and lean not on your own understanding; in all your ways acknowledge him, and he will make all your paths straight.
lawrenzo
Trusted Contributor

Re: Best way to copy directories with big files (>8GB)

I looked into this and the following could be used:

fbackup -f /dev/rmt/x

There are several options that can be used so check out the man pages

use frecover to retrieve the data.

Also sam has a backup utility which uses fbackup.

hope this helps
hello
Sundar_7
Honored Contributor
Solution

Re: Best way to copy directories with big files (>8GB)

Try this Arnold,

# cd /target
# fbackup -vf - -i /source1 | frecover -rXvf -
Learn What to do ,How to do and more importantly When to do ?
noldi
Valued Contributor

Re: Best way to copy directories with big files (>8GB)

Thanks to all who replied.

The following command worked for us with
reasonable performance, regardless of file
size:

# cd /source
# fbackup -f - -i . | (cd /dest; frecover -rXf -)

Best Regards,

Anrold Sutter