1834149 Members
3677 Online
110064 Solutions
New Discussion

data copy

 
SOLVED
Go to solution
Elena Leontieva
Esteemed Contributor

data copy

Hi,

I need to copy a large volume (140 GB) nfs mounted on one hp-ux server to the newly created volume on XP256 that will be mounted on a different hp-ux server. We can export nfs volume to the second machine and do cp. What else could be done? Looking for the best timing solution with the least impact on the environment.
Elena.
8 REPLIES 8
Pete Randall
Outstanding Contributor

Re: data copy

Elena,

I would lean toward restoring a tape backup onto the new system.

Pete

Pete
Paul Sperry
Honored Contributor

Re: data copy

cd /oldvol

find . ???print | cpio ???pcxvdmu /newvol

is how I would do it.
Chris Vail
Honored Contributor

Re: data copy

if you do this routinely, the fastest way to do it is to have the two machines each with a dedicated NIC, then wire them together "back to back". Then you can NFS mount them across that connection rather than using the bandwidth of your regular LAN.



Chris
Hai Nguyen_1
Honored Contributor

Re: data copy

Elena,

If the two servers are network linked, you can use rcp to copy data remotely.
Both rcp and nfs use remote procedure call (RPC) to transfer data.

Hai
Helen French
Honored Contributor

Re: data copy

I would take NFS as the last chance during these type of file transfers. I would do any of the following in the order:

1) ftp ( i think this the fastest one)
2) rcp
3) fbackup with remsh ( you can use compression if needed)
4) cpio with remsh
5) rdump or vxdump
6) tar backup with remsh
7) NFS
Life is a promise, fulfill it!
Elena Leontieva
Esteemed Contributor

Re: data copy

r- commands are disabled in our environment.

Elena.
Paul Sperry
Honored Contributor

Re: data copy

this would require .rhosts to be set up


cd /oldvol
find . -print | cpio -ocxa | remsh ???cd /newvol;cpio -icxvdmu???
Caesar_3
Esteemed Contributor
Solution

Re: data copy

Hello Elena!

Do you have SSH? Than you could you with
s-commands and not r-commands!

The to of the fastest way to copy files
(a lot of files and a lot of space)
is to use the tar command with pipe to remote:
tar cf - | ssh "(cd ; tar xf -)"

or with the dump/restore (but you maybe don't have this tools, need to install):
dump -0 -f - | ssh "(cd ; restore -f -)"

If you don't have any remote copy tools
you will need to export the voliume and use
nfs to copy but it's not the best way,
nfs has his problems.

If you have the SSH package installed
you can use the:
scp :
But the tar is the best for this actions,
a few months ago i made a check to find which
command is the fastest for copy files in the network and it's "tar".

The only problem that you can have is that
if you will have any problem on the way of the
copy you will start it again from the beginning
or to check what's coppied to much balagan.

But the first thing is that your network connection between them is fast and good!
That's will help you.

(140GB with good speed it's not much!)

Good luck!

Caesar