1832884 Members
2626 Online
110048 Solutions
New Discussion

Re: Strange NFS Problem

 
Michael Ernstoff
Frequent Advisor

Strange NFS Problem

I have a problem copying some data from an archive system to an HP-UX 11 system.
The archive data is on a Windows 2000 server, with several directories each of which are of the order of 100Gb in size and contain 3,000,000 files in a completely flat structure (i.e. 3,000,000 in one folder).
I need to copy the data in the same structure to my HP server.
I know this is highly inefficient, but I have no control over this.
The data is NFS exported from W2K (using Windows Services for Unix), and mounted on the HP server.
I have tried both "cp -r" and "find | cpio -d"
In both cases it works for a while, then the problem occurs.
After around 300,000 files (between 8 and 9Gb of data) the copy starts re-copying the same files again and again.
There are no error messages in either Unix or Windows. All processes appear to be working in both, but the same files are copied over and over, so the total number of files copies remains at the 300,000 odd figure.
At the moment I have no idea if the problem is in the HP-UX or Windows environment.
Any suggestions on how to track this down?
16 REPLIES 16
RAC_1
Honored Contributor

Re: Strange NFS Problem

Are the files on windows being changed when you are copying them??

Also, it would be wiser to copy data in chunks. Like copy files a-e, then e-m and so on.

Anil
There is no substitute to HARDWORK
Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

The files are not being updated on Windows. The NFS mount is read only.
While it would be possible to break down the copy into smaller chunks, it would be difficult as commands like "ls" tend to fail with that number of files.
RAC_1
Honored Contributor

Re: Strange NFS Problem

ls [a-eA-E]* | xargs cp -r {} /dest_dir/.

You may want to split it further.
a-d and so on.

Anil
There is no substitute to HARDWORK
Peter Godron
Honored Contributor

Re: Strange NFS Problem

Michael,
from what I can find there is no hard limit on the numbe rof files in a single directory under Vxfs.
Problems may come from the inode table growing too big for the copy to use. It may hit the end of the buffer and then start copying the last file over and over again.

Can you let us know if it the last files it tries to copy?

I would try the approach of copying chunks of files.

Also have you considered impacts on file searches and backup strategy with this approach?
Regards
Steven E. Protter
Exalted Contributor

Re: Strange NFS Problem

innode issue sounds promising.

Try making a tar file with winzip, transferring the single file and seeing what the results are.

They should be similar if lack of inodes is the issue.

To accommodate this structure on hp-ux kernel modificaiton is probably required.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

Looking in glance the inode table has only reached around 1/3 full.

It is not just one file being repeatedly copied, and it is not all 300,000.
Looking at the timestamps it seems to be re-copying from file number 236658 to file number 319134 of 319134.
As I said there should be over 3 million, so it gets around 10% through, then repeats the last 25% of those already copied.
Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

Can tar read winzip files?
This may be a possibility providing there is sufficient space on the Windows box to store the zip file.
What HP-UX kernel modifications do you think would be required?
Steven E. Protter
Exalted Contributor

Re: Strange NFS Problem

Yes, tar can read winzip files

Or the unzip command.

I know that these files are cross platform compatible.

I commonly also create tarballs on hp-ux and read them with recent versions of winzip, no problem.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

Just been checking out the winzip website, and winzip can read tar files, but not write them.

The chunks of files idea is looking promising. At the moment the Wintel administrator is generating a list of files for me.
Raynald Boucher
Super Advisor

Re: Strange NFS Problem

Just a thought...

You said the directory structure is flat but...
Could there be a symbolic link or it's equivalent buried in there somewhere?
Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

There are no symbolic links in there.
Steve Lewis
Honored Contributor

Re: Strange NFS Problem

Actually there is a limit to the number of files in a single vxfs directory, set by kernel parameter vx_maxlink. We hit that limit when trying to create 50000 files in an archive directory on 11.11 last month. I am suprised you got to 300,000.

This means that you must create multiple directories in which to store all the w2k data, then copy it over in bits. I suggest you remove each file from w2k after you have copied it over.



Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

vx_maxlink cannot specify the number of files in a directory.
I have reached +300,000 and the max allowable for that parameter is 65,534
(that is the according to an HP-UX 11.11 document on docs.hp.com, my server is actually 11.00)
Oliver Stoklossa
Frequent Advisor

Re: Strange NFS Problem

Maybe you can try the tar-command to copy your files, as you say, the Windows drive is mounted:

( cd ; tar cvf - * ) | ( cd ; tar xvf - )
Stephen Keane
Honored Contributor

Re: Strange NFS Problem

vx_maxlink is the maximum number of links to a file.

vx_ninode is the number of I-node entries in the inode table.

There will be an upper limit to the number of files present on the same device because of the format of an NFS file handle passed between the Windows 2000 server and the HP server. I suspect the limit is around the (hex) 40000 mark around 262,144 in decimal.
Michael Ernstoff
Frequent Advisor

Re: Strange NFS Problem

I have already tried using tar, with no success.
Just back from a week away and will be looking at this again tomorrow.
The limits on NFS could be significant.