Operating System - HP-UX
1820325 Members
2646 Online
109623 Solutions
New Discussion юеВ

file copies from one location to a second

 
SOLVED
Go to solution
rmueller58
Valued Contributor

file copies from one location to a second

I am wanting to reduce some network overhead in a script, I've worked on a script but wanting to see if someone could see if there is another possibility that may work better.

As it stands we have "Web Users" that post files to $p1, I want to copy files that are updated on the web to "$distpath$appid, there may be files that already exist in this location, so I iterate $DOCid..
What I am curious about is in my "find command"

where -mtime 0 I am curious if I can define this down to an hourly increment?


Any thoughts appreciated..


#!/bin/bash
export p1=/upload_resumes/
export p2=/upload_resumes/backup/
export mps=/mps-docs/mps-docs/applicant/
export distpath=/mps-docs/mps-docs/applicant/
cd $p1
touch /tmp/app-staged-pre-move-post-copy.log
echo > /tmp/app-staged-pre-move-post-copy.log


for fn1 in `find -type f -mtime 0 -print`
do
echo $fn1
export fn=`echo $fn1| awk '{print substr($1,3,22)}'`
echo $fn
#APPID SPECIFIC INFORMATION
for appid in `echo $fn |awk -F_ '{print $2}'`
do
echo $appid
echo FILENAME: $fn
export source="$p1$fn"
echo $source
export destination="$distpath$appid/$fn"
echo $destination
echo DONE
export distID=`echo $fn |awk -F_ '{print $1}'`
export appID=`echo $fn |awk -F_ '{print $2}'`
export docID=`echo $fn |awk -F_ '{print $3}'|cut -d. -f1`
export ext=`echo $fn |awk -F_ '{print $3}'|cut -d. -f2`
echo $fn |cut -d. -f2 | read ext

export apppath=${distapplpath}${appID}
echo $apppath
read
while [[ -f ${distpath}/${distID}_${appID}_${docID}.${ext} ]]
do
(( docID += 1 ))
if (( docID > 99 )); then
echo "ERROR: docID too high!"
exit 2
fi
done

cp -f $fn ${distpath}mpsplus_${appID}_${docID}.${ext}
ls -la ${distpath}mpsplus_${appID}_${docID}.${ext} >> /tmp/app-staged-pre-move-post-copy.log


done
done
15 REPLIES 15
Steven E. Protter
Exalted Contributor

Re: file copies from one location to a second

Shalom,

I would reduce network overhead by using rsync for HP-UX.

What you seem to be trying to do is duplicate functionality built into that utility.

http://hpux.connect.org.uk/hppd/hpux/Networking/Admin/rsync-3.0.4pre2/

You may be better off compiling that one. If rsync has been added to the Internet Express package in http://software.hp.com use that release.

rsync will only transmit modified files, substantially lowering overhead on the network.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
rmueller58
Valued Contributor

Re: file copies from one location to a second

Steven,

These are "CIFS" mounted directories on L*nux..

What RSYNC command would you use to replace the
cp -f $fn ${distpath}mpsplus_${appID}_${docID}.${ext}?

rsync -f
rmueller58
Valued Contributor

Re: file copies from one location to a second

Can I somehow define in my find command line to "-mtime" equals 1 hour rather then one day?
James R. Ferguson
Acclaimed Contributor
Solution

Re: file copies from one location to a second

Hi Rex:

You can do this a couple of ways. If you want to remain "pure shell" then create a reference point and 'find' files newer than that:

...
touch -cmt 12111400 /tmp/reffile
find /path -type f -newer /tmp/reffile -print

...Now this leaves you having to calculate one-hour ago (or whatever)...[messy]

THUS, I'd take a pure Perl approach:

# cat ./findago
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
my @path = @ARGV ? @ARGV : ('.');
find( sub { print "$File::Find::name\n" if -f $_ && -M _ <= ( 1 / 24 ) },
@path );
1;

...run as:

# ./findago /path

This finds files updated in the last hour (1/24 of a day) in the '/path' or paths passed as arguments. The current directory is assumed if no argument is given.

Regards!

...JRF...

Steven E. Protter
Exalted Contributor

Re: file copies from one location to a second

rsync samba to samba?

No I don't think that will work. rsync is designed to copy files between two distinct systems.

If the destination is some kind of shared storage, rsync is your best option. Otherwise use JRF's find command.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
James R. Ferguson
Acclaimed Contributor

Re: file copies from one location to a second

Hi (again):

By the way, even though I said "pure Perl" I meant to infer that you need the Perl only for the 'find'. Call the small Perl script letting it redirect its output to a file, or collect its output in a shell variable:

...
./findago /somepath > /tmp/files.$$
...

or:

FILES=$(./findago /somepath)
...

Regards!

...JRF...
rmueller58
Valued Contributor

Re: file copies from one location to a second

Thanks guys, this helps..


rmueller58
Valued Contributor

Re: file copies from one location to a second

I used "-mmin 60" in my find command to fix the time at hourly checks and am running the cron script hourly.

James R. Ferguson
Acclaimed Contributor

Re: file copies from one location to a second

Hi (again) Rex:

> I used "-mmin 60" in my find command

Ah, then you are using the GNU 'find' and not the standard HP-UX one!

Yes, the GNU variants are far, far more feature-rich than standard Unix (HP, IBM) provide!

Yes, this is certainly an alternative to the Perl script I suggested. If portability of your script ever becomes an issue, though, I'd bet you might find that Perl was already available far more often then the the GNU 'findutils' package [unless, of course, the port was to Linux].

Regards!

...JRF...
OldSchool
Honored Contributor

Re: file copies from one location to a second

actually, rsync should do nicely.
http://rsync.samba.org/ftp/rsync/rsync.html
notes:

Rsync copies files either to or from a remote host, or locally on the current host (it does not support copying files between two remote hosts).

*and*

As expected, if neither the source or destination path specify a remote host, the copy occurs locally (see also the --list-only option).
rmueller58
Valued Contributor

Re: file copies from one location to a second

James,

Yes using the GNU find, the network operations manager asked this morning if the job stopped because he wasn't seeing tons of generated traffic!!

rmueller58
Valued Contributor

Re: file copies from one location to a second

I have to clean it up because of some path issues.

I am pretty close to having it working.

One other quick question?

I want to add a small section to the bottom of the script that checks for a "String" value in the log, and if the string value exist to send an email..


I thought I could do the following:

export jactrue=`grep mpsplus /tmp/move-jac.log`
echo $jactrue
if [ -z "$jactrue" ]
then
mail -s "JAC hourly Move Log" emailaddress@domain.TLD< /tmp/move-jac.log
else
echo > /dev/null
fi

Any ideas?
James R. Ferguson
Acclaimed Contributor

Re: file copies from one location to a second

Hi (again) Rex:

> I thought I could do the following:

export jactrue=`grep mpsplus /tmp/move-jac.log`
echo $jactrue
if [ -z "$jactrue" ]
then
mail -s "JAC hourly Move Log" emailaddress@domain.TLD< /tmp/move-jac.log
else
echo > /dev/null
fi

...several comments:

1. You don't need to 'export' to do this
2. Define your log file name as a variable to make it easy to change one-line later
3. Eliminate the useless "echo > /dev/null"


Thus:

I thought I could do the following:

MYLOG=/tmp/move-jac.log
jactrue=$(grep mpsplus ${MYLOG})
if [ -z "${jactrue}" ]; then
mail -s "JAC hourly Move Log" emailaddress@domain.TLD < ${MYLOG}
fi

...notice too that I "improved" the variable interpolation with curly braces --- a good, safe habit; and that I substituted the POSIX notation for the back-tick syntax :-)

Regards!

...JRF...
rmueller58
Valued Contributor

Re: file copies from one location to a second

Thanks James,

And thanks for the suggestions and reminders.

rmueller58
Valued Contributor

Re: file copies from one location to a second

Thanks James and all.