1752786 Members
5838 Online
108789 Solutions
New Discussion юеВ

Re: perl and ssh

 
SOLVED
Go to solution
uk1
Frequent Advisor

perl and ssh

Is there anyway to keep an open ssh session through perl to run commands back and forth? I am wanting to compare a set of files between servers (around 3000) and having to run a ssh $_ on every line of a file (containing the files that need to be checked) can take up to 5+hours. Other thought would be to write file sizes and date stamps to a file and compare that way. That doesnt seem to efficent though.
8 REPLIES 8
RAC_1
Honored Contributor

Re: perl and ssh

1. Can be done without keeping session open. I think rdist/rsync should give some option about this.

2. Other way to do it.

NFS mount from other host and running diff/uniq/comm on files.

3. Use of cksum on all files and compare those.

4. Line by Line comparision.

The old and crude way of doing it should be as follows.

ls -1 > file_list_local
remsh $remote -n 'ls -1' > file_list_remote

diff file_list_local file_list_remote
if [[ $? -eq 0 ]]; then
echo "files same on local and on remote host"
fi

for i in $(cat file_list_local)
do
line_count_local=`wc -l ${i}`
line_count_remote="`remsh $remote -n 'wc -l ${i}'"
if [[ ${line_count_local} -eq ${line_count_remote} ]];then
echo "sames lines in file ${i} on local and on remote"
else "Files remote ${i} not matching"
fi
done

SCRIPT not tested.

Anil
There is no substitute to HARDWORK
uk1
Frequent Advisor

Re: perl and ssh

That is some food for thought. However, we are not running nfs mainly due to being in a sg cluster and we/they (just started here) have not yet implemented ha nfs and there are no real plans to do so. If we could get ha nfs working then it would be ALOT eaiser to do this. If it was for the amount of files I had to check it would be eaiser. I am thinking of putting in a full, partial, or one file check though for ease of use. Actually that gives me something to think about! Thanks for your input!
uk1
Frequent Advisor

Re: perl and ssh

Actually rdist is looking prosiming as well. Looking into that now
RAC_1
Honored Contributor

Re: perl and ssh

The method 4 is also possible. Just requires additional scripting and testing. Will work best if files are ascii files.

Anil
There is no substitute to HARDWORK
Tom Schroll
Frequent Advisor

Re: perl and ssh


David,

I have no perl solution. :-(

*However* I do know that rsync does a very good job of comparing files, even if you don't actually want to copy them.

If you have the ability to install rsync on all of your servers, this is a tool that can be used for many purposes (not just this purpose), and it can run using ssh as the connection protocol. Rsync is available on the Software and Porting Archive.

For your purpose, you would use rsync with the -n option (or --dry-run).

Here is a ver quick and crude example of how you can use rsync to tell you if files are different. This example uses local directories, however you can also use the form hostname:/path/directory to compare remote directories. And you can use the -e option (--rsh=) to specify ssh.

[storm@thunder storm]$ ls -al 9c1
total 44
drwxrwxrwx 2 storm users 43 Jul 25 15:16 .
drwx--x--x 39 storm users 8192 Jul 25 15:14 ..
-rw------- 1 storm users 23142 Mar 30 1999 exhaust
-rw-r--r-- 1 storm users 738 Sep 14 1999 parts
-rw------- 1 storm users 2410 Apr 26 1999 wing
[storm@thunder storm]$ ls -al 9c1a
total 12
drwxrwxrwx 2 storm users 6 Jul 25 15:16 .
drwx--x--x 39 storm users 8192 Jul 25 15:14 ..
[storm@thunder storm]$ rsync -av 9c1/ 9c1a/
building file list ... done
./
exhaust
parts
wing
wrote 26532 bytes read 68 bytes 53200.00 bytes/sec
total size is 26290 speedup is 0.99
[storm@thunder storm]$ ls -al 9c1a
total 44
drwxrwxrwx 2 storm users 43 Jul 25 15:16 .
drwx--x--x 39 storm users 8192 Jul 25 15:14 ..
-rw------- 1 storm users 23142 Mar 30 1999 exhaust
-rw-r--r-- 1 storm users 738 Sep 14 1999 parts
-rw------- 1 storm users 2410 Apr 26 1999 wing
[storm@thunder storm]$ rsync -av 9c1/ 9c1a
building file list ... done
wrote 122 bytes read 20 bytes 284.00 bytes/sec
total size is 26290 speedup is 185.14
[storm@thunder storm]$
[storm@thunder storm]$ echo "different data now" > 9c1a/parts
[storm@thunder storm]$ rsync -nav 9c1/ 9c1a/
building file list ... done
parts
wrote 126 bytes read 24 bytes 300.00 bytes/sec
total size is 26290 speedup is 175.27
[storm@thunder storm]$

You will see in this example that I have two directories: 9c1 and 9c1a. I start off with 9c1a empty. Then I rsync it to 9c1 to make them the same. Then I change a file and run rsync -n again to show that it picked up and reported the changed file without actually copying anything.

Rsync is the most efficient and reliable freeware system I've seen around, and it uses checksums and sizes to determine if a file is different. It is also efficient in sending the differences across the network if you use it actually update files.

Hope this helps.

-- Tom
If it ain't broke, it needs optimized.
uk1
Frequent Advisor

Re: perl and ssh

The only problem with 4 is the amount of time it would take to process over 3000 files. I am going to try sdist and see how that goes. Thanks all for the help.
Carles Viaplana
Valued Contributor

Re: perl and ssh

Hi David,

I don't know any way to keep ssh connection open using perl, so you can:

- execute an script on remote host and then get the results
- get file you need to compare using rcp command

Regards,

Carles
Stuart Browne
Honored Contributor
Solution

Re: perl and ssh

Well, you can either use perl SSH modules (they exist) to open a session and do what you want, or you could open 'ssh' as a piped file-handle, and keep it open that way *shrug*

Be sure you do checking that the file handle hasn't died on you though, given the extended period.
One long-haired git at your service...