1748128 Members
3665 Online
108758 Solutions
New Discussion юеВ

Background connections

 
Viliam Kocinsky
Occasional Contributor

Background connections

Hi,
there are some task, that need to be done on about 500 servers. When it's done one-by-one, it take considerable time. I was trying to start each connection in background and wait for results with 'wait pid'. It works well, this way. But if I want also catch output from each connection and display it, in order, they have been started, it doesn't work. I wrote this script:
------------------------------------------
#!/bin/sh

pipedir=~/myscript$$
mkdir $pipedir
processes_max=200
processes_toread=100
servers_started=0
servers_read=0
processes_running=0

# Get servers
i=0
for srv in $(get_servers for_testing);do
server_list[$i]=$srv
let i++
done
servers_count=${#server_list[*]}

# Start processess in background
while [ $servers_read -lt $servers_count ] ;do
server_actual=$servers_started
while [ $processes_running -lt $processes_max ] && [ $server_actual -lt $servers_count ] ; do
mkfifo $pipedir/$server_actual
ssh -q ${server_list[$server_actual]} "PATH=\$PATH:/usr/bin:/usr/sbin:/usr/local/bin ;sudo uname" > $pipedir/$server_actual &
process_list[$server_actual]=$!
let server_actual++
let processes_running++
let servers_started++
done
# Show results
server_actual=$servers_read
i=0
while [ $i -lt $processes_toread ] && [ $server_actual -lt $servers_started ] ; do
if [ $server_actual -eq $servers_count ]; then
break 2
fi
echo -n ${server_list[$server_actual]}:
cat <$pipedir/$server_actual
wait ${process_list[$server_actual]}
rm $pipedir/$server_actual
let i++
let processes_running--
let server_actual++
let servers_read++
done
done
rmdir $pipedir
exit 0
-----------------------------------------

It use named pipe to send output from each connection to. Problem is, that 'ssh' program is blocked until other side of pipe is open 'cat pipe'. So it's working almost, like one-by-one proceeding.
I tried also to store result from background process in to variable, but it either store empty string (immediate output after process go to background), or it waits for process finish and then it assign the output to variable. It would be more simple, to just let each process to write it's output on stdout as it finish. But it'd be mess. I need to have output aranged in form:
servername1:output1
servername2:output2
Do somebody has some solution?
Thanks.
Viliam Kocinsky
2 REPLIES 2
Steven E. Protter
Exalted Contributor

Re: Background connections

Shalom,

Lets say I had a list of servers called list

while read -r hostname
do
echo "run command"
ssh -f $hostname
done < list

This would of course require a public ssh key be placed on the server.

I've used -f to run ssh commands in batch mode with success.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
James R. Ferguson
Acclaimed Contributor

Re: Background connections

Hi Viliam:

Here's a prototype that collects the output from a background process into a file. Each line of the file begins with an ordinal process number (1..n) which allows you to sort the results back into the order in which the processes were started.

# cat ./probe
#!/usr/bin/sh
typeset -i i=0
typeset SERVERS=/tmp/servers
typeset RESULTS=/tmp/results.log
for SERVER in $(< ${SERVERS})
do
i=$((i+1))
ssh -n ${SERVER} "{ echo ${i};echo ${SERVER};sleep 10;date;uname -a; }|xargs" >> /tmp/results.log &
done

...In this prototype, the one-line result per server consists of the server name; the date; and the server's 'uname' information --- all collected into one line of the output file and beginning with an ordinal process number.

The ${SERVERS} input file simply has a server name; one per line.

Regards!

...JRF...