Operating System - HP-UX
1753902 Members
9858 Online
108810 Solutions
New Discussion юеВ

script that compares two files and removes duplicates.

 
SOLVED
Go to solution
trpjr1
Advisor

script that compares two files and removes duplicates.

Hi everone,

I have two files that list the disks on my server. The first file lists the disks that are part of vg00 and the second file lists all pvs onr the server. I am looking for a way to create a third file that shows only the disks that are not part of vg00.

Example:

file1.txt disks that are part of vg00
c0t2d0s2
c1t2d0s2

file2.txt all disks on server
c0t2d0
c0t2d0s1
c0t2d0s2
c1t2d0
c1t2d0s1
c1t2d0s3
c8t14d5
c8t14d6
c8t14d7
c8t15d0
...

file3.txt should look like this
c8t14d5
c8t14d6
c8t14d7
c8t15d0

file3.txt should not have any disks that are part of vg00.

Thanks for any input you can provide.
Tommy P.
4 REPLIES 4
Jeff_Traigle
Honored Contributor
Solution

Re: script that compares two files and removes duplicates.

sort file1.txt file2.txt | sed 's/s[0-9]//' | uniq -u - file3.txt
--
Jeff Traigle
James R. Ferguson
Acclaimed Contributor

Re: script that compares two files and removes duplicates.

Hi:

Have a look at 'comm(1)'.

Regards!

...JRF...
Bob_Vance
Esteemed Contributor

Re: script that compares two files and removes duplicates.

This scriptlet creates temp file "/tmp/f1s"

## (sed < /tmp/f1 >/tmp/f1s \
-e 's/s.*//'
fgrep < /tmp/f2 >/tmp/f3 \
-v -f /tmp/f1s
cat /tmp/f3
)

c8t14d5
c8t14d6
c8t14d7
c8t15d0



or, if you don't want a temp file

## ( sed < /tmp/f1 \
-e 's/s.*//' \
|while read D
do e="$e -e $D"
done

grep < /tmp/f2 >/tmp/f3 \
-v $e
cat /tmp/f3
)

c8t14d5
c8t14d6
c8t14d7
c8t15d0


Note the second scriptlet above is not portable to Linux,
because of child shell variable issues.

Portable scriptlet:

## ( sed < /tmp/f1 \
-e 's/s.*//' \
|(while read D
do e="$e -e $D"
done
echo " $e"
) \
|(read e
grep < /tmp/f2 >/tmp/f3 \
-v $e
)
cat /tmp/f3
)




bv
"The lyf so short, the craft so long to lerne." - Chaucer
trpjr1
Advisor

Re: script that compares two files and removes duplicates.

Thanks everyone!