1834227 Members
2544 Online
110066 Solutions
New Discussion

remove duplicate entries

 
SOLVED
Go to solution
kholikt
Super Advisor

remove duplicate entries

Hi,

I am doing some batch processing of a file that I dump from NNM. I only one to keep one unique entry based on the IP address. For example I need to run the following

www05_bck.abc.com ; 1.23.32.35 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical

into

www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
abc
5 REPLIES 5
MarkSyder
Honored Contributor
Solution

Re: remove duplicate entries

uniq old_file new_file

Mark Syder (like the drink but spelt different)
The triumph of evil requires only that good men do nothing
Muthukumar_5
Honored Contributor

Re: remove duplicate entries

You can use like,

sort -u

hth.
Easy to suggest when don't know about the problem!
Yogeeraj_1
Honored Contributor

Re: remove duplicate entries

hi,

i tried this:
==============
$ more 1.txt
www05_bck.abc.com ; 1.23.32.35 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
$ uniq 1.txt 2.txt
$ more 2.txt
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
$


regards
yogeeraj
No person was ever honoured for what he received. Honour has been the reward for what he gave (clavin coolidge)
Shyjith P K
Frequent Advisor

Re: remove duplicate entries

Hi

Use the "-d" option with "uniq" to list one copy only of each repeated line in the input file.

$ more test
www05_bck.abc.com ; 1.23.32.35 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical

$ uniq test
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical

Note: There are 2 entries for "www05_bck.abc.com "

$ uniq -d test
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical



Rgds
Shyjith
Muthukumar_5
Honored Contributor

Re: remove duplicate entries

You can also try as,

-- testfile--
your inputs

# sort -mu testfile
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical

hth.
Easy to suggest when don't know about the problem!