- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- remove duplicate entries
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2005 08:55 PM
06-28-2005 08:55 PM
I am doing some batch processing of a file that I dump from NNM. I only one to keep one unique entry based on the IP address. For example I need to run the following
www05_bck.abc.com ; 1.23.32.35 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
into
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2005 09:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2005 09:12 PM
06-28-2005 09:12 PM
Re: remove duplicate entries
sort -u
hth.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2005 09:15 PM
06-28-2005 09:15 PM
Re: remove duplicate entries
i tried this:
==============
$ more 1.txt
www05_bck.abc.com ; 1.23.32.35 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
$ uniq 1.txt 2.txt
$ more 2.txt
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
$
regards
yogeeraj
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-28-2005 10:43 PM
06-28-2005 10:43 PM
Re: remove duplicate entries
Use the "-d" option with "uniq" to list one copy only of each repeated line in the input file.
$ more test
www05_bck.abc.com ; 1.23.32.35 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
$ uniq test
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
www05_bck.abc.com ; 1.23.32.35 ; Critical
Note: There are 2 entries for "www05_bck.abc.com "
$ uniq -d test
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
Rgds
Shyjith
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-29-2005 01:31 AM
06-29-2005 01:31 AM
Re: remove duplicate entries
-- testfile--
your inputs
# sort -mu testfile
www05_bck.abc.com ; 1.23.32.35 ; Critical
uat01.abc.com ; 1.23.4.36 ; Critical
uat01_ap1.abc.com ; 1.23.5.36 ; Critical
uat01_ap2.abc.com ; 1.23.6.36 ; Critical
uat01_bck.abc.com ; 1.23.32.36 ; Critical
hissta01.abc.com ; 1.23.4.37 ; Critical
hissta01_ap1.abc.com ; 1.23.5.37 ; Critical
hth.