Operating System - HP-UX
1855491 Members
4576 Online
104111 Solutions
New Discussion

Here is a tough one for the language guru's

 
SOLVED
Go to solution
Belinda Dermody
Super Advisor

Here is a tough one for the language guru's

I have a stripped down log file that looks like this and now I am stuck

Gffff Rrrrrr gredwds@zzzzioon.com
fzzzzzan@zzz.zzzz.edu
JJJJ CCCCCC (E-mail) abcdefr99@prodigy.net
afjdl@turkey.com bcdade@aol.com

what I need to get down to is just beable to extra the email addresses as the final result from each line, some lines might have more than one address all the extra stuff can be trashed.

so my final output file will look like this
gredwds@zzzzioon.com
fzzzzzan@zzz.zzzz.edu
abcdefr99@prodigy.net
afjdl@turkey.com
bcdade@aol.com
14 REPLIES 14
Sridhar Bhaskarla
Honored Contributor

Re: Here is a tough one for the language guru's

Hi,

Try this


for i in $(cat your_logfile)
do
echo $i |awk '/\@/ {print $0}'
done

Downside of it is that it is going to print every word that has @ in it if it is ok with you

-Sri
You may be disappointed if you fail, but you are doomed if you don't try
S.K. Chan
Honored Contributor

Re: Here is a tough one for the language guru's

Something simple like this maybe just what you need ?
$ cat logfile|tr " " '\012'|grep @ > newlogfile
assuming the "extra stuff" does not have any "@" character.
RAC_1
Honored Contributor

Re: Here is a tough one for the language guru's

assuming email address is at the end of each line. Words on line are seperated by space.

awk -F " " '{print $NF}' > /tmp/new_file.
There is no substitute to HARDWORK
Belinda Dermody
Super Advisor

Re: Here is a tough one for the language guru's

Nice tries guys but the trash is still in the output file. The only thing I want left is the email address
Sridhar Bhaskarla
Honored Contributor
Solution

Re: Here is a tough one for the language guru's

I cut and paste your example data and called it your_logfile and applied my script. It does print out what you wanted. It is going to take each word and see if it has @ character and prints if so.

Whatelse kind of data you have?

-Sri
You may be disappointed if you fail, but you are doomed if you don't try
Pete Randall
Outstanding Contributor

Re: Here is a tough one for the language guru's

James,

I beg to differ, but Sri's answer (with only a minor modification) works perfectly.

for i in `cat yourlogfile`
> do
> echo $i |awk '/\@/ {print $0}'
> done

Pete

Pete
Hai Nguyen_1
Honored Contributor

Re: Here is a tough one for the language guru's

You can try this script:

#!/bin/ksh

while read LINE
do
for WORD in $LINE
do
echo $WORD | grep -q "@"
if [ $? -eq 0 ]
then
echo $WORD
fi
done
done < YOUR_FILE_HERE
James R. Ferguson
Acclaimed Contributor

Re: Here is a tough one for the language guru's

Hi James:

# awk '{for (i=1; i<=NF; i++) if ($i ~/@/) {print $i}}' filename

Regards!

...JRF...
Hai Nguyen_1
Honored Contributor

Re: Here is a tough one for the language guru's

James,

This modified script should work as well.

#!/bin/ksh

while read LINE
do
for WORD in $LINE
do
if [[ $WORD = *@* ]]
then
echo $WORD
fi
done
done < YOUR_FILE_HERE
Sridhar Bhaskarla
Honored Contributor

Re: Here is a tough one for the language guru's

Hi,

OK Let me make it more complicated so that if you find any word that says "alksdjf@asdkj", it should ignore it. Try this.

#!/usr/bin/ksh
DATA=j
for i in $(cat $DATA)
do
echo $i |awk '$0 ~/.*\@.*\../ {print $0}'
done

-Sri
You may be disappointed if you fail, but you are doomed if you don't try
Belinda Dermody
Super Advisor

Re: Here is a tough one for the language guru's

Thanks guys for all the great responses, I knew it wouldn't be that though, just wanted to get some blood flowing. That is why these forums are the best around and I told my manager if he wanted to go to a third party support for the HP equipment I still needed the access to the forums, so much better than calling in.
Sridhar Bhaskarla
Honored Contributor

Re: Here is a tough one for the language guru's

James,

I am being honest here for future reference.

You should be using JRF's script. I used for in shell and called awk for each value. JRF called "for" in awk's own umbrella. That will be much faster and effective.

-Sri
You may be disappointed if you fail, but you are doomed if you don't try
James R. Ferguson
Acclaimed Contributor

Re: Here is a tough one for the language guru's

Hi (again) James:

I think if you time the solutions you will find that the pure 'awk' performs better than the 'cat'.

Regards!

...JRF...

Belinda Dermody
Super Advisor

Re: Here is a tough one for the language guru's

Jim, thanks but I always use awk over the cat usage and that is why I am testing out yours now. But I assigned the points as they came in and if it worked. Sorry about short changing you.