- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: awk help please
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2006 10:28 PM
06-20-2006 10:28 PM
I am attemting to use awk to get and manipulate data but cannot get the syntax:
Here is what I want, in a logfile I have many summary lines for diffenent sql's that run.
I am running:
TODAY=$(date "+%a %d %b")
awk '
/SQL1 Summary Started/,/SQL1 Summary Finished/ {print}
$2 == "unloaded." {print $1}
' dailystockdownload.log
SQL1 Summary Started Mon 19 Jun 05:31:27 2006
Isolation level set.
1067 row(s) unloaded.
SQL1 Summary Finished Mon 19 Jun 07:57:32 2006
I get all entries from the logfile but I only want todays output in the following format on 1 line.
SQL1 Summary Started Mon 19 Jun 05:31,1067 row(s),Finished Mon 19 Jun 07:57
does this make sense? I do prefer to use awk and sed rather than perl but am open to suggestions.
A million Thanks!
Lawrenzo
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2006 11:02 PM
06-20-2006 11:02 PM
Solutiontry:
awk -v today="$(date '+%a %d %b')" '$0 ~ today && /SQL1 Summary Started/ { start=$0 }
$2 == "unloaded." {rows=$1" "$2 }
start && /SQL1 Summary Finished/ { print start,rows,$0 ; start="" }' dailystockdownload.log
Note:
- I'm not inserting additional commata.
- Blocks starting before midnight but ending after are reported as well.
mfG Peter
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2006 11:03 PM
06-20-2006 11:03 PM
Re: awk help please
# echo "$(awk '/R1 Stock Started /,/R1 Stock Finished/' dailystockdownload.log | awk '{print}')"
R1 Stock Finished Sat 17 Jun 00:30:19 2006
R1 Stock Started Sun 18 Jun 01:00:01 2006
Isolation level set.
7645 row(s) unloaded.
R1 Stock Finished Sun 18 Jun 01:00:18 2006
R1 Stock Started Mon 19 Jun 00:30:01 2006
Isolation level set.
7645 row(s) unloaded.
R1 Stock Finished Mon 19 Jun 00:30:18 2006
R1 Stock Started Tue 20 Jun 00:30:01 2006
Isolation level set.
7662 row(s) unloaded.
R1 Stock Finished Tue 20 Jun 00:30:17 2006
R1 Stock Started Wed 21 Jun 00:30:01 2006
Isolation level set.
7699 row(s) unloaded.
R1 Stock Finished Wed 21 Jun 00:30:46 2006
all I want is the information for today but cannot figure out how to print the lines between R1 Stock Started Wed 21 Jun 00:30:01 2006 and R1 Stock Finished Wed 21 Jun 00:30:46 2006
Thanks again
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2006 11:05 PM
06-20-2006 11:05 PM
Re: awk help please
just reviewd the output format - found that 'unloaded.' is in column 3, so better use:
awk -v today="$(date '+%a %d %b')" '$0 ~ today && /SQL1 Summary Started/ { start=$0 }
$NF == "unloaded." {rows=$1" "$2 }
start && /SQL1 Summary Finished/ { print start,rows,$0 ; start="" }' dailystockdownload.log
mfG Peter
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2006 11:13 PM
06-20-2006 11:13 PM
Re: awk help please
You the Greatest!
Thanks.