- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Scripting: Parsing HUGE text/log files
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 07:49 AM
05-24-2004 07:49 AM
I am trying to parse data out of some HUGE log files. The problem is I would normally just cat the file, grep'ping or "awk'ing" for the data. For example, in this case, I am only wanting to see the line of data which is actually related to a file (the output is basically an "ls -la" output of backed up data)-so I would normally do something along the lines of...
cat
do
if [ `echo $line | cut -c1` = "-" ]
then; echo $line
fi
done
BUT, the logfile is so big (200+ MB) that the "cat" chokes and never drops any data to the while statement. I have tried using "more" instead of "cat", and get better results with some of the smaller log files...but it still can't work with the big ones.
I have no control over how the log files are output...they are being generated by a different process, beyond my modification.
Any suggestions??
Thanks!
Mike
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 07:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 07:59 AM
05-24-2004 07:59 AM
Re: Scripting: Parsing HUGE text/log files
sed -e '/^-/p' input_file
Anil
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 08:08 AM
05-24-2004 08:08 AM
Re: Scripting: Parsing HUGE text/log files
One way to do it with Perl:
perl -ne 'print if /^-/' file
JP
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 08:09 AM
05-24-2004 08:09 AM
Re: Scripting: Parsing HUGE text/log files
grep ^-
In your example script, you've misspelled LINE so a working version would be:
set -u
cat
do
if [ $(echo "$LINE" | cut -c1) = "-" ]
then
echo "$LINE"
fi
done
To prevent similar spelling errors, always use set -u. Also, use "$LINE" to preserve imbedded spaces. The use of grave accents has been depracated for almost a decade now. Use $(...) rather than `...` (see man pages for sh-posix and ksh). The above script works but is about 1/10 the speed of the grep line. If you only want the first occurance of "-" in the file, use this:
grep ^-
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 08:11 AM
05-24-2004 08:11 AM
Re: Scripting: Parsing HUGE text/log files
Thanks!
Mike
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2004 08:36 AM
05-24-2004 08:36 AM
Re: Scripting: Parsing HUGE text/log files
Thanks for the spellcheck :-) I am not sure what was happening with "cat" and "more" either. Very simply, if I ran a "cat" against the 200MG file (just "cat
If I did a "ps" against the "while" or the "read" (to see if "cat" had started piping data to those commands) I got nothing when running the script against the larger log file, where it did show a "while read" process running when using the script against smaller logfiles.
It was as if the "cat"/"more" commands had to parse ALL the data into memory, before it started dumping it to the "while read line".
Not sure why...but the "sed | while read" is working where the "cat | while read" and "more | while read" were choking. Must be the way in which the two commands handle the data...?
Thanks again!
Mike
P.S. Confession timeâ ¦this script will be running on both HPUX and Solaris, and it is on the Solaris box where the script is choking (where I am testing it)â ¦but SUNâ s support forums are so crappy, that I posted it here first in order to actually get a response. I wonder if SUNâ s and HPâ s more/cat commands handle their memory d