- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Compressing files
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 12:58 PM
05-13-2002 12:58 PM
Compressing files
I have and oracle database 8.1.7.3 that generates a lot of archive logs, each one is 200 Mb. I have croned a job every 15 minutes that checks the directory and if it finds any new files, it will compress them.
Now, here is the question:
If the Oracle Archiver process happens to start archiving a file but has not yet completed writing the entire content of the file to disk, if my croned compress job kicks in, will it be able to compress that file even if Oracle has not finished with it yet ? I am a little paranoid, because if the system lets the compress to go through, then I may have corrupted the archive logs.
In other words, does HP-UX prevent other processes from accessing the file ? Bear in mind that compress reads the file and writes to a new file with *.Z and removes the original one.
Any thoughts would be helpful.
Thank you.-
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 01:00 PM
05-13-2002 01:00 PM
Re: Compressing files
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 01:04 PM
05-13-2002 01:04 PM
Re: Compressing files
HTH
Duncan
I am an HPE Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 01:08 PM
05-13-2002 01:08 PM
Re: Compressing files
FNAME="myfile"
SECONDS=600
fileage.pl -m -s ${SECONDS} ${FNAME}
STAT=$?
if [ ${STAT} -eq 0 ]
then
echo "File: ${FNAME} has not been modified in the last ${SECONDS} seconds; it is safe to compress."
else
echo "File ${FNAME} has been modified; not safe to compress."
fi
I've attached fileage.pl; invoke fileage.pl without arguments for full usage.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 01:45 PM
05-13-2002 01:45 PM
Re: Compressing files
I have just read the man page for compress and it says this:
Access Control Lists
compress retains a file's access control list when compressing and expanding data.
I think this is saying it basically locks the file could one of the guru's comment on this.
cheers
John.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 01:53 PM
05-13-2002 01:53 PM
Re: Compressing files
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 01:56 PM
05-13-2002 01:56 PM
Re: Compressing files
If you are compressing FILEA and some process is still writing to it, and the compress completes, then the original file is DELETED, meaning the process still writing to FILEA won't be writing to squat (actually never-never-land).
live free or die
harry
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 02:33 PM
05-13-2002 02:33 PM
Re: Compressing files
I appreciate all the input I am getting.
I like the fuser suggestion, I will have the job check the fuser before compressing the file, if there is a process list next to the file, then either sleep for a while and try again or abort to the next file and come back to the busy one later.
Please if there is a better suggestion, do not hesitate to update the forum.
Thank you for the feedback.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 06:02 PM
05-13-2002 06:02 PM
Re: Compressing files
I think the question boils down to...Why are you compressing them?
If it's disk space...Then as Clay puts it "That's dog's hunting way too much!".
In that case you really need to extend the VG to hold proper log count/length/size.
If it's to backup & save tape space then I'd recommend you mirror the VG & lvsplit the mirror out prior to compressing & archiving then lvmerge it back in. If redundancy is a major concern & you're nervous about having just one copy for *any* time period then keep 2 mirror copies & you'll always have at least 2 copies on-line.
If you just want to "roll" the logs, I believe Oracle handles that as well. I think you can mv the file to a relevant, time-stamped filename prior to compressing it & Oracle will start a new log...mind you I'm not an Oracle expert but I'm sure one will correct me if I'm mistaken.
Rgds,
Jeff
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-13-2002 10:49 PM
05-13-2002 10:49 PM
Re: Compressing files
as A.Clay and Harry make clear the file is not locked presumably even if you fuser before start of compress another process can effectively attach to file during compress and think it is writing to it.
I would be tempted to rename the file before compression commences to ensure this cannot happen.
As Jeff points out the real solution is to pick another alternative.
cheers
John.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-14-2002 12:22 AM
05-14-2002 12:22 AM
Re: Compressing files
Bruno Cunha
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-14-2002 01:46 AM
05-14-2002 01:46 AM
Re: Compressing files
In case of recovery, you would have to uncompress the file(s) as they're requested.
This might be a bit tricky if you don't have the space for it !
My opinion would be that you should have the space for the archive log files.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-14-2002 06:00 AM
05-14-2002 06:00 AM
Re: Compressing files
timestamp file to compress
all uncomressed files older
than a timestamp file.
Something like:
touch compress.$$
sleep 30
find . -name arch\*.arc ! -newer compress.$$ -exec compress {} \;
rm compress.$$
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-14-2002 06:38 AM
05-14-2002 06:38 AM
Re: Compressing files
Well, I found a much safer solution that I tested and it works great.
First the cron job gets a list of all the files to compress then I check each file against the v$archived_log table in oracle.
If the file is properly archived and oracle is done with it, the Archiver process makes an entry into this table, making it safe to move or compress the file.
I have tested this by issuing an "Alter system switch logfile" right after that I started watching the directory as the file makes to the archive directory and at the same time I query the V$Archived_log table for the existence of the file. The table never showed the file until oracle finished archiving then an entry popped up in the table.
so the cron job never touches a file if it is not in the V$ARCHIVED_LOG table. If anyone is interested in the script, I will be glad to send it to them.
By the way, to answer the other questions why not to give enough space. This database generates close to 50 Gb of archive logs a day.
The directory that holds all of this is 40 Gb. Compressing and sending the files to a stand by database database in another server and eventually to tape, is the best way to have space available for the next day.
Again, many thanks to everyone who contributed their time and intellect to assist in solving this problem
Thank you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2002 05:08 AM
05-15-2002 05:08 AM
Re: Compressing files
itax_dev /tmp/algo # cp mail.log mail.log.preZ.txt
itax_dev /tmp/algo # cp mail.log mail.log.pregz.txt
itax_dev /tmp/algo # ls -arltp
total 7698
drwxrwxrwx 19 bin bin 6144 May 15 09:07 ../
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log.preZ.txt
drwxr-x--- 2 ty hpadmin 96 May 15 09:07 ./
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log.pregz.txt
itax_dev /tmp/algo # gzip mail.log.pregz.txt
itax_dev /tmp/algo # compress mail.log.preZ.txt
itax_dev /tmp/algo # lt
total 3168
drwxrwxrwx 19 bin bin 6144 May 15 09:07 ../
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log
-r--r----- 1 ty hpadmin 203755 May 15 09:07 mail.log.preZ.txt.Z
-r--r----- 1 ty hpadmin 98570 May 15 09:07 mail.log.pregz.txt.gz
drwxr-x--- 2 ty hpadmin 1024 May 15 09:08 ./
itax_dev /tmp/algo
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2002 05:40 AM
05-15-2002 05:40 AM
Re: Compressing files
Great info, you mean gzip compresses data much more than compress ?
I didn't know that, I thought they are almost the same. But, clearly from your example, it looks like the gzip condensed more data than compress did. I will definitely have to switch from compress to gzip.
Great input.