- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Continuously archiving directory contents
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2002 02:21 AM
02-22-2002 02:21 AM
Continuously archiving directory contents
I am wondering if there is a clever more efficient method to do what i am doing now. I have directories that are continuously growing with thousands of small files being created at a rate of over 100 every 5 minutes! Obviously this is a headache to deal with, and results in fbackup taking hours to run as some directories on this machine have over 1 million files in them by now. Anyway, I was going to tar the directories, delete the directory with rm -r and then recreate it again, but this would mean that files being created would fail for the time between directory deletion and recreation. Next I was going to run a recursive "for i in `ls -1, etc" in the directory to move the files to a tmp location and then delete them. The tmp location would then be archived and deleted itself. The problem with this is that i have to check the size of each file before moving it to prevent partial copies. Again with the number of files involved, and the number of directories this needs to be done with, the overhead would be too much. I am sure there is a much more clever way of archiving files to keep the qty down to take the pressure off fbackup. I am also sure that loads of sys admins must have to deal with this and someone has figured out how. I want to avoid "find" as they are too heavy on the processor, there could be 20 running at once on this box. Any help would be greatly appreciated.
Dermot.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2002 03:44 AM
02-22-2002 03:44 AM
Re: Continuously archiving directory contents
Wow, that looks like fun :-).
I use the "for loop" you described. However, on my system I am sure that all but the most recent file are "complete". Using ls -rt1 makes it very easy to exclude this last file.
It may not be the solution you are looking for, but if your situation is simular, it would be a very easy one.
Regards,
Tom Geudens
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2002 04:47 AM
02-22-2002 04:47 AM
Re: Continuously archiving directory contents
1- create several directories:
mkdir 0 1 2 3 4 5 6 7 8 ....
Now use symbolic links:
#! /usr/bin/ksh
let maxd=9
let dir=0
while [ 1 ]
do
let dir=$dir%$maxd
mv YOURDIR YOUROPENDIR
ln -s $dir YOURDIR
sleep 600
done
-----
In this way you will change of dir each 10 minutes. You will find some benefits:
1- open files remains open
2- shorter directories structures
3- Now you can test, backup, and remove dir by dir.
:-))
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-22-2002 06:34 AM
02-22-2002 06:34 AM
Re: Continuously archiving directory contents
olddir=/abc
newdir=/xyz
cd $olddir
ls -rt | while read f
do
open=`fuser $f 2>/dev/null`
if [ "$open" = "" ] ; then
mv $f $newdir
else
echo "file $f is held open by processes ${open}"
fi
done
This is untested, so you might want to create some test stuff first before using...
Also it won't work if there are sub-directories in the directory being archived
HTH
Duncan
I am an HPE Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-25-2002 06:26 AM
02-25-2002 06:26 AM
Re: Continuously archiving directory contents
Dermot
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-25-2002 10:18 AM
02-25-2002 10:18 AM
Re: Continuously archiving directory contents
So where do I get my FOD ;o)
Cheers
Duncan
I am an HPE Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-25-2002 10:30 AM
02-25-2002 10:30 AM
Re: Continuously archiving directory contents
ln dirname dirname.
mkdir newdirname
# now force dirname to refer to the new directory...
ln -f newdirname dirname
unlink newdirname
This results in the existing directory being moved aside without affecting processes with files already open or processes that wish to create new files, because dirname always exists. A fresh new directory is now in place and you can do what you like with the old directory.
This may or may not work in your case, depending on your application, but it was quite effective in my case. I had the script running every 5 minutes which you could tweak depending on the rate at which new files are created.
Regards,
Steve