- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - OpenVMS
- >
- Re: Some directories are hardly accessible.
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 05:06 AM
тАО08-29-2008 05:06 AM
I have a disk where in some directories, thousands and at times tens of thousands of small files are created. Naturally the INDEXF.SYS for the disk is huge, about 112,000 blocks. The directory files (*.DIR) where the small files are created are big too - nearly 2000 blocks each. I encountered problems with jobs that do I/O on the disk can almost do nothing (but no error message). I suspect fragmentation so I started to clean those directories but that takes too slow. This interesting point to note here is that if no process is deleting files on the directories the DIR command works quite good but if I run my clean job the response for DIR command will be so slow that itтАЩs impractical. Another thing, I can create new files in the directory.
ItтАЩs VAX with OpenVMS V7.1
What can be the cause of it? What the best remedy beside initialize the disk and load from tapes?
Thanks.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 05:11 AM
тАО08-29-2008 05:11 AM
Re: Some directories are hardly accessible.
DFU can be found at www.digiater.nl
Oswald
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 05:42 AM
тАО08-29-2008 05:42 AM
SolutionThe design is the cause. VMS directories are not designed for 10s of thousands of files.
RE:"What the best remedy beside initialize the disk and load from tapes?"
If at all possible, spread the files into more directories.
Background: VMS directories are ordered lists of filenames, and for each unique filename, a list of version numbers with the associated file id. When names are inserted or deleted from the beginning of the directory, once there is either no room, or an empty block, the blocks of the directory must be moved to make room. So if you have a 2000 block directory file, it is possible that many of these blocks will need to be copied. While that is in progress, no other operations can make changes to the directory.
Sorry for the bad news, but I know of no plans to change the design.
The only effect a fragmented disk will have is to prevent a directory from expanding, as when that occurs a new contiguous piece must be located, and the contents copied to the new location. If there isn't enough contiguous space, you will get an ACP file create failed message.
Reinitializing the disk or compressing the directories won't help much. If you have 2000+ block directories, file insertions and deletions are going to be slow.
Jon
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 05:53 AM
тАО08-29-2008 05:53 AM
Re: Some directories are hardly accessible.
I suppose that your appli puts files in a directory named disk$appli,
and that your appli is started every morning and shut every evening.
May be you should do the following, define disk$appli as a search-list
The following will automagically roll
def disk$appli disk:
disk:
disk:
disk:
disk:
disk:
disk:
Of course you have to create your directory <.monday> and so
You can of course use a little dcl to have a search-list with many more
elements (day of month comes to mind), and make it roll.
On monday you can quietly move the file in <.tuesday> and other
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 05:56 AM
тАО08-29-2008 05:56 AM
Re: Some directories are hardly accessible.
really, the thing to do is redisgn the application(s).
Such big directories are inherently VERY inefficient, because in the design it was never considered to be used this way.
MANY operations which add or delete somewhere in the beginning of the (alphabetically ordered) directory cuases ALL the rest of the directory to be re-written.
And if the DIR file had to be extended, first a new, bigger file has to be allocated contiguously, then the whole content has to be copied, and the original DIR has to be deleted. Very IO intensive, ie, very time consuming. And NO way to use caching!.
SO, it is MUCH better to devise SOME way to split up the directory in (several, maybe many) (sub-?) directories.
One rather easy way is to make the DIR a searchlist, and every time a "reasonable" number of files are created (monthly? dayly?, perhaps hourly?) add a new one as first translation of the list.
Nwe files will always be created there, and existing files will be found from anywhere in the list.
(of course, some cleanup or consolidation will also be needed to keep the list itself reasonably small. (I seem to remember that I once ran into a limit of 80 translations, but maybe that was because the total translated string length exceeded some value. No way to trace that back now).
hth
Proost.
Have one on me.
jpe
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 06:19 AM
тАО08-29-2008 06:19 AM
Re: Some directories are hardly accessible.
I didn├в t design or developed this app, that was done in the 80s. And I├в m still new to this app.
Jon wrote << The only effect a fragmented disk will have is to prevent a directory from expanding, as when that occurs a new contiguous piece must be located, and the contents copied to the new location. If there isn't enough contiguous space, you will get an ACP file create failed message>>
That was the reason I wrote << I can create new files in the directory>> so fragmentation is not the problem here, I had that problem once. (although many small files can contribute to fragmentation).
Assuming that I delete all the files I guess that would be a good thing to delete the directory files and recreate brand new ones with the CREATE/DIRECTORY command. I wonder if I should use the qualifier /ALLOCATION qualifier.
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 06:56 AM
тАО08-29-2008 06:56 AM
Re: Some directories are hardly accessible.
OK, you did not design it, but I conclude that you are somehow responsible for keeping it running.
First, find out HOW and WHERE the files are created.
If it is in an embedded image, using a hard-code reference to some physical location, and you have (for whatever reason) no possibility to change the source; then I would say you are out of luck.
OTOH, if the code in any way uses a LOGICAL file location, then just a little DCL will solve things for you!
eg, assume the program code uses an external file assignment: OK, spoof that.
eg, assume the program creates its files in APPDIR; you are home again
even if the files go top APPDISK:[TODIR.DATA}; you can DCL around that.
If the applic does SET DEFAULT and uses the default location; again all set to go.
So,
tell us what description is applicable to you, and we will probably be able to bend it in better shape.
Just tell us.
PS it might be usefull to disclose your architecture and VMS version, and any used 3rd party tooling.
Proost.
Have one on me.
jpe
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 06:59 AM
тАО08-29-2008 06:59 AM
Re: Some directories are hardly accessible.
I would have to check if 7.1 had the /ALLOCATION qualifier, I do not have a 7.1 system accessible from where I am at this instant.
As labadie noted, search lists when used correctly, do have to potential to make this a more manageable application. If one is careful, one can be even transparent to the application. Consider moving archival data to a secondary point in the search list.
Also, a collateral question is whether this system is using disk caching. Disk caching can significantly improve performance in these situations, for a variety of reasons (it also helps other applications by removing their traffic from the disk).
Also consider whether outside performance expertise can be of assistance. While the directory churn is a good candidate for a performance issue, there may be other issues [Disclosure: My firm provides such services, as do other regular contributors to this forum].
- Bob Gezelter, http://www.rlgsc.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 08:18 AM
тАО08-29-2008 08:18 AM
Re: Some directories are hardly accessible.
Reference to the directory locations are both in images and in DCL command files. Long term solutions are not the priority now. The concern now is to get out of this situation. You are absolutely right that a long term solution is needed here. After I deleted very slowly files on the directory, the server admin used DFU to compress the directory with good results. The cleaning is going much faster now. I intend to finally delete the directory and CREATE is again.
I am only the poor guy who supports the app, not the server admin and mot a manager who decides anything. So getting external expertise would be great but not in my hands. With these VAX apps, owners just want to keep them running somehow.
Thanks for your replies.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-29-2008 01:31 PM
тАО08-29-2008 01:31 PM
Re: Some directories are hardly accessible.
When cleaning up large directories, if you're removing a directory entry by deleting a file, or renaming it into another directory, try to work backwards from lexically higher file names down (ie: Z to A rather than A to Z). This will be significantly faster than natural order.
$ DIRECTORY/NOHEAD/NOTRAIL
can be used to generate lists of filenames. PIPE the output into SORT to invert the order.
>recreate brand new ones with the
>CREATE/DIRECTORY command. I wonder if I
>should use the qualifier /ALLOCATION
>qualifier.
This won't help. The problem is simply too many files. If you reduce the number of files the directory, will perform well again. /ALLOCATION is unnecessary, as directories are never shrunk. They will retain the maximum allocation. DFU has a "compact" operation but you DON'T want to use it, as you'll get better performance by having files distributed across the allocated space.
> Reference to the directory locations are
>both in images and in DCL command files.
Maybe now you see the value in always using logical names to reference directories? Even if the directory names are hard coded, it may still be possible to create a logical name search list using the device name, and multiple concealed devices which will appear to the application as a single directory, but to the file system as many. If you want to persue this path, please post an example of the file specification used by the application.