Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 09:32 AM
тАО08-08-2006 09:32 AM
Here is an example of our problem on one of our nodes.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 09:49 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 10:05 AM
тАО08-08-2006 10:05 AM
Re: Help
Again the title "Help" is not very descriptive. It certainly won't help anyone trying to find this topic in the future. Please change it to something like "How to fix file fragmentation"
The files you describe are fragmented, but not too severely. You can't do anything about it while they're open. Assuming you have some time window when they're closed, I'd suggest you CONVERT each file to clean it up both inside and outside.
$ CONVERT [RMS]EDOCHIST.DAT [RMS]EDOCHIST.DAT
Given the level of fragmentation, this is NOT urgent. After the CONVERT if the file is still fragmented, try:
$ COPY/CONTIGUOUS [RMS]EDOCHIST.DAT [RMS]EDOCHIST.DAT
To avoid the file fragmenting in the future, work out its likely growth pattern and give it an appropriate EXTEND quantity. If I'm reading your report correctly, the file is around 450K blocks, so 50K blocks wouldn't be unreasonable:
$ SET FILE/EXTEND=50000 [RMS]EDOCHIST.DAT
You need to know the data to decide on the best value. How much do you expect the file to grow? How much disk space are you prepared to allocate to it now?
Of course, this will only help if you have sufficient contiguous free space.
If you have it, run DEFRAG against the disk to consolidate the free space.
It's not clear from the display what your "free space issues" are. If you're running low, the most cost effective solution is often "buy more"!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 11:12 AM
тАО08-08-2006 11:12 AM
Re: Help
For that number of files, my first attempt would be to do CONVERT with an appropriate FDL (as John said).
I would also consider what the expansion requirements of these files are, and allocate the new files accordingly.
If the disk has a big enough block of free space, this should work. If the entire disk is fragmented, I would consider using a defragmenter.
- Bob Gezelter, http://www.rlgsc.com
P.S. I second John's comments about titling the post in a clearer fashion.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 01:15 PM
тАО08-08-2006 01:15 PM
Re: Help
In addition to the other replies you have, I'll chime in with the budget answer.
BACKUP can be used as a defragmentation utility and it's included with the operating system. If you have a spare disk, make sure all files are closed and create a disk-to-disk /IMAGE copy. Either update the logicals used for data access or change the disk identifier.
If you don't have a spare device, you can create a tape backup and image restore. Be very sure your backup device is reliable before trying this. The VERIFY switch is recommended.
As mentioned above, set a reasonable file extent to help avoid fragmentation issues in the future.
Andy
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 06:10 PM
тАО08-08-2006 06:10 PM
Re: Help
BTW : may be the files don't get defragmented because they were open during the defragment process (if any).
Wim
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 06:27 PM
тАО08-08-2006 06:27 PM
Re: Help
If you haven't already, I'd get a copy (for free) of DFU from here
http://h71000.www7.hp.com/freeware/freeware70/dfu/
...It's very quick on a whole bunch of useful tasks (has online help) and can defag individual files just fine.
If you want something that'll defrag more than an individual file HP's Disk File Optimizer (DFO) has a varied level of defrag levels. You can download an use the reporting version for free here:
http://h71000.www7.hp.com/openvms/storage/dfopage.html
It's been a while but at the time we felt the cost was reasonable and justified for the volumes we cannot easily backup/image and restore, so we use it regularly to consolidate free space.
One final point, if any of these files are created regularly (new files ;1 say from a month end batch run) and you know how big they get, you could investigate changing the application to create preallocating the required space and/or apply an extension too. If they're created via .FDL files then you can also specify values within the FDL too.
Kind Regards
John
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-08-2006 08:53 PM
тАО08-08-2006 08:53 PM
Re: Help
Is the performance of your application currently acceptable and are these files frequently accessed?
Look at window turns (MONITOR FCP) and split I/O rate (MONITOR IO). Unless this a significant percentage then the fragmentation is not a problem. Don't fix problems that are having no significant impact.
Purely Personal Opinion
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-09-2006 05:54 AM
тАО08-09-2006 05:54 AM
Re: Help
even if you DO have a relatively high WindowTurnRate, this CAN be overcome more easily.
You can enable "Cathedral Windows", ie, you always load ALL of the file header in memory.
You achieve this by setting SYSGEN param ACP_WINDOW to 255 (max, special meaning: ALL)
Unfortunately it requires reboot, and it can better be left to AUTOGEN to calculate some related params.
This completely eliminates WindowTurns.
There are those that say it is as good as the best defragger.
I do have some doubts on that, but it sure eliminates all defragger overhead. And it does NOT consolidate free space!
(You DO need a bit more physcal memory (in the order of the total size of the excessive fileHEADERS, so several KB [er very fragmented file probably less than 1 MB if you have less than 100 VERY frahmented files)
fwiw
Proost.
Have one on me.
jpe
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-09-2006 06:38 AM
тАО08-09-2006 06:38 AM
Re: Help
It depends on what your problem (if any) is.
Purely Personal Opinion