cancel
Showing results for 
Search instead for 
Did you mean: 

DFU

 
SOLVED
Go to solution
Mulder_1
Frequent Advisor

DFU

How to defrag a disk using DFU ?

Is there any documentation available on the procedure ?

Thanks
4 REPLIES 4
marsh_1
Honored Contributor

Re: DFU

hi,

see this previous post for links to the docs.

http://forums11.itrc.hp.com/service/forums/questionanswer.do?threadId=1169587

hope this helps

Jon Pinkley
Honored Contributor

Re: DFU

FOX2,

The current version of DFU, V3.2, does not defragment a whole disk the way products like DFO(aka DFG), PerfectDisk, or Diskeeper do. DFU will let you defrag specific files, or a list of files. DFU can also defrag the Indexf.sys file (in offline mode).

Jon
it depends
John Gillings
Honored Contributor
Solution

Re: DFU

FOX2, you can use DFU SEARCH/FRAG to determine which files are fragmented. The output can then be fed back into DFU to defrag them.

Then attached command procedure uses this mechanism, via pipes, sorting the output so that the worst fragmented files are defragged first.

Parameters are disk to be fragmented and fragmentation threshold (ie: any files with more than the threshold fragments are defragged, default is 1).
A crucible of informative mistakes
Jon Pinkley
Honored Contributor

Re: DFU

FOX2,

What problem are you trying to solve by defragmenting the disk? Is some operation failing? If so, what operation fails, and what error message are you getting?

My point is that even if every file on a disk is contiguous, if the free space is fragmented, you will still get file creation failures if a directory file needs to be made larger, and there is insufficient contiguous space on the volume. For example, free space may be 200000 blocks, but the largest free extent may be 5000 blocks. If your existing directory is already > 5000 blocks, the directory expansion will fail, and as a result the file you are attempting to create will fail, even if there is sufficient free space on the volume.

If increasing the largest contiguous space is the problem you are attempting to fix, then defragmenting the files will probably make the problem worse, unless the defragmentation procedure also consolidates the free space. DFU V3.2 does not do free space consolidation.

To solve the problem of insufficient contiguous space, increasing fragmentation of an existing file may be the short-term answer. You can use DFU to search for large files that are in a single piece with the command:

$ define dfu$nosmg 1
$ mcr dfu search /size=min=10000/allocated/sort/frag=max=1 disk:

will give an alpha sorted list of files that have at least 10,000 blocks allocated in a single extent. Choose one that is used infrequently, and then make a copy of it, to the next highest version. This copy will be fragmented, but if it isn't used, the performance penalty will be insignificant. After the copy is made, you can delete the original file that was in one piece, and now you have a contiguous chunk of free space that can be used to extend the directory.

You may want to wait until you are ready to do the operation that needs the contiguous space before deleting the original contiguous file. Otherwise some other file creation or extension may use blocks from the contiguous chunk.

Jon
it depends