Operating System - OpenVMS
Showing results for 
Search instead for 
Did you mean: 

ZIP limitation?

Go to solution
Art Wiens
Respected Contributor

ZIP limitation?

I am using Zip 2.3 (November 29th 1999) on Alpha v7.2-2 . I have been busy zipping unmaintained directories chock full of 3 - 10 block text files. Several directories have had 10,000+ files and went without issue. Next in line is a particular directory with over 45,000 files in it!!

I start ZIP and get no error message or indication that anything bad is happening ... process is doing "lots" of io, consuming cpu etc. but is now taking over 3 hours to even start adding files to a new zip file. It has no files open, no ZIP file created yet.

Is 45,000 too many files for ZIP?

Honored Contributor

Re: ZIP limitation?

Please make sure your have the latest version of zip from the following HP link, where you can find all the latest ZIP, UNZIP packages.

Ian Miller.
Honored Contributor

Re: ZIP limitation?

Zip has a 2Gb file size limit - parhaps you are running into that?

Accessing a directory of that size is going to be slow whatever you are doing.
Purely Personal Opinion
Art Wiens
Respected Contributor

Re: ZIP limitation?

Archunan: The version I'm using is the same as what's on the Freeware v2.3 .

Ian: No it shouldn't be close to 2G, tons of files but they're all small.

It finally did "start" after about 4 hours. What's it doing during this initial period? Obviously it must be "reading" the source files, but I don't see any open files for this process other than the executable.

I guess if I had to deal with 45,000 things, I might take a bit of time upfront to plan what I was going to do first ;-)

Ian McKerracher_1
Trusted Contributor

Re: ZIP limitation?

Hello Art,

Have a look at this link. It probably isn't the cause of your problem but it may be of interest.




Art Wiens
Respected Contributor

Re: ZIP limitation?

Thanks Ian. I did notice that it's creating the temporary ZI*.* file in my local directory - once it gets going. In my case though, several hours went by before it even opened the ZI file. Another 1.5 hours to add all 45,763 files into it.

Steven Schweda
Honored Contributor

Re: ZIP limitation?

> Please make sure your have the latest
> version of zip

This is good advice.

> from the following HP link, where you
> can find all the latest ZIP, UNZIP
> packages.

This is no better advice today than it was a
while ago. As I enjoy reminding people,
repeatedly in some cases, the latest
released versions are Zip 2.31 and UnZip
5.52, and the source kits are normally
available at or near:


Zip 2.31, for example, puts the "ZI*."
temporary file in the right place. If your
actual archive will be on a diferent device,
Zip 2.3 will need to copy the whole thing
after it's done, while Zip 2.31 will simply
(and quickly) rename it.

I wouldn't bet that a newer Zip would work
any faster on a large number of files, but
I'd be interested to learn whether it does.
I doubt that anyone has tried a test like
this, so there could easily be some
previously unnoticed slow code in there.

What's the Zip command used? There could,
for example, be a problem in VMS wildcard
processing. I assume that you're not
explicitly specifying all 45000 file names.

I've forgotten. Is VMS V7.2-2 too old for
the latest directory caching improvement?

It shouldn't be '"reading" the source files',
but it should be looking for them, as it
does need a list. If the directory look-ups
are slow, then Zip may be doomed to a
certain amount of undesirable sloth.

P.S. If you hit the 2GB limit, let me know.
Art Wiens
Respected Contributor

Re: ZIP limitation?

The command was:

$ zip "-Vwj" zipfile.zip $1$dga77:[dir.subdir]*.*;*

I was not in the source directory when I issued the command, the source directory contained 45,763 files. The resulting zip file ended up being 458,398 blocks.

Steven Schweda
Honored Contributor

Re: ZIP limitation?

Ok. It seems that VMS V7.2 was where the
directory cache improvement was made, so I
probably can't blame that. Of course, and I

[T]he OpenVMS Wizard cannot and does not
recommend storing large numbers of files
in a single (very large) directory.


If you don't specify a non-default device
for the archive file itself
("dev:[dir]zipfile.zip"), then that
particular fix in Zip 2.31 won't affect you.
Other I/O speed improvements might, however,
so I'd still suggest using the newer version.
200MB is not 2GB, but it does take some
little while to write it. But in your case,
the prep time seems to swamp the I/O time.

When I get really bored, I may have a look at
the wildcard code to see if it does something
especially lame.

Wake me if things get any worse.
Steven Schweda
Honored Contributor

Re: ZIP limitation?

For a good time, I made a short command
procedure to create N similarly named files
("A_nnnnnn.dat", nnnnnn = "000001", ...) in
a directory. On an otherwise idle XP1000
N = 10000, t = 0:22:41
N = 50000, t = 2:56:04

Looks non-linear to me.

Given that it takes three hours just to
create 50000 such small files, I'd bet that
doing any significant directory work on that
mess would be comparably slow, and thus that
there's not much hope of speeding up Zip in
this situation.

See also: