- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: UNIX tar limitation
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 06:53 PM
06-21-2006 06:53 PM
UNIX tar limitation
I encountered this particular scenario where there're about 10,000 files to be tar-ed. However, the tar utility just failed to tar-up these files.
I'm running the tar utility on a bash shell.
Could anyone tell me if there is a limitation to the number of files that the tar utility is capable of handling? Also, what tweaking is needed to the shell for the tar to have such capabilities?
Thanks in advance.
Danny
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 06:59 PM
06-21-2006 06:59 PM
Re: UNIX tar limitation
i have allready used tar for more than 10,000 files.
Did you try to tar more than 2GB of data? There is a limitation to 2GB.
Or did you use wildcards in your tarcommand. With that much files, using wildcards will get you an error from the shell which tries to resolve the wildcard to a list of files. In that case try using tar with a directory, not a filelist.
instead of tar cf mytar.tar /your/dir/* use tar cf mytar /your/dir
Regards Stefan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 07:01 PM
06-21-2006 07:01 PM
Re: UNIX tar limitation
Default tar has a limitation. You can install PHCO_28992 and check it works or not. Otherwise, get GNU's tar
http://hpux.connect.org.uk/hppd/hpux/Gnu/tar-1.15.1/
http://www2.itrc.hp.com/service/patch/patchDetail.do?patchid=PHCO_28992&sel={hpux:11.11,}&BC=main|search|
-Arun
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 07:03 PM
06-21-2006 07:03 PM
Re: UNIX tar limitation
with a tar patch you are capable of tar up to 8 GB. But i don't know whether there is a file number limitation.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 07:26 PM
06-21-2006 07:26 PM
Re: UNIX tar limitation
"Because of industry standards and interoperability goals, tar does not
support the archival of files larger than 8GB or files that have
user/group IDs greater than 2048k. Files with user/group IDs greater
than 2048k are archived and restored under the user/group ID of the
current process, unless the uname/gname exists (see tar(4))."
man tar
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 07:36 PM
06-21-2006 07:36 PM
Re: UNIX tar limitation
You can use the GNU version of tar to overcome 2GB file size limitation.
Here it is:
http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.15.1/
[ Although tar is supplied with HP-UX, gtar has more options and handles absolute paths as relative paths, restores the original dates of directories when extracting and also supports large file systems. The gettext and libiconv packages should be installed first, prior to installing tar. ]
Cheers,
Raj.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 08:37 PM
06-21-2006 08:37 PM
Re: UNIX tar limitation
Thanks to all whom had shared their knowledge in this thread.
However, the greater issue is with the number of files that tar can handle and not just the size.
The tar operation is done as in:
/usr/bin/tar -cEf "$GROUPID.tar" *"$GROUPID"*"statsfile.GPEH.xml"
Hence not the entire directory list.
Any ideas anyone?
Thanks
Danny
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 08:46 PM
06-21-2006 08:46 PM
Re: UNIX tar limitation
the problem is the use of wildcards. Your shell tries to expand those wildcards an creates an awful long commandline.
So you have to work around this. Use find to identify your files and hand the list over to tar using xargs. Or move the files to a seperate directory and then tar the whole directory.
Regards Stefan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 11:06 PM
06-21-2006 11:06 PM
Re: UNIX tar limitation
When you mentioned that the shell attempts to expand those wildcards and created awful long commandline, may I know what was the error message that you obtained when you experimented with this?
May I know how does the use of "find" coupled with "xargs", or by moving the files to a seperate directory and then tar-ingthe whole directory would help in this issue?
Greatly appreciate it if you could shed some light in these questions.
Thanks
Danny
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 11:18 PM
06-21-2006 11:18 PM
Re: UNIX tar limitation
echo *"$GROUPID"*"statsfile.GPEH.xml"
the result is what tar will see on the command line. The command line is not infinitely long so that is why you must provide long lists of filenames with utilities like find and xargs.
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-21-2006 11:26 PM
06-21-2006 11:26 PM
Re: UNIX tar limitation
using a wildcard in a command is not interpreted by the comamnd but by the shell and then passed to the command.
So if you do something like tar cf xxx.tar *.txt and you have 1.txt, 2.txt .... 100.txt the shell will interprete this wildcart and call a command tar cf xxx.tar 1.txt 2.txt 3.txt 4.txt ...... 100.txt
Unfortunately a command can not have infinite length.
If you move all files to another directory you can just tar this directory without using wildcards. So if you find and move all desired files there is no need to call tar using wildcards. (e.g. tar cf xxx.tar ./my_tar_directory)
Or use find with xargs to get this long list of files passed to tar. Something like:
find . -name "yourexpression" | xargs tar cf yourtarfile.tar
See man xargs for more information.
Regards Stefan