- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- millions of files per directory
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:04 PM
тАО04-19-2007 11:04 PM
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:11 PM
тАО04-19-2007 11:11 PM
Re: millions of files per directory
Other things like backups or filesystem sychronisation may be prolonged due to the number of files that have to be opened and written etc.
As far as I am aware the inode limit is the setting that determines how many files can be in a filesystem as each file is added to the inode table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:12 PM
тАО04-19-2007 11:12 PM
Re: millions of files per directory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:16 PM
тАО04-19-2007 11:16 PM
Re: millions of files per directory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:25 PM
тАО04-19-2007 11:25 PM
Re: millions of files per directory
Best thing to do (if possible) is to create some sort of hashing-algorithm to put these millions of files in a tree of subdirectories.
For example, given a bunch of files ranging from 'a000000' to 'c999999' you could start by having subdirectories 'a', 'b', and 'c' (each good for one millon files). Within these directories you could then have subdirectories '000' to '999', each holding one thousand files.
A simplified example, of course, but I'd try implementing something like this in order to avoid the complications described above.
Cheers,
Wout
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:34 PM
тАО04-19-2007 11:34 PM
Re: millions of files per directory
setup a script as mentioned to move the files into sub directories or the environment will become unmanageable.
HTH
#
Chris
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:50 PM
тАО04-19-2007 11:50 PM
SolutionDivide and conquer. That said, by using current VxFS (JFS) releases (e.g. 4.1 or later) with the latest version layout and mount options that meet your needs, but offer the best performance, you can probably achieve some gains.
Have a look at thie white paper on JFS performance and tuning:
http://docs.hp.com/en/5576/JFS_Tuning.pdf
Another good source of mount options as they relate to performance for VxFS filesystems is the manpages for 'mount_vxfs'. You might find, for instance that mounting with 'noatime' helps speed up your filesystem searches if this is their predominate activity.
http://docs.hp.com/en/B2355-60105/mount_vxfs.1M.html
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-20-2007 01:14 AM
тАО04-20-2007 01:14 AM
Re: millions of files per directory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-20-2007 02:37 AM
тАО04-20-2007 02:37 AM
Re: millions of files per directory
You are essentially using the directory as a database - something that it was never intended to do.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-22-2007 01:36 AM
тАО04-22-2007 01:36 AM
Re: millions of files per directory
Could I ask why? Could you work out a front-end access script/program that would sort and store programs in separate directies instead?
-tjh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-22-2007 01:56 PM
тАО04-22-2007 01:56 PM
Re: millions of files per directory
The product is called Centera and it is the first successful commercial product using CAS (content addressable storage). Millions of files are trivial -- we have dozens of terabytes of small files on Centeras. NOTE: there is no directory structure at all so you need a database to keep track of the special name for each file. Also, performance is limited and really designed for low volume access such as data archiving -- somewhere between disk arrays and tape silos.
As Clay mentioned, you have a developer problem that sounds suspiciously like a way to avoid buying a real database program (ie, every part number in the company has a small file with data in it). VxFS does not have any practical limitation on inodes since they are
built dynamically. AS long as you have space, you can add more files.
But don't do searches (ie, ls * or find, etc) and expect anything faster than minute responses. Once the developer tries to make this work, you'll probably get Version 2 of the software where a small database tracks all the files with binary searches, or maybe an hash algorithm...hummm, who knows, you may end up with Version 3 which is a real database.
Note: if you want the best possible performance with millions of files, you MUST install 11.31 as there are specific enhancements for massive directories -- and no, it still won't perform like a real database.
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-22-2007 06:40 PM
тАО04-22-2007 06:40 PM
Re: millions of files per directory
perhaps, reiserfs can handle these things (but that is not available on hpux), it is not only the filesystem itself that has to be effecient with it.
try as hard as you can to get this stuff into a database, that is what these things are made for and what they do best. they will outperform any filesystem for sure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-22-2007 09:49 PM
тАО04-22-2007 09:49 PM
Re: millions of files per directory
Bill, I assume you are referring to layout 5 of vxfs 3.5? We are installing it for testing now.
I have also found XFS, developed by SGI, and ported to Suse. XFS can also support a large number of files in a directory. We will be testing this also.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-23-2007 01:36 AM
тАО04-23-2007 01:36 AM
Re: millions of files per directory
-tjh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-24-2007 12:17 PM
тАО04-24-2007 12:17 PM
Re: millions of files per directory
Bill Hassell, sysadmin