- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- millions of files per directory
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:04 PM
тАО04-19-2007 11:04 PM
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:11 PM
тАО04-19-2007 11:11 PM
Re: millions of files per directory
Other things like backups or filesystem sychronisation may be prolonged due to the number of files that have to be opened and written etc.
As far as I am aware the inode limit is the setting that determines how many files can be in a filesystem as each file is added to the inode table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:12 PM
тАО04-19-2007 11:12 PM
Re: millions of files per directory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:16 PM
тАО04-19-2007 11:16 PM
Re: millions of files per directory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:25 PM
тАО04-19-2007 11:25 PM
Re: millions of files per directory
Best thing to do (if possible) is to create some sort of hashing-algorithm to put these millions of files in a tree of subdirectories.
For example, given a bunch of files ranging from 'a000000' to 'c999999' you could start by having subdirectories 'a', 'b', and 'c' (each good for one millon files). Within these directories you could then have subdirectories '000' to '999', each holding one thousand files.
A simplified example, of course, but I'd try implementing something like this in order to avoid the complications described above.
Cheers,
Wout
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:34 PM
тАО04-19-2007 11:34 PM
Re: millions of files per directory
setup a script as mentioned to move the files into sub directories or the environment will become unmanageable.
HTH
#
Chris
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-19-2007 11:50 PM
тАО04-19-2007 11:50 PM
SolutionDivide and conquer. That said, by using current VxFS (JFS) releases (e.g. 4.1 or later) with the latest version layout and mount options that meet your needs, but offer the best performance, you can probably achieve some gains.
Have a look at thie white paper on JFS performance and tuning:
http://docs.hp.com/en/5576/JFS_Tuning.pdf
Another good source of mount options as they relate to performance for VxFS filesystems is the manpages for 'mount_vxfs'. You might find, for instance that mounting with 'noatime' helps speed up your filesystem searches if this is their predominate activity.
http://docs.hp.com/en/B2355-60105/mount_vxfs.1M.html
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-20-2007 01:14 AM
тАО04-20-2007 01:14 AM
Re: millions of files per directory
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-20-2007 02:37 AM
тАО04-20-2007 02:37 AM
Re: millions of files per directory
You are essentially using the directory as a database - something that it was never intended to do.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО04-22-2007 01:36 AM
тАО04-22-2007 01:36 AM
Re: millions of files per directory
Could I ask why? Could you work out a front-end access script/program that would sort and store programs in separate directies instead?
-tjh