- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- If there are a limitation of the file numbers in a...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2002 07:00 PM
05-23-2002 07:00 PM
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2002 07:09 PM
05-23-2002 07:09 PM
Re: If there are a limitation of the file numbers in a directory?
see man newfs_hfs there is option -i number_of_bytes_per_inode
decrease this value will increase inode for file system which need to keep many small files.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2002 07:10 PM
05-23-2002 07:10 PM
Re: If there are a limitation of the file numbers in a directory?
http://forums.itrc.hp.com/cm/QuestionAnswer/1,,0xbacd91ccb36bd611abdb0090277
Hope it helps ..
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2002 07:12 PM
05-23-2002 07:12 PM
Re: If there are a limitation of the file numbers in a directory?
And another one
http://forums.itrc.hp.com/cm/QuestionAnswer/1,,0xcd745220af9bd5118ff10090279cd0f9,00.html
Regards
Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2002 07:15 PM
05-23-2002 07:15 PM
Re: If there are a limitation of the file numbers in a directory?
http://forums.itrc.hp.com/cm/QuestionAnswer/1,,0xbacd91ccb36bd611abdb0090277a778c,00.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2002 10:17 PM
05-23-2002 10:17 PM
Re: If there are a limitation of the file numbers in a directory?
grep LINK_MAX /usr/include/limits.h
=> 32767
I think it's good to know (we learned it the hard way !).
So if you want to limit the number of files
in one directory by branching, branch fast
enough (eg 0000/0000/0000/0001.txt!)
Using this algorithm, we're managing a small
20.000.000 files.
Hein Coulier
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-24-2002 12:29 AM
05-24-2002 12:29 AM
Solutionhave more than (say) 2000 files directly in one directory will produce considerable performance problems, when accessing this directory. I experienced this on a productive environment, when a outbreak process had created thousands of /tmp-files.
A innocent command like "rm *.old" might not be executed, because the shell command line expansing produces more the 20k.
Thanks, Klaus