HPE GreenLake Administration
- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Two Itanium systems: SSH works one place, fail...
Operating System - HP-UX
1833779
Members
2410
Online
110063
Solutions
Forums
Categories
Company
Local Language
back
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
back
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Blogs
Information
Community
Resources
Community Language
Language
Forums
Blogs
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-20-2005 04:07 AM
09-20-2005 04:07 AM
Two Itanium systems: SSH works one place, fails another
We have two near-identical Itanium boxes. On one OpenSSH 3.9 works like a charm; on the other, connections are continually dropped and the daemon dies. Moved to 4.2, connections still drop but the daemons live. Diffed the kernel on both boxes; working box to left, broken box to right. Any ideas?
< maxfiles_lim 4096 4096 Immed
---
> maxfiles_lim 2048 2048 Immed
62c62
< maxuprc 3360 ((nproc*8)/10) Immed
---
> maxuprc 1654 ((nproc*8)/10) Immed
64c64
< msgmap 4202 (msgtql+2)
---
> msgmap 2070 (msgtql+2)
67,68c67,68
< msgmni 4200 (nproc)
< msgseg 16800 (msgtql*4)
---
> msgmni 2068 (nproc)
> msgseg 8272 (msgtql*4)
70c70
< msgtql 4200 (nproc)
---
> msgtql 2068 (nproc)
74,78c74,78
< nfile 65536 65536 Imm (auto disabled)
< nflocks 4200 (nproc) Imm (auto disabled)
< ninode 35648 (8*nproc+2048)
< nkthread 8416 8416 Immed
< nproc 4200 4200 Immed
---
> nfile 30000 30000 Imm (auto disabled)
> nflocks 2068 (nproc) Imm (auto disabled)
> ninode 18592 (8*nproc+2048)
> nkthread 6000 6000 Immed
> nproc 2068 2068 Immed
91c91
< physical_io_buffers 1280 Default Auto
---
> physical_io_buffers 640 Default Auto
97c97
< scroll_lines 100 Default Immed
---
> scroll_lines 100 Default
104c104
< semmnu 4196 (nproc-4)
---
> semmnu 2064 (nproc-4)
< maxfiles_lim 4096 4096 Immed
---
> maxfiles_lim 2048 2048 Immed
62c62
< maxuprc 3360 ((nproc*8)/10) Immed
---
> maxuprc 1654 ((nproc*8)/10) Immed
64c64
< msgmap 4202 (msgtql+2)
---
> msgmap 2070 (msgtql+2)
67,68c67,68
< msgmni 4200 (nproc)
< msgseg 16800 (msgtql*4)
---
> msgmni 2068 (nproc)
> msgseg 8272 (msgtql*4)
70c70
< msgtql 4200 (nproc)
---
> msgtql 2068 (nproc)
74,78c74,78
< nfile 65536 65536 Imm (auto disabled)
< nflocks 4200 (nproc) Imm (auto disabled)
< ninode 35648 (8*nproc+2048)
< nkthread 8416 8416 Immed
< nproc 4200 4200 Immed
---
> nfile 30000 30000 Imm (auto disabled)
> nflocks 2068 (nproc) Imm (auto disabled)
> ninode 18592 (8*nproc+2048)
> nkthread 6000 6000 Immed
> nproc 2068 2068 Immed
91c91
< physical_io_buffers 1280 Default Auto
---
> physical_io_buffers 640 Default Auto
97c97
< scroll_lines 100 Default Immed
---
> scroll_lines 100 Default
104c104
< semmnu 4196 (nproc-4)
---
> semmnu 2064 (nproc-4)
Replaces my unrecoverable 2001 profile.
3 REPLIES 3
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-20-2005 05:14 AM
09-20-2005 05:14 AM
Re: Two Itanium systems: SSH works one place, fails another
I am not an expert on kernel tuning but your problematic server's kernel parameters looks lie a little too low. If the two servers identical hardware wise, why not bring them up to the same values ?
________________________________
UNIX because I majored in cryptology...
UNIX because I majored in cryptology...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-20-2005 05:24 AM
09-20-2005 05:24 AM
Re: Two Itanium systems: SSH works one place, fails another
I agree with you and we are bringing them into line.. but when I look at what the parameters actually are, I don't see how any of them could cause trouble on what is basically an idle server.
Replaces my unrecoverable 2001 profile.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-20-2005 05:29 AM
09-20-2005 05:29 AM
Re: Two Itanium systems: SSH works one place, fails another
if you have glance installed, did you try peeking at the cpu/memory usage, when firing up the sshd and in time how it perfors as people come in and out using ssh ? As I said, I can not relate any of the parameters to any specific activity on a server which is otherwise idle, but in some cases, smaller than expected values, cause some unforeseen ill effects. I haved lived thru quite few of those, not necessarily ssh related though. If you have a full glance/perfview license, you can analyze the historical data from the start of sshd to the demise of it and see if there is any choking happens on any aspect of the system.
________________________________
UNIX because I majored in cryptology...
UNIX because I majored in cryptology...
The opinions expressed above are the personal opinions of the authors, not of Hewlett Packard Enterprise. By using this site, you accept the Terms of Use and Rules of Participation.
Company
Events and news
Customer resources
© Copyright 2025 Hewlett Packard Enterprise Development LP