- Community Home
- >
- Servers and Operating Systems
- >
- HPE ProLiant
- >
- ProLiant Servers (ML,DL,SL)
- >
- Re: 1U Server Packing density
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-22-2010 08:15 AM
тАО07-22-2010 08:15 AM
I am now putting my DL380G4ps (x3) in a rack and have a question on the packing density/heat dissipation with these servers.
I had planned on installing them in adjacent slots, i.e, 1 on top of the other but have noticed that there is hardly any clearance between them and they have ventillation holes in the top of the cases.
I have enough room to leave a whole "U" slot between them, but that won't look too neat.
Alternatively, I could leave 1 hole (1/3 U) between them to allow ventillation and look at least a little better.
The rack is not really configured for optimum ventillation, with blanking plates missing etc. (I know, I really should fix it) but I wondered if anyone had any thoughts on mountinig the servers?
I guess they are designed to be mounted with no empty space between them, but they do generate a lot of heat and I thought maybe leaving some extra space might take some load off the fans and/or let them run a little cooler.
Any thoughts ?
regards
Dave
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-22-2010 09:58 AM
тАО07-22-2010 09:58 AM
SolutionI think HP did mean for them to be packed in like sardines, those fans are a bit loud, but they move a lot of air.
I say you can go either way, either far apart or right on top of each other. The most important thing it to get those blanking plates in! (Even plastic or foam boards will work for large gaps.)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-22-2010 10:29 AM
тАО07-22-2010 10:29 AM
Re: 1U Server Packing density
The important thing is providing sufficient cool air to the servers. Are you using ambient, rack-top, or subfloor cooling? The main problem with space between servers is that some of the cold air will dissipate into that space (when uncovered). This can be mitigated with cold air forced up from the floor and down from the top of the rack, but both coolers will still work harder to maintain the temperature than they would if the blank spots weren't there...
The airflow is designed to be lateral only -- warm air should not leak out of the top, but rather be forced out the back. Leaving empty Us or even empty holes is wasting space. Bottom line, rack them together to decrease the overall cooling requirements!
good luck!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-22-2010 04:25 PM
тАО07-22-2010 04:25 PM
Re: 1U Server Packing density
If you pack the servers in adjacent slots, you should be very diligent in your cable management practices, to enable good airflow and easy maintenance. If the cabling you're installing now is intended to stay the same during the entire lifetime of the server, go ahead and pack them tightly.
On the other hand, if these servers are going to require frequent cabling or other hardware configuration changes (we have a few test/lab servers like that), and/or the cable management practices at your site tend to be on the looser side, or the servers just have to have a lot of cables plugged in, then leaving 1U of space between the servers might save you from some frustration in the future. But if you do that, you really should put blanking plates on the empty slots.
Trying to leave just 1/3 U of space between the servers may or may not work, depending on your rack type. But I think it's easier to find 1 U blanking plates than 1/3 U ones.
You should also consider the cooling budget at your site: what's the total amount of heat per rack (in Watts or BTUs) the site's cooling systems can handle? If a fully-populated rack would produce more heat in a single location than the cooling system can handle and there are no plans to expand the cooling system, putting the servers in adjacent slots would enable a hot spot to be created.
If your site's cooling system is working at close to 100% capacity, it might be necessary to make avoidance of hot spots the primary guideline for server placement.
MK
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-25-2010 10:56 PM
тАО07-25-2010 10:56 PM
Re: 1U Server Packing density
Thanks a lot for the very useful and comprehensive answers.
It sounds like there is not too much of an issue with mounting the servers in adjacent slots then, provided the rack cooling and air flow is set up correctly of course. As I said, the rack is missing a few blanking plates, but I will correct that this week before I power the servers up in the rack.
Saberus, I know what you mean about the fan noise! They are LOUD - I set the servers up on the desk in my office and had to stick some ear plugs in if I was going to preserve my hearing. Those fans certainly move the air through the box, but what a racket!
Unfortunately WFH-WI, I am stuck with ambient cooling. Forced cooling would obviously be preferable, but not an option for me I'm afraid. As you say, running the servers on my desk, there was no appreciable leakage out of the ventilation holes at the top of the case, so I am less worried than I was about blocking them off. There does seem to be about 3-5mm clearance between the servers when they are racked in adjacent slots, but there does seem to be very little purpose for those holes.
Matti, thanks - yes, cable management is a concern. Hopefully, though, once the servers are up, they should not be subject to much disturbance so I will concentrate on the air flow, rather than the maintenance aspects. Worst case, I'd just have to disconnect the cables before sliding the servers out. I can't really envisage a scenario where I'd want to keep them powered up/connected when slid out on the rails.
If this becomes a problem, then I will go back to the idea of leaving 1U (with blanking plates) between them. You were right about the 1/3U gaps too - I was really just thinking out loud when I mentioned it. I had not realised until I looked closer that the rack holes are not uniform over the 42U height, they are in groups of 3 per U with the space between some of the holes being different so leaving a 1/3U space would not work - the rails need to be fitted in the right 3 holes per U slot.
Anyway, thanks again for the help - off to find those blanking plates now !
Regards
Dave