- Community Home
- >
- Storage
- >
- HPE Nimble Storage
- >
- Array Setup and Networking
- >
- Nimble Network - 2 isolated switches
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-23-2014 09:13 AM
тАО01-23-2014 09:13 AM
Is anyone running a configuration where each Nimble 10G connection goes into an isolated switch?
Most hosts are ESX.
Would I have a software initiator for each "fabric"?
or a single software initiatior with each vmkernel interface? (Each interface would end up being on a separate subnet/network)
Brian
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-24-2014 02:10 PM
тАО01-24-2014 02:10 PM
SolutionYa we have two isolated switches & networks for the iSCSI traffic.
Your ESXi hosts can have only a single Software Initiator, but you need to create two separate virtual networks (unless your splitting things up with VLANS).
Here is a screen shot from vCenter.
2 separate virtual switches (note that I setup the vSwitch's for local vm iSCSI traffic too)
Here is a rough logical wiring diagram I made up for our networks way back.
Let me know if you have any other questions.
тАаShawn
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-10-2020 12:27 AM
тАО01-10-2020 12:27 AM
Re: Nimble Network - 2 isolated switches
I have experimented with 2 isolated switches and discovered issues. For example:
ControllerA, eth1a to switch 1
ControllerA, eth1b to switch 2
ControllerB, eth1a to switch 1
ControllerB, eth1b to switch 2
Tried Both Same subnet & separate subnet configurations.
If I connect a single pc in the same subnet in to switch 1 I can ping the active controller Management IP and connect to the web admin.
If I connect to switch 2 I cannot ping nor access the management admin.
Is this normal behaviour? If so, at what point or scenario would access be available on switch 2.
I appreciate under normal circumstances that 2 NICs would be used on the esxi host (which I will be adding after failover testing) but that still does not explain the loss of communication. I was under the impression that load balancing would occur over both controller Ethernet management ports. The same example is also identical with iSCSI data configuration.
Any ideas?
Many thanks
Chris
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-10-2020 05:51 AM
тАО01-10-2020 05:51 AM
Re: Nimble Network - 2 isolated switches
Very normal.
The Nimble storage runs in an active-standby mode. One controller is active, the other on standby. The Management IP address is virtual, and stays with the currently active controller. In the event of initiating a switchover, or something happens to switch1 and connectivity is lost, the the other controller takes over and the management IP moves.
As far as iSCSI traffic, again, any IP addresses are also virtual, again staying with the currently active controller. The host's multiathing will use iSCSI paths as needed. For Best Practice, get the Nimble Connection Manager for the OS and install on every host. Expect a reboot to be needed after NCM is installed. That will set host parameters correctly, and monitor the host's traffic controlling paths to the active Nimble controller as needed.
Each time you go to the Hardware page, it will do a systems check to see if the other controller can be made active. If there is some problem, a banner will appear at the top of the page, in which case contact Nimble Support to resolve the issue.
Note: While I am an HPE Employee, all of my comments (whether noted or not), are my own and are not any official representation of the company
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-10-2020 06:43 AM - edited тАО01-10-2020 06:45 AM
тАО01-10-2020 06:43 AM - edited тАО01-10-2020 06:45 AM
Re: Nimble Network - 2 isolated switches
I understand that the SAN controllers are in active/standby configuration and only one is active at any one time.
My confusion is over the active controller and the 2 Ethernet management ports connected to 2 separate switches. The management IP can only be pinged from 1 isolated switch.
If I turn off the switch that allows pings to the management IP and connect to the other switch itтАЩs now able to ping. This tells he that the management isnтАЩt running in a load balanced/mirrored is the documentation dictates?
Many thanks
Chris
If I turn off the
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-10-2020 07:57 AM
тАО01-10-2020 07:57 AM
Re: Nimble Network - 2 isolated switches
Hello,
this doesn't sound right, especially with iSCSI data - as you say.
If your switches are not stacked, you should not operate in a single subnet environment. You must create separate subnets - one for each switch - in order for the IP addresses to be connected.
For example:
10.10.10.x - NIC 1 (server) - Switch 1 - eth0a (on both controller A & B)
10.10.20.x - NIC 2 (server) - Switch 2 - eth0b (on both controller A & B).
You must also have Nimble Connection Manager installed on your hosts.
In the above configuration, you will be able to ping 10.10.10.x subnet & iSCSI discovery address only through NIC 1. NIC 2 will only be able to ping and connect to those over 10.10.20.x
If you get stuck, I highly recommend you engage with Nimble Support.
twitter: @nick_dyer_