- Community Home
- >
- Storage
- >
- HPE Nimble Storage
- >
- Array Setup and Networking
- >
- HP C7000 Recommended 10G Switches
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-09-2014 09:20 AM
11-09-2014 09:20 AM
Hi Folks,
I have customer of mine who currently has the HP C7000 . Each of the blades within the chassis have FCOE cards and we wanted use this to cards to connect to the 10Gbe infrastructure for the SAN.
My questions are the following ?
1. Can we connect the FCOE cards directly to the 10Gbe switches does it require DCB capabilities if we are only using 10Gb functionality ?
2. My suggestion was to purchase 10Gbe switches rather than pass thru modules or virtual connect , any recommendation in terms of 10Gbe switches on the hp side that are compatible with nimble and don't have issues.
Output from the blade server ...
HP 1Gb Ethernet Pass-Thru Module for c-Class BladeSystem
406740-B21
ProLiant BL685c G7
CPU and Memory Information CPU 1 AMD Opteron(TM) Processor 6238 (12 Cores)
CPU 2 AMD Opteron(TM) Processor 6238 (12 Cores)
CPU 3 AMD Opteron(TM) Processor 6238 (12 Cores)
CPU 4 AMD Opteron(TM) Processor 6238 (12 Cores)
Memory 131072 MB
FlexFabric Embedded Ethernet
Ethernet (NIC 1) LOM:1-a 28:92:4A:2C:A4:7A
iSCSI HBA (iSCSI 1) LOM:1-b 28:92:4A:2C:A4:7B
FCoE HBA LOM:1-b 10:00:28:92:4A:2C:A4:7B
Ethernet (NIC 2) LOM:2-a 28:92:4A:2C:A4:7E
iSCSI HBA (iSCSI 2) LOM:2-b 28:92:4A:2C:A4:7F
Regards
Varghese
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-10-2014 12:21 PM
11-10-2014 12:21 PM
SolutionFlexFabric Embedded Ethernet(LOM) is mapped to Interconnect Bays 1 & 2. I assume you are using GbE2c Ethernet Blade Switches, which is a 1Gb switch.
1. Nimble doesn't support FCOE.
2. You would need to buy NC542m/554m mezzanine cards with Virtual Connect Flex-10/10D Interconnect Bays and connect the Flex-10 Interconnect Bays 3 & 4 to a 10Gb switch. We use Brocade VDX6740 Switches for our iSCSI network.