- Community Home
- >
- Storage
- >
- HPE Nimble Storage
- >
- HPE Nimble Storage Solution Specialists
- >
- vVol based VM's disks are connected simultaneously...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 11:12 AM - last edited on 06-09-2022 08:43 PM by support_s
06-09-2022 11:12 AM - last edited on 06-09-2022 08:43 PM by support_s
vVol based VM's disks are connected simultaneously from 2 different hosts
Hello Everyone,
This is my very first post regarding Nimble storage array's under this forum, so apologiez if I post it under wrong location.
We have a 4 node cluster:
- ESXi 7U2
- iSCSI based connection to vVol
- All best practices is followed as per VMware documentation for iSCSI connection and port binding
Storage:
- HPE HF20 on 5.2.1.700
- It hosts mixture of VMFS and vVol
- Storage is working Active/Standby
Issue we are currently facing:
Some VM's vmdk are getting connected (not accessed) from 2 or sometimes 3 hosts simultaneously (This is visible only from SAN web gui) We followed HPE Nimble support and VMware advise to turn off "iSCSIunsupportedblockandpages" which helped to fixed some issues to certain extend, but did not the problem mentioned in Subject.
(What we originally were facing was, no storage API were able to get executed against some VMs such as vMotion or Snapshot. Those VMs showed 0B in space; that issue is gone by the change above, but still some VMs are getting accessed by 2 or more hosts)
We already opened couple tickets for 2 locations showing similar symtoms, but to this date it is not fixed
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 12:13 PM
06-09-2022 12:13 PM
Query: vVol based VM's disks are accessed simultaneously from 2 different hosts
System recommended content:
1. HPE Serviceguard for Linux with VMware virtual machines
Please click on "Thumbs Up/Kudo" icon to give a "Kudo".
Thank you for being a HPE valuable community member.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 02:11 PM
06-09-2022 02:11 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
You mentioned port binding. Do you have any active port binding? How many iSCSI vmknics do you have, and are they on the same L3?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 02:15 PM
06-09-2022 02:15 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
Re: subject of this post. When a VM is live-migrating from one host to another, you will see the disks have connections from both hosts for a brief period of time.
Is the vvol datastore still reporting size as zero bytes in vCenter UI?
Do you have an active case open with Nimble Support?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 02:21 PM
06-09-2022 02:21 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
- vCenter UI reporting size is fixed by implementing the unofficial change related to ESXi 7 (Disabling iSCSIUnsupportedBlockandPages"
- It is not brief connection to 2 or more hosts, but continously
- binding is done following best practices and it is on L2, no routing is happening in between
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 02:29 PM
06-09-2022 02:29 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
There are multiple (more than 1) vmknics, and they are on different L3 networks, and iSCSI port-binding is enabled?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 02:52 PM
06-09-2022 02:52 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
iSCSI network:
- 2 vmnic (physical ports)
- 2 VMK each under seperate PG
- Each PG has static load balancing Active/Inactive to vmnicN & vmnicN+1
Port binding:
- Port binding is set to use both iSCSI VMKs
- Both ESXi iSCSI initiator and iSCSI traffic on SAN are on layer 2 (no Layer3)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2022 09:22 PM
06-09-2022 09:22 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
The IP subnets of the bound vmknics: are they same or different?
I'm very sorry for asking this question again, but I do not understand it yet from your answers.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2022 07:13 AM
06-10-2022 07:13 AM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
No problem.
But IP subnet from all VMNICs assocated with iSCSI VMKs to SAN iSCSI traffic itself are all same (Which means working on Layer 2)
Layer 3 routing is involved, so when I said layer 2 I meant all subnets are the same
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-13-2022 12:12 PM
06-13-2022 12:12 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
What we need to know is the IPs of the vmknics and their netmasks => are they different L3s?
You do not have to share your exact IPs but let me know please. e.g.:
* vmk1 (iSCSI1): A.B.1.2/16
* vmk2 (iSCSI2): A.B.3.4/16
Here, vmk1 and vmk2 are on DIFFERENT L3.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-13-2022 01:21 PM
06-13-2022 01:21 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
That is exactly what I said I think
All VMKs are sharing same IP:
VMK1 (iSCSI-1 PG) = A.B.C.15/24
VMK1 (iSCSI-2 PG) = A.B.C.16/24
SAN IP range (Nimble managed) = A.B.C.20/24 - A.B.C.28/24
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-13-2022 02:03 PM
06-13-2022 02:03 PM
Re: vVol based VM's disks are connected simultaneously from 2 different hosts
Thanks for confirming. Let's look at a little more debugging. Please let me know which ones fail.
* Run "esxcli storage vvol storagecontainer list" on all ESXi and make sure the Size(MB) shows up as the right expected number, and "Accessible: true" for all vvol datastores
* Run "esxcli storage vvol protocolendpoint list" and make sure a device is listed as "eui.something" and shows "Accessible: true" and "Configured: true".
* Note the eui ID of the device and run this command "esxcfg-mpath -b -d eui.something" => note how many paths you are seeing. If you have NCM installed, you should see 2 paths per vmknic.
* If you are able to ssh to your nimble (user: admin), we can try a few more:
* vm --list: this will show you all your vvol VMs. Pick one that has this problem with multiple connections
* vm --info <vvol-vm-name>: this will show all volumes for this VM. Pick the volume name that ends with VMDK
* vol --info <vvol-vmdk>: Check "Access Control List:" Only one "Initiator Group:" should be listed in this section. Check "Connected Initiators:" and make IQNs of the same host show up here.