- Community Home
- >
- Software
- >
- HPE Morpheus Software
- >
- HPE Morpheus VM Essentials
- >
- Re: GFS2 and Failover of a VM if one Host of the 3...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Friday - last edited 2 hours ago by support_s
Friday - last edited 2 hours ago by support_s
GFS2 and Failover of a VM if one Host of the 3 Hosts is lost.
Hi All,
Currently testing 3 node hpe-vm version 8.10 with iscsi Storage - for true transparency using a True Nas Hyperconvgered on Node3.
I am able to mount the GFS2 Datastore:
Node1 - has morpheus VM runing on local storage
Node2 - has test windows11 VM
Node3 - has true NAS VM providing iSCSI storage to all 3 NODES for GFS2 Datastore
I am able to move the VM - windows 11 VM from Node to Node
However when I hard power of my ProLiant DL360 Gen10 - Node2
VM does not failover and the following exception is thrown on the Morpheus node.
Question: Can an HPE_VM cluster with 3 nodes using GFS2 Datastore survive the loss of one node?
2025-11-28_11:47:53.31429 'Exception in thread "Thread-7773" org.apache.guacamole.GuacamoleUpstreamTimeoutException: Connection to guacd timed out.
2025-11-28_11:48:08.33097 at org.apache.guacamole.io.ReaderGuacamoleReader.read(ReaderGuacamoleReader.java:180)
2025-11-28_11:48:08.33097 at org.apache.guacamole.io.ReaderGuacamoleReader.readInstruction(ReaderGuacamoleReader.java:195)
2025-11-28_11:48:08.33098 at org.apache.guacamole.protocol.FilteredGuacamoleReader.readInstruction(FilteredGuacamoleReader.java:80)
2025-11-28_11:48:08.33098 at org.apache.guacamole.protocol.FilteredGuacamoleReader.readInstruction(FilteredGuacamoleReader.java:80)
2025-11-28_11:48:08.33098 at org.apache.guacamole.protocol.FilteredGuacamoleReader.read(FilteredGuacamoleReader.java:63)
2025-11-28_11:48:08.33099 at com.morpheus.remote.MorpheusGuacamoleWebsocketHandler.relayGuacamoleMessages(MorpheusGuacamoleWebsocketHandler.groovy:175)
2025-11-28_11:48:08.33099 at jdk.internal.reflect.GeneratedMethodAccessor9266.invoke(Unknown Source)
2025-11-28_11:48:08.33100 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-11-28_11:48:08.33100 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-11-28_11:48:08.33100 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-11-28_11:48:08.33100 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-11-28_11:48:08.33100 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:362)
2025-11-28_11:48:08.33101 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:61)
2025-11-28_11:48:08.33102 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:194)
2025-11-28_11:48:08.33102 at com.morpheus.remote.MorpheusGuacamoleWebsocketHandler$_afterConnectionEstablished_closure1.doCall(MorpheusGuacamoleWebsocketHandler.groovy:111)
2025-11-28_11:48:08.33102 at com.morpheus.remote.MorpheusGuacamoleWebsocketHandler$_afterConnectionEstablished_closure1.doCall(MorpheusGuacamoleWebsocketHandler.groovy)
2025-11-28_11:48:08.33102 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2025-11-28_11:48:08.33102 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
2025-11-28_11:48:08.33103 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-11-28_11:48:08.33103 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-11-28_11:48:08.33103 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-11-28_11:48:08.33103 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-11-28_11:48:08.33103 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-11-28_11:48:08.33104 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-11-28_11:48:08.33105 at groovy.lang.Closure.call(Closure.java:427)
2025-11-28_11:48:08.33105 at groovy.lang.Closure.call(Closure.java:406)
2025-11-28_11:48:08.33105 at groovy.lang.Closure.run(Closure.java:498)
2025-11-28_11:48:08.33105 at java.base/java.lang.Thread.run(Unknown Source)
2025-11-28_11:48:08.33105 Caused by: java.net.SocketTimeoutException: Read timed out
2025-11-28_11:48:08.33105 at java.base/sun.nio.ch.NioSocketImpl.timedRead(Unknown Source)
2025-11-28_11:48:08.33106 at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
2025-11-28_11:48:08.33106 at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
2025-11-28_11:48:08.33106 at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
2025-11-28_11:48:08.33106 at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
2025-11-28_11:48:08.33106 at java.base/sun.nio.cs.StreamDecoder.readBytes(Unknown Source)
2025-11-28_11:48:08.33107 at java.base/sun.nio.cs.StreamDecoder.implRead(Unknown Source)
2025-11-28_11:48:08.33107 at java.base/sun.nio.cs.StreamDecoder.read(Unknown Source)
2025-11-28_11:48:08.33107 at java.base/java.io.InputStreamReader.read(Unknown Source)
2025-11-28_11:48:08.33107 at org.apache.guacamole.io.ReaderGuacamoleReader.read(ReaderGuacamoleReader.java:169)
2025-11-28_11:48:08.33108 ... 27 more
2025-11-28_11:48:09.66597 '[2025-11-28 11:48:09,665] [http-nio-127.0.0.1-8080-exec-60] [34mINFO [0;39m [36mc.m.ApplianceInterceptor[0;39m - ERROR /remote/socket org.springframework.dao.DataAccessResourceFailureException: Could not obtain current Hibernate Session; nested exception is org.hibernate.HibernateException: No Session found for current thread
2025-11-28_11:48:09.66599 '[leaseToken=, clientOs=Windows, containerId=, consoleMode=hypervisor, consoleKeymap=en-us-qwerty, serverId=15, allocationId=, remoteApp=, GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042, GUAC_AUDIO=audio/L8, GUAC_AUDIO=audio/L16, GUAC_WIDTH=1024, GUAC_HEIGHT=768, GUAC_IMAGE=image/jpeg, GUAC_IMAGE=image/png, GUAC_IMAGE=image/webp]
2025-11-28_11:48:09.66761 '[2025-11-28 11:48:09,667] [http-nio-127.0.0.1-8080-exec-60] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Connection established: StandardWebSocketSession[id=7024e529-41c1-c6ff-3abe-3989c9bb423d, uri=ws://10.10.80.37/remote/socket?leaseToken=&clientOs=Windows&containerId=&consoleMode=hypervisor&consoleKeymap=en-us-qwerty&serverId=15&allocationId=&remoteApp=&GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042&GUAC_AUDIO=audio%2FL8&GUAC_AUDIO=audio%2FL16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image%2Fjpeg&GUAC_IMAGE=image%2Fpng&GUAC_IMAGE=image%2Fwebp]: 1
2025-11-28_11:48:09.66763 ''[2025-11-28 11:48:09,667] [http-nio-127.0.0.1-8080-exec-60] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Instantiating a Guacamole Remote Tunnel [requestQuery:leaseToken=&clientOs=Windows&containerId=&consoleMode=hypervisor&consoleKeymap=en-us-qwerty&serverId=15&allocationId=&remoteApp=&GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042&GUAC_AUDIO=audio/L8&GUAC_AUDIO=audio/L16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image/jpeg&GUAC_IMAGE=image/png&GUAC_IMAGE=image/webp, GUAC_ID:56767fcf-4d72-4f9e-9dcc-0bc6cc048042, consoleMode:hypervisor, GUAC_IMAGE:[image/jpeg, image/png, image/webp], clientOs:Windows, GUAC_AUDIO:[audio/L8, audio/L16], GUAC_WIDTH:1024, consoleKeymap:en-us-qwerty, GUAC_HEIGHT:768, serverId:15] - localhost - 4822
2025-11-28_11:48:09.66769 ''[2025-11-28 11:48:09,669] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Checking zone list user can view
2025-11-28_11:48:09.66965 ''[2025-11-28 11:48:09,683] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Console Mode: hypervisor null
2025-11-28_11:48:09.68318 'SSL Tunnel Established: 24269
2025-11-28_11:48:12.76103 Hostname: 127.0.0.1
2025-11-28_11:48:12.76104 '[2025-11-28 11:48:12,760] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - setting keymap to: en-us-qwerty
2025-11-28_11:48:12.76104 ''[2025-11-28 11:48:12,761] [http-nio-127.0.0.1-8080-exec-60] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Setting up tunnel
2025-11-28_11:48:12.76128 ''[2025-11-28 11:48:12,768] [http-nio-127.0.0.1-8080-exec-60] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Returning tunnel!
2025-11-28_11:48:12.76855 ''[2025-11-28 11:48:12,774] [Thread-7776] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Guacamole Session Closed.
2025-11-28_11:48:12.77502 ''[2025-11-28 11:48:33,740] [http-nio-127.0.0.1-8080-exec-53] [34mINFO [0;39m [36mc.m.ApplianceInterceptor[0;39m - ERROR /remote/socket org.springframework.dao.DataAccessResourceFailureException: Could not obtain current Hibernate Session; nested exception is org.hibernate.HibernateException: No Session found for current thread
2025-11-28_11:48:33.74111 '[leaseToken=, clientOs=Windows, containerId=, consoleMode=hypervisor, consoleKeymap=en-us-qwerty, serverId=15, allocationId=, remoteApp=, GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042, GUAC_AUDIO=audio/L8, GUAC_AUDIO=audio/L16, GUAC_WIDTH=1024, GUAC_HEIGHT=768, GUAC_IMAGE=image/jpeg, GUAC_IMAGE=image/png, GUAC_IMAGE=image/webp]
2025-11-28_11:48:33.74342 '[2025-11-28 11:48:33,742] [http-nio-127.0.0.1-8080-exec-53] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Connection established: StandardWebSocketSession[id=4a8394d0-ace2-58aa-bed8-8cfdc02c3521, uri=ws://10.10.80.37/remote/socket?leaseToken=&clientOs=Windows&containerId=&consoleMode=hypervisor&consoleKeymap=en-us-qwerty&serverId=15&allocationId=&remoteApp=&GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042&GUAC_AUDIO=audio%2FL8&GUAC_AUDIO=audio%2FL16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image%2Fjpeg&GUAC_IMAGE=image%2Fpng&GUAC_IMAGE=image%2Fwebp]: 1
2025-11-28_11:48:33.74350 ''[2025-11-28 11:48:33,742] [http-nio-127.0.0.1-8080-exec-53] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Instantiating a Guacamole Remote Tunnel [requestQuery:leaseToken=&clientOs=Windows&containerId=&consoleMode=hypervisor&consoleKeymap=en-us-qwerty&serverId=15&allocationId=&remoteApp=&GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042&GUAC_AUDIO=audio/L8&GUAC_AUDIO=audio/L16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image/jpeg&GUAC_IMAGE=image/png&GUAC_IMAGE=image/webp, GUAC_ID:56767fcf-4d72-4f9e-9dcc-0bc6cc048042, consoleMode:hypervisor, GUAC_IMAGE:[image/jpeg, image/png, image/webp], clientOs:Windows, GUAC_AUDIO:[audio/L8, audio/L16], GUAC_WIDTH:1024, consoleKeymap:en-us-qwerty, GUAC_HEIGHT:768, serverId:15] - localhost - 4822
2025-11-28_11:48:33.74353 ''[2025-11-28 11:48:33,744] [RxCachedThreadScheduler-18] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Checking zone list user can view
2025-11-28_11:48:33.74447 ''[2025-11-28 11:48:33,757] [RxCachedThreadScheduler-18] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Console Mode: hypervisor null
2025-11-28_11:48:33.75797 'SSL Tunnel Established: 29914
2025-11-28_11:48:36.82507 Hostname: 127.0.0.1
2025-11-28_11:48:36.82508 '[2025-11-28 11:48:36,825] [RxCachedThreadScheduler-18] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - setting keymap to: en-us-qwerty
2025-11-28_11:48:36.82508 ''[2025-11-28 11:48:36,825] [http-nio-127.0.0.1-8080-exec-53] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Setting up tunnel
2025-11-28_11:48:36.82527 ''[2025-11-28 11:48:36,832] [http-nio-127.0.0.1-8080-exec-53] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Returning tunnel!
2025-11-28_11:48:36.83265 ''[2025-11-28 11:48:36,839] [Thread-7780] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Guacamole Session Closed.
2025-11-28_11:48:36.83918 ''[2025-11-28 11:49:02,666] [http-nio-127.0.0.1-8080-exec-59] [34mINFO [0;39m [36mc.m.ApplianceInterceptor[0;39m - ERROR /remote/socket org.springframework.dao.DataAccessResourceFailureException: Could not obtain current Hibernate Session; nested exception is org.hibernate.HibernateException: No Session found for current thread
2025-11-28_11:49:02.66683 '[leaseToken=, clientOs=Windows, containerId=, consoleMode=hypervisor, consoleKeymap=en-us-qwerty, serverId=15, allocationId=, remoteApp=, GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042, GUAC_AUDIO=audio/L8, GUAC_AUDIO=audio/L16, GUAC_WIDTH=1024, GUAC_HEIGHT=768, GUAC_IMAGE=image/jpeg, GUAC_IMAGE=image/png, GUAC_IMAGE=image/webp]
2025-11-28_11:49:02.66865 '[2025-11-28 11:49:02,668] [http-nio-127.0.0.1-8080-exec-59] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Connection established: StandardWebSocketSession[id=8d581b21-3761-0641-3834-c9e4b1cc0956, uri=ws://10.10.80.37/remote/socket?leaseToken=&clientOs=Windows&containerId=&consoleMode=hypervisor&consoleKeymap=en-us-qwerty&serverId=15&allocationId=&remoteApp=&GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042&GUAC_AUDIO=audio%2FL8&GUAC_AUDIO=audio%2FL16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image%2Fjpeg&GUAC_IMAGE=image%2Fpng&GUAC_IMAGE=image%2Fwebp]: 1
2025-11-28_11:49:02.66867 ''[2025-11-28 11:49:02,668] [http-nio-127.0.0.1-8080-exec-59] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Instantiating a Guacamole Remote Tunnel [requestQuery:leaseToken=&clientOs=Windows&containerId=&consoleMode=hypervisor&consoleKeymap=en-us-qwerty&serverId=15&allocationId=&remoteApp=&GUAC_ID=56767fcf-4d72-4f9e-9dcc-0bc6cc048042&GUAC_AUDIO=audio/L8&GUAC_AUDIO=audio/L16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image/jpeg&GUAC_IMAGE=image/png&GUAC_IMAGE=image/webp, GUAC_ID:56767fcf-4d72-4f9e-9dcc-0bc6cc048042, consoleMode:hypervisor, GUAC_IMAGE:[image/jpeg, image/png, image/webp], clientOs:Windows, GUAC_AUDIO:[audio/L8, audio/L16], GUAC_WIDTH:1024, consoleKeymap:en-us-qwerty, GUAC_HEIGHT:768, serverId:15] - localhost - 4822
2025-11-28_11:49:02.66870 ''[2025-11-28 11:49:02,670] [RxCachedThreadScheduler-19] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Checking zone list user can view
2025-11-28_11:49:02.67070 ''[2025-11-28 11:49:02,684] [RxCachedThreadScheduler-19] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Console Mode: hypervisor null
2025-11-28_11:49:02.68414 'SSL Tunnel Established: 32705
2025-11-28_11:49:05.75314 Hostname: 127.0.0.1
2025-11-28_11:49:05.75316 '[2025-11-28 11:49:05,753] [RxCachedThreadScheduler-19] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - setting keymap to: en-us-qwerty
2025-11-28_11:49:05.75316 ''[2025-11-28 11:49:05,753] [http-nio-127.0.0.1-8080-exec-59] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Setting up tunnel
2025-11-28_11:49:05.75356 ''[2025-11-28 11:49:05,760] [http-nio-127.0.0.1-8080-exec-59] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Returning tunnel!
2025-11-28_11:49:05.76130 ''[2025-11-28 11:49:05,768] [Thread-7783] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Guacamole Session Closed. Nodes 1 and Nodes3 through this period are still able to browse and access the storage at this time.
When I power back Node2 the cluster recovers but the VM has gone - no record of it on the server
Nov 28 10:28:41 ih-hpenode2 libvirtd[3097]: Unable to get XATTR trusted.libvirt.security.ref_dac on /mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/ih-wintest/hvm_8-disk-0: No such file or directory
Nov 28 10:28:41 ih-hpenode2 libvirtd[3097]: Unable to remove disk metadata on vm ih-wintest from /mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/ih-wintest/hvm_8-disk-0 (disk target vda)
Nov 28 11:15:33 ih-hpenode2 libvirtd[2990]: Domain id=1 name='ih-wintest' uuid=5568a3d8-dc41-4fa8-97f3-364cc5ab22b1 is tainted: custom-argv
Nov 28 11:31:08 ih-hpenode2 libvirtd[2990]: Domain id=2 name='ih-wintest' uuid=5568a3d8-dc41-4fa8-97f3-364cc5ab22b1 is tainted: custom-argv
Nov 28 11:31:08 ih-hpenode2 libvirtd[2990]: Domain id=2 name='ih-wintest' uuid=5568a3d8-dc41-4fa8-97f3-364cc5ab22b1 is tainted: host-cpu
root@ih-hpenode2:/home/smadmin# systemctl status libvirtd
● libvirtd.service - libvirt legacy monolithic daemon
Loaded: loaded (/usr/lib/systemd/system/libvirtd.service; enabled; preset: enabled)
Active: active (running) since Fri 2025-11-28 12:26:48 UTC; 54min ago
TriggeredBy: ● libvirtd-ro.socket
● libvirtd-admin.socket
● libvirtd.socket
Docs: man:libvirtd(8)
https://libvirt.org/
Main PID: 3313 (libvirtd)
Tasks: 20 (limit: 32768)
Memory: 36.0M (peak: 38.7M)
CPU: 5.773s
CGroup: /system.slice/libvirtd.service
└─3313 /usr/sbin/libvirtd
Nov 28 12:26:48 ih-hpenode2 systemd[1]: Starting libvirtd.service - libvirt legacy monolithic daemon...
Nov 28 12:26:48 ih-hpenode2 systemd[1]: Started libvirtd.service - libvirt legacy monolithic daemon.
Nov 28 12:26:48 ih-hpenode2 libvirtd[3313]: libvirt version: 10.0.0, package: 10.0.0-2ubuntu8.9 (Ubuntu)
Nov 28 12:26:48 ih-hpenode2 libvirtd[3313]: hostname: ih-hpenode2
Nov 28 12:26:48 ih-hpenode2 libvirtd[3313]: internal error: Cannot find start time for pid 3133
Nov 28 12:26:48 ih-hpenode2 libvirtd[3313]: internal error: Cannot find start time for pid 3162unable to move or startup VM
2025-11-28_13:16:22.27235 , error:error: failed to get domain 'ih-wintest'
2025-11-28_13:16:22.27235 , exitValue:1, errorData:error: failed to get domain 'ih-wintest'
2025-11-28_13:16:22.27236 ]
2025-11-28_13:16:22.27236 ''[2025-11-28 13:16:22,272] [RxCachedThreadScheduler-8] INFO c.m.p.KvmProvisionService - Move Results! [success:false, error:error: failed to get domain 'ih-wintest'
2025-11-28_13:16:22.27245 ]
2025-11-28_13:16:22.27246 ''[2025-11-28 13:16:22,275] [RxCachedThreadScheduler-8] INFO c.m.p.KvmProvisionService - Applying Placement Strategy: ih-wintest - auto
2025-11-28_13:16:22.27522 ''[2025-11-28 13:16:22,310] [RxCachedThreadScheduler-8] INFO c.m.c.KvmComputeUtility - Metadata Results: [success:false, command:sudo virsh metadata "ih-wintest" "https://www.morpheusdata.com" --config --key "morpheus" --set "<metadata><placement>auto</placement><local-storage>false</local-storage><system-vm>false</system-vm></metadata>", data:
2025-11-28_13:16:22.31018 , error:error: failed to get domain 'ih-wintest'
2025-11-28_13:16:22.31018 , exitValue:1, errorData:error: failed to get domain 'ih-wintest'
2025-11-28_13:16:22.31018 ]
2025-11-28_13:16:22.31018 ''[2025-11-28 13:16:22,340] [RxCachedThreadScheduler-8] INFO c.m.p.KvmProvisionService - Set Metadata Results: [success:false]
2025-11-28_13:16:22.34024 ''[2025-11-28 13:16:22,351] [RxCachedThreadScheduler-8] INFO c.m.ProcessService - updateProcess: 65 ih-wintest - running - Move Server
2025-11-28_13:16:22.35190 ''[2025-11-28 13:16:22,353] [RxCachedThreadScheduler-8] INFO c.m.ProcessService - Process Updated: 65 - ih-wintest - running - Move Server
2025-11-28_13:16:22.35373 ''[2025-11-28 13:16:22,354] [RxCachedThreadScheduler-8] INFO c.m.ProcessService - Process Updated: 65 - ih-wintest - running - Move Server
2025-11-28_13:16:22.35479 ''[2025-11-28 13:16:32,236] [appJobLow-16] INFO c.m.h.KvmBaseHostService - Cached Hypervisor Devices for ih-hpenode3 in 24939ms
2025-11-28_13:16:32.23663 ''[2025-11-28 13:16:32,642] [appJobLow-16] INFO c.m.h.KvmBaseHostService - Cached Datastores for ih-hpenode3 in 406ms
2025-11-28_13:16:32.64290 ''[2025-11-28 13:16:33,142] [appJobLow-16] INFO c.m.h.KvmBaseHostService - Cached Routers for ih-hpenode3 in 500ms
2025-11-28_13:16:33.14301 ''[2025-11-28 13:16:33,643] [appJobLow-16] INFO c.m.h.KvmBaseHostService - Cached Networks for ih-hpenode3 in 501ms
2025-11-28_13:16:33.64391 ''[2025-11-28 13:16:33,646] [appJobLow-16] INFO c.m.h.KvmBaseHostService - Cached Hypervisor ih-hpenode3 in 27359ms
2025-11-28_13:16:33.64628 ''[2025-11-28 13:16:34,475] [appJobLow-16] INFO c.m.h.KvmBaseHostService - Cached Cluster Virtual Machines in 829ms
2025-11-28_13:16:34.47649 ''[2025-11-28 13:17:05,100] [http-nio-127.0.0.1-8080-exec-38] INFO c.m.a.ServersController - servicePlans: [zoneId:1, serverId:15, _:1764333863445, controller:servers, action:servicePlans, id:49
Disks are still accessible but cannot power it on in virsh (dissapeared)
root@ih-hpenode2:/home/smadmin# ls -l /mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/ih-wintest/
total 28021672
-rw-r--r-- 1 libvirt-qemu kvm 31919177728 Nov 28 11:36 hvm_8-disk-0
root@ih-hpenode2:/home/smadmin#
root@ih-hpenode2:/home/smadmin# virsh list --all
Id Name State
-----------------------------
- ih-hpevsa2 shut off
no ih-wintest - GONE!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Friday
Friday
Re: GFS2 and Failover of a VM if one Host of the 3 Hosts is lost.
Did I miss something in the documenation as when deploying the VM to the GFS2 volume it seems that only the Disk is stored on the GFS2 Volume but the .XML file is stored on the local Server?
root@ih-hpenode2:/etc/libvirt/qemu# ls -l
total 36
drwxr-xr-x 2 root root 4096 Nov 26 15:05 autostart
-rw------- 1 root root 12754 Nov 13 18:17 ih-hpevsa2.xml
-rw------- 1 root root 11909 Nov 28 14:39 ih-test.xml
drwxr-xr-x 3 root root 4096 Nov 13 16:20 networks
root@ih-hpenode2:/etc/libvirt/qemu#
How is this expected to work in a failover scenario if this HOST is suddnely powered off? Is there some setting I am missing where the .xml file should be stored in the GFS2 volume as this does not seem to happen automatically when deploying the instance and selecting the GFS2 Datastore?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sunday - last edited Sunday
Sunday - last edited Sunday
Re: GFS2 and Failover of a VM if one Host of the 3 Hosts is lost.
Hello I123Habc,
have you enabled the HA Heartbeat inside of the GFS2 Datastore? Otherwise the nothing will going to happen. Once it is enabled(it might take up to 1 minute) you should see inside of the GFS2 datastore a subfolder called mvm-hb. In them there are some subfolders where you will see they have the names of the hosts and inside you will see the XML files of the VMs that are powered on. They are taken over in case of failover and yes, it should handle the one node failover.
With regards
I work at HPE
HPE Support Center offers support for your HPE services and products when and how you need it. Get started with HPE Support Center today.
[Any personal opinions expressed are mine, and not official statements on behalf of Hewlett Packard Enterprise]
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
yesterday
Re: GFS2 and Failover of a VM if one Host of the 3 Hosts is lost.
Hi Peter,
I am embarressed to say that I had glossed over this glaringly obvious setting! Thank you for the response and will test this and I can confirm that as you stated the folder has been created.
Many Thanks,
Ian
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
yesterday
Re: GFS2 and Failover of a VM if one Host of the 3 Hosts is lost.
Hi Peter,
Not sure if you can point me in the right direction maybe I have missed something but even with the heartbeat enabeld and the two remaining hosts being able to access the Storage and the agent installed on the Windows11 VM and it is able to migrate between the hosts the failover fails,
root@ih-hpenode1:/etc/libvirt/qemu# ls -l /mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/
total 8
drwxr-xr-x 2 root root 3864 Nov 28 15:36 ih-test
drwxr-xr-x 3 root root 3864 Dec 1 10:40 mvm-hb
root@ih-hpenode1:/etc/libvirt/qemu# pcs resource
* Clone Set: dlm-clone [dlm]:
* Started: [ ih-hpenode1 ih-hpenode3 ]
* Stopped: [ ih-hpenode2 ]
* Clone Set: storage_c7d58-clone [storage_c7d58]:
* Started: [ ih-hpenode1 ih-hpenode3 ]
* Stopped: [ ih-hpenode2 ]This is the error that was in the Morpheus VM when I powered off Node2
2025-12-01_15:17:04.10782 ''[2025-12-01 15:17:04,749] [RxCachedThreadScheduler-3] [34mINFO [0;39m [36mc.m.h.KvmBaseHostService[0;39m - Undefining VM: ih-test - bd98171a-f630-45d1-a901-8c26cd49d52f due to duplicate detection -- [[id:13, name:ih-test, state:running, uuid:bd98171a-f630-45d1-a901-8c26cd49d52f, metadata:[metadata:[placement:auto, local-storage:true, system-vm:false]], maxMemory:[unit:KiB, slots:8, value:527973376], memory:[unit:KiB, value:8388608], currentMemory:[unit:KiB, value:8388608], memoryBacking:[source:[type:memfd]], vcpu:[current:2, placement:static, value:32], iothreads:1, resource:[partition:/machine], sysinfo:[system:[entry:[[name:manufacturer, value:Morpheus], [name:product, value:MVM], [name:serial, value:BD98171A-F630-45D1-A901-8C26CD49D52F]]], type:smbios], os:[type:[machine:pc-q35-8.2, arch:x86_64, value:hvm], bootmenu:[enable:no], smbios:[mode:sysinfo]], features:[acpi:, apic:[eoi:on], hyperv:[relaxed:[state:on], vapic:[state:on], spinlocks:[retries:8191, state:on], vpindex:[state:on], synic:[state:on], stimer:[state:on], mode:custom]], cpu:[feature:[[name:vmx, policy:disable], [name:x2apic, policy:disable]], numa:[cell:[unit:KiB, memory:8388608, memAccess:shared, cpus:0-31, id:0]], mode:host-passthrough, migratable:on, check:none], clock:[timer:[[name:hypervclock, present:yes], [name:hpet, present:yes]], offset:utc], on_poweroff:destroy, on_reboot:restart, on_crash:destroy, devices:[emulator:/usr/bin/qemu-system-x86_64, disk:[[driver:[discard:unmap, cache:none, io:io_uring, name:qemu, type:qcow2], source:[file:/mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/ih-test/hvm_9-disk-0, index:3], backingStore:, target:[bus:virtio, dev:vda], serial:3f4387dd-0705-4151-bebd-3dd6600b4450, boot:[order:1], alias:[name:virtio-disk0], address:[bus:0x05, function:0x0, domain:0x0000, slot:0x00, type:pci], type:file, device:disk], [driver:[name:qemu, type:raw], source:[index:5], target:[bus:sata, dev:sda], readonly:, serial:58f0fc07-2c16-41ad-9ecf-7fba9679b9ee, boot:[order:3], alias:[name:sata0-0-0], address:[bus:0, unit:0, controller:0, type:drive, target:0], type:file, device:cdrom], [driver:[name:qemu, type:raw], source:[index:6], target:[bus:sata, dev:sdc], readonly:, serial:f0b8438f-86e7-4da9-84b7-6d694ebea067, boot:[order:2], alias:[name:sata0-0-2], address:[bus:0, unit:2, controller:0, type:drive, target:0], type:file, device:cdrom]], controller:[[model:pcie-root-port, target:[port:0x10, chassis:1], alias:[name:pci.1], address:[bus:0x00, function:0x0, domain:0x0000, slot:0x02, type:pci, multifunction:on], index:1, type:pci], [model:pcie-root-port, target:[port:0x11, chassis:2], alias:[name:pci.2], address:[bus:0x00, function:0x1, domain:0x0000, slot:0x02, type:pci], index:2, type:pci], [model:pcie-root-port, target:[port:0x12, chassis:3], alias:[name:pci.3], address:[bus:0x00, function:0x2, domain:0x0000, slot:0x02, type:pci], index:3, type:pci], [model:pcie-root-port, target:[port:0x13, chassis:4], alias:[name:pci.4], address:[bus:0x00, function:0x3, domain:0x0000, slot:0x02, type:pci], index:4, type:pci], [model:pcie-root-port, target:[port:0x14, chassis:5], alias:[name:pci.5], address:[bus:0x00, function:0x4, domain:0x0000, slot:0x02, type:pci], index:5, type:pci], [model:pcie-root-port, target:[port:0x15, chassis:6], alias:[name:pci.6], address:[bus:0x00, function:0x5, domain:0x0000, slot:0x02, type:pci], index:6, type:pci], [model:pcie-root-port, target:[port:0x16, chassis:7], alias:[name:pci.7], address:[bus:0x00, function:0x6, domain:0x0000, slot:0x02, type:pci], index:7, type:pci], [model:pcie-root-port, target:[port:0x17, chassis:8], alias:[name:pci.8], address:[bus:0x00, function:0x7, domain:0x0000, slot:0x02, type:pci], index:8, type:pci], [model:pcie-root-port, target:[port:0x18, chassis:9], alias:[name:pci.9], address:[bus:0x00, function:0x0, domain:0x0000, slot:0x03, type:pci, multifunction:on], index:9, type:pci], [model:pcie-root-port, target:[port:0x19, chassis:10], alias:[name:pci.10], address:[bus:0x00, function:0x1, domain:0x0000, slot:0x03, type:pci], index:10, type:pci], [model:pcie-root-port, target:[port:0x1a, chassis:11], alias:[name:pci.11], address:[bus:0x00, function:0x2, domain:0x0000, slot:0x03, type:pci], index:11, type:pci], [model:pcie-root-port, target:[port:0x1b, chassis:12], alias:[name:pci.12], address:[bus:0x00, function:0x3, domain:0x0000, slot:0x03, type:pci], index:12, type:pci], [model:pcie-root-port, target:[port:0x1c, chassis:13], alias:[name:pci.13], address:[bus:0x00, function:0x4, domain:0x0000, slot:0x03, type:pci], index:13, type:pci], [model:pcie-root-port, target:[port:0x1d, chassis:14], alias:[name:pci.14], address:[bus:0x00, function:0x5, domain:0x0000, slot:0x03, type:pci], index:14, type:pci], [model:pcie-root-port, target:[port:0x1e, chassis:15], alias:[name:pci.15], address:[bus:0x00, function:0x6, domain:0x0000, slot:0x03, type:pci], index:15, type:pci], [model:pcie-root-port, target:[port:0x1f, chassis:16], alias:[name:pci.16], address:[bus:0x00, function:0x7, domain:0x0000, slot:0x03, type:pci], index:16, type:pci], [model:pcie-root-port, target:[port:0x20, chassis:17], alias:[name:pci.17], address:[bus:0x00, function:0x0, domain:0x0000, slot:0x04, type:pci, multifunction:on], index:17, type:pci], [model:pcie-root-port, target:[port:0x21, chassis:18], alias:[name:pci.18], address:[bus:0x00, function:0x1, domain:0x0000, slot:0x04, type:pci], index:18, type:pci], [model:pcie-root-port, target:[port:0x22, chassis:19], alias:[name:pci.19], address:[bus:0x00, function:0x2, domain:0x0000, slot:0x04, type:pci], index:19, type:pci], [model:pcie-root-port, target:[port:0x23, chassis:20], alias:[name:pci.20], address:[bus:0x00, function:0x3, domain:0x0000, slot:0x04, type:pci], index:20, type:pci], [model:pcie-root-port, target:[port:0x24, chassis:21], alias:[name:pci.21], address:[bus:0x00, function:0x4, domain:0x0000, slot:0x04, type:pci], index:21, type:pci], [model:pcie-root-port, target:[port:0x25, chassis:22], alias:[name:pci.22], address:[bus:0x00, function:0x5, domain:0x0000, slot:0x04, type:pci], index:22, type:pci], [model:pcie-root-port, target:[port:0x26, chassis:23], alias:[name:pci.23], address:[bus:0x00, function:0x6, domain:0x0000, slot:0x04, type:pci], index:23, type:pci], [model:pcie-root-port, target:[port:0x27, chassis:24], alias:[name:pci.24], address:[bus:0x00, function:0x7, domain:0x0000, slot:0x04, type:pci], index:24, type:pci], [model:pcie-root-port, target:[port:0x28, chassis:25], alias:[name:pci.25], address:[bus:0x00, function:0x0, domain:0x0000, slot:0x05, type:pci, multifunction:on], index:25, type:pci], [alias:[name:ide], address:[bus:0x00, function:0x2, domain:0x0000, slot:0x1f, type:pci], index:0, type:sata], [alias:[name:usb], address:[bus:0x03, function:0x0, domain:0x0000, slot:0x00, type:pci], index:0, model:qemu-xhci, type:usb], [alias:[name:pcie.0], index:0, model:pcie-root, type:pci], [alias:[name:virtio-serial0], address:[bus:0x04, function:0x0, domain:0x0000, slot:0x00, type:pci], index:0, type:virtio-serial], [model:pcie-root-port, target:[port:0x29, chassis:26], alias:[name:pci.26], address:[bus:0x00, function:0x1, domain:0x0000, slot:0x05, type:pci], index:26, type:pci], [model:pcie-to-pci-bridge, alias:[name:pci.27], address:[bus:0x01, function:0x0, domain:0x0000, slot:0x00, type:pci], index:27, type:pci]], interface:[mac:[address:52:54:00:59:21:1a], source:[bridge:mgmt, portid:d05287c1-2ded-4606-b655-e183ddfb1549, network:Management], virtualport:[parameters:[interfaceid:b3fef9f9-f8c0-4d67-aeae-779ff9b37f4f], type:openvswitch], target:[dev:vnet23], model:[type:virtio], driver:[queues:2], alias:[name:net0], address:[bus:0x02, function:0x0, domain:0x0000, slot:0x00, type:pci], type:bridge], serial:[source:[path:/dev/pts/4], target:[model:[name:isa-serial], port:0, type:isa-serial], alias:[name:serial0], type:pty], console:[source:[path:/dev/pts/4], target:[port:0, type:serial], alias:[name:serial0], tty:/dev/pts/4, type:pty], channel:[source:[mode:bind, path:/run/libvirt/qemu/channel/13-ih-test/org.qemu.guest_agent.0], target:[name:org.qemu.guest_agent.0, state:disconnected, type:virtio], alias:[name:channel0], address:[bus:0, controller:0, port:1, type:virtio-serial], type:unix], input:[[alias:[name:input0], address:[bus:0, port:1, type:usb], bus:usb, type:tablet], [alias:[name:input1], bus:ps2, type:mouse], [alias:[name:input2], bus:ps2, type:keyboard]], graphics:[listen:127.0.0.1, passwd:*******, port:41005, type:vnc, autoport:no], sound:[alias:[name:sound0], address:[bus:0x1b, function:0x0, domain:0x0000, slot:0x01, type:pci], model:ac97], audio:[id:1, type:none], video:[model:[heads:1, type:virtio, primary:yes], alias:[name:video0], address:[bus:0x00, function:0x0, domain:0x0000, slot:0x01, type:pci]], watchdog:[alias:[name:watchdog0], action:reset, model:itco], memballoon:[alias:[name:balloon0], address:[bus:0x06, function:0x0, domain:0x0000, slot:0x00, type:pci], model:virtio], rng:[backend:[model:random, value:/dev/random], alias:[name:rng0], address:[bus:0x07, function:0x0, domain:0x0000, slot:0x00, type:pci], model:virtio]], seclabel:[[label:libvirt-bd98171a-f630-45d1-a901-8c26cd49d52f, imagelabel:libvirt-bd98171a-f630-45d1-a901-8c26cd49d52f, model:apparmor, type:dynamic, relabel:yes], [label:+64055:+994, imagelabel:+64055:+994, model:dac, type:dynamic, relabel:yes]], commandline:[arg:[[value:-machine], [value:pc-q35-8.2,smbios-entry-point-type=32]]], type:kvm, stats:[currentTimeMillis:1764602224215, state.state:1, state.reason:1, cpu.time:4497251917000, cpu.user:3880681913000, cpu.system:616570279000, cpu.cache.monitor.count:0, cpu.haltpoll.success.time:42228768159, cpu.haltpoll.fail.time:33466632449, balloon.current:8388608, balloon.maximum:8388608, balloon.last-update:0, balloon.rss:8436404, vcpu.current:2, vcpu.maximum:32, vcpu.0.state:1, vcpu.0.time:2150730000000, vcpu.0.wait:0, vcpu.0.delay:659913854, vcpu.0.irq_exits.sum:2897363, vcpu.0.nmi_window_exits.sum:0, vcpu.0.pf_emulate.sum:26508, vcpu.0.pf_guest.sum:0, vcpu.0.request_irq_exits.sum:0, vcpu.0.preemption_reported.sum:5152, vcpu.0.notify_window_exits.sum:0, vcpu.0.pf_fast.sum:0, vcpu.0.irq_window_exits.sum:194, vcpu.0.insn_emulation_fail.sum:0, vcpu.0.halt_exits.sum:7034722, vcpu.0.pf_spurious.sum:0, vcpu.0.pf_fixed.sum:11948302, vcpu.0.guest_mode.cur:no, vcpu.0.irq_injections.sum:1346, vcpu.0.pf_taken.sum:11979250, vcpu.0.io_exits.sum:735736, vcpu.0.fpu_reload.sum:4189181, vcpu.0.req_event.sum:11990639, vcpu.0.tlb_flush.sum:283205, vcpu.0.insn_emulation.sum:3647397, vcpu.0.nested_run.sum:0, vcpu.0.preemption_other.sum:504, vcpu.0.directed_yield_successful.sum:0, vcpu.0.halt_poll_invalid.sum:0, vcpu.0.host_state_reload.sum:10506403, vcpu.0.invlpg.sum:0, vcpu.0.exits.sum:49728172, vcpu.0.halt_poll_fail_ns.sum:18576008162, vcpu.0.l1d_flush.sum:13881325, vcpu.0.signal_exits.sum:24, vcpu.0.pf_mmio_spte_created.sum:26508, vcpu.0.nmi_injections.sum:1, vcpu.0.blocking.cur:yes, vcpu.0.halt_poll_success_ns.sum:21974873825, vcpu.0.halt_wakeup.sum:6372979, vcpu.0.hypercalls.sum:3278, vcpu.0.halt_attempted_poll.sum:1392545, vcpu.0.halt_wait_ns.sum:9538390720158, vcpu.0.halt_successful_poll.sum:640411, vcpu.0.mmio_exits.sum:3456685, vcpu.0.directed_yield_attempted.sum:0, vcpu.1.state:1, vcpu.1.time:2162390000000, vcpu.1.wait:0, vcpu.1.delay:300821518, vcpu.1.irq_exits.sum:2747116, vcpu.1.nmi_window_exits.sum:0, vcpu.1.pf_emulate.sum:4778, vcpu.1.pf_guest.sum:0, vcpu.1.request_irq_exits.sum:0, vcpu.1.preemption_reported.sum:2814, vcpu.1.notify_window_exits.sum:0, vcpu.1.pf_fast.sum:0, vcpu.1.irq_window_exits.sum:0, vcpu.1.insn_emulation_fail.sum:0, vcpu.1.halt_exits.sum:6615656, vcpu.1.pf_spurious.sum:0, vcpu.1.pf_fixed.sum:5897197, vcpu.1.guest_mode.cur:no, vcpu.1.irq_injections.sum:505, vcpu.1.pf_taken.sum:5904754, vcpu.1.io_exits.sum:4793, vcpu.1.fpu_reload.sum:88398, vcpu.1.req_event.sum:10047650, vcpu.1.tlb_flush.sum:22076, vcpu.1.insn_emulation.sum:84743, vcpu.1.nested_run.sum:0, vcpu.1.preemption_other.sum:155, vcpu.1.directed_yield_successful.sum:0, vcpu.1.halt_poll_invalid.sum:0, vcpu.1.host_state_reload.sum:6036642, vcpu.1.invlpg.sum:0, vcpu.1.exits.sum:35923252, vcpu.1.halt_poll_fail_ns.sum:14890624287, vcpu.1.l1d_flush.sum:8980414, vcpu.1.signal_exits.sum:167, vcpu.1.pf_mmio_spte_created.sum:4778, vcpu.1.nmi_injections.sum:1, vcpu.1.blocking.cur:yes, vcpu.1.halt_poll_success_ns.sum:20253894334, vcpu.1.halt_wakeup.sum:6009376, vcpu.1.hypercalls.sum:15479, vcpu.1.halt_attempted_poll.sum:1236565, vcpu.1.halt_wait_ns.sum:9528009873758, vcpu.1.halt_successful_poll.sum:601275, vcpu.1.mmio_exits.sum:83445, vcpu.1.directed_yield_attempted.sum:0, net.count:1, net.0.name:vnet23, net.0.rx.bytes:1271213825, net.0.rx.pkts:297498, net.0.rx.errs:0, net.0.rx.drop:0, net.0.tx.bytes:19895680, net.0.tx.pkts:234917, net.0.tx.errs:0, net.0.tx.drop:0, block.count:3, block.0.name:vda, block.0.path:/mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/ih-test/hvm_9-disk-0, block.0.backingIndex:3, block.0.rd.reqs:504834, block.0.rd.bytes:33987843584, block.0.rd.times:734006844452, block.0.wr.reqs:662337, block.0.wr.bytes:25136322048, block.0.wr.times:1458621555141, block.0.fl.reqs:22173, block.0.fl.times:56800089708, block.0.allocation:32181977088, block.0.capacity:64424509440, block.0.physical:29569191936, block.1.name:sda, block.1.backingIndex:5, block.1.rd.reqs:287, block.1.rd.bytes:1650742, block.1.rd.times:28710669, block.1.wr.reqs:0, block.1.wr.bytes:0, block.1.wr.times:0, block.1.fl.reqs:0, block.1.fl.times:0, block.2.name:sdc, block.2.backingIndex:6, block.2.rd.reqs:155, block.2.rd.bytes:1200128, block.2.rd.times:13232352, block.2.wr.reqs:0, block.2.wr.bytes:0, block.2.wr.times:0, block.2.fl.reqs:0, block.2.fl.times:0, iothread.count:1, iothread.1.poll-max-ns:32768, iothread.1.poll-grow:0, iothread.1.poll-shrink:0, dirtyrate.calc_status:0, dirtyrate.calc_start_time:0, dirtyrate.calc_period:0, dirtyrate.calc_mode:page-sampling, vm.pages_1g.cur:0, vm.pages_4k.cur:395825, vm.max_mmu_rmap_size.max:0, vm.mmu_flooded.sum:0, vm.remote_tlb_flush.sum:12646, vm.mmu_cache_miss.sum:0, vm.max_mmu_page_hash_collisions.max:0, vm.mmu_recycled.sum:0, vm.mmu_pde_zapped.sum:0, vm.mmu_shadow_zapped.sum:0, vm.nx_lpage_splits.cur:0, vm.pages_2m.cur:0, vm.mmu_unsync.cur:0, vm.mmu_pte_write.sum:0, vm.remote_tlb_flush_requests.sum:22042], serverId:4, parentServerId:4]]
2025-12-01_15:17:04.75268 ''[2025-12-01 15:17:20,075] [RxCachedThreadScheduler-3] [34mINFO [0;39m [36mc.m.c.KvmComputeUtility[0;39m - Redefine Command Results: [success:true, command:sudo virsh undefine --nvram "ih-test", data:Domain 'ih-test' has been undefined
2025-12-01_15:17:20.07692
2025-12-01_15:17:20.07692 , error:, exitValue:0, errorData:]
2025-12-01_15:17:20.07693 ''[2025-12-01 15:17:20,098] [RxCachedThreadScheduler-3] [34mINFO [0;39m [36mc.m.c.KvmComputeUtility[0;39m - file remove results: [success:true, command:sudo rm -rf "/var/morpheus/kvm/vms/ih-test", data:, error:, exitValue:0, errorData:]
2025-12-01_15:17:20.09890 ''[2025-12-01 15:17:21,320] [Thread-18588] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Guacamole Session Closed.
2025-12-01_15:17:21.32120 ''[2025-12-01 15:17:21,581] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.c.KvmComputeUtility[0;39m - Migration Results: [success:true, command:sudo virsh migrate --live --persistent --p2p --tunnelled --undefinesource "ih-test" qemu+ssh://10.10.80.35/system, data:
2025-12-01_15:17:21.58335 , error:, exitValue:0, errorData:]
2025-12-01_15:17:21.58336 ''[2025-12-01 15:17:21,581] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.p.KvmProvisionService[0;39m - Move Results! [success:true, error:]
2025-12-01_15:17:21.58336 ''[2025-12-01 15:17:21,623] [appJobLow-13] [34mINFO [0;39m [36mc.m.h.KvmBaseHostService[0;39m - Cached Cluster Virtual Machines in 17516ms
2025-12-01_15:17:21.62373 ''[2025-12-01 15:17:21,874] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.p.KvmProvisionService[0;39m - Applying Placement Strategy: ih-test - auto
2025-12-01_15:17:21.87494 ''[2025-12-01 15:17:21,912] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.c.KvmComputeUtility[0;39m - Metadata Results: [success:false, command:sudo virsh metadata "ih-test" "https://www.morpheusdata.com" --config --live --key "morpheus" --set "<metadata><placement>auto</placement><local-storage>false</local-storage><system-vm>false</system-vm></metadata>", data:
2025-12-01_15:17:21.91233 , error:error: Requested operation is not valid: transient domains do not have any persistent config
2025-12-01_15:17:21.91235 , exitValue:1, errorData:error: Requested operation is not valid: transient domains do not have any persistent config
2025-12-01_15:17:21.91235 ]
2025-12-01_15:17:21.91235 ''[2025-12-01 15:17:21,943] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.p.KvmProvisionService[0;39m - Set Metadata Results: [success:false]
2025-12-01_15:17:21.94341 ''[2025-12-01 15:17:21,954] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.ProcessService[0;39m - updateProcess: 99 ih-test - running - Move Server
2025-12-01_15:17:21.95450 ''[2025-12-01 15:17:21,955] [RxCachedThreadScheduler-11] [34mINFO [0;39m [36mc.m.ProcessService[0;39m - Process Updated: 99 - ih-test - running - Move Server
2025-12-01_15:17:21.95557 ''[2025-12-01 15:17:22,865] [http-nio-127.0.0.1-8080-exec-65] [34mINFO [0;39m [36mc.m.ApplianceInterceptor[0;39m - ERROR /remote/socket org.springframework.dao.DataAccessResourceFailureException: Could not obtain current Hibernate Session; nested exception is org.hibernate.HibernateException: No Session found for current thread
2025-12-01_15:17:22.86515 '[leaseToken=, clientOs=Windows, containerId=, consoleKeymap=en-us-qwerty, consoleMode=hypervisor, serverId=16, allocationId=, remoteApp=, GUAC_ID=bc235ae1-be88-4130-b06c-af85a6426098, GUAC_AUDIO=audio/L8, GUAC_AUDIO=audio/L16, GUAC_WIDTH=1024, GUAC_HEIGHT=768, GUAC_IMAGE=image/jpeg, GUAC_IMAGE=image/png, GUAC_IMAGE=image/webp]
2025-12-01_15:17:22.86702 '[2025-12-01 15:17:22,866] [http-nio-127.0.0.1-8080-exec-65] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Connection established: StandardWebSocketSession[id=3c0f1dbe-c686-a6b7-1ab2-86d6ef5630f3, uri=ws://10.10.80.37/remote/socket?leaseToken=&clientOs=Windows&containerId=&consoleKeymap=en-us-qwerty&consoleMode=hypervisor&serverId=16&allocationId=&remoteApp=&GUAC_ID=bc235ae1-be88-4130-b06c-af85a6426098&GUAC_AUDIO=audio%2FL8&GUAC_AUDIO=audio%2FL16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image%2Fjpeg&GUAC_IMAGE=image%2Fpng&GUAC_IMAGE=image%2Fwebp]: 1
2025-12-01_15:17:22.86709 ''[2025-12-01 15:17:22,867] [http-nio-127.0.0.1-8080-exec-65] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Instantiating a Guacamole Remote Tunnel [requestQuery:leaseToken=&clientOs=Windows&containerId=&consoleKeymap=en-us-qwerty&consoleMode=hypervisor&serverId=16&allocationId=&remoteApp=&GUAC_ID=bc235ae1-be88-4130-b06c-af85a6426098&GUAC_AUDIO=audio/L8&GUAC_AUDIO=audio/L16&GUAC_WIDTH=1024&GUAC_HEIGHT=768&GUAC_IMAGE=image/jpeg&GUAC_IMAGE=image/png&GUAC_IMAGE=image/webp, GUAC_ID:bc235ae1-be88-4130-b06c-af85a6426098, consoleMode:hypervisor, GUAC_IMAGE:[image/jpeg, image/png, image/webp], clientOs:Windows, GUAC_AUDIO:[audio/L8, audio/L16], GUAC_WIDTH:1024, consoleKeymap:en-us-qwerty, GUAC_HEIGHT:768, serverId:16] - localhost - 4822
2025-12-01_15:17:22.86775 ''[2025-12-01 15:17:22,869] [RxCachedThreadScheduler-4] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Checking zone list user can view
2025-12-01_15:17:22.86957 ''[2025-12-01 15:17:22,883] [RxCachedThreadScheduler-4] [34mINFO [0;39m [36mc.m.r.MorpheusGuacamoleWebsocketHandler[0;39m - Console Mode: hypervisor null
2025-12-01_15:17:22.88390 'SSL Tunnel Established: 29288
2025-12-01_15:17:22.93349 Hostname: 127.0.0.1
After this it seems to just keep looping
Until this point
2025-12-01_15:29:48.07778 ''[2025-12-01 15:29:48,964] [appJobLow-18] [1;31mERROR[0;39m [36mc.m.p.KvmProvisionService[0;39m - executeComputeServerCommand error: java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:48.96430 'java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:48.96430 at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:110)
2025-12-01_15:29:48.96431 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:44)
2025-12-01_15:29:48.96431 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
2025-12-01_15:29:48.96431 at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:34)
2025-12-01_15:29:48.96431 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
2025-12-01_15:29:48.96431 at com.morpheus.ComputeServer$getConfigMap$15.call(Unknown Source)
2025-12-01_15:29:48.96432 at com.morpheus.provision.KvmProvisionService.getKvmHypervisorOpts(KvmProvisionService.groovy:4797)
2025-12-01_15:29:48.96432 at com.morpheus.provision.KvmProvisionService$getKvmHypervisorOpts$17.callCurrent(Unknown Source)
2025-12-01_15:29:48.96432 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy:4633)
2025-12-01_15:29:48.96433 at com.morpheus.provision.KvmProvisionService$executeComputeServerCommand$33.callCurrent(Unknown Source)
2025-12-01_15:29:48.96433 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy:4616)
2025-12-01_15:29:48.96433 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy)
2025-12-01_15:29:48.96434 at com.morpheus.provision.KvmProvisionService$executeComputeServerCommand$1.call(Unknown Source)
2025-12-01_15:29:48.96434 at com.morpheus.host.MvmHostService.cachePacemakerStatus(MvmHostService.groovy:40)
2025-12-01_15:29:48.96434 at com.morpheus.host.MvmHostService$cachePacemakerStatus$2.callCurrent(Unknown Source)
2025-12-01_15:29:48.96434 at com.morpheus.host.MvmHostService.refreshServerGroup(MvmHostService.groovy:25)
2025-12-01_15:29:48.96434 at com.morpheus.host.MvmHostService.refreshServerGroup(MvmHostService.groovy)
2025-12-01_15:29:48.96435 at com.morpheus.host.MvmHostService$refreshServerGroup.call(Unknown Source)
2025-12-01_15:29:48.96435 at com.morpheus.ComputeService.refreshServerGroup(ComputeService.groovy:1084)
2025-12-01_15:29:48.96435 at com.morpheus.ComputeService$refreshServerGroup$2.call(Unknown Source)
2025-12-01_15:29:48.96436 at com.morpheus.ApplianceJobService.executeApplianceJob(ApplianceJobService.groovy:457)
2025-12-01_15:29:48.96436 at jdk.internal.reflect.GeneratedMethodAccessor1017.invoke(Unknown Source)
2025-12-01_15:29:48.96436 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:48.96436 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:48.96437 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:48.96437 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:48.96438 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:362)
2025-12-01_15:29:48.96438 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:61)
2025-12-01_15:29:48.96439 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:203)
2025-12-01_15:29:48.96439 at com.morpheus.ApplianceJobService$_onApplianceJob_closure2.doCall(ApplianceJobService.groovy:305)
2025-12-01_15:29:48.96440 at com.morpheus.ApplianceJobService$_onApplianceJob_closure2.doCall(ApplianceJobService.groovy)
2025-12-01_15:29:48.96440 at jdk.internal.reflect.GeneratedMethodAccessor995.invoke(Unknown Source)
2025-12-01_15:29:48.96441 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:48.96441 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:48.96441 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:48.96442 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:48.96442 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-12-01_15:29:48.96442 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-12-01_15:29:48.96442 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:38)
2025-12-01_15:29:48.96442 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
2025-12-01_15:29:48.96442 at com.morpheus.ApplianceJobService$_onApplianceJob_closure3.doCall(ApplianceJobService.groovy:372)
2025-12-01_15:29:48.96443 at jdk.internal.reflect.GeneratedMethodAccessor994.invoke(Unknown Source)
2025-12-01_15:29:48.96443 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:48.96443 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:48.96443 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:48.96444 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:48.96444 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-12-01_15:29:48.96445 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-12-01_15:29:48.96445 at groovy.lang.Closure.call(Closure.java:427)
2025-12-01_15:29:48.96445 at groovy.lang.Closure.call(Closure.java:406)
2025-12-01_15:29:48.96445 at com.morpheus.util.BoundedExecutor$2.run(BoundedExecutor.java:47)
2025-12-01_15:29:48.96445 at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
2025-12-01_15:29:48.96446 at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
2025-12-01_15:29:48.96446 at java.base/java.lang.Thread.run(Unknown Source)
2025-12-01_15:29:49.19188 '[2025-12-01 15:29:49,191] [appJobLow-18] [1;31mERROR[0;39m [36mc.m.p.KvmProvisionService[0;39m - executeComputeServerCommand error: java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:49.19190 'java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:49.19191 at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:110)
2025-12-01_15:29:49.19191 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:44)
2025-12-01_15:29:49.19191 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
2025-12-01_15:29:49.19191 at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:34)
2025-12-01_15:29:49.19191 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
2025-12-01_15:29:49.19192 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:53)
2025-12-01_15:29:49.19192 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
2025-12-01_15:29:49.19193 at com.morpheus.provision.KvmProvisionService.getKvmHypervisorOpts(KvmProvisionService.groovy:4797)
2025-12-01_15:29:49.19193 at com.morpheus.provision.KvmProvisionService$getKvmHypervisorOpts$17.callCurrent(Unknown Source)
2025-12-01_15:29:49.19194 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy:4633)
2025-12-01_15:29:49.19195 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy)
2025-12-01_15:29:49.19195 at com.morpheus.provision.KvmProvisionService$executeComputeServerCommand$1.call(Unknown Source)
2025-12-01_15:29:49.19195 at com.morpheus.host.MvmHostService.cachePacemakerStatus(MvmHostService.groovy:40)
2025-12-01_15:29:49.19195 at com.morpheus.host.MvmHostService$cachePacemakerStatus$2.callCurrent(Unknown Source)
2025-12-01_15:29:49.19195 at com.morpheus.host.MvmHostService.refreshServerGroup(MvmHostService.groovy:25)
2025-12-01_15:29:49.19196 at com.morpheus.host.MvmHostService.refreshServerGroup(MvmHostService.groovy)
2025-12-01_15:29:49.19196 at com.morpheus.host.MvmHostService$refreshServerGroup.call(Unknown Source)
2025-12-01_15:29:49.19196 at com.morpheus.ComputeService.refreshServerGroup(ComputeService.groovy:1084)
2025-12-01_15:29:49.19196 at com.morpheus.ComputeService$refreshServerGroup$2.call(Unknown Source)
2025-12-01_15:29:49.19196 at com.morpheus.ApplianceJobService.executeApplianceJob(ApplianceJobService.groovy:457)
2025-12-01_15:29:49.19198 at jdk.internal.reflect.GeneratedMethodAccessor1017.invoke(Unknown Source)
2025-12-01_15:29:49.19198 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:49.19198 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:49.19199 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:49.19199 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:49.19199 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:362)
2025-12-01_15:29:49.19199 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:61)
2025-12-01_15:29:49.19199 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:203)
2025-12-01_15:29:49.19200 at com.morpheus.ApplianceJobService$_onApplianceJob_closure2.doCall(ApplianceJobService.groovy:305)
2025-12-01_15:29:49.19200 at com.morpheus.ApplianceJobService$_onApplianceJob_closure2.doCall(ApplianceJobService.groovy)
2025-12-01_15:29:49.19200 at jdk.internal.reflect.GeneratedMethodAccessor995.invoke(Unknown Source)
2025-12-01_15:29:49.19200 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:49.19201 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:49.19202 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:49.19202 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:49.19202 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-12-01_15:29:49.19202 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-12-01_15:29:49.19202 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:38)
2025-12-01_15:29:49.19203 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
2025-12-01_15:29:49.19203 at com.morpheus.ApplianceJobService$_onApplianceJob_closure3.doCall(ApplianceJobService.groovy:372)
2025-12-01_15:29:49.19203 at jdk.internal.reflect.GeneratedMethodAccessor994.invoke(Unknown Source)
2025-12-01_15:29:49.19203 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:49.19203 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:49.19204 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:49.19204 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:49.19205 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-12-01_15:29:49.19206 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-12-01_15:29:49.19206 at groovy.lang.Closure.call(Closure.java:427)
2025-12-01_15:29:49.19206 at groovy.lang.Closure.call(Closure.java:406)
2025-12-01_15:29:49.19206 at com.morpheus.util.BoundedExecutor$2.run(BoundedExecutor.java:47)
2025-12-01_15:29:49.19206 at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
2025-12-01_15:29:49.19207 at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
2025-12-01_15:29:49.19207 at java.base/java.lang.Thread.run(Unknown Source)
2025-12-01_15:29:49.46290 '[2025-12-01 15:29:49,462] [appJobLow-18] [1;31mERROR[0;39m [36mc.m.p.KvmProvisionService[0;39m - executeComputeServerCommand error: java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:49.46292 'java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:49.46293 at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:110)
2025-12-01_15:29:49.46293 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:44)
2025-12-01_15:29:49.46293 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
2025-12-01_15:29:49.46293 at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:34)
2025-12-01_15:29:49.46293 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
2025-12-01_15:29:49.46294 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:53)
2025-12-01_15:29:49.46294 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
2025-12-01_15:29:49.46294 at com.morpheus.provision.KvmProvisionService.getKvmHypervisorOpts(KvmProvisionService.groovy:4797)
2025-12-01_15:29:49.46294 at com.morpheus.provision.KvmProvisionService$getKvmHypervisorOpts$17.callCurrent(Unknown Source)
2025-12-01_15:29:49.46295 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy:4633)
2025-12-01_15:29:49.46296 at com.morpheus.provision.KvmProvisionService$executeComputeServerCommand$33.callCurrent(Unknown Source)
2025-12-01_15:29:49.46296 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy:4616)
2025-12-01_15:29:49.46296 at com.morpheus.provision.KvmProvisionService.executeComputeServerCommand(KvmProvisionService.groovy)
2025-12-01_15:29:49.46296 at com.morpheus.provision.KvmProvisionService$executeComputeServerCommand$1.call(Unknown Source)
2025-12-01_15:29:49.46296 at com.morpheus.host.MvmHostService.cachePacemakerStatus(MvmHostService.groovy:117)
2025-12-01_15:29:49.46296 at com.morpheus.host.MvmHostService$cachePacemakerStatus$2.callCurrent(Unknown Source)
2025-12-01_15:29:49.46297 at com.morpheus.host.MvmHostService.refreshServerGroup(MvmHostService.groovy:25)
2025-12-01_15:29:49.46297 at com.morpheus.host.MvmHostService.refreshServerGroup(MvmHostService.groovy)
2025-12-01_15:29:49.46297 at com.morpheus.host.MvmHostService$refreshServerGroup.call(Unknown Source)
2025-12-01_15:29:49.46298 at com.morpheus.ComputeService.refreshServerGroup(ComputeService.groovy:1084)
2025-12-01_15:29:49.46298 at com.morpheus.ComputeService$refreshServerGroup$2.call(Unknown Source)
2025-12-01_15:29:49.46298 at com.morpheus.ApplianceJobService.executeApplianceJob(ApplianceJobService.groovy:457)
2025-12-01_15:29:49.46298 at jdk.internal.reflect.GeneratedMethodAccessor1017.invoke(Unknown Source)
2025-12-01_15:29:49.46299 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:49.46299 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:49.46300 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:49.46300 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:49.46300 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:362)
2025-12-01_15:29:49.46300 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:61)
2025-12-01_15:29:49.46301 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:203)
2025-12-01_15:29:49.46301 at com.morpheus.ApplianceJobService$_onApplianceJob_closure2.doCall(ApplianceJobService.groovy:305)
2025-12-01_15:29:49.46301 at com.morpheus.ApplianceJobService$_onApplianceJob_closure2.doCall(ApplianceJobService.groovy)
2025-12-01_15:29:49.46302 at jdk.internal.reflect.GeneratedMethodAccessor995.invoke(Unknown Source)
2025-12-01_15:29:49.46302 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:49.46302 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:49.46303 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:49.46303 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:49.46303 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-12-01_15:29:49.46303 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-12-01_15:29:49.46303 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:38)
2025-12-01_15:29:49.46303 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
2025-12-01_15:29:49.46304 at com.morpheus.ApplianceJobService$_onApplianceJob_closure3.doCall(ApplianceJobService.groovy:372)
2025-12-01_15:29:49.46304 at jdk.internal.reflect.GeneratedMethodAccessor994.invoke(Unknown Source)
2025-12-01_15:29:49.46304 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2025-12-01_15:29:49.46305 at java.base/java.lang.reflect.Method.invoke(Unknown Source)
2025-12-01_15:29:49.46306 at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107)
2025-12-01_15:29:49.46306 at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
2025-12-01_15:29:49.46306 at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:274)
2025-12-01_15:29:49.46306 at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1030)
2025-12-01_15:29:49.46306 at groovy.lang.Closure.call(Closure.java:427)
2025-12-01_15:29:49.46307 at groovy.lang.Closure.call(Closure.java:406)
2025-12-01_15:29:49.46307 at com.morpheus.util.BoundedExecutor$2.run(BoundedExecutor.java:47)
2025-12-01_15:29:49.46307 at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
2025-12-01_15:29:49.46307 at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
2025-12-01_15:29:49.46307 at java.base/java.lang.Thread.run(Unknown Source)
2025-12-01_15:29:49.55677 '[2025-12-01 15:29:49,556] [appJobLow-18] [1;31mERROR[0;39m [36mc.m.p.KvmProvisionService[0;39m - executeComputeServerCommand error: java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
2025-12-01_15:29:49.55679 'java.lang.NullPointerException: Cannot invoke method getConfigMap() on null object
Powering on the Host2 - cluster is restored but the ih-test VM is lost - still have access to the disk in the GFS2 volume and resources look good once more but the VM XML config is gone?
root@ih-hpenode2:/etc/libvirt/qemu# ls -l
total 24
drwxr-xr-x 2 root root 4096 Nov 26 15:05 autostart
-rw------- 1 root root 12754 Nov 13 18:17 ih-hpevsa2.xml
drwxr-xr-x 3 root root 4096 Nov 13 16:20 networks
root@ih-hpenode2:/etc/libvirt/qemu# pcs resource
* Clone Set: dlm-clone [dlm]:
* Started: [ ih-hpenode1 ih-hpenode2 ih-hpenode3 ]
* Clone Set: storage_c7d58-clone [storage_c7d58]:
* Started: [ ih-hpenode1 ih-hpenode2 ih-hpenode3 ]
root@ih-hpenode2:/etc/libvirt/qemu# ls -l /mnt/51abed95-7904-4c18-9c5c-d4a424ac7d58/ih-test/
total 28876420
-rw-r--r-- 1 libvirt-qemu kvm 32181977088 Dec 1 15:19 hvm_9-disk-0
root@ih-hpenode2:/etc/libvirt/qemu#
root@ih-hpenode2:/etc/libvirt/qemu# virsh list --all
Id Name State
-----------------------------
- ih-hpevsa2 shut off
root@ih-hpenode2:/etc/libvirt/qemu#
So unfortuantely with the Heartbeating enabled there is still something else blocking the Failover of the Test VM.
Another question how straightforward is it to recover this VM given that I still have access to the disks?
Many Thanks,
Ian