- Community Home
- >
- Storage
- >
- Entry Storage Systems
- >
- Disk Enclosures
- >
- rp7410 va7100 array high i/o wait and general ques...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:07 AM
тАО06-03-2009 08:07 AM
Couple questions.
First, each of our VA7100 arrays are full with 15 hard drives each. We have dual controller configuration. When I do an ioscan one controller address shows 12 disks and the other 13 disks for one array. I would think that each controller would show 15 disks right? Here's the output from ioscan:
# ioscan -funC disk |grep "0/0/6/0/0.8"
disk 1 0/0/6/0/0.8.0.110.0.0.0 sdisk CLAIMED DEVICE HP A6188A
disk 29 0/0/6/0/0.8.0.110.0.0.1 sdisk CLAIMED DEVICE HP A6188A
disk 31 0/0/6/0/0.8.0.110.0.0.2 sdisk CLAIMED DEVICE HP A6188A
disk 32 0/0/6/0/0.8.0.110.0.0.3 sdisk CLAIMED DEVICE HP A6188A
disk 33 0/0/6/0/0.8.0.110.0.0.4 sdisk CLAIMED DEVICE HP A6188A
disk 34 0/0/6/0/0.8.0.110.0.0.5 sdisk CLAIMED DEVICE HP A6188A
disk 35 0/0/6/0/0.8.0.110.0.0.6 sdisk CLAIMED DEVICE HP A6188A
disk 36 0/0/6/0/0.8.0.110.0.0.7 sdisk CLAIMED DEVICE HP A6188A
disk 37 0/0/6/0/0.8.0.110.0.1.0 sdisk CLAIMED DEVICE HP A6188A
disk 40 0/0/6/0/0.8.0.110.0.1.1 sdisk CLAIMED DEVICE HP A6188A
disk 42 0/0/6/0/0.8.0.110.0.1.2 sdisk CLAIMED DEVICE HP A6188A
disk 44 0/0/6/0/0.8.0.110.0.1.3 sdisk CLAIMED DEVICE HP A6188A
# ioscan -funC disk |grep "1/0/6/0/0.8"
disk 5 1/0/6/0/0.8.0.108.0.0.0 sdisk CLAIMED DEVICE HP A6188A
disk 30 1/0/6/0/0.8.0.108.0.0.1 sdisk CLAIMED DEVICE HP A6188A
disk 38 1/0/6/0/0.8.0.108.0.0.2 sdisk CLAIMED DEVICE HP A6188A
disk 39 1/0/6/0/0.8.0.108.0.0.3 sdisk CLAIMED DEVICE HP A6188A
disk 41 1/0/6/0/0.8.0.108.0.0.4 sdisk CLAIMED DEVICE HP A6188A
disk 43 1/0/6/0/0.8.0.108.0.0.5 sdisk CLAIMED DEVICE HP A6188A
disk 45 1/0/6/0/0.8.0.108.0.0.6 sdisk CLAIMED DEVICE HP A6188A
disk 46 1/0/6/0/0.8.0.108.0.0.7 sdisk CLAIMED DEVICE HP A6188A
disk 47 1/0/6/0/0.8.0.108.0.1.0 sdisk CLAIMED DEVICE HP A6188A
disk 48 1/0/6/0/0.8.0.108.0.1.1 sdisk CLAIMED DEVICE HP A6188A
disk 49 1/0/6/0/0.8.0.108.0.1.2 sdisk CLAIMED DEVICE HP A6188A
disk 50 1/0/6/0/0.8.0.108.0.1.3 sdisk CLAIMED DEVICE HP A6188A
The same is true for the other array. Why am I not seeing all 15 disks per array? Commandview SDM shows all 15 disks in the gui. Thinking maybe for some reason the other disks are on a different i/o address I did an ioscan for disk and grep'd the disk model number (A6188A). It returned back exactly 50 disks. One controller showing 12 and the other 13 for two arrays is 50. But Shouldn't I get back 30 since that's how many disks actually exist?
Second, how do I go about fixing these SCSI errors? I believe understanding the first part of the question is needed for me to figure out the second question.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:12 AM
тАО06-03-2009 08:12 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
# ioscan -funC disk |grep "0/0/6/0/0.8"
disk 1 0/0/6/0/0.8.0.110.0.0.0 sdisk CLAIMED DEVICE HP A6188A
disk 29 0/0/6/0/0.8.0.110.0.0.1 sdisk CLAIMED DEVICE HP A6188A
disk 31 0/0/6/0/0.8.0.110.0.0.2 sdisk CLAIMED DEVICE HP A6188A
disk 32 0/0/6/0/0.8.0.110.0.0.3 sdisk CLAIMED DEVICE HP A6188A
disk 33 0/0/6/0/0.8.0.110.0.0.4 sdisk CLAIMED DEVICE HP A6188A
disk 34 0/0/6/0/0.8.0.110.0.0.5 sdisk CLAIMED DEVICE HP A6188A
disk 35 0/0/6/0/0.8.0.110.0.0.6 sdisk CLAIMED DEVICE HP A6188A
disk 36 0/0/6/0/0.8.0.110.0.0.7 sdisk CLAIMED DEVICE HP A6188A
disk 37 0/0/6/0/0.8.0.110.0.1.0 sdisk CLAIMED DEVICE HP A6188A
disk 40 0/0/6/0/0.8.0.110.0.1.1 sdisk CLAIMED DEVICE HP A6188A
disk 42 0/0/6/0/0.8.0.110.0.1.2 sdisk CLAIMED DEVICE HP A6188A
disk 44 0/0/6/0/0.8.0.110.0.1.3 sdisk CLAIMED DEVICE HP A6188A
# ioscan -funC disk |grep "1/0/6/0/0.8"
disk 5 1/0/6/0/0.8.0.108.0.0.0 sdisk CLAIMED DEVICE HP A6188A
disk 30 1/0/6/0/0.8.0.108.0.0.1 sdisk CLAIMED DEVICE HP A6188A
disk 38 1/0/6/0/0.8.0.108.0.0.2 sdisk CLAIMED DEVICE HP A6188A
disk 39 1/0/6/0/0.8.0.108.0.0.3 sdisk CLAIMED DEVICE HP A6188A
disk 41 1/0/6/0/0.8.0.108.0.0.4 sdisk CLAIMED DEVICE HP A6188A
disk 43 1/0/6/0/0.8.0.108.0.0.5 sdisk CLAIMED DEVICE HP A6188A
disk 45 1/0/6/0/0.8.0.108.0.0.6 sdisk CLAIMED DEVICE HP A6188A
disk 46 1/0/6/0/0.8.0.108.0.0.7 sdisk CLAIMED DEVICE HP A6188A
disk 47 1/0/6/0/0.8.0.108.0.1.0 sdisk CLAIMED DEVICE HP A6188A
disk 48 1/0/6/0/0.8.0.108.0.1.1 sdisk CLAIMED DEVICE HP A6188A
disk 49 1/0/6/0/0.8.0.108.0.1.2 sdisk CLAIMED DEVICE HP A6188A
disk 50 1/0/6/0/0.8.0.108.0.1.3 sdisk CLAIMED DEVICE HP A6188A
# ioscan -funC disk |grep "0/0/8/0/0.8"
disk 2 0/0/8/0/0.8.0.108.0.0.0 sdisk CLAIMED DEVICE HP A6188A
disk 7 0/0/8/0/0.8.0.108.0.0.1 sdisk CLAIMED DEVICE HP A6188A
disk 8 0/0/8/0/0.8.0.108.0.0.2 sdisk CLAIMED DEVICE HP A6188A
disk 9 0/0/8/0/0.8.0.108.0.0.3 sdisk CLAIMED DEVICE HP A6188A
disk 10 0/0/8/0/0.8.0.108.0.0.4 sdisk CLAIMED DEVICE HP A6188A
disk 11 0/0/8/0/0.8.0.108.0.0.5 sdisk CLAIMED DEVICE HP A6188A
disk 13 0/0/8/0/0.8.0.108.0.0.6 sdisk CLAIMED DEVICE HP A6188A
disk 19 0/0/8/0/0.8.0.108.0.0.7 sdisk CLAIMED DEVICE HP A6188A
disk 20 0/0/8/0/0.8.0.108.0.1.0 sdisk CLAIMED DEVICE HP A6188A
disk 21 0/0/8/0/0.8.0.108.0.1.1 sdisk CLAIMED DEVICE HP A6188A
disk 22 0/0/8/0/0.8.0.108.0.1.2 sdisk CLAIMED DEVICE HP A6188A
disk 23 0/0/8/0/0.8.0.108.0.1.3 sdisk CLAIMED DEVICE HP A6188A
disk 51 0/0/8/0/0.8.0.108.0.1.4 sdisk CLAIMED DEVICE HP A6188A
# ioscan -funC disk |grep "1/0/8/0/0.8"
disk 6 1/0/8/0/0.8.0.110.0.0.0 sdisk CLAIMED DEVICE HP A6188A
disk 12 1/0/8/0/0.8.0.110.0.0.1 sdisk CLAIMED DEVICE HP A6188A
disk 14 1/0/8/0/0.8.0.110.0.0.2 sdisk CLAIMED DEVICE HP A6188A
disk 15 1/0/8/0/0.8.0.110.0.0.3 sdisk CLAIMED DEVICE HP A6188A
disk 16 1/0/8/0/0.8.0.110.0.0.4 sdisk CLAIMED DEVICE HP A6188A
disk 17 1/0/8/0/0.8.0.110.0.0.5 sdisk CLAIMED DEVICE HP A6188A
disk 18 1/0/8/0/0.8.0.110.0.0.6 sdisk CLAIMED DEVICE HP A6188A
disk 24 1/0/8/0/0.8.0.110.0.0.7 sdisk CLAIMED DEVICE HP A6188A
disk 25 1/0/8/0/0.8.0.110.0.1.0 sdisk CLAIMED DEVICE HP A6188A
disk 26 1/0/8/0/0.8.0.110.0.1.1 sdisk CLAIMED DEVICE HP A6188A
disk 27 1/0/8/0/0.8.0.110.0.1.2 sdisk CLAIMED DEVICE HP A6188A
disk 28 1/0/8/0/0.8.0.110.0.1.3 sdisk CLAIMED DEVICE HP A6188A
disk 52 1/0/8/0/0.8.0.110.0.1.4 sdisk CLAIMED DEVICE HP A6188A
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:23 AM
тАО06-03-2009 08:23 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
UNIX because I majored in cryptology...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:26 AM
тАО06-03-2009 08:26 AM
SolutionIf you look at the LUN information for the arrays you will see 12 LUNs for one array and 13 for the other.
As far as the SCSI errors go, that is somewhat surprising. I suppose it could be a controller issue on the array(s), but if that were the case you should see messages in Commandview that says there are problems with the array.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:30 AM
тАО06-03-2009 08:30 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:41 AM
тАО06-03-2009 08:41 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
Hope this helps!
Regards
Torsten.
__________________________________________________
There are only 10 types of people in the world -
those who understand binary, and those who don't.
__________________________________________________
No support by private messages. Please ask the forum!
If you feel this was helpful please click the KUDOS! thumb below!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 08:46 AM
тАО06-03-2009 08:46 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
# armdsp -i
to get the aliases, then
# armdsp -a
to get all the details.
Hope this helps!
Regards
Torsten.
__________________________________________________
There are only 10 types of people in the world -
those who understand binary, and those who don't.
__________________________________________________
No support by private messages. Please ask the forum!
If you feel this was helpful please click the KUDOS! thumb below!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 11:49 AM
тАО06-03-2009 11:49 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
Now why would I be getting high i/o waits and scsi read errors? Could it be something as simple as a flakey optical cable? Or maybe a flakey controller?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-03-2009 11:55 AM
тАО06-03-2009 11:55 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
The SCSI Read errors are another thing entirely, at least to me. This could be indicative of a problem. What exactly, I'm not sure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-04-2009 11:59 AM
тАО06-04-2009 11:59 AM
Re: rp7410 va7100 array high i/o wait and general question about va7100
Jun 4 13:30
...
device 0x1f0a0500 (with priority: 1, and current flags: 0x0).
LVM: VG 64 0x010000: PVLink 31 0x020100 Failed! The PV is still accessible.
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097902000), from raw device 0x1f020600 (with priority: 0, and current flags: 0x40) to raw device 0x1f0a0600 (with priority: 1, and current flags: 0x0).
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097906000), from raw device 0x1f020700 (with priority: 0, and current flags: 0x40) to raw device 0x1f0a0700 (with priority: 1, and current flags: 0x0).
LVM: VG 64 0x040000: PVLink 31 0x020500 Failed! The PV is still accessible.
LVM: VG 64 0x040000: PVLink 31 0x020600 Failed! The PV is still accessible.
LVM: VG 64 0x040000: PVLink 31 0x020700 Failed! The PV is still accessible.
SCSI: Read error -- dev: b 31 0x020200, errno: 126, resid: 1024,
blkno: 8, sectno: 16, offset: 8192, bcount: 1024.
LVM: Performed a switch for Lun ID = 0 (pv = 0x000000009786a000), from raw device 0x1f020200 (with priority: 0, and current flags: 0x40) to raw device 0x1f0a0200 (with priority: 1, and current flags: 0x0).
LVM: VG 64 0x020000: PVLink 31 0x020200 Failed! The PV is still accessible.
LVM: VG 64 0x010000: PVLink 31 0x020000 Failed! The PV is not accessible.
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097820000), from raw device 0x1f020000 (with priority: 0, and current flags: 0xc0) to raw device 0x1f0a0000 (with priority: 1, and current flags: 0x0).
LVM: Performed a switch for Lun ID = 0 (pv = 0x00000000978b6000), from raw device 0x1f020400 (with priority: 0, and current flags: 0x40) to raw device 0x1f0a0400 (with priority: 1, and current flags: 0x0).
LVM: Performed a switch for Lun ID = 0 (pv = 0x00000000978b2000), from raw device 0x1f020300 (with priority: 0, and current flags: 0x40) to raw device 0x1f0a0300 (with priority: 1, and current flags: 0x0).
LVM: VG 64 0x030000: PVLink 31 0x020300 Failed! The PV is not accessible.
LVM: VG 64 0x030000: PVLink 31 0x020400 Failed! The PV is still accessible.
LVM: Performed a switch for Lun ID = 0 (pv = 0x00000000978fc000), from raw device 0x1f0a0500 (with priority: 1, and current flags: 0x0) to raw device 0x1f020500 (with priority: 0, and current flags: 0x80).
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097902000), from raw device 0x1f0a0600 (with priority: 1, and current flags: 0x0) to raw device 0x1f020600 (with priority: 0, and current flags: 0x80).
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097906000), from raw device 0x1f0a0700 (with priority: 1, and current flags: 0x0) to raw device 0x1f020700 (with priority: 0, and current flags: 0x80).
LVM: VG 64 0x040000: PVLink 31 0x020500 Recovered.
LVM: VG 64 0x040000: PVLink 31 0x020600 Recovered.
LVM: VG 64 0x040000: PVLink 31 0x020700 Recovered.
LVM: Performed a switch for Lun ID = 0 (pv = 0x000000009786a000), from raw device 0x1f0a0200 (with priority: 1, and current flags: 0x0) to raw device 0x1f020200 (with priority: 0, and current flags: 0x80).
LVM: VG 64 0x020000: PVLink 31 0x020200 Recovered.
LVM: Performed a switch for Lun ID = 0 (pv = 0x00000000978b2000), from raw device 0x1f0a0300 (with priority: 1, and current flags: 0x0) to raw device 0x1f020300 (with priority: 0, and current flags: 0x80).
LVM: Performed a switch for Lun ID = 0 (pv = 0x00000000978b6000), from raw device 0x1f0a0400 (with priority: 1, and current flags: 0x0) to raw device 0x1f020400 (with priority: 0, and current flags: 0x80).
LVM: VG 64 0x030000: PVLink 31 0x020300 Recovered.
LVM: VG 64 0x030000: PVLink 31 0x020400 Recovered.
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097820000), from raw device 0x1f0a0000 (with priority: 1, and current flags: 0x0) to raw device 0x1f020000 (with priority: 0, and current flags: 0x80).
LVM: Performed a switch for Lun ID = 0 (pv = 0x0000000097824000), from raw device 0x1f0a0100 (with priority: 1, and current flags: 0x0) to raw device 0x1f020100 (with priority: 0, and current flags: 0x80).
LVM: VG 64 0x010000: PVLink 31 0x020000 Recovered.
LVM: VG 64 0x010000: PVLink 31 0x020100 Recovered.