HPE EVA Storage

Performance issue between Qlogic and Emulex driver

New Member

Performance issue between Qlogic and Emulex driver

I am using two C-class blades with Mezannine cards for fabric connectivity.

One full height blade has a QLogic QMH2462 4Gb FC HBA for HP c-Class BladeSystem

One half height blade has a Emulex LPe1105-HP 4Gb FC HBA for HP c-Class BladeSystem

Both the servers are running Oracle Linux 5.2 64 bit OS and PSP 8.20 x86_64 package with appropriate drivers required for failover. The modprobe.conf file for both servers respectively shows:

options qla2xxx ql2xmaxqdepth=16 qlport_down_retry=30 ql2xloginretrycount=30 ql2xfailover=1 ql2xlbType=1 ql2xautorestore=0xa0 ConfigRequired=0
remove qla2xxx /sbin/modprobe -r --first-time --ignore-remove qla2xxx && { /sbin/modprobe -r --ignore-remove qla2xxx_conf; }

options lpfc lpfc_nodev_tmo=10 lpfc_lun_queue_depth=16 lpfc_discovery_threads=32
options lpfcmpl mpl_hbeat_tmo_busy=0

Multipath.conf on both servers is same as well:

defaults {
user_friendly_names yes

multipaths {
multipath {
wwid ***********
alias xxxxxx
path_grouping_policy multibus
path_checker readsector0
path_selector "round-robin 0"
failback manual
rr_weight priorities
no_path_retry 5

devices {
device {
vendor "HP"
product "EVA4400"
path_grouping_policy multibus
getuid_callout "/sbin/scsi_id -g -u -s /block/%n"
path_checker readsector0
path_selector "round-robin 0"
hardware_handler "0"
failback 15
rr_weight priorities
no_path_retry queue

The issue is that whenever I present a vdisk to each of the servers, the server with the Emulex HBA detects all 8 paths available to it whereas the server with QLogic detects just one path:


multipath -ll -->

xxxxxxx (*******************) dm-0 HP,HSV300
[size=1.5T][features=1 queue_if_no_path][hwhandler=0][rw]
\_ round-robin 0 [prio=50][enabled]
\_ 0:0:0:1 sda 8:0 [active][ready]

Server with Emulex:

xxxx (**************************) dm-1 HP,HSV300
[size=100G][features=1 queue_if_no_path][hwhandler=0]
\_ round-robin 0 [prio=8][active]
\_ 0:0:0:2 sdb 8:16 [active][ready]
\_ 0:0:1:2 sdd 8:48 [active][ready]
\_ 0:0:2:2 sdf 8:80 [active][ready]
\_ 0:0:3:2 sdh 8:112 [active][ready]
\_ 1:0:0:2 sdj 8:144 [active][ready]
\_ 1:0:1:2 sdl 8:176 [active][ready]
\_ 1:0:2:2 sdn 8:208 [active][ready]
\_ 1:0:3:2 sdp 8:240 [active][ready]
xxxx (**************************) dm-0 HP,HSV300
[size=100G][features=1 queue_if_no_path][hwhandler=0]
\_ round-robin 0 [prio=8][active]
\_ 0:0:0:1 sda 8:0 [active][ready]
\_ 0:0:1:1 sdc 8:32 [active][ready]
\_ 0:0:2:1 sde 8:64 [active][ready]
\_ 0:0:3:1 sdg 8:96 [active][ready]
\_ 1:0:0:1 sdi 8:128 [active][ready]
\_ 1:0:1:1 sdk 8:160 [active][ready]
\_ 1:0:2:1 sdm 8:192 [active][ready]
\_ 1:0:3:1 sdo 8:224 [active][ready]

The server running Emulex gives a speed of about 300 MB/s whereas the one on QLogic gives out 25 MB/s. I need urgent help to sort this issue.
Rob Leadbeater
Honored Contributor

Re: Performance issue between Qlogic and Emulex driver


It looks like you've managed to configure the QLogic failover, as well as the RedHat multipath options, which would appear to be conflicting with each other.

I could be completely off the mark, but try changing ql2xfailover=1 to ql2xfailover=0 in modprobe.conf.

Hope this helps,