<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: IOPS Tests in MSA Storage</title>
    <link>https://community.hpe.com/t5/msa-storage/iops-tests/m-p/7196557#M16796</link>
    <description>&lt;P&gt;&lt;a href="https://community.hpe.com/t5/user/viewprofilepage/user-id/2045220"&gt;@CRD&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;I don't know of any way of getting a graph in IOMeter.&amp;nbsp; The graph you are showing is over a number of days with a data point average every 15min.&amp;nbsp; IOmeter runs a workload as fast as possible for the time given and returns in the results the average over the time.&lt;BR /&gt;In order to get the multiple data points as shown in the graph you would need to create that many access specification or run that many variable tests from the 'Test Setup' tab.&amp;nbsp; In the end you would need to take the data to your favorite spreadsheet app and then create a graph from the data.&lt;BR /&gt;Some fun knobs to turn in IOMeter:&lt;BR /&gt;'Disk Targets' -&amp;nbsp; # of Outstanding I/Os == this is the queue depth you are pushing, if you are doing a random workload increase this to improve performance&lt;BR /&gt;'Disk Targets' -&amp;nbsp; Maximum Disk Size == put this small enough to fit in cache and watch the dial go really fast.&amp;nbsp; Keep it inside your SSD capacity after writing data to the Pool and you should see this steadily increase as the tiering engine moves that 'Hot' data into the SSD tier.&lt;BR /&gt;'Test Setup' - Run time == this is how long the test runs,&amp;nbsp; put multiple of the same access specifications and set for 15 min and you should get a steady stream of the ~same performance output (once the tiering/caching is stabilized)&lt;BR /&gt;'test Setup' - Cycling Options -&amp;gt; Cycle # Outstanding I/Os&amp;nbsp; == steadily increase the queue depth and find a saturation point for your workload&lt;BR /&gt;All that can show some fun numbers but in the end IOmeter and the Array may still report differing information.&amp;nbsp; The array is going to report what it sees which will be the result of caching and coalescing between IOMeter and the array.&amp;nbsp; So don't expect IOmeter to show 550 READ IOPs and the array to report exactly 550 READ IOPs.&lt;BR /&gt;Hope this helps.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
    <pubDate>Fri, 15 Sep 2023 00:43:56 GMT</pubDate>
    <dc:creator>JonPaul</dc:creator>
    <dc:date>2023-09-15T00:43:56Z</dc:date>
    <item>
      <title>IOPS Tests</title>
      <link>https://community.hpe.com/t5/msa-storage/iops-tests/m-p/7196440#M16790</link>
      <description>&lt;P&gt;Good day&lt;/P&gt;&lt;P&gt;Hope this is not a silly question but want to find out if anyone has a known case for IOmeter setup to measure MSA2060 IOPS?&lt;/P&gt;&lt;P&gt;2 x 1.9 TB SSD SAS disks for cache&lt;/P&gt;&lt;P&gt;42 x 12 TB 7K SAS MDL disks&lt;/P&gt;&lt;P&gt;I want to try and mimic the below MSA GUI IOPS graph in IOmeter if possible.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="MSA IOPS Graph.png" style="width: 804px;"&gt;&lt;img src="https://community.hpe.com/t5/image/serverpage/image-id/137236iA726A1E4A572A472/image-size/large?v=v2&amp;amp;px=2000" role="button" title="MSA IOPS Graph.png" alt="MSA IOPS Graph.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Any advice will be much appreciated&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 03:53:01 GMT</pubDate>
      <guid>https://community.hpe.com/t5/msa-storage/iops-tests/m-p/7196440#M16790</guid>
      <dc:creator>CRD</dc:creator>
      <dc:date>2023-09-15T03:53:01Z</dc:date>
    </item>
    <item>
      <title>Re: IOPS Tests</title>
      <link>https://community.hpe.com/t5/msa-storage/iops-tests/m-p/7196557#M16796</link>
      <description>&lt;P&gt;&lt;a href="https://community.hpe.com/t5/user/viewprofilepage/user-id/2045220"&gt;@CRD&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;I don't know of any way of getting a graph in IOMeter.&amp;nbsp; The graph you are showing is over a number of days with a data point average every 15min.&amp;nbsp; IOmeter runs a workload as fast as possible for the time given and returns in the results the average over the time.&lt;BR /&gt;In order to get the multiple data points as shown in the graph you would need to create that many access specification or run that many variable tests from the 'Test Setup' tab.&amp;nbsp; In the end you would need to take the data to your favorite spreadsheet app and then create a graph from the data.&lt;BR /&gt;Some fun knobs to turn in IOMeter:&lt;BR /&gt;'Disk Targets' -&amp;nbsp; # of Outstanding I/Os == this is the queue depth you are pushing, if you are doing a random workload increase this to improve performance&lt;BR /&gt;'Disk Targets' -&amp;nbsp; Maximum Disk Size == put this small enough to fit in cache and watch the dial go really fast.&amp;nbsp; Keep it inside your SSD capacity after writing data to the Pool and you should see this steadily increase as the tiering engine moves that 'Hot' data into the SSD tier.&lt;BR /&gt;'Test Setup' - Run time == this is how long the test runs,&amp;nbsp; put multiple of the same access specifications and set for 15 min and you should get a steady stream of the ~same performance output (once the tiering/caching is stabilized)&lt;BR /&gt;'test Setup' - Cycling Options -&amp;gt; Cycle # Outstanding I/Os&amp;nbsp; == steadily increase the queue depth and find a saturation point for your workload&lt;BR /&gt;All that can show some fun numbers but in the end IOmeter and the Array may still report differing information.&amp;nbsp; The array is going to report what it sees which will be the result of caching and coalescing between IOMeter and the array.&amp;nbsp; So don't expect IOmeter to show 550 READ IOPs and the array to report exactly 550 READ IOPs.&lt;BR /&gt;Hope this helps.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 00:43:56 GMT</pubDate>
      <guid>https://community.hpe.com/t5/msa-storage/iops-tests/m-p/7196557#M16796</guid>
      <dc:creator>JonPaul</dc:creator>
      <dc:date>2023-09-15T00:43:56Z</dc:date>
    </item>
  </channel>
</rss>

