IT Operations Management (ITOM)
Showing results for 
Search instead for 
Do you mean 

Don’t let thin-on-thin storage provisioning be the subprime crisis of your datacenter

VijayRamS on ‎07-12-2016 10:14 PM ‎07-18-2016 10:52 AM Julia_Dickinson

Thin provisioning in storage arrays is widely used and is a proven technology. Almost all of us use it at this point. The array has a limited amount of physical capacity. Thin provisioning allows us to present much more storage than what is physically available. The array guarantees portion of storage to different storage volumes based on writes. The assumption here being that consumers of storage will not use all the requested capacity immediately. The consumers of storage will be competing for the storage available which is marked in purple in the picture (Figure 1) below.
Figure 1 – Thin provisioning over allocation at arrayFigure 1 – Thin provisioning over allocation at array

There are different tools to monitor the over allocation levels (Figure 1, marked in orange) in storage arrays. I have seen different numbers that customers consider safe for over allocation numbers. The most common number for allocation is 400%. Meaning, if you have 1TB storage you allocate up to 4 TB to consumers.

Now, let’s add thin provisioning at the server level. You are taking the capacity that is not guaranteed at the array level and blowing it up even higher. Consider the following scenario:

 

Figure 2 - Thin-on-Thin over allocation at array and virtualization layerFigure 2 - Thin-on-Thin over allocation at array and virtualization layer

You take the 4TB and give it to 4 server admins, 1 TB each. In turn they take that 1TB and create virtual machines that are thinly provisioned for a total of 2 TB. Your server admins are more conservative and have only allocated 200% of what they have. Now 200% allocation is not that big and is totally perfect, right?

If you really do the math between physical capacity and what is seen by the guest VMs, it is not 200% or 400%. You have just taken 1 TB of physical capacity and given it as 8 TB. That is a whopping 800% allocation. Does this whole scheme sound familiar to you? Taking assets that don’t have any guarantees and considering them as a base for creating additional offerings, you are setting yourself up for a subprime crisis of your own with thin-on-thin.

The bigger problem that I have seen with customers is, people don’t even know how much they have over allocated.

Storage Operations Manager (SOM) provides out of the box calculations for over allocation after two levels of thin provisioning. This information is presented at each data store level. SOM also provides how much physical capacity is available for expansion in the storage array.

Figure 3 - Virtual server analytics dashboard in SOMFigure 3 - Virtual server analytics dashboard in SOM

There are different recommendations from storage array vendors and hypervisor providers to effectively use thin provisioning. With SOM you can take control of thin provisioning be it on the array level or the data store level or thin-on-thin.

Do you know your over allocation levels across arrays and data stores? Start your free trial today.


Tweet to us at @HPE_ITOps and let us know what you think! | Follow HPE Software on LinkedIn  | Friend us on HPE Software on Facebook

Don’t miss out! Click here to get the monthly HPE Software Blog newsletter.

About the Author

VijayRamS

Events
June 6 - 8, 2017
Las Vegas, Nevada
Discover 2017 Las Vegas
Join us for HPE Discover 2017 in Las Vegas. The event will be held at the Venetian | Palazzo from June 6-8, 2017.
Read more
Apr 18, 2017
Houston, TX
HPE Tech Days - 2017
Follow a group of tech bloggers for a new HPE Tech Day, a full day of sessions about how to create a hybrid IT, from hyperconverged to Composable Infr...
Read more
View all
//Add this to "OnDomLoad" event