Shifting to Software-Defined
cancel
Showing results for 
Search instead for 
Did you mean: 

Are software-defined data centers a cure-all for it complexity?

GuestBloggerCDI

 

Brian K.jpgBy guest blogger, Brian Knudtson, Senior Technical Marketing Manager

IT complexity is inevitable in the modern data center, with many IT professionals basing their careers on managing complex silos by specializing in certain areas of the IT stack (e.g., backup administrator, storage administrator). Today, many vendors and analysts are touting the advantages of simplification; however, the reality for the IT industry is that IT is on a never-ending quest to hide complexity either through processes or new technology. The problem is as soon as one part of the environment is simplified, complexity just creeps in elsewhere. According to blogger Keith Townsend, complexity drives IT professionals to have to, “abstract away the abstraction, so that we normalize it.”

Will the industry ever find a “cure-all” to solve complexity? Maybe not, but the right approach may bring us closer to an answer. Many people believe this approach will be “software-defined.” Software-defined technologies offer many typical capabilities that make IT infrastructure simpler, including policy-based management, stretched networks for disaster recovery and workload mobility and abstraction of management across data centers. As a whole, though, software-defined technologies are not always simple to implement or support.

Cure all.jpgWhen software-defined technologies are delivered in a software-only model, the complexity tends to be high. For the vendor, development and support teams must consider all the permutations of hardware and software that their customers might have in their data centers. The IT staff are then responsible for the integration of all the hardware and software components within their unique data center, along with the maintenance of all the cross-system interactions. These factors must be dealt with in the field with whichever permutations apply to each environment.

However, “software-defined” doesn’t necessarily have to mean “software-only” or even based solely on commodity hardware. If the solution is tied to one specific piece of hardware, then variability diminishes and complexity tends to decrease. Converged and hyperconverged infrastructure, for instance, were initially brought to market as appliances to reduce complexity and simplify both deployment and support. A virtual desktop infrastructure is very much a software-defined infrastructure, but far from simple to implement and regularly requires Graphics Processing Units (GPUs), which are not commodity components.

These examples of software-defined data center solutions are examples of the modern proverb, “the software has to run somewhere,” which highlights how hardware decisions can make or break an architecture when applied in the real world. The complexity for both the vendor and the customer’s IT staff can be driven out of the solution by narrowing or prescribing the integration points and performing the integration in the factory instead of in the field. This can create a more stable and consistent experience for the end user.

In this way, hyperconverged technologies, like the HPE SimpliVity 380 powered by Intel®,  are eradicating complexity by reducing the number of discrete devices needed in an environment and minimizing IT staff integration and cross-system troubleshooting. Simplifying operational activities — clone, backup, restore, move — and creating pre-built integrations with existing management interfaces — vCenter, UCS Director, vRealize Automation — reduces graphical user interfaces and API complexity. By providing customers with an appliance that is pre-integrated with software, hyperconverged vendors are able to strike a balance between the flexibility of software-only delivery and the simplicity of appliance-only delivery.

The industry is in a period of intense IT complexity, coupled with ever-increasing end-user demands. While not a given, implementing a software-defined data center strategy, like HPE SimpliVity hyperconverged infrastructure, can create a much simpler environment to manage. But simplicity is usually dependent on what is implemented and how it’s managed. As consultant and contributing columnist Trevor Pott stated, “that’s IT: every single aspect of it is a cyclical masking of complexity until we ultimately master it enough to truly commoditize it.” Hyperconverged infrastructure is a component of the software-defined data center and helps with this complexity by mastering the management of data and reducing the components needed in the data center.

To learn more about hyperconverged solutions, download the e-book: How Hyperconvergence Can Help IT.

Brian

Brian Knudtson is an IT industry expert and vExpert with 18 years of experience. He holds a BS in Computer Science from the University of Nebraska Omaha, and several certifications including VCAP-DCA, VCAP-DCD, VCP-DT, MCITP-Enterprise, and HP Master ASE.

Follow HPE Composable Infrastructure

0 Kudos
About the Author

GuestBloggerCDI

Events
Nov 27 - 29
Madrid, Spain
HPE Discover 2018 Madrid
Learn about all things HPE Discover 2018 in Madrid, Spain, 27 - 29 November, 2018.
Read more
See posts for
dates/locations
HPE at 2018 Technology Events
Learn about the technology events where Hewlett Packard Enterprise will have a presence in 2018.
Read more
View all