Servers: The Right Compute
Showing results for 
Search instead for 
Did you mean: 

To get the most out of big data, choose the right servers


Big Data has been much on my mind recently. I keep seeing staggering numbers that illustrate the impact it’s having on businesses. According to a collection of surprising statistics about big data at Baseline magazine, 90 percent of the world’s data has been created in the last two years – and enterprises store 80 percent of all data. For a typical Fortune 1000 company, just a 10 percent increase in the accessibility of data can produce an additional $65.7 million in net income.


Capitalizing on today’s rapidly growing data pools requires an infrastructure that’s up to the task, and data-driven organizations of all kinds are achieving remarkable results by upgrading their data centers. For a great example, see my recent blog about the impressive strides made by the National Renewable Energy Lab (NREL). An HP Apollo supercomputing platform delivered massive computational power – 1.2 petaflops of compute performance – for NREL’s scientific efforts, and at the same time enabled the lab to achieve its sustainability goals for the project. What previously took a day now takes NREL an hour. (To learn more, watch this short video).


Of course, you don’t have to be a world-renowned science lab to benefit from a server upgrade to power your big data initiatives. Whatever the needs of your data-driven organization – whether you’re dealing with mass content storage, structured and unstructured data, real-time analytics, or a plethora of databases – HP has an array of options to ensure your success. Choosing the right compute takes some thought; a great place to start is the video below:




Learn how other enterprises are using HP’s Optimized Compute portfolio here.

Rosemarie "Rosie" Chiovari
HPE Enterprise Security Solutions Rule
0 Kudos
About the Author


Many years in the computing world (I remember punch cards)