Digital Transformation
Showing results for 
Search instead for 
Did you mean: 

Machine learning, AI, and the next wave of legacy systems


Guest blog post by Nicole Hemsoth, IT journalist, and co-editor of The Next Platform


Artificial intelligence (AI), machine learning, and deep learning—they're the natural next step for Big Data. Machine learning and AI are the evolution of what was considered revolutionary just a few years ago: gathering and making sense of large, previously silo-bound data volumes.

Prepping for the Big Data era

During the first wave of the Big Data era, companies were just getting a handle on the distribution, variety, and monetary potential of their many silos. The focus at the time was on integration and recognition—the simple understanding of what data resided where, and what parts of the organization were accountable for it. This was no small challenge, and it's still happening in companies with large, distributed infrastructures and global teams.


That first wave led to a new host of tools, including Hadoop, predictive analytics, and all the many innovations up and down the stack designed to integrate, centralize, process, store, and analyze Big Data. From roughly 2013 till now, the entire enterprise data stack has been altered, from the hardware infrastructure to the various software underpinnings—well, almost all of it.


There are some radically new elements set to become part of the analytics stack for enterprises, but they're often matched with bulky legacy tools and frameworks. The fact is there's no way to capture the benefits of the new machine learning and AI capabilities without rethinking legacy infrastructure.


Many enterprise IT shops have already done the heavy lifting. This came about because the existing database, processing, and storage engines proved no match for the vast volumes of previously siloed data, and they needed a new set of analytics tools to address that. In many cases, these approaches were designed with familiar analytical methods and interfaces in mind. Some might have made the SQL to NoSQL jump, others adopted Hadoop as the processing and storage framework, and others migrated legacy databases into centralized stores and implemented more complex visualization tools. But no matter which method of beefing up the infrastructure was chosen, if it was in the name of making it more "Big Data ready," chances are it was "AI ready" as well.


Many analytics vendors are now scrambling to find ways to implement the new wave of AI algorithms into their analytical packages. This is good news for system and IT managers who have already undertaken an extensive overhaul to prepare for Big Data tools and platforms. But there are still some areas that could prove weak points—and some lie at the heart of the enterprise data strategy.

Shoring up the data warehouse for the machine learning revolution

As a representative example of legacy infrastructure facing new analysis methods, consider the beleaguered data warehouse.


The legacy beasts that keep enterprises tied to the past are often numerous: from applications to storage systems to (gasp!) the system administrators who keep the status quo going because "Why fix what isn't broken?" For many organizations, the data warehouse is one of the factors responsible for maintaining the old guard. Sound familiar? After all, when the Big Data era kicked off in earnest, wasn't this same culprit to blame for many strategies being shot down on the first pass?


Many enterprises modernized their data warehouses to keep pace with analytics innovations. They looked to Hadoop, the range of new extract, transform, and load (ETL) tools that were flooding the market, and various other "silo-breaking" technologies. Overall, the data warehouse revisions were just enough to let organizations keep one foot planted firmly in the past, and the other tentatively in the future.


The problem is that the next generation of Big Data analysis and intelligence—the AI and machine learning revolution in the enterprise—requires another rethink of the data warehouse. The good news (and there is plenty) is that the potential ROI is far easier to trace than it was in the early, nebulous days of vague "Big Data gold mines." Organizations that took significant steps to shore up their data warehouses and other legacy infrastructures to meet the demands of the Big Data era will be better positioned to tackle the AI challenges in the years ahead.


But if you’re stuck with a data warehouse that is more 2010 than 2015, you may have to go back and ride the Big Data wave of change before you can consider an AI initiative. And there's no better time to take the plunge, because the next big wave is about to break.


Read the Frost & Sullivan white paper to learn how the right infrastructure can prepare your data center for business disruptors.

0 Kudos
About the Author


Jan 30-31, 2018
Expert Days - 2018
Visit this forum and get the schedules for online HPE Expert Days where you can talk to HPE product experts, R&D and support team members and get answ...
Read more
See posts for dates
HPE Webinars - 2018
Find out about this year's live broadcasts and on-demand webinars.
Read more
View all