Servers & Systems: The Right Compute
1753834 Members
8226 Online
108806 Solutions
New Article
AdvEXperts

Re: Learning about learning: Artificial intelligence, machine learning & deep learning

How can artificial intelligence, machine learning, and deep learning transform our world? Experts discuss the differences between these technologies and where they see them headed long term.

AI_Learning_RGB800x533.png

Artificial intelligence (AI), machine learning (ML), and deep learning (DL): these terms crop up often online and in the popular and technical media. They either represent a tidal wave that’s going to overwhelm every industry and company that doesn’t rapidly adapt or they’re a more incremental innovation in IT. They may usher in new capabilities like speech recognition or image analysis, but at the same time they may also be a little overhyped today.

Which of these scenarios is closer to reality? And while we’re at it, what—if anything—are the differences between AI and machine learning and deep learning? Also, why does any of this matter to someone who’s otherwise interested in high-performance computing and IT at the cutting edge?

To answer the final question first, it matters a great deal if you lend any credence to the industry reports that see AI as a revolution in computing. It’s arguably at least as important as parallel processing or even supercomputing. Some go further and compare AI to electricity—an utterly transformative technology that not only upends the way the world does business, it alters the very fabric of society.

In his book, The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity, futurist Byron Reese takes a bullish view on AI and its cousins machine learning and deep learning (whose differentiating features we’ll come to in a moment). “A few centuries ago, we systematized science and created unimagined prosperity,” he writes. “A few decades ago, we began making mechanical brains [via AI], and just a few years ago, we learned dramatic new ways to make those brains more powerful [via machine learning and deep learning]. It feels as if this very moment is the great inflection point of history.”

However, according to Paul Hahn, who was formerly with Cray and currently manages AI and analytics marketing at HPE, another approach might be to compare AI not to electricity but rather to the advent of the mobile phone.

“Over the long term, maybe ten to twenty years, AI is going to have a disruptive and transformative effect on work and cause displacement,” he says. “I’m hard-pressed to think of industries where it may not have some impact. But I think that in the nearer term, it’s going to change the tools people use for work in every industry — though maybe not the nature of work itself. I think it’s more akin to mobile networks. I think electricity changed the entire standard of living for people who had it. AI, I don’t think, will change standards of living.”

Of the three—artificial intelligence, machine learning, and deep learning—AI has the longest history, says Arti Garg, now head of advanced AI solutions & technologies at HPE.

“These terms get used really broadly,” she says of AI, ML, and DL. But a productive way of visualizing the field, she says, is to imagine a small circle nested inside a slightly larger circle, which is itself nested within an even larger circle.

The largest of those circles is AI. “AI is anything using a machine to do something that human cognition might do,” she says.

“I don’t think people should spend much time worrying about the term AI in general,” says Hahn. “It’s such a broad field. And there are lots of folks focused on advancing the state of the art in AI. Of more practical interest are two subsets of AI. One is machine learning — which picks up where statistics leaves off. It’s the notion of having a large body of data and using the machine to learn from the data to make predictions. The other is deep learning, which is a subset of machine learning. . . where you learn to make predictions based on experience.”

Think of machine learning, in other words, as a way of enabling a computer to see patterns in big data sets and then to apply those patterns to new data to draw inferences and act on those inferences.

Then deep learning relies on simulations of the way human brain cells work (a.k.a. neural networks) as one way of implementing machine learning. In fact, the term “deep learning” is a little misleading. Deep learning isn’t so much deep as it’s neural network-based.

At a practical level, Garg notes, deep learning is the most computationally intensive of the three, because deep learning is often used on extremely large data sets that people might not even think of as “data.” Images, video, sounds, and free-form text represent typical inputs in a deep learning setting.

Autonomous cars are a now-classic use case in deep learning. Whereas recommender engines on an e-commerce or online video or music platform are more typically machine learning—because the input to the algorithm already exists as traditional data files (e.g., Person A bought items X, Y, and Z; Person B bought and so on) and do not need as much neural network smarts to be able to be processed.

Deep learning systems, in particular, must be “trained.” That means running the neural network simulations on a broad range of input — for example, teaching an autonomous driving program about traffic cones by feeding them millions of images and videos of cars driving around traffic cones in typical driving situations.

“Preparing data is often the biggest bottleneck for organizations,” Garg says. “So we would like to provide a system that has compute and storage that allows you to prepare your data and then send it to training without having to physically move the data from one system to another. That’s how a lot of organizations have to do deep learning today.”

From making weather forecasts by the minute to finding smarter ways of discovering oil and gas deposits to streamlining the car insurance adjustment process via deep learning assistants, expect much more from these three nested realms of AI in the years ahead.

Hahn says AI, machine learning, and deep learning will only be fanning out further and more broadly across industries and use cases.

“It’s hard to imagine any application that won’t have some form of AI embedded in it,” Hahn says. “It’s going to be everywhere.”

 

This blog originally published on cray.com and has been updated and published here on HPE’s Advantage EX blog.



Advantage EX Experts
Hewlett Packard Enterprise

twitter.com/hpe_hpc
linkedin.com/showcase/hpe-ai/
hpe.com/info/hpc

0 Kudos
About the Author

AdvEXperts

Our team of Hewlett Packard Enterprise Advantage EX experts helps you dive deep into high performance computing and supercomputing topics.

Comments
Caroline Jeanmaire

Thank you for this thorough post on AI, Machine Learning and Deep Learning and I am sure that individuals are not clear with all these details you have shared here. 

Ramesh Sampangi

Very clear and informative blog. I enjoyed while reading your blog. I appreciated your effort in this blog. I wish you post some more blogs again quickly. Keep sharing!!!

datascience

The information shared in this post is informative and engaging. What I really liked about this post is that the terms are explained in very simple terms, so that it is easy to understand even for beginners. Please keep posting such useful information regularly, so that it will benefit a lot more people like me. 

mlops course

In the fast-paced world of machine learning, MLOps (Machine
Learning Operations) has emerged as a critical discipline for successful model
deployment and management. This article explores the significance of MLOps
courses and highlights their role in equipping professionals with the skills to
navigate this rapidly evolving field.