hpe blog uk & ireland
cancel
Showing results for 
Search instead for 
Did you mean: 

Artificial Intelligence, why now?

Mattab

I mentioned in my last article that Artificial Intelligence is a large field, to get to being a large field it must have been around for some time, which it has. The term Artificial Intelligence (AI) was coined in 1956 at the Dartmouth Conference (New Hampshire) by Stanford University Professor John McCarthy (although he was an assistant professor at Dartmouth at the time). You could argue that AI has deep roots in Formal Reasoning but that would open up a huge philosophical debate and would probably take significantly more space and time to write and read!

So why now, if AI has been around for 70+ years what has changed in the past few years that has propelled AI into the mainstream? The answer is that the perfect wave has hit, three things have come together to give us the adoption rates that we see. These three things are Big Data, Better Algorithms and the hardware has caught up with the mathematics.

To understand why this has all happened at once, I will explore each of the things that have accelerated AI out of the AI Winter. Yes AI has had some ups and downs over the years. Back in the late 70s after much hype and research the mountainous problems that AI faced seemed to be insurmountable and things took a turn for the worse. As AI failed to live up to the promise, a frost descended, funding froze up, interest withered and the first AI winter hit and hit hard. AI was cast onto the pile of nice ideas that just couldn’t be achieved.

Then in the early 80s AI was back and back with vengeance, the sequel was going to be bigger, better faster and stronger! Was summer finally here? For anyone who remembers ‘Expert Systems’, unfortunately far from the feel the warmth of the sun on your skin, the second in the franchise was significantly lacking, clunky and very expensive to implement and more expensive to maintain. Winter was coming, again.

It looked like it was game over for AI, but thankfully it had reached cult status and a number of dedicated followers kept the dream of AI alive. AI played chess and watched TV waiting quietly in the wings, but it wasn’t until 2012 that the waters thawed and AI managed to the get to the top of the the perfect wave. Without a reboot, the franchise was back in the mainstream, there are still a lot of installments to come because AI is a journey not a destination.

What currents came together to create this wave? Big Data, Better Algorithms and affordable parallel computing power. Firstly we have data and lots of it, we are generating data at an ever increasing pace of 2.5 quintillion bytes a day, but more importantly we have labelled data. Labelled data is data that we have described/tagged so that a machine can understand what it is. Secondly we have better algorithms, academia has been working on AI for a long time and some fantastic work has been done to refine the thinking, much of it is in the public domain. Lastly we have better and more affordable hardware on which we can run the mathematical models on. All three of these things are connected.

One of the main breakthroughs on the hardware was back in 2012 when Alexnet, which is a Neural Network designed to recognize images, was first ported from a CPU to run on a GPU (Graphical Processing Unit). Since then leaps and bounds have been made in the depth and complexity of the algorithms that can be run and some of the largest names in the industry have been developing AI tools and techniques.AI-At-The-Crest-Of-The-Wave.jpg

 

But it is important to remember that you do need these three as a foundation on which to build an AI strategy. With your data and lots of it, data that is correctly labeled, governed and controlled. More on data in a future blog and no AI blog would be complete without one on Ethics too. The best place to source your data is from your business community, IT might store it but it is the producers, consumers and owners of the data who understand it the most. They are the ones you need to engage with. HPE, through our heritage from Hewlett Packard, has a huge amount of experience with data and storage going back to the 1970s. Getting this right is paramount to a successful AI journey. As the old African proverb goes, if you want to go quickly go alone, if want to go far go together.

Not only do you need algorithms but you need the right algorithm, there are a lot of decisions you need to make to be sure that your AI implementation is going to achieve the best possible outcome. As I mentioned in my previous blog AI is about complex mathematics so your source data needs to be in the right format and in a machine consumable format. Making AI a foundation part of a digital transformation is a great way of introducing this capability into your organisation. Building your own AI is challenging so it helps to engage with an organisation, like HPE, who has the right skills, experience, tools & ecosystem to accelerate you on your AI journey. From delivering your AI projects to integrating this technology deeply into our own strategy HPE has a long and deep heritage in AI.

Remember buy or build you need the appropriate infrastructure. At this level AI is more than hardware it is a collection of frameworks, algorithms, platforms, libraries and languages that need to integrated and managed in the right way to give the right outcome. At HPE we have done a lot of work building reference architectures, building a benchmarking suit and performance guides to help you start on your journey with the right capabilities in place.


Matt Armstrong-Barnes
Hewlett Packard Enterprise

twitter.com/HPE_UKI
linkedin.com/company/hewlett-packard-enterprise
hpe.com/uk

0 Kudos
About the Author

Mattab

Matt is Chief Technologist for Artificial Intelligence in the UK&I and has a passion for helping customers understand how AI can be part of a wider digital transformation initiative.