Digital Transformation
Showing results for 
Search instead for 
Did you mean: 

"Analytics for the people" - Big Data 2020


Let’s jump to 2020 (our favourite year to jump to). And let’s follow Penny as she models a big data analysis.

Penny is the European Sales Operations manager for Smart Health Devices ( She wants to create a solution that tells the European sales managers where big deal help is needed. Sure, she can look in the sales force automation system and see what the sale reps think, but unfortunately, SDH’s sales reps are notoriously poor at setting an accurate probability of success for a deal.
The modelling system first presents Penny with a list of the possible data sources for the analysis. All the usual structured sources are there - orders, account visit reports, and sales reps deal predictions. Penny choses to include these.
But Penny is also looking to mine sentiment about SDH in accounts with potential deals. She chooses to mine sentiment in SDH’s support online community (if a customer is bitching heavily about in the support communities, it’s unlikely that that customer will place a large order).  
She also chooses to mine tweets with various SDH-related hash tags. 
She then calls a few of her “pet” customers and asks them where they would go to bitch about, and heaven forbid, praise SDH’s products and services. She gets back a list of 15 sites which she includes into the analysis.
Penny is no PhD Data Scientist, so she doesn’t know which analysis algorithms will give her the deal-close prediction she needs. Not a problem - she tells the modelling system the domain she is working in (Sales Ops) and what she is trying to achieve (deal closure prediction). The modelling system lists the best algorithms. Penny has data from a couple of pursuits that actually resulting in real deals and a few that didn’t. She runs these thru the different algorithms and chooses the analysis set that best matches what happened in reality. 
Because Penny is not a data analysis expert, and because data analysis is far from an exact science, she is comforted to know that she can quickly go back into the analysis system and choose a different analysis set at a later time.
Penny’s “customers” are the European sales managers. She needs to ensure that her analysis results are presented in a way that is most helpful her customers. The modelling system gives her a number of options. She chooses a map-based visualisation (see below).
TITAN screen shot copy.png
Sales managers are rarely sat at a laptop - they are either out on the road or in meetings. So Penny opts to create analysis clients for smart phones, tablets and the web. 
Penny “releases” her analysis to her sales managers. Over the next few weeks, she gets plenty of feedback. Much of it is positive, but she realises that the sentiment analysis modelling she did is missing some types of sentiment. She goes back and adjusts the analysis. She also realises that while the map visualisation is a great overview, she needs something more tabular for a country and deal double-click. She adds this.
These adjustments are very important. In a future blog, I shall go into a lot more detail on the idea that “everything is an experiment”. But for now, let’s just observe that Penny’s first release of analysis is not the final version. It’s an experiment, and the modelling system needs to allow Penny to quickly adjust her “experiment” based upon feedback. 
What happened here? Two things. Firstly, Penny created a “360 degree [big] data analysis”. And secondly, she did “analytics for the people” - the ability for subject matter experts to create their own 360-degree analyses. 
Let’s look at each of these two concepts in more detail. Let’s start with 360-degree big data analysis.
360-degree big data analysis
By 2020, we will routinely use traditional structured data, machine-to-machine data, and human interaction data in our data analyses. Let’s look at some examples of 360-degree analysis:
Banking Compliance : 360-degree big data analysis is already in use in banks. They look at structured information (trades versus stock movements and company announcements). And they analyse mail and voice conversations between traders.
Retail : we might analyse structured sales data to see what’s trending up, what’s trending down (and thus, may leave us with unsold stock), and for affinities - products that sell together. Affinities are important because they allow retailers to up-sell, cross-sell and sell-with. Structured analysis allows to see figure out “customer journeys” - what journeys do customers take thru a potentially multi-channel sales process before they buy from us. 
360 diagram copy.png
But retail can also benefit from analysis of human interaction data. Structured data tells you about the product and services that you sell. It tells you nothing about those things that your competitors sell, but that you don’t sell. Social media can fore-warn you. You’ll get a warning about “cool stuff to wear” in social media before you see it in structured data. As the proud owner of two teenage kids, I know that a lot of their fashion choices are based on trends created and fuelled in social media. 
Social media can also tell you about the demographics of the customers who buy your product. For example, if you are aiming for 18 to 26-year olds, and you get a ton of social media conversation of the form “I bought this item for my mum. She loved it and bought one for grandmother too. My gran’s friend just love it”, you know you are not hitting your target age-group. (I don’t know of any retailers doing this, but the London Metropolitan Police use social media to test their community campaigns, and a car manufacturer analysed social media to understand the demographics of the people who bought their cars). 
Supply Chain : companies like Kokuba are already putting their structured data into big data analysis engines for faster analysis, allowing them to make “real-time” decisions. The increasing use of RFID tagging means that supply chain analysis will involve masses of machine-to-machine data processing. 
But what about human interaction data? Successful companies have great supply chains in no small part because they work well with their supply chain partners. How do you know you are “working great with your partners”? How about analysing the sentiment of email and voice traffic between you and your partners? How about using “CRM augmentation” (see below) to improve your partner dialogs? How about using email and voice analysis of your partner conversations to ensure compliance on yours and your partners’ part?
Police : structured analysis can be used to understand crime patterns and thus, to target police resources and community programs. I read the other day that thefts follow the same pattern as that of hunting animals - thefts occur around the same area until that area becomes “over fished”, at which point the criminals will move to another area until that area becomes over-fished.  Apparently, criminals and hunting animals are very sensitive to hunting their feeding areas dry.
But Police forces are using human interaction data too to understand what the public are annoyed about, and to geographically isolate negative sentiment. (I’ve written about this in more detail in this blog post).
Police also use social media sentiment analysis to gauge the effectiveness of the campaigns that they run. They might work with community leaders in a certain district and then mine the social media from that district to see if the campaign paid off. 
They also use social media to determine the real “community leaders”. Because of social media, you may have two different community leaders - one leader for the older people who “talks to everyone”, and another, for the younger people who “tweets like crazy and has masses of followers”. 
Augmented CRM : HP’s SaaS’ed service desk, HP Service Anywhere, has really drunk the human interaction analysis kool-aid. 
As a support engineer dialogs with a customer, Autonomy looks for meaning in the conversation and brings up related knowledge-based articles, incidents and problems. 
The system looks at all the incidents and problems for clusters around different topics. This allows the support desk to proactively attack issues that are “trending upwards”. 
When a customer types text into the self-help system, that text is given to Autonomy and a set of keywords is returned. This allows customers to type in text that is close, but that doesn’t exactly match, keywords the system has used to store information. It makes the self-help system much more useful. 
Why not use these same functions to augment any CRM system. For example, we are working with a social services agency to augment their CRM system in this way. Anyone who has a CRM system which is important to them may want to follow suit. 
IT fault and performance management : The latest generation of IT management systems takes all log, alert, performance, and service hierarchy data and uses it to better solve problems with today’s complex IT systems. Just as in retail, this is a huge step forward. (HP Operations Analytics an example of such a system. I blogged on this product recently.)
In the future, however, these IT operations analytics solutions will also use human interaction data. They will derive meaning from the interaction between the testing team and the app dev team (Support desk : "This is a defect.” App dev team : "No, it’s a feature.”), between the service desk and application maintenance, and between the service desk and the customer. As a former quality manager, I know that these interactions often contain information that can help in quick diagnoses of customer problems.
Apparently, one of Mikael Gorbochev’s favourite says was, “to see for yourself is to hear a thousand times”. In other words, if you really want to know what’s going on, you have to “get out there”. But with fast-moving, multi-national businesses, "getting out there" is very expensive and time-consuming. 360-degree analytics from structured, machine-to-machine and human interaction data is probably the next best thing.
Analytics for the people
Much of my life has been spent in product management. As a product manager, you need to answer all kinds of data-related questions. Many of these questions, I couldn’t answer on my own - I needed IT to help me. Like all IT departments, ours is resource constrained, and I thus became very familiar with the phrase, “you are on the list, but you are currently below the cut line”.  If we want product managers, supply chain managers, campaign managers, Police commissioners, sales ops managers, lab managers, and retail managers to mine and use all the data available to them, we must give them the ability to create their own big data analyses. Hence the term “analytics for the people”.
We have another problem that makes “analytics for the people” an imperative. In 2012, McKinsey consulting in their excellent report "Big Data – The next frontier for innovation, competition and productivity” stated that the US was 190,000 data scientists short.
I like the phrase “analytics for the people". I didn’t, however, create it myself. It is the subtitle for a joint HP Labs / HP Software project whose objective is just that - to allow subject matter experts to create and play with their own big data analysis, in the way that Penny did at the top of this blog post.   (A demo of the HP Labs prototype in action can be found here.)
The HP Labs / HP Software researchers believe that it’s not yet possible to create “analytics for the people” that is completely generic. Instead, they believe we need to know what the subject matter expert is trying to achieve. So, we may have such a system for chief marketing officers, and another for sales ops managers, one for supply chain managers, and so on.
Analytics for the people and the cloud
Do Penny and her IT people really want to setup her analytics platform? Do they really want to worry about adding the latest analytics algorithm updates? Probably not - Penny’s company makes and sells medical analysis equipment and data analysis is just a tool they use in order to achieve this. 
I believe that we will see increasing use of cloud-based analytics platforms where bandwidth allows. When huge amounts of data need to be analysed (the telemetrics from a fleet of trucks which are charged on a “thrust basis”, for example), the cloud may not be suitable because transferring all that data to the cloud is not worth it. But for analysis where data sets aren’t huge, the cloud is a great solution.
For large data-set analyses, I think that we’ll see managed appliances - think of this as “on-site cloud”.
The product manager, sales ops manager, healthcare manager, retail manager, and supply chain manager of the future will routinely gain insight from the full 360-degrees of big data available to them. 
They will create the data analysis themselves. And because of this, they will be treat their 360-degree big data analysis as an experiment - they will iterate with the data sources, the algorithms, and the visualisations until they have optimised the insights they get from the data.  
My former-self product manager can’t wait for this future to arrive!
Want more?

For a lovely graphical listing of all Big Data 2020-related postings, please go to my Big Data 2020 web page.


To find out what HP Big Data can do for your today, please go so our HP HAVEn page


Mike Shaw
Director Strategic Marketing

linkedin.gifMike Shaw

0 Kudos
About the Author


Mike has been with HPE for 30 years. Half of that time was in research and development, mainly as an architect. The other 15 years has been spent in product management, product marketing, and now, strategic marketing. .

Jan 30-31, 2018
Expert Days - 2018
Visit this forum and get the schedules for online HPE Expert Days where you can talk to HPE product experts, R&D and support team members and get answ...
Read more
See posts for dates
HPE Webinars - 2018
Find out about this year's live broadcasts and on-demand webinars.
Read more
View all