Transforming IT
Showing results for 
Search instead for 
Do you mean 

Understanding Voracity: A Big Variable for Big Data

amosferrari on ‎02-08-2013 10:30 AM

In my previous blog, I wrote about the concept of the Big Data Refinery System, a key element in an IT transformation for Big Data. I would like to expand more on the concept of Big Data characteristics that are forcing the IT transformation to Big Data – and in particular, the concept of Voracity.


Big Data variables forcing Big Data technology adoption


Not all of the data we produce is “big data”. Forrester identified three main attributes to classify a data source as being big data. This was later adopted by IDC, Gartner and others as follows:


  • Velocity: data creation and transport of streaming data, with milliseconds to seconds to respond; Velocity is also related to ingestion and data cleaning.
  • Variety: data in many forms – structured, unstructured, internal/external, text, multimedia.
  • Volume: data quantity, scaled from terabytes to petabytes to zettabytes; volumes that traditional Data Management technologies cannot  handle in time for consumption.

I do want to stress this scenario and add to it a forth variable: Voracity.


  • Voracity: data consumption, ingestion and processing of data in real time; velocity as it relates to consumption of large amounts of data. Voracity reflects the information “appetite” from both the business side and the consumer side of the enterprise. The new capabilities coming from big data access and utilization will increase users’ requirements related to information. For example, one possible future scenario might require users to access streaming videos and related textual data to make better decisions.



Voracity: processing all data quickly


Voracity is the ability to process all data types quickly. As with Volume, Voracity is realized through scale-out or scale-up architectures, or a combination of the two. Hadoop implements MapReduce tasks across thousands of servers at once, allowing massive amounts of data to be processed in a short period of time. SAP HANA’s scale-out , in-memory architecture allows very fast response times. Similarly, Vertica’s columnar database is a scale-out architecture that allows for real-time queries across data spread over 10s or 100s of servers. With its indexing and search capabilities across any data type, Autonomy is providing extended capabilities for today’s Information "consumerization".


Voracity and the need for Big Data Services


As noted in a recent blog, Big Data services should start with users; they should start with a description of service requests from the customers of IT.  Just because I can ingest and process the data rapidly, it does not mean I can necessarily consume all of that data rapidly; or at least rapidly enough.


What is rapid enough? Well, that’s defined by the use case. In this scenario, Voracity becomes then a key element in determining the characteristics of Big Data IT services, inserting the “users” and data consumption into the Big Data scenario (see the figure below).


Voracity focuses on consumption, so it formally introduces the user or the functional organization into the equation; the traditional three V's focus on production and processing, leaving a gap as regards consumption. 


I would like to start a discussion around the concept of Voracity as a key characteristic for a Big Data transformation of IT infrastructure – Voracity as a key variable when it comes to cloud and services.

What do you think of these ideas? Drop a comment in the box below and let me know your opinion. 



Key Takeaways


Today, IT infrastructures that support data architecture need to evolve. IT transformation to Big Data services does require considering Voracity as a key transformation variable.


Enterprises’ IT departments need to understand the analytics requirements for Big Data and plan for the transformation of today’s infrastructure in order to provide Big Data services to the business.


Last December, at HP Discover in Frankfurt, HP announced some new choices within the HP Big Data Strategy Workshop. Click here to learn how the Workshop can help you build a roadmap for your Big Data infrastructure strategy, while reducing risk and accelerating decision-making.


Amos FerrariTo learn how I can help your organization meet its growth objectives, see my HP Technology Expert profile 

About the Author


I’m a Global Strategist, a certified (PMI) Project Manager, specializing in business to IT alignment, agility consulting, Infrastructure Transformation and Strategic Architecture for Big Data, Mobility, Private Cloud, Unified Communications and Collaboration. I drive the strategy, vision and content of strategic consulting services in the Big Data IT Infrastructure services area at HP. As part of this, I meet with senior level customers to understand their challenges, conduct workshops to determine future vision and roadmaps as well as presenting at industry and analyst events.

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
1-3 December 2015
Discover 2015 London
Discover 2015 in London, the ultimate showcase technology event for business and IT professionals to learn, connect, and grow.
Read more
November 2015
Software Online Expert Days
Join us online to talk directly with our Software experts.
Read more
View all