The world is full of data but as “Big Data” gets bigger, the task of quantifying it continues to spawn a multitude of solutions and complications.
The transformative potential of Big Data is now a well-worn trope hence the furious paddling by organisations big and small to extract value.
So just how did it we end up with so much data? And more importantly what can we do with it?
The building blocks of today’s data enriched world are squarely built on the foundation of the internet. The connectivity, that so many of us take for granted, has forged a new pathway of interaction - with every action leaving a trail; every click and tweet collected and dissected.
A glimpse of just how pervasive this data collection was provided by the current affairs program Four Corners this week. Big Data is about behaviour and unlocking the cues to that behaviour lies at the heart of unlocking the transformation potential of Big Data.
However, a pell-mell race to adoption has its pitfalls.
When it comes to working with Big Data most organisations don’t know which questions they need to ask but Amazon Webs Services’ lead data scientist Dr Matt Wood says that’s not as big a problem as one would imagine.
Dr Wood says that the static nature of traditional infrastructure has in the past forced organisations to limit the scope of their questioning. But as the walls of the data centre melt away for businesses it frames the information at hand in a different perspective.
“What we found initially was that a lot of our customers scoped their questions based on the infrastructure they had. But in a utility-computing platform the data centre walls can disappear entirely.”
This allows organisations to take a far more granular approach – what information is useful, which questions drive a competitive advantage, and which information can be used to devise better services and tools.
When it comes to actionable insight, Dr Wood says that organisations need to feel comfortable enough to explore their data because there are tools available for them to make ad hoc queries.
“They don’t have to worry about the structure that they put around their data. They are able to quickly explore and more importantly quickly iterate on the questions that they ask of their data,” Dr Wood says.
The next step is to integrate the customer information– purchasing data, social media traffic, clickstream and web log data and this tacit collaboration between the organisation and its customers is the crucible for developing viable solutions.
“Customers generate plenty of information and more often than not a lot of this data is just thrown away, logs get rotated out as you run out of storage and many organisations just don’t think about it.”
“That’s just leaving money on the table,” Dr Wood warns.
Pulling all the data sources together, implementing an ad hoc query process and having a suite of tools to turn queries into real time streams of data provides enormous potential for organisations. But there is a catch.
The quest for value requires wading through a deluge of metrics and there is an initial risk of getting bogged down in the detail. Breaking this analysis paralysis requires figuring out the usefulness of the metrics at hand.
Starting with a broad level of metrics isn’t unnatural but the trick is to quickly get out of the nitty gritty and focus on the metrics that provide actionable insight on how the business is doing and tracking customer behaviour.
Beyond competitive advantage
Big Data and analytics, if used effectively, can provide a tangible competitive advantage. However, the trend is also providing a methodology to create either brand new business opportunities or push the undifferentiated competitive advantage narrative to another level.
Dr Wood points to San Francisco–based Climate Corp, an AWS customer, as an example of a business that integrates its internal data with publicly available data and turns it into an actual product.
The company, founded by former Google employees - David Friedberg and Siraj Khaliq – in 2006, crunches an enormous volume of data to provide deeply personalised policies to farmers to protect their crops.
We are talking about seriously big volumes. Decades of crop yield data, terabytes of weather and soil data are run through the analytics engine on a monthly basis. Tens of thousands of probable weather scenarios and trillions of data points are crunched constantly to recalculate the premiums.
What this enables it to do is save a lot of money on claim validation, the modelling essentially reduces the need for a group of people to go out to the field and validate claims.
It keeps Climate Corp’s costs down and the farmers inherit the largesse as well. It’s not about clicks but Climate Corp is putting Big Data and analytics at the core of its business model.
At some point, this type of number crunching is going to start delivering outcomes independent of a corporate point of view. The United Nations' Global Pulse team is using anonymous mobile phone data to track the movement of people in the case of a humanitarian crisis.
A recent example of the utility was the work done by the UN team during a cholera outbreak in Haiti following the earthquake last year. The team was able to use the mobile information to ascertain the ground zero of the outbreak and tracking the movement of people to target the aid response.
According to Dr Wood, the limits to this sort of applications are only based around data sets that haven’t yet been integrated. The limits are no longer data generation because a multitude of data points means more data sets to work with.
The advent of utility-based computer has taken the pressure off the infrastructure needed to store, compute and analyse the data. With those restrictions now lifting organisations, of all ilk, are getting involved.
With the economies of the internet now favourable to data generation and data crunching, tethering “The Internet of Things” phenomenon to the Big Data equation could well deliver the promised transformation that we are all waiting for.