Computing the data tools of the future

Legacy IT systems just won't cut it in relation to big data as the new age of computers will need to go beyond being glorifed calcualtors and begin to learn and adapt from data.

In Australia, 16.2 million Australians are active in the digital universe and 64 per cent of Australians are predicted to own a smartphone in 2012. Globally, we create 2.5 quintillion bytes of data everyday – so much that 90 per cent of the data in the world today has been created in the last two years alone. This data comes from everywhere, including sensors used to gather the weather, social media posts, digital photos and videos uploaded online, transaction records of online purchases, and from smartphone GPS signals to name a few.

An estimated 294 billion emails traverse the internet each day while around 48 hours of content is uploaded to YouTube in a single minute.  According to market researcher IDC, the amount of global data in 2020 will be 50 times greater than current levels. We expect this rate to be even more pertinent in Australia as superfast broadband is rolled out to all corners of the country.                

This data is big data. But the question is, are we ready for all this data? Better yet, are our systems equipped with the innovative technology to manage it?                                                                                                                 

What is needed is the technology to harness the plethora of data to allow both businesses and consumers to make decisions based on quality analysis, rather than experience and intuition. In doing so, we would assume a more scientific approach to our businesses and lives, whether it's a doctor diagnosing a patient, a homeowner choosing the most efficient time to do their washing, or a meteorologist predicting the weather.

To this end, we need to build IT systems that can not only filter and store all this big data, but also make intelligent use of it. For more than 50 years, we have been operating with the same IT elements: processor, memory, storage, database, and programs. And we've designed IT systems to handle business processes automation, long business cycles, and terabytes of largely structured data.                                                                                       

The problem is this IT management legacy won’t cut it anymore. Data is only going to continue to get bigger and the only way technology will keep up is if computing gets smarter. Systems designed for transaction processing and structured data simply can't deliver the levels of performance both business and consumers are demanding and will require in the very near future.                             

It is time to significantly shift the computing paradigm—from computers that calculate to computers that learn from and adapt to all data, structured and unstructured data, such as emails, presentations, videos and more

Last year, IBM's Watson high-powered question/answer system showed the world what was possible when a finely-tuned learning system tackles big data with advanced analytics when it competed and outsmarted two intelligent human contestants on the Jeopardy! game show

The success of Watson has given us a glimpse into the monumental shift in computing that will affect businesses in every industry and consumers around the world. Today, IBM and partners are putting Watson to work in industries from healthcare to banking. But that's just the beginning

Future generations of optimised systems will benefit organisations across all industries as th.ey deal with common and complex data centre issues. We are on the verge of expert integrated systems with built-in knowledge on how to perform complex tasks and based on proven best practices. Systems that not only recognise changes in the computing environment, but that also anticipate them. As workload demands spike, the systems respond. When new applications or upgrades are needed, they're deployed against best practices and integrated patterns

In order to deal with the explosive growth of data, the systems that store it will have to get very efficient and smart. They will do this through deployment of advanced capabilities including universal storage virtualisation, compression, data de-duplication, as well as automated tiering to keep the data best balanced for cost, speed and access patterns. In addition, next generation integration technologies enable systems of integrated storage, networking and servers to make these capabilities easier to deploy.

 As we rush to the future, generating, storing, and managing ever more mountains of digital information along the way, it’s now time to start questioning the vitality of our systems so they can keep up with our insatiable demand for intelligence and convenience.

Francois Vazille is the general manager, systems and technology group, at IBM Australia and New Zealand