Or the title could have been “Making accurate prophecies with help of Big Data”… Well, although it seems a distant dream today, the future computers are expected to be able to foretell your fortune in future! And how will they do so? They say by studying Big Data, of course!
What is Big Data?
Big Data, as the name suggests, is a collection of unimaginably huge amount of data in form of texts, videos, pictures etc. which came out of nowhere but from each Facebook update we make, the pictures we upload on the net, or even that cookie transaction which takes place when we enter (almost) any site on the WWW. Plus, think of those innumerable amounts of news updates on bulletins, chats & conversations on forums, videos on streaming sites, blog posts, product reviews, GPS signals, climate details and the Google searches conducted each day.
An estimate by IBM tells that about 2.5 quintillion bytes (that is ten raised to the power 18) of data is contributed to the Big Data everyday by us. About 90% of world’s data has been created in the last two years alone.
How can Big Data help?
If you have managed computing how BIG the Big Data really is, then you must have guessed that such an enormous amount is practically not processable by the human mind, or even the most monstrous PCs out there. Then why is Big Data so important and who is processing it to obtain some cream out of the mess?
These days, big corporations and research institutes are employing supercomputers on large scales to chew huge chunks of Big Data and draw out some of the most interesting patterns which tell a lot about the world and humanity in general. Conclusions have been drawn that such patterns of events will not only yell out the lesser known facts about the present (because it’s you who they want to keep track of), but also remove curtains from what humanity has always longed to know about: the future!
For now, the predictions would be of some more “prominent” events like natural calamities, social revolutions, wars or attacks, things that can be prepared-for or prevented, and not of pin-point occurrences like the points you will be winning in the next round of Angry Birds you play :-).
Do you know, attempts have already been surprisingly successful to predict future events by feeding a lot of past years’ news to a supercomputer… (Did I tell that tries are being made since 2011?) Be it predicting the upsurge of revolutions in Tunisia and Libya, or the fall or Mubarak from power – all have already been successfully predicted by an 8.2 teraflops supercomputer, called “Nautilus”, housed at University of Illinois, which all made news back in 2011. (I admit, they were predicted retrospectively, but if the machine is managing to put up some conclusions about 2010-11 just by reading the news headlines of up to 2009 only, it is no less than a miracle.)
Data scientist Kalev Leetaru is one of the foremost proponents in the emerging field of predictive supercomputing. His research helped usher in the era of “petascale humanities,” where computers can identify useful or interesting patterns if provided with sufficiently large data repositories.
Leetaru amassed a collection of over one hundred million articles from media outlets around the world, spanning 30 years, with each item translated and tagged for geography and tone. Leetaru then analyzed the data with the shared-memory supercomputer Nautilus, creating a network with ten billion items connected by one hundred trillion semantic relationships.
The 30-year worldwide news archive was part of a 2011 study called Culturomics 2.0: Forecasting large-scale human behavior using global news media tone in time and space. The findings were impressive, pointing to a degree of predictive ability, greater than chance would account for…
The events that could be predicted included the revolutions in Tunisia, Egypt and Libya, including the removal of Egyptian President Mubarak. The corpus also correctly anticipated a period of stability for Saudi Arabia.
Leetaru takes this to mean that it’s possible to predict major upheavals, like the Arab Spring, with some degree of confidence.
“It’s like a weather forecast,” he said in a Kernel article. “A 70 per cent chance of rain tomorrow means that may be it does not rain, but it’s probably worth bringing an umbrella, because strong conditions for rain are there.”
While reading all this, did you notice, the genius supercomputer was yet not introduced to the actual “Big Data”, but a part of it: the news entries. Imagine what will happen if it got an opportunity to read and process all data that is online! If this happens, it will not be merely reading the internet, it’ll be reading each one of us. It will tell facts and make prophecies that will affect our day-to-day (nay, minute-to-minute) lifestyle.
The obstacle? Well, Big Data is so fat that perhaps not even Nautilus can examine and control all of that much at once. The idea that future-prediction will deepen its roots into our everyday life is quite a dream yet. But hasn’t technology, in all these years, proved itself for always pushing the limits, always renaming innovation? I think we can safely cross our fingers for the day when we will step out and our smartphones will tell how’s the day ahead going to be.