Google has formally launched its own entry into the big data market.It officially released its cloud-based tool BigQuery. The search giant had been testing the tool in private beta since November and it is now open to all comers — for a price, it said in a blog post.
BigQuery can handle terabytes of data and although Google is now charging for the service it is letting users store up to 100 gigabytes for free.
This is particularly interesting because Google invented the techniques that lead to the big data revolution. Years ago it published some technical papers describing how it deals with massive volumes of data so quickly. Others read those papers, used those technique and came up with their own versions. The big data revolution was born.
Today big data is one of the hottest technologies around. Market research firm IDC predicts companies will spend $16.9 billion on big data products and services by 2015, compared to $3.2 billion in 2010. Google wants its share of those billions.
Big data refers to a combination of technologies that can search and analyse massive amounts of information nearly instantly no matter what format they are in: tweets, posts, e-mails, documents, audio, video.
Google thinks BigQuery beats the alternatives because it’s so easy to use — by hooking into one interface BigQuery gives users access to Google’s powerful data centres. Alternatives like Hadoop take a lot of expertise to set up and you still have to run it on some hardware somewhere.
BigQuery is priced affordably too — at least for a six month “introductory” period. 12 cents per gigabyte per month and 3.5 cents per gigabyte processed per day.