Big Data and Analytics

by Jennifer R
The author talks about how the business intelligence tools used to interpret data are changing as people continually work on adjusting databases process massive amounts of data. Traditionally the data to be analyzed was relational and stored in cubes, with retrieved information “delivered as standard reports”. There is a demand for tools with ‘data discovery’ properties, where they work with near-real time data “to create adhoc reports and graphs”. The location for data storage for databases is changing in conjunction with the analytic tools. Databases used to be stored on disks due to the size being impractical for storage on RAM. Improvements have made database storage on memory feasible; such databases are described by the author as in-memory databases. There’s also been talk about ‘predictive analytics’ tools where they “try to anticipate what will happen based on trends they spot in the data”.

It is interesting that it talks about how the tools used to analyze database information change, and not just the databases themselves. We focused on databases in class, but not much on the tools used in conjunction with databases. Also, the articles talked about the storing of information on RAM. As I mentioned in a previous blog, I would be worried about the volatile nature of memory such as RAM and would prefer to have a backup on disk somewhere.

Source: Miller, M. (2012) The Challenge of Analytics in a Big Data World. PC Magazine. Retrieved June 3, 2012 from