by Alex L
The exponential growth of computed data has become more and more difficult to manage. Big data refers to a collection of data that can be so large and complex, that traditional data management techniques cannot consistently manage due to the unpredictable variation in data. Any average person today generates large amounts of data daily, with their mobile devices, computers, and Internet activity and behaviors. All of these bits and details of data can add up to a significant amount. Big data challenges are characterized by volume, velocity, and variety. Volume is big data’s greatest challenge because even though some companies may be able to store vast amounts of data, they are not able to process it into meaningful information due to its sheer size. Velocity is the “speed” in which the data flows. Some organization’s servers cannot handle the increasing amount of demand. In addition, large amounts of unstructured data such as photos, audio, and video have begun to flood in from the multitude of social networking outlets such as Facebook, Twitter, and YouTube, and are constantly streamed in real-time to billions of users. Within the past decade, the significant increase in mobile products, such as cell phones and tablets, have also largely increased the demand for data services. Lastly, the immense variety in data types make organizing and interpreting such data cumbersome. Cisco forecasts that by 2017, annual global data center IP traffic will reach 7.7 zettabytes, or 8.26 billion terabytes.