Data

Controversies over Wi-Fi Data {3}

by Jim J
 

The FCC recently cleared Google of wrongdoings regarding their actions in the breech of data two years ago. Google is accused of harvesting data from their “Street View” cars that they gathered from Wi-Fi networks. In the process of aggregating data with people’s wireless networks, Google also accidentally gathered private information such as passwords and email addresses from the wireless networks. Though this was not their purpose in collecting the information, Google inadvertently did so because of plethora of data they were collecting. And to top it off, they got away with it as not being a crime because the data use public airwaves that are accessible by anyone; the argument is that its similar to one sending vital information using Walkmans on a frequency others can listen into. The final point is the only way for personal information to truly be secure is through encryption and with password-protection. read more...

Our Information’s Importance to Corporations {Comments Off on Our Information’s Importance to Corporations}

by Tyler K
 

In the Guardian article chosen, the author describes how Wal-Mart has recently purchased Facebook’s popular “Social Calendar” application, and the implications of the purchase – the corporation now has access to the millions of users, and all of the friends of the users that they may refer to using the application. The author goes on to describe the future role of information, and how information on people, as consumers and for other roles, is quickly becoming a massive component of the modern world. The article details how important information really is, “Facebook’s projected $100bn value is based on the data it offers people who want to exploit its social graph,” (Krotoski, 2012) – essentially, the success of the social media site relies heavily upon offering all of the information the user believes is just to be shared with friends. This leads to a discussion on the future of individual privacy and what having information online might lead to – it even refers to my previous blog about Target analysts discovering a woman is pregnant before even her father can! The author chooses the phrase “aggregated data,” or “Big Data” to describe what is happening to all of the information on us online; powerful organizations are collecting various facets of data about individuals, whether it be to market to them better, determine whether to hire individuals (several recent cases involve companies asking for Facebook passwords, or simply collecting the information on their own), and one example even details using social media to gather information on a man that attempted murder! read more...

Rising Data Breaches in Hospitals {Comments Off on Rising Data Breaches in Hospitals}

by Jim J
Figures in data breaches from hospitals continue to go up this year. Compared with 8% in 2008, compromises in data security have one up to a staggering 31%. Few of these problems are caused by hand made errors, instead, the reason for the rise in the breach of data security are the growing use of portable devices which include both laptops and handheld devices in the medical industry. With the plethora of devices and applications available nowadays, regulation is simply too slow to keep up with technology and update policies to control handling data the proper way. And last, more companies outsource work to third-parties which further complicate policies they follow in protecting user data. read more...

Obtaining Data Faster Through Data Virtualization {1}

by Renee L
In his article “Using Data Virtualization to Get Data to Users Faster,” Ron Powell talks about the benefits of data virtualization and self-service reporting with CEO and founder of Izenda, Sanjay Bhatia. Izenda is the market leading Ad Hoc reporting software company that provides real-time dashboards and data virtualization. The company simply allows users to easily create and customize their own reports through a software. With data virtualization, Izenda allows users to connect with many data sources simultaneously, and gives them access to real-time data. With that being said, users can just look at a dashboard, use the data to make any changes, and look at the result before anyone else sees it. Also, the company consistently connects with its customers by making sure that data is compliant, corresponding to the Sarbanes-Oxley and Dodd-Frank Act. Because of the fact that data volumes are getting much bigger than they used to be, security is very important. The software provides security by defining its business rules and once new data comes in, the business rules are automatically applied. In addition, Izenda uses Microsoft SQL Server which allows users to “achieve 10 TB level databases that don’t require DBAs to optimize the database” (Powell, 2012). It also gives them the ability to create data stores without having to spend hours on setting up the databases. In addition, Izenda is “broswer-based, HTML 5, works on tables, smartphones, and iOS” (Powell, 2012). Overall, the software provides many benefits to its users and is really easy to use. read more...

CA Technologies-The Future of Data Modeling in the Cloud {1}

by Alexander H
John Heywood once said “two heads are better than one”. In our ever growing world of technology, this quote is quickly becoming the normal standard for the creation of products. From simple brain storms to application development, thinking in groups allows a broader range of ideas to flourish. This concept is being implemented by CA Technologies, the developer of our ERwin Data Modeler, with the help of the cloud. The ERwin software allows a user friendly approach to data modeling and is consistently utilized for data structure management. However, CA Technologies wishes to take this tool one step further in the form of three various releases that aim to take data modeling into the cloud. The first update has enabled “integration and management in Microsoft’s cloud environment”. This allows the data to be in a single area when share in the cloud. The second release in the trio is the CA ERwin Web Portal which is an “interface that allows for data architects to build their database models” in such a way that business and IT users can access the material. Finally, the ERwin Data Modeler r8.2 was altered in ways that accommodated the sharing premise, such as the availability of concurrent licensing agreements.
Being one of the programs we utilize in class, it is important to note that this is a professional program that successful businesses utilize in order to model their databases. I believe this article highlights the importance of cloud computing and how it can revolutionize how a company models its data. A group of database architects may be able to build a great database. However, if the development of the database was distributed and shared throughout various companies, whether they are sister companies or simply have a similar focus, the ideas from all areas can benefit the company immensely. This also leads to consistent data modeling throughout various companies and break free from the confines of the company for new and fresh ideas.
The main concern I have, as noted by senior director Donna Burbank, is the security of such information in the cloud. The models created could be potent in the hands of competitors if they ever got a hold of the information. I believe that overtime, the issue of security with cloud computing will be fortified and made more secure. This key element will open the lines for worldwide collaboration and I think that CA Technologies is taking a step in the right direction as technology progresses. read more...

Psychic smartphones ??? {2}

by Abel R
For this week, I came across an article titled “Your smartphone knows you better than you do” in infoworld.com. As a smartphone owner/ user this caught my interest because it involves with how companies can use what I presumed was private information to their advantage. In this article, researchers are studying how smartphones can be used to predict human behavior. Researches have accomplished this by studying patterns of how and when people use their smartphones. This article also reveals how this kind of research can be used in the business world, for example: Wireless carriers can use this data analysis to determine who’s more likely to jump carriers (Cringely, 2011). read more...

What is Web Analytics {2}

by Jasmine C
The article I read about talked about web analytics and how it is the process people use to gather statistics about how their website is used.  This kind of tool can help a person understand their users which can then help them decide what their users want/need. To find out more about what the users think, one way that their feedback can be documented is with surveys, like surveymonkey. The article then goes on to talk about web analytic tools, like Google Analytics, Coremetircs, WebTrends, etc. Google Analytics came out in 2006 and has been doing well ever since it’s release.  The article also stated that if you are using analytics, then you should have a privacy policy that states what personal data they may be providing upon request, what information could possible be collected without their knowledge, how the company may use this information, etc. This way, users have an idea of what kind of information will be gathered from them. If along the line they want to get out of the analytics program, they should be able to. Google analytics  has good privacy policies in that they are “clear, direct, and intentional” (Marek 8). read more...

No More Data Modeling? {4}

by Renee L
With more advanced and faster database technologies, some people say that there is no need for data modeling. In addition, organizations with agile development environments do not use data modeling. In fact, data modeling is not even mentioned at all. However, data modeling expert and consultant, Len Silverston, argues that it is more important to use data modeling now than ever. The purpose of data modeling is to understand the data requirements, build a foundation for the design, and allow data to be integrated. Therefore, if organizations do not have a data model, how are they going to understand the data and foundation of the database/business? Silverston also recommends that modeling techniques should be applied to agile development environments because things are happening so quickly. Thus, modeling would be perfect for agile development because it focuses on delivering solutions quickly. A data model can help produce things more quickly and prevent from making a mess because people will understand how everything fits together and the specific data requirements and design of the database. read more...

The Data Modeling Approach {2}

by Anthony T
The article focuses on data modeling as a way to improve the effectiveness of businesses. It outlines strategies in which businesses can benefit from reducing cost of marketing, higher revenue, better profitability, and competitive advantages. Predictive modeling tools, according to the article, can be especially useful in analyzing factors which are directly related to product profitability. The author also mentions the benefits of reliability and accuracy of supplier relationship management when building data models. The accessibility of modeling tools is also discussed in the article and how it is no longer only at the dispose of statisticians or experts. Modeling tools and data analysis is becoming more and more available for marketers to use. read more...

Potentially Relevant Earthquake Data {1}

by Joshua L
Earthquakes occur where the boundaries of  tectonic plates meet, specifically 80% of earthquakes occur at the Pacific Plate boundary. Earthquakes’ first accurate data was recorded in 1880, exhibiting an average on twenty earthquakes occur annually with a Richter rating of 7.0 or greater. This data is useful to help develop the ability to track and predict the occurrence of earthquakes. The fault of not moving forward with predicting earthquakes lies in the lack of access of earthquake information, as well as the lack of “infrastructure” for retrieving and accessing that information. Much of the data that does exist is useful because it is unorganized and no correlation exists to explain the links between the existing documented information. The interpretation of the patterns, correlation, and trends between all of this recorded data does not exist, thus the data is practically useless. With successful predictions of earthquakes mass amounts of life and property can be salvaged.  The major issues regarding successfully predicting earthquakes are: reviewing mass amounts of data takes a lot of man hours, communication gaps between observatories collecting the data, having knowledge about what the data actually shows, and inconsistent earthquake results. Currently  these issues keep from successfully predicting earthquakes; however, the data measuring earthquake changes combined with earthquake history can help predict the next ‘big one.’ Using an ontology based data warehouse to process and study the results is currently under study; this method is thought to better conclude and organize the existing, and future, data regarding earthquakes. read more...