database management

Technique for quick access from data warehouse. {3}

by Hongde H
The article I choose to read this week is about techniques for quick access from Data Warehouse. In the journal, the author was stating the common problems faced by data warehouse administrators and users. He outlined some query performance techniques which minimizes response time and improves overall efficiency of  data warehouse, particularly when data warehouse is access and updated frequently. By and large performance of the system is improved without accessing the original information sources which provide good strategies that make finer data warehouse. read more...

SQL Change Data Capture {Comments Off on SQL Change Data Capture}

by Ming X
The article I read for this week is called Implementing SQL Server’s Change Data Capture by Gregory Larsen. He talks about a new feature called “Change Data Capture (CDC)” came with SQL Server 2008. CDC provides the ability to set up and manage database data auditing without requiring custom auditing procedures and triggers; it captures DML operations (Insert, Update, Delete statements) and makes the altered database available for later reporting. Users can use CDC to track the changes that in a table or tables. CDC is a process that is implemented within a database and allows users to identify the SQL tables, all changes will be tracked. Users can track changes to a single table, or multiple tables, and each change to a table is tracked in separate tables. These tables are stored directly in the database where change data capture is enabled. When users implement change data capture a process is automatically generated and scheduled to collect and manage the change data capture information. By default change data capture information is only kept for 3 days. Users can read the change data tracking tables directly to see what has changed, or use built-in functions to read the data. But, change data capture is available only on the Enterprise, Developer, and Evaluation editions of SQL Server. read more...

The New Cloud Computing By Oracle {5}

by Ming X
The article I read is about the new cloud computing provided by Oracle. Oracle introduced its cloud computing service last year. It can be leased by business and kept behind their own security firewalls. Last month, Mr. Ellison, the founder and chief executive of Oracle, kicked off his Oracle Open World gathering and announced Oracle has a new cloud database, called 12c, which is “a sharply different way of accessing a database in cloud computing.” Mr. Ellison’s claims that may indeed be “the first multi-tenant database in the world.” The new hardware from Oracle could double the speed of shifting data  from EMC, but costs only one-eighth as  machines from I.B.M. Mr. Ellison also delivered three important truths about cloud computing; all these big companies are taking cloud computing for business very seriously, they know how much big business values; Companies are going further down the food chain, Oracle is looking into offerings as good products for small and medium-sized businesses. Lastly, companies are focusing on improving the speed of data processing. read more...

Improving Databases and Datacenters {Comments Off on Improving Databases and Datacenters}

by Anthony T
The article is about the current trends in IT and how they relate to databases management. The author mentions the growth of e commerce and how companies are deciding to use the cloud to meet those needs. A problem which datacenter managers face is doing more with less funding. The author suggest advanced request distribution as a solution for datacenter managers. What this is basically is a process which sends request to certain resources which are best fitted to execute them. This technique is being used in the web and application layer of the application architecture. In terms of the data layer, advanced distribution techniques are used primarily on transaction processing done over the web. The authors go on to outline their proposed cost cutting technique. Their strategy basically assigns requests among a cluster of “off the shelf databases”. The steps of their process consists of developing a logical approach and architecture, develping a working model, and evaluation of artifacts. read more...

Database Usage in Food Warehouses {2}

by Renee L
In their article “An RFRS that Combines RFID and CBR Technologies,” Lao, Choy, Ho, and Yam propose a technology for global concerns in food safety management. This new technology is called real-time food receiving operations management systems (RFRS), and it combines the use of case based reasoning (CBR) and radio frequency identification technology (RFID). The CBR technology is used to retrieve and analyze data that is stored in a database and case library, and the RFID technology is used to gather real-time inventory data and oversee inventories, equipment, and operations. These technologies help improve food reception operations in many ways. With the CBR technology, businesses can easily see the data of the flow, therefore, they can make better decisions. CBR easily captures and sorts data for better decision-making. Taken from the article, a picture of a company’s database is shown below. This example shows that the CBR allows them to easily see and grab information for each activity. read more...

What to do With Too Much Data {2}

by Tyler K

In the article, the author discusses how the modern database often extends beyond a few hundred entities; modern day companies regularly are wading through terabytes of information, trying to drag useful & meaningful context out of massive loads of information. Several massive problems are brought up – searching through the data is tedious and yields irrelevant results, metadata could vary in the usefulness and the context might not be comprehended by others, attributes could mean the same thing but be sorted separately (ex: Mac, Macintosh, Apple Computer, iMac could all be different ways to describe the same product), and it is very difficult to standardize the data and determine who regulates and incorporates the standardization – and if it’s even worth the time to do so. Thus, the solution offered is simple – relax the standard. Let there be a little differentiation, and create unified product descriptions that can catch multiple ways of describing the same object, determine responsibilities for who is going to ensure data integrity. Even then, there is no hard solution, and the conclusion is that there must be a future implementation of database management systems that can form patterns and relationships with data, have well-documented information on where data is originating from, and develop a system to understand how much is being lost by inaccuracies in the data. read more...

Multimedia Databases and Security {1}

by Jennifer R
The author first examines the different types of architecture available for constructing a  multimedia database. The roles of managing both metadata and multimedia can be delegated to the database management software, or a separate file manager can be delegated to handle the multimedia alone. Metadata is discussed, as audio and video data can require a great quantity of metadata. The author says “in the case of video data, one may need to maintain information about the various frames.” Multimedia data mining is also of concern, as it differs from data mining in a traditional database. The author explains that “data mining models data as a collection of similar but independent entities. The goal of data mining is to search for patterns that are common to many of these entities. ” The subtle details for identifying things we see in pictures and videos makes it difficult to fit multimedia data mining to traditional data mining. The author demonstrates this with an example, saying “pictures and video of different buildings have some similarity—each represents a view of a building—but without clear structure such as ‘these are pictures of the front of buildings’ it is difficult to relate multimedia mining to traditional data mining.” Due to the complexity of multimedia, the author suggests an end-to-end security approach, where we ensure every component of the system is secured. read more...

Sexual Assault Database developed by Department of Justice {3}

by Alexander H
As technology becomes more advanced, it is necessary to enhance and improve databases in order to maximize efficiency. Rick Maze, an author at the Army Times, wrote an article explaining the changes that a database is undergoing in order to improve its usability. The Department of Justice is planning to deploy a special database this summer that is aimed at sexual assault prevention. The Sexual Assault Database Management System, or SADMS, has been in the making for years and is nearing its final stages. This database will contain the names of victims and alleged offenders, the nature of the assaults, and the outcomes of any legal action. read more...

A Quick Look At MySQL 5.6 {1}

by Chris S
This article is a preview about the new upcoming MySQL from Oracle. Of course it will have various new features with it as well as experimental or test features that may or may not be included in the final version depending on the feedback from testers and database administrators. It was mentioned that two of the improvements are data replication and the ability to bypass the framework in SQL for faster access to data. The most notable feature will be an application programming interface (API) that will allow applications to access data directly from the InnoDB database engine. This basically skips the process of having to go through the SQL-based interface. This API is believed to improve MySQL such that it will be on par as far as accessibility with NoSQL databases which are quickly gaining favor with web applications. A new set of ADD operations is said to be an additional experiment with the software. It will allow applications to input data to the database without blocking other operations while they try to write their own information to the index of the database. read more...

Database Management in an Ad Hoc Network {2}

by Vincent S
Today I am posting about an article I found in the ACM Digital Library.  The article was a peer reviewed journal submission presenting a solution on how to manage a database in a mobile ad hoc network.  For those you are not familiar, an ad hoc network is a wireless network in which no network management device is present (switch, router, hub, bridge) and all host devices have wireless capabilities.  The reason for such a network to exist would be for circumstances in which a wired network would be difficult to set-up and maintain.  Such a scenario could include in battle or more relevant to this class would be for disaster recovery for a business in an emergency situation.  Possible challenges that might present themselves to a database administrator are easily intercepted and comprimisable data, lack of data integrity, and network collisions that might cause the loss of data due to lack of network management.  The authors of the article posted various solutions to these challenges. read more...