Database Design and Development


by Ymku

What is database? A database is organized lists of information. Improving computer industry can transform our life in many different ways. We do not have to go outside to go shopping, watch a movie, or even bank work. We can do this entire job at home with computer. All of these works are related to database. The companies have a lot of data for their business. Companies need to organize their data because unorganized data is just useless when data is hard to utilize. Organizing data is a key of the success their business. SQL (Structured Query Language) is a program language for managing RDBMS (relational database management system). The SQL is a way of managing database and I would like to introduce one of widely used SQL which is MYSQL. read more...

Solid State Disks in Database Servers {16}

by Youngmin L
Solid State Disks (SSDs) are a data storage medium using flash memory. Hard drives on the other hand use rotating platters to store data. Compared to traditional hard disks, SSDs have faster data rates, are more resistant to physical shock, run cooler & quieter, and have lower access times. Because SSDs are a young technology, they are more expensive and have less capacity than traditional hard disks. read more...

NoSQL cloud technology as an option? {3}

by Christopher J
As relational databases such as SQL and MySQL have been the standard model for storing, retrieving, and managing data throughout the IT industry for the past decades, relational databases finds itself losing its significance primarily due to fixed schema requirements and the inability to scale (Arora & Aggarwal, 2013). Three prominent trends occurring in the computer industry are Big Users, Big Data, and Cloud Computing. Relational databases finds itself struggling to keep up with these current trends. This is where NoSQL cloud technology can provide a solution to these problems. read more...

Using Database Technology to Fight Terrorism {9}

by Alii S
We all remember September 11th. We remember where we were, what we were doing, and most importantly what happened. After the events of that day, America subsequently went into war with Afghanistan and Iraq in order to catch the perpetrator, Osama bin Laden, the leader of the Al-Qaeda at the time. After 12 years of looking, he had finally been caught, and America had seen some justice finally done. The elements that went into finding this man were obviously lots of manpower, in the form of soldiers, intelligence workers, sources, and the list goes on. One of the elements were technology, more specifically, databases. read more...

Data Warehouse {2}

by Rosario E
Data Warehouse

Data warehousing has been evolving since 2008 to analytical architecture that includes data marts, ETL, near line storage, exploration warehouses.

The SQL Server has also evolved since started with small amounts of data on a personal computer with basic functions until now that serves mid-size and large amounts of data for warehousing and it is one of the preferred technology platform for advanced data warehouse. read more...

PostgreSQL is a way! {2}

by Yukit C
What is PostgreSQL?

Companies are always looking for database system to take care of complex data infrastructure in order to solve specific business needs, lower the cost of operations, and reduce the deployment time. Some case studies show PostgreQL is one of useful tools. PosgresQL is an open source object-relational database system (DBMS). In term of source relational database system, it is a relation database with an object-oriented database model which supports objects, classes and inheritance. PostgresQL runs major operating systems such as Linux, UNIX(Mac OSX, Solaris as well as Windows. It also supports for foreign keys, joins, triggers and stored procedures with multiple languages. read more...

Big Big Data {1}

by Edris B
 Big Data is loosely used term in the database industry. At first this term is seen as so simple it becomes complicated or might be used so general it becomes ineffective. Analyzing Big Data is essential in well-established businesses to help grow and mold the company. Big Data as described by Forbes is “ a collection of data from traditional and digital sources inside and outside your company that represents a source for ongoing discovery and analysis.” (What is Big Data) With the expansion of the information and technology age this becomes more and more relevant everyday. There are even organizations that companies can outsource to do this type of analysis for them. Big Data is a relatively new idea and has been becoming more efficient but, there can still be improvement. read more...

Novelty Mining {1}

by KingWen F

The rapid growth of worldwide corporations leads to a continuous increasing data. These corporations often create community spaces for them to share their information with the rest of the world. This also creates opportunities for competitors to find out what they are up to by reading thought this information they provided on the business blogs. However, often times when they issue a new posts the content is repeated in the old ones. “The current available search engine, like Google, can not tell whether a newly posted article contains fresh content or not, as compared to all the previous posted articles” (Tsai and Kwee, 2011). Therefore, when people try find flesh content by using search engines, they usually come across tons of posts with old or already known content before they can find the flesh ones. For decision makers in corporations spending time on reading old or known information is not desirable. Business decisions are time sensitive, if the decision makers receive the information late they are likely to miss the opportunities to outperforming their competitors. With the huge amount of information add to the internet each day the necessity of locating flesh and related content continue to increase. Novelty mining system can help filter out known information and identify the flesh ones for the users. In this article we will first have a brief look on how novelty mining works and we will look at some business applications. read more...

Database Optimization: Genetics {3}

by Austin P
As the need for databases increases optimization is the next step as a natural form of evolution. Everyone wants better, faster, and more efficient technology including databases. Givon Zirkind is the author of an academic journal which talks about the Optimization of databases that involve genetics. Zirkind writes about, with data storages increasing their memory size at a cheaper cost than before optimization should be easy. Unfortunately, since many programs that are created are not as efficiently coded as they can be and other minor altercations and factors leads to software bloat. According to Zirkind, “Software bloat is when a computer program has so many features, that a user cannot possibly know them all and use them all”. Zirkind writes about a project he did to decrease the amount of bloat and excess data by articulating a specific software design and specifications. Some of the ideas the group as well as Zirkind used was indexing method selection criteria and programming language selection. The indexing method selection involved the use of complex mathematics to create superior access speed over a Linked List using B-tree. B-Tree is an organizational structure for storing information and retrieving the information in the form of a tree. As for the programming language that Zirkind chose to use was the C language due to its performance and portability. After applying the software design and specification phase the next step was to optimize, this was through key compression and index size reduction. Not only is the key compression and index size reduction important having what Zirkind calls “good engineering” is a huge factor towards optimization. Zirkind clarifies that a good engineer is one that is simple. Engineering in databases and especially in technology needs to keep code and other information simple by reason that the more information that is used in code or other sources takes up more memory. In databases this means that load times are longer than needed.  The practices that Zirkind and the group used made a significant increase in the efficiency of their genetic database by 7 to 9 times the original access speed over the databases they used for testing. Also according to the article, the database normally used 7 disk access to record all the data within the database, however, with the new optimization the reduced the use of disk access to a maximum of 2. The reduction of the use of disk access was by recording data loaded into memory and record blocking. read more...

2014, New Year For SQL Server {1}

by Patrick B
The next step in Microsoft’s Relational Database Management System looks promising for efficiency and speed in transaction processing. Microsoft recently held a community technology preview for the massive project and this is what was found. SQL Server 2014 will come with their new In-Memory Online Transaction Processing (OLTP) feature called Hekaton that is a built in part of the database system. Hekaton works by selecting data that is being read or written more frequently and moves it into the server’s working memory. It allows for the priority data to be quickly accessed and ready for transactions or updates on the fly. By enabling this Hekaton optimization, it detects which data should be used by the working memory and optimizes the specified data into main memory. Integrity of the data is maintained by writing any transactions to a separate log file in case of system outages. Beyond just the increased speed, companies can expect to see a saving in cost as Hekaton would reduce the computational requirements necessary to get data processing done, which would require less servers and hardware. read more...