Denormalization: Intermediate Step

by Jasmine C
The article I read about is very informative.  A quick synopsis of the article is that it discusses a lot of information regarding the techniques of denoralization and the pros and cons of normalization vs denoralization.  Today, normalization is the way to designing a relational database.  However, the biggest disadvantage of normalization is that system performance is very poor.  With normalization, data is organized so that there is minimal updating and data is easily accessible.   At the moment, denormalization techniques do not have concrete guidelines to guide the process.   However, denormalization shows a positive effect on a databases’s performance.   It has been proposed that denormalization be used, in addition to normalization, to play as a middle step to help with system performance .  This article describes three approaches that are used to review the donormalization strategies.  Each of the approaches shows how denormalization positively affects databases.

I like this article because it takes into consideration compromise.  Since normalization isn’t perfect, people are finding ways to make it more efficient, hence denormalization.  With denormalization, system performance can be more more efficient  and that is a plus for everyone.  With denormalization, a person needs to know when to use it.  If they use it when it is not needed then instead of helping their system it might just hinder it.  I like the advantages and the disadvantages this article provides in regards to normalized vs denormalized databases.

This article relates to class because we as students can relate it to how we can make sure that our database works efficiently.  When we have to work on our projects, we now know that we are not only restricted to the use of normalized data.  We can also use denormalized data or a combination of the two.

Shin, S. (2002). Denormalization effects on performance of relational database for data warehouse.ProQuest Dissertations and Theses. Retrieved February 5, 2012 from

5 thoughts on “Denormalization: Intermediate Step”

  1. Great article. I agree that denormalization is very efficient for system performance. thank you for posting this article

  2. I enjoyed reading you response, it gave me great insight on the pros and cons of using denormalization and normalization. It seems to be in the early stages with not real guidelines, put as times goes on, we will see a mixture of both methods being implemented into the database system.

  3. Compromise… thing quarreling experts rarely reach. But I suppose with a field as heavily fundamental and information-focused as database systems, a common goal would be to create the most efficient system for all the parties involved. With normalization as you point out, arguments appear to defer to system performance. Perhaps this could lead to a new hybrid methodology for semi-normalization? Without expertise in this field I could not tell you what such idea would entail, but I also assume that I am not the first to make such a suggestion. Either way, it’s articles like the one reviewed here that always lead to practically endless future research opportunities.

    Good find. Cheers.

  4. If normalization slows down the system performance, but denormalization shows a positive effect, why wouldn’t you want to include both? Then again, if the user doesn’t know how to use them properly it might be create problems. Thanks for the article.

  5. This article was interesting because it shows that what some people thought was the best way to organize data may not be and so they are having to find different strategies to make the system perform tasks faster. I never really thought that denormalization would make the system more efficient. Good article

Comments are closed.