A new tool for scientific databases

by Edwin T
The article i chose discussed large science databases. Apparently the large and complex SQL Server databases are very difficult to migrate into some of the current cloud services, such as Amazon EC2 and Microsoft SQL Azure. The authors found it very difficult to migrate a database small or large into the cloud without making some changes to the schema. The changes ultimately affected performance and usability. The authors also mention they are developing a tool known as “Data-Scope” specifically designed for scientific data analysis. This tool is supposed to maximize data throughput and minimize power consumption that these large databases require.

Based on all of the articles discussed in class and from the ones i’ve personally researched, this is the first one that addresses scientific data. Astronomers for example, collect petabytes (1024 terrabytes) of information. I can only imagine how the schema of their database would look like. Every bit of data is as important as the next for them and to have to make changes or risk the possibility of having to make changes when moving to the cloud is a huge drawback. If the new tool “Data-Scope” manages to fix the issue with this, it will make an impact on the cloud industry. A way to target a whole new market of data.

Citation:

Brandic, I., Raicu, I., Thakar, A., Szalay, A., Church, K., & Terzis, A. (2011). Large science databases-are cloud service ready for them? Scientific Programming, 19(2/3), 147-159. Retrieved from http://0-web.ebscohost.com.opac.library.csupomona.edu/ehost/detail?vid=11&hid=112&sid=39bce05e-9881-406b-b7f1-24dc1f8ea7d2%40sessionmgr115&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=aph&AN=66692030