by Stefan S
The article talks about the giant social network site Facebook.com. Facebook’s power derives from what Jeff Rothschild, its VP of technology calls the “social graph.” The largest human population in the world is in Facebook.com. They have the highest traffic flows in term of network. Facebook centers are built on the backs of three tiers of x86 servers loaded up with open-source software.
According to the articles Facebook maintains data centers in Santa Clara, CA; San Francisco; and Northern Virginia and the centers are built on the backs of three tiers of x86 servers loaded up with open-source software. The top tier of the network is made up of the Web servers that create the Web pages that users see, most with eight cores running 64-bit Linux and Apache. Facebook develops complex core applications using a variety of full-featured computer languages, including C++, Java, Python, and Ruby. To manage the complexity of this approach, the company created Thrift, an application framework that compiled from different languages to work together. The middle tier is consisting of caching servers. As their website is growing and growing, 800 database servers cannot serve up all the needed data. Facebook receives 15million request per second for both data and connections. Facebook has discovered that interrupts on the servers’ Ethernet controllers which makes the servers’ process run slower. Facebook rewrote the controller’s drives to scale on multicore system. Facebook is also experimenting with solid-state drives, which could speed the performance of the MySQL database tier by a factor of 100. (Zeichick, A., 2008)
We can relate to this article to the class on how we use database almost every day without noticing it. Not just doing our project with SQL server, but blogging, going to Facebook and other website now days are database driven.
In my opinion Facebook is growing very rapidly and to accommodate 15million per-second in 2008, by now the request can be double as 30 million per-second. Instead of running 800 servers maybe now they have 1000 plus just to accommodate the database side. I am wondering how they can accommodate all the photos and videos that been uploaded every day to their servers. What if there is natural disaster on one of their tier whether in San Francisco, Virginia, or Santa Clara.
Zeichick, A. (2008, Facebooks combinatorial challenge. Technology Review, 111(4), 47-47. Retrieved from http://search.proquest.com/docview/195345429?accountid=10357