optimization

Using Database Technology to Fight Terrorism {9}

by Alii S
We all remember September 11th. We remember where we were, what we were doing, and most importantly what happened. After the events of that day, America subsequently went into war with Afghanistan and Iraq in order to catch the perpetrator, Osama bin Laden, the leader of the Al-Qaeda at the time. After 12 years of looking, he had finally been caught, and America had seen some justice finally done. The elements that went into finding this man were obviously lots of manpower, in the form of soldiers, intelligence workers, sources, and the list goes on. One of the elements were technology, more specifically, databases. read more...

Database Optimization: Genetics {3}

by Austin P
As the need for databases increases optimization is the next step as a natural form of evolution. Everyone wants better, faster, and more efficient technology including databases. Givon Zirkind is the author of an academic journal which talks about the Optimization of databases that involve genetics. Zirkind writes about, with data storages increasing their memory size at a cheaper cost than before optimization should be easy. Unfortunately, since many programs that are created are not as efficiently coded as they can be and other minor altercations and factors leads to software bloat. According to Zirkind, “Software bloat is when a computer program has so many features, that a user cannot possibly know them all and use them all”. Zirkind writes about a project he did to decrease the amount of bloat and excess data by articulating a specific software design and specifications. Some of the ideas the group as well as Zirkind used was indexing method selection criteria and programming language selection. The indexing method selection involved the use of complex mathematics to create superior access speed over a Linked List using B-tree. B-Tree is an organizational structure for storing information and retrieving the information in the form of a tree. As for the programming language that Zirkind chose to use was the C language due to its performance and portability. After applying the software design and specification phase the next step was to optimize, this was through key compression and index size reduction. Not only is the key compression and index size reduction important having what Zirkind calls “good engineering” is a huge factor towards optimization. Zirkind clarifies that a good engineer is one that is simple. Engineering in databases and especially in technology needs to keep code and other information simple by reason that the more information that is used in code or other sources takes up more memory. In databases this means that load times are longer than needed.  The practices that Zirkind and the group used made a significant increase in the efficiency of their genetic database by 7 to 9 times the original access speed over the databases they used for testing. Also according to the article, the database normally used 7 disk access to record all the data within the database, however, with the new optimization the reduced the use of disk access to a maximum of 2. The reduction of the use of disk access was by recording data loaded into memory and record blocking. read more...

2014, New Year For SQL Server {1}

by Patrick B
The next step in Microsoft’s Relational Database Management System looks promising for efficiency and speed in transaction processing. Microsoft recently held a community technology preview for the massive project and this is what was found. SQL Server 2014 will come with their new In-Memory Online Transaction Processing (OLTP) feature called Hekaton that is a built in part of the database system. Hekaton works by selecting data that is being read or written more frequently and moves it into the server’s working memory. It allows for the priority data to be quickly accessed and ready for transactions or updates on the fly. By enabling this Hekaton optimization, it detects which data should be used by the working memory and optimizes the specified data into main memory. Integrity of the data is maintained by writing any transactions to a separate log file in case of system outages. Beyond just the increased speed, companies can expect to see a saving in cost as Hekaton would reduce the computational requirements necessary to get data processing done, which would require less servers and hardware. read more...

A/B Split and Multivariate Testing: Statistics/Evidence Based Optimization {4}

by Ruben S
For the past few weeks we have been talking about pay-per-clicks, conversion rates, and other topics on what we need our sites to do. How do we go about accomplishing our goals of increased conversion rate optimization, or having more people visiting out sites? Well, we could go about this in two ways, 1st we could just go willy nilly changing items on our site and hoping that this would accomplish out task. Although some people might get some lucky guesses, this is no way to systematically what our changes actually accomplish. The 2nd method we could use is statistical or evidence based changes. Companies no longer guess at what changes should be kept or scrapped, now companies rely on statistical evidence to make and keep changes on their sites. There are several methods that a company can use to test their changes, but the two I will cover today, which happen to be the most common, are A/B (Split) Testing and Multivariate Testing. With this type of testing, we can implement different changes to out site and let our customers use the site in their natural state. Customers actually using our site will give us feedback to what changes work and what changes do not work. Each one has its advantages and disadvantages, but both are far better than not doing any controlled testing at all. read more...

Load Time Optimization {3}

by Jenifer W

Load Time Optimization

A website’s page load time plays a very important role in user and consumer experiences. The faster the page responds and loads to the user, the better the experience will be for the user. In a study, “nearly one-third (32%) of consumers will start abandoning slow sites after one to five seconds” (“Seconds count”, 2010). This means that loading performance on website is very crucial and there are many ways to optimize page load time. One method is to use AJAX to reduce latency over the past years by retrieving bits of web technology code to alter the layout of page rather than re-creating a whole new page with similar content. However, this method is not optimal to provide the best optimization on loading pages as many sites references tons of external objects that are mostly HTTP requests for images, JavaScripts, and stylesheets (Hopkins). Using AJAX does not guarantee that the user does not have to wait (“Best practices for,” ). For this reason, we can look at other additional alternatives on how to optimize page load time to improve a site’s usability and SEO. Many developers focus on the back end of the website to reduce costs but it does little to cover latency. “It seems almost as if the highspeed storage and optimized application code on the back end have little impact on the end user’s response time. Therefore, to account for these slowly loading pages we must focus on something other than the back end: we must focus on the front end” (Souders 2008). Below are suggestions from Hongkiat Lim’s article called “Ultimate Guide To Web Optimization (Tips & Best Practices)”: read more...