Optimizations For Improving Page Load Times

Cody Lindley | June 11, 2010

 

The issue at hand today is simply this: what are the weightiest optimizations that can be implemented when constructing web pages?  This area of web development is best categorized as load-time optimizations.  This article is going to examine the most critical load-time optimizations for crafting a speedy web page.

In another article I will be covering run-time optimizations.  Run-time optimizations focus on the performance of a web page once it has already been downloaded by the client (e.g. a web browser).  Specifically, it is concerned more with the interaction and experience the user engages in once the web page is already down loaded.  In this article, however, we will concern ourselves with load-time optimizations.  Load-time optimizations are specifically geared at shaving off the time required to load a web page and all subsequent page loads.

Before looking at some load-time optimizations I would like to make a recommendation.  Use a measuring stick to test load times!  I know that it might seem silly to point out such an obvious recommendation, but measuring is extremely important.  In my experience, far too many developers forgo measuring page performance.  So, very simply put, make measuring web page performance a part of your daily routine as a web developer.

My measure stick of choice is found at http://www.webpagetest.org.  As a quick preview of this tool have a look at a "Web Page Test" done on the homepage of this site.

The "Web Page Test" is used to measure the loading time of a web page.  Above and beyond just measuring load times, it can be used to diagnose why a web page might be running slowly.  In a nutshell, it will provide the details required to understand the most costly parts of your HTML page architecture as it pertains to file sizes, http requests, and document rendering.

Specifically, the “Web Page Test” online tool takes a detailed look at the HTTP activity that occurs during the loading of a web page.  It then takes this activity and provides several charts, graphs, visuals, and reports for understanding what is occurring during each http request.  In addition, it also provides insight into how HTTP requests effect, as a whole, each other and how this pertains to the bottom line, which is total load time.

Honestly, grokking this tool is as simple as just visiting the site and using it.  If you are reading this article and seeking performance based answers then the functionality and information provided by this tool will make perfect sense once you just start using it.  I encourage you to open a browser window now and use the “Web Page Test” tool to test a web page.  If you require a bit of a primer on how to leverage this tool you might checkout the screencast from Dave Artz.  Dave is the Director of Website Optimization at AOL.  So, he might a know a thing to two that could be helpful.

Now, just because this is my favorite measuring stick does not mean that it will be yours.  There are several tools available that provide very similar, if not identical metrics.

Below is a list of some well known alternative tools for measuring web page loading times.  I find myself leveraging many of these tools in combination with the aforementioned  “Web Page Test” tool from AOL.

  • YSlow from Yahoo (firefox Add-on)
  • Page Speed from Google (firefox Add-on)
  • Pingdom Tools - Full page test (online tool)
  • FireBug net tab (Firefox Add-on)
  • Web Developer Tools (Built into Safari and Chrome)

Now that we have a measuring stick and we can test optimizations and locate optimization opportunities, let's move on to looking at some actual page optimizations.

I like to keep things simple and digestible.  So, we are only going to focus on five optimization categories in a sea of optimizations.  These categories are: Reducing HTTP requestsReducing file sizeOrdering includes for speedDistributing the loadChecking Sever Response Times

Reducing HTTP Requests

This optimization simply comes down to reducing the number of files that a web page requires.  By reducing dependencies for images, stylesheets, and scripts we can avoid the bottleneck that can occur due to the browser getting tied up in downloading (HTTP requests) assets.

When building web pages we should strive to reduce dependencies and thus HTTP requests.  This can be done by:

  1. Using CSS Sprites.  Using a CSS sprite entails embedding more that one graphical element in an image file.  Doing this avoids having one HTTP request requesting only one image element.  Using a sprite we can download several image elements using a single HTTP request.
  2. Consolidate similar external dependencies into a single file for production code.  When developing CSS and JavaScript files it might be logical to keep files in a modular architecture.  However, once code moves from development to production you should be combining these modules into at single file (or as few files as possible) to reduce HTTP requests.  Many of the tools used to Minify CSS and JavaScript also provide the ability to concatenate files together.  Consolidation can, of course, be done by hand.  However, in my projects I have leveraged a tool called minify to handle automated consolidation.  

Reducing File Size

This will come as no surprise, I hope.  Small files are loaded and parsed faster than bigger files.  Knowing this we should always be striving to shrink file sizes (images, css, js, html) down to the minimal acceptable size given our desired outcome.  Below we examine five ways in which we can cut down file size.

  1. Minify CSS and JavaScript files.  Once CSS and JavaScript files are combined, they should be minified.  Minifying involves the removal of comments, white space and the shortening of variable names.  Various tools provide various degrees of minification, but the key take-away here is that some minification at some level should be leveraged to reduce the size of CSS and JavaScript files.  Personally I have used minify, YUI compressor, shrinksafe and Closure Compiler.  These tools are rather sophisticated and, while I do recommend their usage, a simple tool can often do the trick too.  For example, many of these tools are offered as online solutions (e.g. Shrinksafe & YUI Compressor).
  2. Choose appropriate image formats and compress these images to the smallest acceptable size.  Not all image formats are created equal and not all formats are the best solution for a given type of image.  Knowing when to use a .gif, .jpeg, or .png can be a bit of an art form.  Add in the complexities of dealing with images that contain transparency, combined with browsers that support transparent images differently, and you have a lot of complex decisions to make.  The depth of knowledge required to navigate the the complexities of creating images and choosing a format for a web browser is beyond the scope of this article.  I suggest doing a google search on the topic and becoming educated on when, why, and how to use a particular format for web pages.  Once you have decided upon a format to use, the next step is to reduce the file down to its smallest acceptable size.  To do this I have leveraged tools like pngcrush and smush.it.  Of course, Photoshop and Fireworks will also do the trick.  In my experience, Fireworks actually does a better job at compressing images.  The key here is to make sure images are compressed and the format chosen is the most ideal format given the usage.
  3. Don’t Flash unless you have too.  Flash is a great tool when it is crystal clear that its usage is justified.  Using Flash just for the sake of using Flash can be a costly decision, given the overhead of requiring a browser plugin (the Flash player) to run .swf files.  In the right situation Flash is a savior.  Keep in mind, however, that Flash does have an overhead cost.  My observation has been that .swf files are typically rather large in file size when comparing them to all dependencies.  Also, it is typical that embedding Flash on a web page requires a bit of code overhead to deal with those users who do not have the proper Flash player.  Let's be clear, I am not a Flash hater.  I am just recommending exhausting other solutions (e.g. JavaScript Solutions) that are smaller in file size before leveraging Flash.
  4. Don’t over complicate the DOM.  If you are still rolling out HTML with table-layouts, this topic is directed at you.  I am not trying to be dogmatic about anything here.  However, the size of the HTML document does have an effect on load times.  Reducing the size of the actual HTML elements is not something that should be over looked.  This reduction mainly comes in the form of keeping the DOM simple.  So, use the least amount of markup required to remain semantical and SEO friendly, but still get the job done.
  5. **Remove unnecessary whitespace from HTML documents. ** Yes, believe it or not, I am going to mention this.  I have seen HTML pages that contain so much white space that when removed, it reduced the size of the HTML page by close to 90k.  Do not add unnecessary bloat to HTML pages, remove whitespace or, at the very least, be aware of its weighted effect on page size.  We minify CSS and JavaScript documents to optimize; why not also minify HTML documents when it comes to removing whitespace?
  6. Gzip all text files.  When you see the term Gzip, all that we are talking about is compressing the content contained in text files so that less data is transmitted.  HTML, CSS, and JavaScript are all text files.  These files can be compressed from the server and severed encoded in a Gzip format to be unpackaged by the web browser.  Using Gzip requires a bit of server-side knowledge.  For more details check out this “Better Explained” article or as usual feel free to use google to search for Gzip topics.

Ordering Includes for Speed

Believe it or not, the order in which dependencies are included in a web page can directly effect how long it takes for a browser to render and completely load a web page.  This is due to the way in which browsers manage HTTP request.  And, specifically, how a browser HTTP request handles the various types (.css vs. .js) of dependencies.  Below, I detail two techniques of including dependencies that can improve page load time.

  1. **Include JavaScript at the bottom of the page. ** A web browser will stop the rendering of a page while it waits for JavaScript files to download and do its thing.  This can cause some dramatic slow downs as the page is being rendered by the browser.  By moving JavaScript to the bottom of the page, the user will typically begin to see the page render faster.  As such, we’re hoping that the perception by the user will be that our web page is stinkin' fast!
  2. Include CSS at the top of the page (in the header).  Getting the user as much visual information as possible as soon as possible is key.  Keeping CSS files at the top of a web page allows the page to render progressively, i.e. display content as soon as possible.

Distributing the Load

Distributing the load can be a very complex topic.  In general, a web browser will only open so many connections with one host name.  However, if we distribute our dependencies to 3 or 4 host names, the browser can open up simultaneous connections pulling dependencies from different servers at the same time.  Of course, there are limits.  Have a look at browserscope for a good over view of which browsers support what.  To distribute the load and get simultaneous connections working in our favor we can leverage the following solutions.

  1. Use a CDN. To distribute the load taken on by the web browser we can leverage a content delivery network (CDN).  This not only distributes the load.  It also provides a collection of web servers distributed across multiple locations serving your content to users based on there geography to the closest server.
  2. Use sub-domains to fake a CDN.  An alternative to using a CDN or purchasing more hosting is to fake a CDN using sub-domains. Using sub-domains provides the needed unique URL for allowing simultaneous connections.
  3. **Leverage free tools from a free CDN. ** When available leverage dependencies hosted on free CDN's.  For example, the people at Google have been kind enough to provide CDN hosted versions of many popular JavaScript solutions.  When it is logical, one should take advantage of these free services to help distribute the load.

Checking Sever Response Times

Not everything comes down to how fast a client (e.g web browser) can load and parse an HTML page.  The server plays a role as well.  Before you optimize for a client, it might make sense to verify there are no bottle necks on the server-side.  These issues can span the gamut from complicated SQL queries, poor database models, overloaded servers, and poor application architecture, to name just a few.

In general, if a web page is loading slowly make sure to verify the slowness is not coming from the server.  I have been in a situation where no matter what optimizations were applied the server simply was not responding in an acceptable time.  This was due to issues on the server.  Our optimizations on the client-side helped loading times to a degree, but when the server was taking 7 to 8 seconds to respond to an HTTP request, our client-side optimizations did little to improve loading times.  A lot of time on that project was wasted optimizing the wrong problem.  After that project, when faced with performance loading issues I will always first check to see how fast the server is actually responding to HTTP requests.

Web page performance has become a rather large and complex field of study.  This article has only scratched the surface of this field of knowledge.  In reality, being an optimization engineer is a job role in and of itself.  In fact, companies these days are hiring optimization engineers, as well as creating entire departments based simply on the practice of optimizing code for delivery on computers and mobile devices.

If you want to dig deeper than what was discussed here and continue your understanding of load-time optimizations I highly recommend investigating the following resources:

 

About the Author

Cody Lindley is a client-side engineer (aka front-end developer) and recovering Flash developer. He has an extensive background working professionally (11+ years) with HTML, CSS, JavaScript, Flash, and client-side performance techniques as it pertains to web development.

If he is not wielding client-side code he is likely toying with interface/interaction design, PHP, MVC frameworks, iPhone development, jQuery, Dojo, or authoring material and speaking at various conferences. When not sitting in front of a computer, it's a sure bet he is hanging out with his wife & kids in Boise, Idaho, training for triathlons, skiing, mountain biking, road biking, alpine climbing, reading, watching movies, or debating the rational evidence for a Christian worldview. Currently he is working as a contractor/freelancer.

Find Cody on: