Whether a Web site is fast or slow depends on many different things. I’ve been very disappointed with the speed of some of our sites lately and I am working on addressing the issues I can change. This is important to those visiting the sites and is also a factor in Google’s ranking algorithm for search. To fully understand, I thought it would be good to document some of the main elements that make up the speed of a site.
Let’s start with the issues that are out of control of the Web site. The person visiting the site has to connect to the Internet to see a site and the speed of their connection can greatly affect their perception of a site’s speed. A slow connection can make any site seem slow and some sites will be extremely slow.
Another issue can be the number of “hops” between a visitor and the Web server. More hops can mean slower speed. If a visitor is on one side of the world and the site’s server is on the opposite side of the world, there can be a number of intermediate servers (or hops) to make the connection. A site owner can decrease this issue by using a content delivery network (CDN) to cache content on servers in different parts of the world.
Now that we’ve covered the biggest issues outside of a Web site’s control, let’s talk about the issues they can control. The first element is the speed of the server itself. A server is simply a computer and there may be many Web sites on a single server. Since a computer can only run so fast, a crowded server can lead to slow speeds. While it isn’t impossible to move a site to a different server, this is one of the more difficult elements to change. The goal is to choose a good hosting company for your site from the very beginning.
The size of the page plays a big part. If you have more data to pass to a visitor, it takes longer. Therefore you want to optimize the graphics to keep their size to a minimum and not include anything that isn’t necessary on a page. These days the magic number for total page size is to keep it under 3MB total. This isn’t always possible, but at least it gives you a good number as a goal.
You can have a single 100KB graphic and it will load much faster than 100 separate 1KB graphics. This is due to “page requests”. Each element on a Web page must be requested. Having too many elements can slow a site down. And if some of those elements are being requested from another site, the speed of that request could be out of your control. For example, we have a “Facebook box” that pulls the latest posts from our Facebook page (and thus the Facebook server) and displays them on our site. It isn’t uncommon for a site to be pulling data from a number of different servers. Keeping the requests to 30 or fewer is ideal, though often it can be difficult to keep it under 100.
How can you reduce the number of requests? Obviously the biggest answer is simply not to include something on the page. But another method is to combine elements if at all possible so that a single request can fulfill more needs. We’re definitely looking at ways we can do this to speed up our sites.