File Under: Uncategorized

Site Optimization Tutorial – Lesson 3

So far, we’ve learned how to shrink page layout code and how to effectively compress images. Still, there are a few more techniques you can apply to optimize your pages, and most of them spring from smart design sense (Here are the On Page Search engine optimization guidelines). Follow these helpful design tips and your page load time will be about as miniscule as humanly possible – short of running your code through a Frinkian Debigulator.


  1. HTTP Compression
  2. Link Prefetching
  3. Heavy Baggage
  4. Cache In
  5. URL Abbreviation

HTTP Compression

HTTP compression isn’t something you put in your code. Instead, check with your sysadmin to see if it’s installed (or can be) on the server that’s dishing out your site’s pages.

What does it do? HTTP compression accelerates the transmission of pages from server to surfer. It allows for server-side HTML compression so server apps (like Apache or Microsoft Web Server) can compress the source code of your page before sending it out over the wires.

HTML compression works on almost every browser these days, but the code is savvy enough to dish out un-compressed files to unsupported browsers. Compression alone improves the download size of pages by up to 300%. (Exclamation mark.)

Other nuggets in the HTTP compression nougat include persistent connections (between server and client) and pipelining (which allows the server to rattle off files without waiting for a client’s “uh-huh, got that one, next please” response). Crackin’ good stuff, all of it. The only negative thing we can say about HTTP compression is that it’s probably already installed on your server, seeing how it’s well-adopted now (for obvious reasons). Still, if you suspect your hosting company is slow-with-the-program, it couldn’t hurt to ask.

Link Prefetching

Link prefetching is a feature in some browsers which makes good thrift of a browser’s idle time by downloading files that aren’t on the current page but might be needed a page or two down the road. Don’t worry – link prefetching doesn’t slow pages down. The extra downloads don’t kick in until after the current page has finished loading and the browser doesn’t have any pressing matters to attend to.

If you had a photo gallery, for example, with Previous and Next navigation links by each picture, you could add a prefetch link tags pointing to both of those destinations. That way, while the user is staring at one amazing travel picture, two other pages are downloading in the background. If the user does click either the Previous or Next links, those pages will already have been downloaded, and the content will display instantaneously. More technically speaking, “Guy clicks the link, see, Bada-bing, Bada-boom, it’s the next page, already. Aow!”

Link prefetching isn’t automatic – it relies on you, the developer, to encode hints as to what files are likely to be needed next. (Browsers aren’t prescient, but good webmasters understand a site’s traffic flow.) You can encode prefetch hints in either the HTTP header or the page’s HTML. There are a number of allowed variations on how to code a prefetch, but a simple HTML  link¢ tag with a relation type of “next” is small, easy, and our favorite. Like these:

<link rel="next" href="/images/next_big_picture.jpeg">

<link rel="next" href="">

You can include multiple prefetch hints on a page, though you don’t want to bog down your source code with too many links. Read Mozilla’s official link prefetching FAQ for the full picture.

Heavy Baggage

Here’s another tip to help keep your pages loading faster: Keep your scripts from loading until as late in the game as possible. Reason being, whenever a browser encounters a <script> tag, the HTML parser is brought to a grinding halt. Meanwhile, the browser has to run and fetch the script from the network, parse it diligently, then execute it. Then, and only then, will the browser fire up the ol’ HTML parser again.

So, if you have a script that’s not really needed (a virtual kitten with an unseemly interest in the users’ cursor is one fine example), and it’s fetched externally, eliminate it. If the script is a must-have, but can be deferred until onLoad, try moving the load to the end of the source code to get the page’s content displayed faster. You might pay a small penalty in the page’s overall load time, though, since the script got a late start.

As to optimizing scripts themselves: abbreviate the names of variables and functions with a vengeance, and always take time to see if somebody else hasn’t already coded something similar, smaller, and better.


Java? Don’t get me started. Don’t even get me started!

Seriously. Java applets are suitable for extremely targeted applications — such as a workflow manager on a corporate intranet — and that’s about it. Initializing Java applets is disastrous for performance and should only be done if the applet’s specific functionality is the whole point and purpose of the page.

We’re looking the 21st century square in the eyes, people. If you’re still of the opinion that a Java headline ticker or banner-rotation applet makes for an delightful web accoutrement, well, I can also get you a real nice deal on some preferred Webvan shares.


I’m not a Flash whiz, but thankfully, I know a couple. Here’s a quick checklist of optimizations for Flash designers trying to keep animations slim:

  • Use Flash where it counts. Avoid it whenever there’s a reasonable alternative using HTML and JavaScript.
  • Use symbols and keyframes strategically.
  • Minimize the use of bitmap images, sounds, and video.
  • Re-use bitmap images, sounds and video rather than using many unique ones.
  • Utilize compression settings for individual images and sounds, when possible compress them in another program like Fireworks.
  • Simplify vector images using the Modify -> Optimize menu.
  • Pace the download so that everything doesn’t have to load all at once.
  • When all else fails, build a preloader and display a “loading” message.

Cache In

Network (or Proxy) Caching

We previously discussed how browser-side caches store commonly-used images on the users’ hard drives, but it’s important to note that similar caches exist all alongside the highways and byways of the internet network. These “Network Caches” make websites appear more responsive because information doesn’t have to travel nearly as far to reach the user’s computer.

Some webmasters are leery of network caches. They worry that remote caches might serve out-of-date versions of their site – an understandable concern, especially for sites like blogs that update frequently. But even with a constantly-updating site, there are images and other pieces of content which don’t change all that often. Said content would download a lot faster from a nearby network cache than it would from your server.

Thankfully, you can get a site “dialed in” pretty nicely with just a basic knowledge of cache-controls. You can force certain elements to get cached days on end while keeping other elements from being stored at all. META tags in your document won’t cut it. You’ll need to configure some HTTP settings to make caching really work for you. (Improved cache-controls, incidentally, are another benefit of using HTTP compression) Anyhow, if all this piques your interest, start with Mark Nottingham’s excellent Caching Tutorial For Web Authors.

Every Bit Counts

Alrighty. So you’ve done the big stuff – dropped bit depths on your every PNG, cranked up the HTTP compression, and taken a (metaphorical) weedwhacker to your old, convoluted table layouts. Yet you’re still obsessed with how small, how fast, and how modem-user-friendly you can make your site. Ready to jump into some seriously obsessive-complusive optimization?

You know those TV commercials where they zoom in on a supposedly “clean” kitchen counter, only to reveal wee anthropomorphic germ-creatures at play?

Well, you can similarly clean every extraneous detail from a site’s layout, and still have some nasty, nasty cruft living in the source code. What’s the point of novel-length meta keyword lists and content tags? C’mon, do you still believe that search engines care about that stuff? Not in this millenium. You’ll get better search referrals by thinking carefully about the real content on your pages and building an authoritative site that’s linked to widely.

Streamlining the  head¢er section of unneeded meta keyword/author/description content, and likewise junking giant scripts makes a bigger impact, kilobyte-per-kilobyte, than sacrifices made elsewhere on the page. Having a short  head¢ to your document ensures the initial chunks of data the user receives contain some “real” content, which gets displayed immediately. That’s another notch for “perceived speed” improvements.

Of course, there are plenty of regular  body¢ bytes still worth tossing. Start with HTML comments, redundant white space, and returns. Stripping all these invisible items from your source code yields extra kilobytes of space savings on the average. You can do this manually if you like, or check out utilities like iWebTools’ HTML Optimizer and Dave Raggett’s HTML Tidy that can batch-process the drudgery of culling extraneous spaces, tabs, comments, line breaks, and the like.

URL Abbreviation

Ever spot how links on the Yahoo frontdoor are generally just a few characters long? Go to the site and move your mouse over some of the news links near the top. You’ll see they all start with and then list a string of six numbers. Links, generally, run on much longer than that, especially if they include redirect codes or CGI variables. Put enough normal links on a page, (viz. the Yahoo frontdoor, again) and a sprinkling of kilobytes (and seconds of download time) is also added to the code.

So what’s Yahoo doing with those funny links, anyway? They’re abbreviating their URLs, using the mod_rewrite Apache module, so that a link like “/s/882142″ redirects to “”. Implementing this requires getting your hands dirty with some server configuration. Specifically, you need to get mod_rewrite installed and poke around with the srm.conf file. Dirty work for many of us, but the payoff is worth several solid kilobytes on a link-heavy page.

One important question to ask yourself before leaping into URL abbreviation, though: Why should your homepage require hard-core server-side optimizations used by the major portals? If your frontdoor is farming links on the same Stalinist scale as Yahoo, your problem may well be “too many links”, comrade.

Of course you want every page of your site to be accessible. And if you have advertisers to please, they’ll demand navigation which brings traffic to their sites. Stuffing a lot of links on your page won’t solve these challenges.

It’s a zero-sum game – the more links on a page, the less likely any single link will be clicked. If your page has clumps and columns of links surrounding the content, readers will just tune them out altogether and focus on the good stuff in the middle of the page. There’s little point in having that happen.

Paul Boutin previously covered this topic for Webmonkey, and his parting advice puts it pretty succinctly:

“Study your server logs to find out what people are clicking on most and least. Cut what your readers aren’t reading, and replace it with the information that your log data proves they want.”

Understanding your site’s traffic flow helps you deal with advertisers, too. There’s no better way to upsell a sponsorship than with solid metrics showing how poorly their latest “great idea for a link” performs when compared against a sensible placement with tried-and-tested traffic.