JavaScript image pipelining
We were talking the other day about ways to speed up the microPledge website loading, and we paused a while on just how much overhead is produced by HTTP headers and latency, especially for tiny little images.
(Yeah, we know about HTTP/1.1 pipelining, but in Firefox it’s turned off by default, and IE doesn’t support it.)
We have a number of tiny images on our upcoming site — rounded corners, shadow-borders, and other things you can’t do with plain old CSS. On average, they’re about 200-300 bytes. That’s tiny even in ancient dialup terms.
But if you take a look at the request/response headers normally required to get an image, they add up to about 800 bytes. That’s four times the size. All that effort spent in creating tiny images, wasted. What if you could glue them all together in one big long file, zip it up and send it over the wire? One single request.
That bit is relatively easy — you just need to write a quick-n-dirty client-side JavaScript function to get the file data, and split it up into individual pieces again. But to get the browser to display the data as an image on the page?
That’s where we got stuck.
After looking around a bit, we discovered somebody else has already thought of it, and Firefox implements the idea nicely with the “data: …” url scheme. Cool, but not cool enough, because Internet Explorer doesn’t support it. If you’re lucky enough to have an old version of IE6, there’s an alternative hack you can do, but you’re limited to black-n-white XBMs. Pretty pretty.
David Shea writes about something related he calls CSS Sprites. Almost useful, but not quite right for what we want. :-)
Does anyone else have war stories about something like this?
8 June 2007 by Bryan 9 comments
9 comments (oldest first)
Good point about people with JavaScript off. Ideally there’d be a non-JS fallback (though I suspect the figure’s much lower than 10% these days — there are too many Ajax apps now that’d be unusable without JS).
The problem is not that our site’s too slow, but that we want it even snappier. Snappy site = happy user. Bandwidth isn’t the issue, it’s latency. And Amazon S3, nice as it is, doesn’t solve the latency problem with lots of small images.
HTTP pipelining would definitely help, but it only works in Firefox and even then you have to explicitly turn it on.
You should check out this article:
http://borkweb.com/story/faster-page-loads-with-image-concatenation
Well, all of your images would get cached by the browser, right? Doesn’t that help?
Thanks, Josh — yeah, I saw that article when doing some research into what other people had done. Similar to CSS sprites in that it gets tricky and hard to maintain for lots of differently-sized images.
Greg, browsers definitely cache images, which is great, but that doesn’t help first page loads, and first impressions are pretty important. Then again, we may just be trying too hard. :-)
I’ve seen Google Maps using a technique similar to what games do with materials and/or sprites: use a large image containing many small textures/sprites (see this image). Can’t say exactly how they do that but I guess they crop the large image with small divs by offsetting the background image.
Hope it helps
I think even the google search results page does this- the Goooooogle at the bottom is various crops into one image.
Here’s a new tool I just saw. Perhaps give it a try.
Thanks Greg!
That looks like quite a cool idea. I’ll have to give it a go. The trick will be how to accurately determine in advance when the user is likely to view the image.
Well, I’m no expert on this but what happens if someone doesn’t have javascript? I heard estimates that this is as high as 10%?
Have you tried that http compression stuff?
How about hosting the images somewhere else like the Amazon E3 whatever place?
I guess the important thing is to state your problem. Is the site too slow loading, or are you trying to save bandwidth?