quick wins in web performance
Following on from a workshop by Andy Davies I attended at Port80 this year I am forever looking at how newly launched sites perform. Most of them, invariably, have things that could do with tightening up, making the overall file size smaller and making them download and run quicker.
Running your site through something like webpagetest.org or gtmetrix.com will help show you (through their summary pages) where you could improve your sites performance by pointing out where it’s currently failing.
Below are some ‘quick wins’. Some easy things to do to make your sites better, smaller and more performant.
Optimising images is something that’s quite easy to do but it’s also something that’s often missed. They’re also, more generally, the biggest part of a web page in terms of file size. There are a couple of ways that we can help reduce a web pages file size and performance by doing some simple things.
Are you using the correct file format? It’s easy to export a photo as a PNG24 in Photoshop but wouldn’t it be better as a jpg? Would it be better to use an SVG instead of some gifs for your social media icons. Thinking about what the file type is you’re actually needing will help reduce the file size before you’ve left Photoshop.
Once you’ve got the correct file format are they as optimised as possible. You can only do so much with Firework or Photoshop. You’ll need to use a compression tool like imageoptim(lossless compression), jpegmini(lossy compression), SVGO(for compressing SVG files) or something integrated in an application that’ll squeeze the kilobytes out of the files even more again reducing the over file size of the page.
If there are several icons on the site you’re building. Rather than give each one a separate file which you would call in the CSS (for example) create a sprite sheet. Making only one file of several files helps reduce the calls to the server for the image required.
Removing Duplication and Cruft
As mentioned in a previous blogpost I wrote how I made a quick win by spotting duplicate code for Google analytics. Going through your site and removing any duplication you find will be a very quick win.
As well as duplicate code, there may be code that started out being used in the project but is now redundant. Removing any of this redundant code (with something like UnCSS) will help reduce your file size.
We’re all human, sometimes we mistype things, sometimes we forget to delete references to things we’ve deleted. Looking at what gets downloaded by your browser will tell you if, somewhere in the page, you’re referencing something that doen’t exist.
Browsers are triers. If it fails to find something you’ve included, an image or font file for example, it will continue to look for it just in case it itself failed on the first time to grab it from the server. If this was a font then your sites visitor could be faced with a page with no text for some time. Checking this whilst in development and before you push live will help stop this happening.
will block the page rendering giving the sites visitor a blank page until it’s finished loading. Putting it at the bottom will help this as the page will render the HTML and CSS without being blocked.
Reducing HTTP requests
You don’t need a separate file for your reset. Your print stylesheet is so small you can add it the end. You really only need to send one CSS file down the pipe.
Now with responsive web design you might have a mobile, tablet and desktop CSS file. You really don’t need to split them up. One file to rule them all. Cuts down the amount of HTTP requests meaning your page might actually perform a little bit better.
For this a pre-processor really helps. You can easily import all of your Sass/LESS/Stylus files into one so when it compiles you get one gorgeous CSS file. That way you can still have the separate ‘bits’ of CSS to code but have one file on the server. Doing this with just CSS you just have one file and write all of your CSS in that one file.
You could do the same for your JS. I don’t I like to keep certain bits in their own file. If I use jQuery it gets its own file with its own HTTP request as do Modernizr and Google Analytics.
If you wanted to you could minify the HTML of your webpages. To be honest I’ve never done this. I’m sure if you quickly google you’ll be able to find something that’d do this but when you bring in the various CMSs available today for me it’s just easier to leave it as it is. Remove your comments though doesn’t need to be there in your production code.
When you’re writing CSS I’m sure, like me, you want to be able to quickly understand what each bits doing, adding comments so you can easily scan and find the relevant CSS you need to edit.
The browser doesn’t care about readability. It doesn’t matter how you’ve nicely laid out your CSS and what comments you’ve added. A quick win here would be to minify the CSS you’ve concatenated so that it strips out and whitespace and any comments you’ve added. This will shave down the kilobytes being pulled down the pipe meaning a faster download.
If you use a pre-processor then you can easily add all the comments you want in the world and then minify the outputted CSS when compiling. There are several apps that allow you to do this for LESS, Stylus and Sass. If you’re still writing CSS without one of these you can still minify the CSS from various applications.
To enable this you must be able to access the server as you’d need to edit (or create) the .htaccess file and add this code -
You could add this (taken from the HTML5 Boilerplate)
The browser will cache and keep the files for as long as specified, if possible. I say if possible because every browser has a different amount of ‘space’ for caching. On a mobile browser it’s even less so. Doing this helps to possibly speed up future visits and that’s worth doing I reckon.
LESS, Sass, Stylus
- LESS.app (OSX)- hasn’t been updated since March 2013
- Procssor (OSX)
Hopefully this has helped put fire in your belly. Making your site as lean as possible is not only good for your visitors, it's also good for your server costs. Smaller file sizes and less file means less storage space required and less bandwidth needed for the site to be accessed.
Everyone's a winner baby, that's the truth