Getting Your PageSpeed Score Up


Have you ever had
one of those moments where you’re staring at your
phone waiting for a web page to load, you’ve got full
signal, and nothing’s happening? This kind of sucks, right? I mean, this site could
be the best thing ever. But if it takes too long to
load, I’m going to get bored, and I’m just going to
go do something else. Now get this. You have about one
second before a user starts to get distracted. One. So what causes a site
to have slow load time, and how do you fix it? My name is Matt Gaunt,
I’m a developer advocate, and I wanted to talk to you
about PageSpeed Insights, a tool which can be used to find
issues affecting your site’s load time, and give you some
tips on how you can fix them. So when you visit PageSpeed
Insights, you bash in your URL, and it’ll give you
a score out of 100, and highlight any problem
areas with your site. The focus of this
tool is to point out issues which, if fixed,
would improve your site’s overall load time and
make your users happier. The big question is, what do
you do with this information? Well, the most common problems
fall into three buckets. First, problems that can
be solved with build tools. For example,
JavaScript files where you could minify them
and send less bytes down the wire, or mergine
multiple files, so rather than make four
requests, you only make one. The second set of
issues are focused on the structured your page. For instance, where you link
to your JavaScript files. And thirdly, problems
which need server side tweaks to enable things like
caching of your page assets. Let’s start off with
looking at build tools. Many of you may already be using
some kind of build tool today, but if you want,
then I recommend checking out Grunt or gulp js. These build tools are
picking up a lot of steam within the web
development community, and they’re fairly easy
to get to grips with. The main reason you’d
want to use a build tool is to automate any
set of tasks you’re going to run on your
project on a regular basis. And most build tools will allow
you to pull in third party modules, which means you
get a lot of stuff for free. So let’s look at
a basic example, like JavaScript minification
and concatenation. In other words, take
some JavaScript files, strip out any white space,
replace variable names with shorter ones, and then just
squish all the files together. This way our JavaScript file
is smaller, making it faster to download, but
the browser also has fewer files to download. To achieve this, you
can use the uglify task. You define which files
to minify and merge, and it just takes
care of it for you. cssmin and htmlmin will do the
same thing for CSS and HTML. Another common problem
area is image optimization. When you run PageSpeed
Insights, it looks at your site and basically checks
whether any of the images are larger than they need
to be, and in the guidelines it will recommend jpegoptim,
which is a command line tool which strips out any
metadata from the image files, and can lead to some really
impressive reductions in file size. And when I say it
strips out metadata, I mean things like which camera
was used to take the photo. It’s not needed for a
browser to render the image to just get rid of it. So to add this to your
build process, use imagemin for raster images, and
svgmin for vectors. And svgmin will strip out white
space, and any unnecessary tags and attributes from
your SVG files. The second set of problems
falls into the category of page structure. Now, I don’t know about
you, but I’ve definitely put a JavaScript file in
the head of my page before. The problem with this
is that the browser will start to pass the
document, and as it reads the HTML
tag, the head tag, it will eventually
reach your script tag, and this is where you’ll
see the screen of bad times. Nothing. The reason the page
stops rendering is because JavaScript can
manipulate the DOM and CSS. So the browser needs to go
and fetch the JavaScript from the server and execute
it once it’s downloaded. This is just in case the
script changes a DOM or CSS. And it’s only after
the JavaScript is finished executing that the
page can continue rendering. So to avoid this delay,
simply move your JavaScript to the bottom of
the body element. This way the browser can
pass and render the page up to this point,
and then it’ll handle fetching the JavaScript
at the very end. An alternative and
slightly more performant technique is to use
the async attribute. Now, what async does
is it tells the browser to download and excuse the
JavaScript asynchronously, avoiding the render block. The reason it’s
slightly more performant is the browser doesn’t need
to wait on the DOM and CSS to be loaded before it can
execute the JavaScript. But you need to be
careful, because it means you can’t rely on the DOM,
CSS, or any other JavaScript file to be loaded by the
time your JavaScript actually executes. Essentially, it
might require you to think a little differently
to how you use JavaScript, depending on what you’re doing. The second page
structure woe is CSS. When we add a CSS link tag,
the rendering of the page will block much like
it did with JavaScript. We’ll see it step through
the HTML tag, into the head, read to CSS, and boom. We’ll hit the sad
screen of nothing. The browser has no
choice but to do this. It has to download the CSS
before it can start anything, otherwise the user would
see unstyled content followed by just a flash of
all the CSS’s as it’s loaded, and everything is styled. The fix for this can
be quite involved, and there’s a lot of caveats,
each with their own pros and cons. If you have a relatively small
amount of CSS for a page, you can in line it
directly into the HTML, removing any requests
for a CSS file, and allowing the
page to be rendered as soon as the HTML is loaded. If you have a large
amount of CSS, however, you could in line just
above the full CSS so the browser can render
the most important content, and then the load the of
your CSS using JavaScript. The obvious downside
of this approach is that it requires JavaScript,
meaning if it’s switched off, you won’t get the
additional styles. With both inlining
techniques, you also lose the ability to cache
the CSS files, which could be a big loss if
you have styles shared across multiple
pages, in which case it might be worth splitting
out the styles to CSS files so you can cache
them, and inlining any unique styles for each page. Ultimately, test it and see
what works best for your site. The final element of
the page load gotchas is server side tweaks. Jumping straight in,
let’s look at caching. When the browser makes a request
for a file from your server, it should always return a
cache control max age header. This is what tells the browser
to cache a file for how long, meaning the next time
a user visits your page and it needs a file, it doesn’t
need to download it again. It’s just good to
go from the cache. When it comes to caching,
there are a few things you need to ask yourself. First, how long can
I cache my files for? Second, how do I
actually get my server to return this cache
control max age header? And thirdly, what do I do
when I need to change a file? Answering the first
question of how long do I cache my files
for, I personally found HTML5 Boilerplate to
be a good source of guidance. If you look at HTML5
Boilerplate’s HD access file, there are a number
of caching rules to finding the caching period
depending on the type of file. Here’s a snippet of the file. We’ve got CSS and
JavaScript cached for a year, images
cached for a month, and HTML isn’t cached at all. Now, this is just a guideline. You may want to cache
your images for a year, you may want to cache your
HTML for 30 minutes to an hour, just so that any user hopping
backwards and forwards between pages gets
a smooth experience. If you’re using Apache
to host your site, then you can just
use this HD access file to implement
the caching rules. If you’re using a
different server, then check out the
documentation on how you can add these headers. Now, chances are
you’re going to want to change some of these files
before the cache expires, in which case consider
adding a build step to revision your file names. This way, there
won’t be any issue with older versions
of files being used. filerev is an example
task designed specifically for this use case, and can
even be used with other modules to replace file names
in your HTML and CSS so it’s pointing to
the revision file. The final thing to enable
on your server is GZip. If you’ve not come
across this before, it’s supported by all
browsers, and it’s an incredibly powerful
tool for compressing any text based assets. There are some
huge wins to be had with compression
rates up to 80%. Now, how you enable this
will depend on your server, but the gains are big for
a relatively small amount of work. I recently got this
working on my Nginx server, and it was as simple
as just adding GZip on. So those are the three
most common problem areas raised by PageSpeed Insights. Build times fixes, page
structure, and server side tweaks. Now, these aren’t the only
tools to help with performance, so if you’re looking for
more tips, or just curious about what other tools
exist, I strongly recommend checking out this
blog post by Addy Osmani. It includes many of
the tools mentioned in this video in
much more detail. Hopefully this has
been a useful insight into how you can improve your
site’s load time performance. I challenge you to get 100
points on your PageSpeed score. Cheers.

Leave a Reply

Your email address will not be published. Required fields are marked *