We all know by now that the speed at which your site loads is a search ranking factors. It matters in two ways. First, it matters because Google says it does. A few years ago, Google announced that they were going to start considering load times as one of their many factors involved in SEO.
Secondly, it matter because people say it does. When you go to load a website, how long do you wait before you’re frustrated, go back to the search results, and try something again? It might feel like a long time, and you might guess 5-10 seconds, but you’d be wrong. Most of the time, people will bail on a slow loading site in under two seconds.
So if people leave when your site is slow, and Google bumps your search ranking down a slot or two for the same reason, it’s obvious that you want to solve the problem. Thankfully, Google provides a tool to help you do so: PageSpeed Insights.
What is PageSpeed Insights?
PageSpeed Insights is Google’s site speed monitoring tool. You can visit the tool here and see what it says. When you plug in a URL, Google will analyze the page and will check how it loads on both mobile and desktop simulated devices.
I say simulated devices because Google does not use actual devices to load the page. Instead, they use their Blink renderer to simulate a Chrome instance from both desktop and mobile, to see how it turns out.
PageSpeed Insights analyzes a site and gives it a score on a 100-point scale. The ranking is a mixture of web developer best practices, Google policies, and technical limitations. A score of under 50 is poor; above that can be better rankings. You’re generally shooting for something above 80.
Here are some considerations that Google will look at and suggest changes to, if your site needs those changes:
- Server response time. The longer it takes your web host server to respond to an incoming ping, the worse your rating here will be. Improving this is generally a matter of moving web hosts, though; some data centers are just slower than others, and shared hosting is slower than dedicated hosting.
- Render-blocking scripts. Some JavaScript and CSS files need to load before the page will finish loading. The solution is generally to apply those styles or scripts asynchronously, so the page can continue loading while simultaneously loading the scripts.
- Browser caching. Caching allows a browser to save some elements of a page locally when the page loads, so that when the user returns to the page later or navigates to another page on the site, they don’t have to download the files again. Common elements like a common stylesheet or a logo image should be cached to increase subsequent load speeds.
- Image optimization. Smushing images can compress the size of the files without any visible loss of data in those images. No human eye would be able to see the difference, but the size of the files can be reduced by 10-50%. This allows them to be downloaded much more quickly, thus decreasing load times.
- Redirects. If you redirect a user from one URL to another, that redirect takes time. Minimizing redirects speeds up loads.
- General file compression. Using something like GZip compression allows your server to make files much smaller, so they download and extract faster than downloading the uncompressed file would.
- Minified code. A minifier will remove extraneous formatting and positioning from your HTML, CSS, and script files, making them much smaller. Remember, a single space might not look like anything to you, but a computer reads it as “%20“, so spacing that isn’t required by an HTML renderer but is helpful for you to read the code is actually just making the file size bigger, as far as a computer is concerned.
There are, of course, other elements that go into the load times of a page. You can actually see the entire process of loading your site if you go to a tool like Pingdom. Plug in your URL and then scroll down once it’s done analyzing. You’ll see the file requests and load order shown in a “waterfall”, showing exactly what is taking up the time and where.
The WordPress Solution
Now, many of us use WordPress as a basic framework for our websites because it’s easy. You don’t have to code a framework. You don’t have to code a theme. You don’t have to code scripts to do things for you. You can install all of that as developed by someone else, and focus on writing content and running your business.
Of course, WordPress itself is a pretty well developed and fast platform. The web servers that host it may or may not be great, but that’s an individual issue. All of the rest, well. Plugin developers generally focus on function over speed, and when you have half a dozen or more plugins operating all at once, it’s entirely possible to slow your site way down with all the fancy features.
Thankfully, WordPress developers are also aware of the importance of speed. Many modern plugins are designed with speed in mind, and there are yet more plugins that will help you speed up your site. Here are some of my favorites.
Merge Minify Refresh
This is perhaps my favorite new plugin I’ve discovered recently. If you look up at the page speed factors, one of them is the size of your code and script files. Having too many script files and CSS files that take too long to load will definitely put a dent in your page speeds.
What this plugin does is exactly what the name says. Part 1 is the merge. Any script files are merged together when possible. The same goes for CSS files. Groups of files for different applications are bunched together. Everything that loads on a page is grouped into one file, anything that loads independent of pages is grouped into another, and so forth.
Part 2 is to then minify those grouped files. Smashing down all the unnecessary spacing and symbols, removing comments, the plugin does it all. It uses Minify for CSS and Closure for JavaScript, to generate new, smaller files. When a user visits your page, they download the smushed and minified files instead of all of the individual scripts.
It’s worth noting that the original script files are left untouched. If you need to go in and make a change, you can do so easily; your comments are intact and your code is human-readable, if that’s the way you left it. You can edit the source files and run the plugin again, and it will push the changes into the smushed files. That’s part 3, the refresh.
Let me tell you, this plugin is incredibly effective. On one site I installed it on, I went from a nearly two-second load time to under half a second, reducing requests from around 80 to around 30, and it reduced my bounce rate, which directly improved some of my search rankings. All in one plugin; not too bad at all.
Optimus
Optimus is one of many image optimizers. Just like how smashing down your script files makes them smaller, allowing them to download faster, smashing your image files does the same thing.
Images are chock full of unnecessary data. Some of it is meta data you don’t need when you share it online, like the original author of the file, the EXIF data of a photograph, and so forth. If you want to keep all of that intact, you can do so, but generally it’s not useful outside of a photographer’s portfolio website.
More importantly, images are full of insane minute details that the human eye can’t see. Most of you probably don’t remember a time when web images were capped out at 32 colors or what have you, or when it was a big deal that a computer monitor could handle the “millions of colors” setting. These days, that’s the norm for everything from a PC to a smart-watch.
The result of having access to millions of colors – every possible hex code for colors, in fact – is that computers don’t hesitate to use them. However, every variation in a color is a bit more data the image needs to contain to render itself properly.
The fact is, you can scrunch out a ton of those color variations and never see the difference in the final image. If two pixels next to each other are both sky blue, but one is #87ceeb and one is #87ceec, they look identical to your naked eye. I doubt you’d even be able to see the difference between the two if they were next to each other and you were zoomed in to see the dividing line.
Image smushing using a plugin like Optimus removes a lot of this extraneous data. It’ll turn that ceec pixel into a ceeb pixel, reducing the data in the image file, but keeping the image itself functionally unchanged. The only functional data you might lose is if the image has a steganographic message in it.
When you upload an image to WordPress and you have Optimus installed, it simultaneously uploads the image to the Optimus servers. Optimus smushes down the image file and sends it back, and the smaller image is the one added to your file library in WordPress. You save a ton of space using an automatic process you never see.
WP Super Cache
Caching, as I described above, is the ability for a browser to save static bits and pieces of a website locally so it doesn’t have to download the same file multiple times. This is fine and easy for things like images.
One of the strengths of WordPress is how modular it can be. This is caused by its ability to dynamically generate pages from the component parts. The downside to this is that there is no “page” that can be downloaded and cached. That’s where WP Super Cache comes into play.
WP Super Cache generates static HTML files out of your dynamic WordPress pages. These HTML files are then cached locally on your server, saving your server the time necessary to put the page together on the fly.
This static file can also be cached locally for the user, to save even more time. The whole thing is highly configurable and very well trusted amongst WordPress site owners. In order to make sure the files are stored locally by the user, I recommend supplementing this plugin with a related one. In my case, I prefer:
Leverage Browser Caching
Leverage Browser Caching simply prioritizes a browser’s caching ability and sets longer expiration times for cached files.
Scripts to Footer
Lazy loading is the process of telling certain files, usually scripts, that they can load later when the user actually needs them. It prevents the “waiting for scripts to render” bit up there in PageSpeed Insights. If the rest of the site continues to load before the scripts, it isn’t waiting on the scripts and thus loads faster. Putting the scripts in the footer is an easy way to do this, in a way.
Scripts to Footer basically just puts any non-essential scripts into the footer of your site. Essential elements like site design CSS load at the top, but things like analytics and slide-ins or exit intent pop-overs load later.
Disqus Conditional Load
Disqus is tricky. In the past, I’ve avoided recommending it because it’s not necessarily SEO friendly. You want Google to be able to read the comments, after all. On the other hand, it’s a good comments system and I use it myself.
Disqus Conditional Load is lazy loading specifically for the Disqus comment system. It makes sure the bloated comments scripts load later, only when the user scrolls far enough down to see them. It speeds up site load times, at the expense of visibility on page load.
The only reason I’m recommending this now is that it has been made SEO-friendly. Google can see the comments now, which you can see if you use the “fetch as Google” function in webmaster tools. You get the best of both worlds!