Blog speed optimization

Motivation


I like pitting my blog against website speed testers in hopes of getting good score. petrusmanner.com usually does pretty well, on the account of it being what is commonly referred to as a “static website”. This means i have no backend or a database, and all the content is precompiled from markdown to a pile of HTML pages.

Let's compare some internet avarages and see how they stack up against my blog.
I will be using this blog post instead of the front page, which has almost no content.

Avarages are from HTTP-Archives, and this blog was tested using Google Lighthouse.

Desktop

CategoryAvarageMy blog
Page size2.1 MB (2124 KB)0.42 MB (50.6 KB)
Total requests7321
FCP2.1s0.7s

Mobile

CategoryAvarageMy blog
Page size1.9 MB (1914 KB)0.42 MB (52 KB)
Total requests69 21
FCP5.2s2.3s

Obviously since my blog is static, it would be somewhat unfair to compare it to a large, dynamic websites like Facebook or Twitter. Keep in mind however, that people still largely use bulky CMSes like Wordpress to run their blogs, so in my mind i am competing against those people.

Identifying the problems


Let's start by running my blog through Google Pagespeed and going through the most outstanding problems.

Eliminate render-blocking resources

Pagespeed list of render blocking resources

While the Google fonts request is well optimized for size (only get certain weights and character set) the time it takes to complete is abyssmal.

I can also inline the CSS rules to the main HTML document, saving time and eliminating a request.

Reduce unused JavaScript

Pagespeed list of unused javascript

My blog only has about ~40 lines of Javascript, so re-writing those with vanilla JS instead of jQuery will be trivial – i should have done that to begin with.

Additionally, the script itself can be inlined, once again eliminating an extra request.

Enable text compression

Pagespeed list of uncompressed resources

I honestly thought i already had file compression turned on in Nginx. Regardless, this is also trivial to fix.

Serve images in next-gen formats

Pagespeed list of images not in webp format

Potential savings from this are pretty slim, but let's leave it as a bonus step. Maybe there exists some CLI tool to mass convert images to webp?

Optimizing the site


Now that the most outstanding problems have been identified, the optimization can start.

Eliminate render-blocking resources

One simple way to prevent CSS from render blocking would be to use asynchronous loading with rel="preload"

This works, but is not supported on IE and Firefox. I don't care about IE at all, but the incompatibility with Firefox makes this strategy a no-go.

There is however a really cool hack to make this work that i found here, and it involves using the media type attribute available to link tags:

<link rel="stylesheet" type="text/css" href="[long url]" media="print" onload="this.media='all'">

You can read the full explanation from the linked post, but essentially it works by setting the link tag's media attribute to print, which means that it should only be applied when user is using print-based media, i.e. printing the page out.

This turns the request asynchronous, and then upon completing, the onload attribute is used to set the media type to all, at which point the CSS gets applied to the whole page.

I eventually just ended up inlining all the css by combining them into a hugo partial. Modifying one giant CSS file is a bit cumbersome, but it will have to do.

As for Google fonts, i simply stopped using it by swapping to a web-safe classic – Arial.

Reduce unused JavaScript

I threw out jQuery and wrote the scripts again with vanilla JS:

// example of an old jQuery function
$('#menu-button').click(function(){
  toggleMenuIcon();
  $('#menu').toggleClass('showMenu');
});

// new vanilla js version
const menuButtonClicked = function() {
  toggleMenuIcon();
  x = document.getElementById('menu');
  x.classList.toggle('showMenu')
}

I also inlined the script and placed it in the footer.

Enable text compression

This was probably the easiest one to fix. I added the following directives to Nginx:

gzip on;
gzip_static on;
gzip_types text/plain text/css text/javascript;

Serve images in next-gen formats

There is an apt-package for a webp CLI tool, so i used that in conjunction with this script i found to mass-convert all of my png files to webp:

#!/bin/bash

# converting JPEG images
find $1 -type f -and \( -iname "*.jpg" -o -iname "*.jpeg" \) \
-exec bash -c '
webp_path=$(sed 's/\.[^.]*$/.webp/' <<< "$0");
if [ ! -f "$webp_path" ]; then
cwebp -quiet -q 90 "$0" -o "$webp_path";
fi;' {} \;

# converting PNG images
find $1 -type f -and -iname "*.png" \
-exec bash -c '
webp_path=$(sed 's/\.[^.]*$/.webp/' <<< "$0");
if [ ! -f "$webp_path" ]; then
cwebp -quiet -lossless "$0" -o "$webp_path";
fi;' {} \;

Speed gains and conclusions


Optimizations are done, time to look at how the site performs compared to it's older version:

Desktop

Categoryoldnewincrease
Page size0.05 MB (50.6 KB)0.02 MB (28.1 KB)60% (22.5 KB)
Total requests 211242% (9)
FCP0.7s0.3s57% (0.4s)

Mobile

Categoryoldnewincrease
Page size0.05 MB (50.6 KB)0.02 MB (28.3 KB)60% (22.3 KB)
Total requests 211242% (9)
FCP2.3s1.0s56% (1.3s)

I was not only able to reduce the size of my blog, but inlining the CSS- and JS-files helped to reduce the number of requests as well.

Google pagespeed now gives me a perfect score for speed:

Pagespeed screenshot showing 100 points

Next post in this series will be about optimizing for SEO. Stay tuned.