Blog Speed Optimization

15 July, 2021

Motivation

I like pitting my blog against various performance analyzers in hopes of getting a good score.

Usually i do pretty well, mostly on the account of this blog being a completely static website.

Let’s compare some internet avarages for various performance metrics, and see how they stack up against my blog.
This is the blog post i will be using for the benchmarks.

Avarages are from HTTP-Archives. Performance data for my blog was gathered using Google Lighthouse.

Desktop

CategoryAvarageMy blog
Page size2.1 MB (2124 KB)0.42 MB (50.6 KB)
Total requests7321
FCP2.1s0.7s

Mobile

CategoryAvarageMy blog
Page size1.9 MB (1914 KB)0.42 MB (52 KB)
Total requests6921
FCP5.2s2.3s

Disclaimer: I understand that the static nature of this blog makes the above comparison somewhat unfair, but this was the best data source i could find so i’m using it regardless.

Identifying the problems

Let’s start by running my blog through Google Pagespeed and identifying the most outstanding problems.

Eliminate render-blocking resources

Pagespeed list of render blocking resources

Google-fonts request is optimized well for size, as it only fetches certain weights and character sets, but the time it takes to complete could be improved.

As for the CSS, i could move them to the main HTML-file, therefore saving time and eliminating a few extra requests.

Reduce unused JavaScript

Pagespeed list of unused javascript

At the time of writing, this blog only has about ~40 lines of Javascript. Re-writing those with vanilla JS instead of jQuery should be trivial. Frankly, i should have done it in plain JS to begin with.

Additionally, the script itself can be inlined, once again saving time by eliminating an extra request.

Enable text compression

Pagespeed list of uncompressed resources

I honestly thought i already had file compression enabled in Nginx, but apparently this is not the case. Easy fix.

Serve images in next-gen formats

Pagespeed list of images not in webp format

Potential savings from this are pretty slim, but let’s leave it as a bonus step. Maybe there exists some CLI tool to mass convert images to webp?

Optimizing the site

Now that the most outstanding problems have been identified, the optimization can start.

Eliminate render-blocking resources

One way to prevent CSS from render blocking would be to use asynchronous loading with rel="preload"
I considered this approach, but turns out it does not work with Firefox, which makes this strategy a no-go.

There is however a really cool hack to make this work described here. It involves using the media type attribute available to link tags:

<link rel="stylesheet" type="text/css" href="[long url]" media="print" onload="this.media='all'">

Essentially this works by setting the link tag’s media attribute to print, which means that it should only be applied when user is using print-based media, i.e. printing out the page. This turns the request asynchronous, which upon completion will turn the onload attribute to all, applying the CSS to the page. While this is a cool trick, i eventually just ended up inlining all the css to the main HTML-file.

Google fonts request also got completely eliminated. I decided to stop using an external font provided by Google, and instead swapped to a web-safe classic, Arial.

Reduce unused JavaScript

I threw out jQuery, and wrote the scripts again with vanilla JS:

// example of an old jQuery function
$('#menu-button').click(function(){
  toggleMenuIcon();
  $('#menu').toggleClass('showMenu');
});

// new vanilla js version
const menuButtonClicked = function() {
  toggleMenuIcon();
  x = document.getElementById('menu');
  x.classList.toggle('showMenu')
}

I also inlined the script, and placed it in the footer.

Enable text compression

This was probably the easiest one to fix. I added the following gzip directives to Nginx, so the files will now be compressed before transit, reducing file sizes.

gzip on;
gzip_static on;
gzip_types text/plain text/css text/javascript;

Serve images in next-gen formats

There is an apt-package for a webp CLI tool, so i used that in conjunction with this script i found to mass-convert all of my png files to webp:

#!/bin/bash

# converting JPEG images
find $1 -type f -and \( -iname "*.jpg" -o -iname "*.jpeg" \) \
-exec bash -c '
webp_path=$(sed 's/\.[^.]*$/.webp/' <<< "$0");
if [ ! -f "$webp_path" ]; then
cwebp -quiet -q 90 "$0" -o "$webp_path";
fi;' {} \;

# converting PNG images
find $1 -type f -and -iname "*.png" \
-exec bash -c '
webp_path=$(sed 's/\.[^.]*$/.webp/' <<< "$0");
if [ ! -f "$webp_path" ]; then
cwebp -quiet -lossless "$0" -o "$webp_path";
fi;' {} \;

Speed gains and conclusions

Optimizations are done, time to compare the performance:

Desktop

Categoryoldnewincrease
Page size0.05 MB (50.6 KB)0.02 MB (28.1 KB)60% (22.5 KB)
Total requests 211242% (9)
FCP0.7s0.3s57% (0.4s)

Mobile

Categoryoldnewincrease
Page size0.05 MB (50.6 KB)0.02 MB (28.3 KB)60% (22.3 KB)
Total requests 211242% (9)
FCP2.3s1.0s56% (1.3s)

After these optimizations, Google pagespeed now gives me a perfect score for speed:

Pagespeed screenshot showing 100 points