Optimizing My Blog Part 1 - Speed

Motivation


I like running my blog through those “website speed tester” applications in hopes of a good score. My website (petrusmanner.com) already does pretty well on both speed and size, on the account of it being what is commonly referred to as a “static website”. This means i have no backend or a database, and all the content is compiled from Markdown into HTML.

Let's compare some internet avarages and see how they stack up against my blog.
For the sake of fairness i will be using an actual blog post instead of the front page, which has almost no content.

Avarages are from HTTP-Archives, this blog was tested with Google Lighthouse.

Desktop

CategoryAvarageMy blog
Page size2.1 MB (2124 KB)0.42 MB (50.6 KB)
Total requests7321
FCP2.1s0.7s

Mobile

CategoryAvarageMy blog
Page size1.9 MB (1914 KB)0.42 MB (52 KB)
Total requests69 21
FCP5.2s2.3s

I feel obligated to mention that pitting a static blog against an avarage website is not exactly a fair comparison. However people still largely use complex CMS systems like Wordpress to run their simple little blogging sites, so in my mind i'm competing against those people, not Facebook or Twitter.

Identifying the problems


Let's start by running my blog through Google Pagespeed and going through the most outstanding problems.

Eliminate render-blocking resources

Pagespeed list of render blocking resources

While the Google fonts request is well optimized for size (only get certain weights and character set) the time it takes to complete is abyssmal. Definitely something that needs to be fixed.

CSS files are a bit trickier, but i think i can at least inline the style.css rules to the main HTML document.

Reduce unused JavaScript

Pagespeed list of unused javascript

jQuery is completely useless here. My blog only has about ~40 lines of Javascript, so re-writing those with vanilla JS won't be a problem – probably should have done it like that to begin with.

Additionally, the script itself could also be inlined.

Enable text compression

Pagespeed list of uncompressed resources

I honestly thought i already had nginx compressing the files, but apparently not. This is simple to fix.

Serve images in next-gen formats

Pagespeed list of images not in webp format

Potential savings from this are pretty slim, but let's leave it as a bonus step. Maybe there exists some CLI tool to mass convert images to webp?

Optimizing the site


Now that the most outstanding problems have been identified, the optimization can start.

Eliminate render-blocking resources

One simple way to prevent google fonts from render blocking would be to use asynchronous loading with rel="preload". This works fine, but is not supported on IE and Firefox. While IE can happily be ignored, the incompatibility with Firefox makes this strategy a no-go.

There is however a really cool hack to make this work that i found here, and it involves using the media type attribute available to link tags:

<link rel="stylesheet" type="text/css" href="[long url]" media="print" onload="this.media='all'">

You can read the full brakedown from the link provided, but let's quickly go through what exactly is going on here.
We are setting the link tag's media attribute to print, which means that it should only be applied when user is using print-based media, i.e. priting the page out.

Since we are using the browser and not a printer to view the page, this download now becomes asynchronous. After the request has completed, the onload attribute is used to set the media type to all, at which point it gets applied to the whole page.

While this strategy worked ok, i decided to go even further and switch to a web-safe font in order to eliminate the request all together. Bye bye Source Sans Pro, welcome Arial.

As for the other css files, there is quite a bit more work involved. Back when i built this site, i cobbled the CSS rules together in about an hour, so they are extremely spaghetty and annoying to work with. It is absolutely worth it to refactor them just for the maintainability, but the main point for the optimization is that we deliver critical CSS inline, and load the less-important styles later.

I ended up inlining all the css by combining them into a hugo partial. Modifying one giant CSS file is a bit cumbersome, but it will do while i figure out something better.

# inline-css.html
<style>
  [css goes here]
</style>

# head.html
{{ partial "inline-css" }}

Reduce unused JavaScript

This one was easy. I simply threw out the old jQuery based script (as well as jQuery itself), and re-wrote the whole thing in vanilla js. I also inlined the script rather than fetching it separately.

// example of an old jQuery function
$('#menu-button').click(function(){
  toggleMenuIcon();
  $('#menu').toggleClass('showMenu');
});

// new vanilla js version
const menuButtonClicked = function() {
  toggleMenuIcon();
  x = document.getElementById('menu');
  x.classList.toggle('showMenu')
}

Enable text compression

Another easy one. I added these gzip directives to my main server block in Nginx:

gzip on;
gzip_static on;
gzip_types text/plain text/css text/javascript;

Serve images in next-gen formats

There is an apt-package for a webp CLI tool, so i used that in conjunction with a script i found here to mass-convert all of my png files to webp:

#!/bin/bash

# converting JPEG images
find $1 -type f -and \( -iname "*.jpg" -o -iname "*.jpeg" \) \
-exec bash -c '
webp_path=$(sed 's/\.[^.]*$/.webp/' <<< "$0");
if [ ! -f "$webp_path" ]; then
cwebp -quiet -q 90 "$0" -o "$webp_path";
fi;' {} \;

# converting PNG images
find $1 -type f -and -iname "*.png" \
-exec bash -c '
webp_path=$(sed 's/\.[^.]*$/.webp/' <<< "$0");
if [ ! -f "$webp_path" ]; then
cwebp -quiet -lossless "$0" -o "$webp_path";
fi;' {} \;

Speed gains and conclusions


Optimizations are done, time to look at how the site performs compared to it's older version:

Desktop

Categoryoldnewincrease
Page size0.05 MB (50.6 KB)0.02 MB (28.1 KB)60% (22.5 KB)
Total requests 211242% (9)
FCP0.7s0.3s57% (0.4s)

Mobile

Categoryoldnewincrease
Page size0.05 MB (50.6 KB)0.02 MB (28.3 KB)60% (22.3 KB)
Total requests 211242% (9)
FCP2.3s1.0s56% (1.3s)

As we can see from the data, the increase both in speed as well as in the total page size is quite noticeable.
The goal has been achieved, because Google PageSpeed application now gives my blog 100 points for both mobile and desktop:

Pagespeed screenshot showing 100 points

Next post will be about SEO optimization – coming soon.