Menu

Settings

Theme
Animations

Special thanks to Matthias Ott, whose post about enjoying meta-updates about personal sites was the encouragement I needed to go ahead and blog about some work I'd done.

Introduction

Last night, I was on a kick. I wasn't happy with this site's web performance. Lighthouse scans were scoring my homepage 48 out of a possible 100 points on performance. The Website Carbon Calculator was reporting that each load of my homepage emitted 7.8 grams of carbon dioxide, making it dirtier than 97% of the pages they had tested before. These scores seemed really, really surprising to me, especially given that this site doesn't use any clientside frameworks, and is instead wholly pregenerated static assets built with Eleventy — it shouldn't be that heavy or slow. There had to be some things I could do to improve the site's performance.

Optimizing Fonts

The first piece of performance I chose to tackle was web font performance, since I hadn't had any experience with it yet. Specifically, I figured I could improve the performance around my brand font Nexa, which I'm self-hosting.

To start, I looked for each variant of Nexa that was present on my site, as well as how much they weighed. Here's what I found:

Fonts Downloaded Before Optimization
Font Downloaded Size
Nexa 300 Regular 58.9 kB
Nexa 300 Italic 61.6 kB
Nexa 400 Regular 59.4 kB
Nexa 700 Regular 62.0 kB
Nexa 800 Regular 61.8 kB
Nexa 800 Italic 63.8 kB
Nexa 900 Regular 59.9 kB
Total: 7 versions 427.4 kB

Each variant cost about 60 kilobytes. The browser was only downloading the variants present on a given page, but that usually meant only one or maybe two variants would be culled on a page load.

I found that one variant, Nexa 400 Regular, was only used for one thing: the date-posted stamps on each article. I opted to replace that with the Nexa 300 Regular, which was a quick 60 kilobytes saved.

To really cut down on font bundle sizes, I turned to subsetting, or removing unused glyphs from the font files themselves to serve only the glyphs you need. This is especially helpful to cut out alphabets of languages you aren't writing in, as well as a bunch of other Unicode characters for things like arrows or mathematical symbols. Markos Konstantopoulos's article on creating font subsets was hugely helpful here, and it guided me through using Zach Leatherman's glyphhanger project to identify the character ranges I needed to include in my subsets, and through using pyftsubset to create subsets of my font files.

I chose a bare minimum subset (pretty much just the ASCII range) for the light Nexa 300 Regular and Italic subsets, since those variants are used in very particular pieces of the page. I chose more lenient subsets for the 700 and above variants, since those are used in headings, link text, and more. I could eke much more performance savings with more aggressive ranges there. As it stands, here are the current sizes with the subsets I've gone with:

Fonts Downloaded After Optimization
Font Downloaded Size
Nexa 300 Regular 11.7 kB
Nexa 300 Italic 12.7 kB
Nexa 700 Regular 40.5 kB
Nexa 800 Regular 40.6 kB
Nexa 800 Italic 41.9 kB
Nexa 900 Regular 39.4 kB
Total: 6 versions 186.8 kB

All in all, this means the total size of Nexa downloads is now about 44% the size of the original, which I'm pretty happy with.

However, playing with font bundles alone didn't seem to move the needle all that much for Lighthouse. It was a good learning experience, but more drastic improvements needed to be made.

Refactoring the YouTube Embed

When run against my homepage, the Lighthouse scans were pointing towards one major culprit: the YouTube playlist embed for my Some Antics streams. Lighthouse warned me that this embed alone seemed to be bringing my Time to Interactive up quite a bit, as well as messing with my contentful paint metrics.

I decided to check out Paul Irish's lite-youtube-embed project. This project introduces a <lite-youtube> web component which renders an embed that looks an awful lot like the YouTube player, but "approximately 224× faster" by cutting out a lot of unnecessary and invasive scripts.

There was one hiccup: I had been using YouTube's playlist embed, so that I was always highlighting the latest Some Antics stream. However, lite-youtube-embed doesn't currently support playlists. I had to do some finagling with Eleventy global data (code below if you're interested!) to get the latest stream's video, and set up a GitHub Action to rebuild my site daily to regularly update the latest stream. This meant I was making a bit of a tradeoff between performance and immediately showing the latest stream. Since I only update the latest stream at most once per week, though, this was worth it to me.

Eleventy code for displaying the latest stream in a playlist

_data/latestSomeAntics.js:

const ytfps = require('ytfps');

module.exports = async function() {
	// Supply your playlist ID to ytfps
	const {videos} = await ytfps('PLZluKlEc91YzYor_ItAax4d2iXTXbFAFF');
	const [latest] = videos;
	return latest;
}

In your templates:

<lite-youtube videoid="{{ latestSomeAntics.id }}" playlabel="Play {{ latestSomeAntics.title }}"></lite-youtube>

This move away from YouTube's proprietary embeds ended up being a huge lift for my performance, bringing my homepage's Lighthouse performance score from a poor 48 up to a respectable 81 out of 100. I'd heard just how devastating third-party scripts and trackers can be for a site's performance, and this change really hammered that home for me.

But there was more to do before I really felt comfortable with the score…

Optimizing Images with Cloudinary Transforms

As it stands, my homepage has a lot of images, and none of them were particularly tiny. In one particularly embarassing case, I was serving a raw image hotlinked from Pexels that on its own cost an astounding 10.7 megabytes (oops… 😅). With time, maybe I'll find a way to highlight my blogposts that doesn't involve me needing to list out everything I've ever written on one page, cover images and all. In the meantimes, however, I wanted to tackle getting those cover images to an acceptable size.

I have been using the Cloudinary image CDN to host some of my images. As an image CDN, it's able to serve up images in whichever format is optimal for the user's device and browser, meaning it can serve up more performant image types in browsers that support them. Additionally, its URL transformations API enables you to apply transformations and adjust how the image renders by changing the image's URL itself.

I opted for two URL-based transformations for the card cover images:

  1. Specify f_auto to allow Cloudinary to choose the optimal image format for the user's browser.
  2. Specify q_20 to set the image's quality level at 20 out of a possible 100. This does degrade the image quality, but from what I could tell, when applied to my own cover images as they appear in the cards, the effect wasn't noticeable unless you were looking for it.

I wrote an Eleventy filter to insert those URL parameters for any cover image sources that pointed to Cloudinary (code below!), and migrated any cover images that weren't already in Cloudinary into my Cloudinary media library.

Eleventy filter for adding Cloudinary transformations

.eleventy.js:

module.exports = function (eleventyConfig) {
	eleventyConfig.addFilter('applyCloudinaryTransformations', (url) => {
		if (url && url.includes('res.cloudinary.com')) {
			return url.replace(
				'/image/upload/',
				'/image/upload/f_auto/q_20/'
			);
		} else {
			return url;
		}
	});
};

Just like refactoring the YouTube playlist embed, this change proved fruitful. My Lighthouse score on the homepage went from 81 to 97 (and even 100 in some tests!).

Looking at the images now in my Network tab, I'm being served WebP images on Chrome, and these are the differences in sizes between a handful of images from before and after. With the delightful exception of the Pexels outlier, most images came to about 16% of the size they originally were, with acceptable image degradation.

Comparison of Image Sizes
Cover Image Original Size New Size Percent of Original
PodRocket 226 kB 39.1 kB 17.3%
Algolia Twitch bot post 296 kB 15.7 kB 5.3%
ARIA labels and descriptions post 108 kB 17.5 kB 16.2%
Takeaways from #ComicsA11y post 69.9 kB 11.1 kB 15.9%
Aforementioned Pexels image 10.7 MB 😅 163 kB 1.5% (!!!)

Where Are We Now?

After applying these optimizations, the site is in a much healthier place. Depending on the environment and the particular page, I'm scoring 97 to 100 on my Lighthouse performance scores, and Lighthouse is starting to frame its recommendations as helpful suggestions rather than critical issues.

My carbon footprint is doing much better, too! Where previously, loading the homepage produced about 7.8 grams of carbon dioxide, faring worse than 97% of sites measured by the Website Carbon Calculator, the new results look much more promising! Now, homepage loads only produce about 0.18 grams, and the Website Carbon Calculator says it's doing better than 83% of the sites it's tested. Check out the latest Website Carbon Calculator results for yourself!