Improving Next.JS File Performance

24th June 2021

One thing static site generators can lure us into is not caring about build performance. If code works and the build passes quickly enough, it gets overlooked.

I was building a new Code Component and, having to wait a minute for every refresh was making the feedback loop extremely slow. Couple that with having to restart the development server every few minutes and, it was unbearable.

I did discover after getting most of the work finished that Firefox was the cause of the slow page loads. However, the improvements in this post are still improvements.

Read all the Files.

I started with the assumption that I was interacting with the disk too much. This felt like the right place to start and, I knew one place that was an issue.

So the first problem area I want to look at is getPostBySlug.

1export const getPostBySlug = async <K extends (keyof Post)[]>(
2 slug: string[],
3 fields: K
4): Promise<Pick<Post, ArrayElement<K>>> => {
5 const basePosts = await getPostFiles()
7 const index = indexedBy('slugString', basePosts)
9 const basePost = index[slug.join('-')]
11 const post = await getPost(basePost)
13 return pick(post, fields)
Language ts

This function works and does get a post from its slug. However, it does it in about the worse way it could.

getPostFiles uses fs.readdir to get a list of all the folders in the posts directory. Which begs the question, why do I need to do it to get one post?

Once it has every post, it uses indexedBy from my utils library to create an object of posts indexed by their slug.

That means two loops through every post on the site to get one post returned.

getPost then actually reads in the file's contents and returns the MDX and the frontmatter.

To solve this, I need to ask the question, why are posts not directly retrievable?

The answer is that in the folder names I put the day of the post as well as the year and month. The day is never used, and it would not be possible to use it in slugs or anything like that without invalidating all my current permalinks. So a quick rename of every folder to year-month-slug and getPostBySlug now looks a lot better.

1export const getPostFromSlug = (year: string, month: string, slug: string) => {
2 const filePath = path.join(
4 [year, month, slug].join('-'),
5 'index.mdx'
6 )
8 return getPost(filePath)
Language ts

I can now calculate the posts file path entirely from the data in the URL. No more loops just a direct return of a file object.

A File Proxy

Speaking of file objects. There is a lot of duplication between each type of content on this site. The main reason for this is to provide very acurate typings. I decided that if I was going to try and improve file performance the last thing I wanted to do was get it working for one content type and then have to copy it to the others.

To that end I created a File object that handles all interaction with the disk.

A File takes 2 type parameters. Frontmatter which is the data returned from parsing the frontmatter and Properties which is the primitive data I can get by parsing the file path.

As an example of Properties these are the properties of this post.

2 year: '2021',
3 month: '06',
4 slug: '2021-06-improving-nextjs-performance',
5 href: '/2021/06/improving-nextjs-performance'
Language js

It doesn't really provide enough data to be useful to React but it is very useful for a File's internals.

A File has getters for content, data and bundle which all return a promise for a string or in the case of data an object of Frontmatter & Properties.

This means that the getStaticProps for a post is now a very clean get the post, bundle the post, return the post.

1export const getStaticProps = async ({
2 params
3}: GetStaticPropsContext<{year: string; month: string; slug: string}>) => {
4 if (params.year && params.month && params.slug) {
5 const post = getPostFromSlug(params.year, params.month, params.slug)
7 const source = await post.bundle
9 return {
10 props: {
11 post: replaceProperty(
12 pick(await, [
13 'slug',
14 'title',
15 'lead',
16 'href',
17 'tags',
18 'year',
19 'month',
20 'date'
21 ]),
22 'date',
23 date => date.toISOString()
24 ),
25 source
26 }
27 }
28 }
Language ts

Again pick and replaceProperty come from my utils library.


I wanted to look at caching which given my data is already on disk meant working with an memory cache.

When you do any research into Next.JS caching it is all geared towards keeping your CMS data on disk to reduce the number of fetches. As I said my content is already on the disk so that isn't a performance gain for me.

An in memory cache has 2 problems in Next.JS.

  1. Next.JS runs its build in multiple global scopes so a simple cache wont persist between build runs.
  2. The development server is a single process but the cache wont automatically be invalidated by file changes.

Ignoring both issues I created a simple global scope cache to hold file objects.

1const fileCache: Record<string, File<any, any>> = {}
3export const file = <Frontmatter extends {}, Properties extends {}>(
4 filePath: string,
5 properties: Properties
6): File<Frontmatter, Properties> => {
7 if (!fileCache[filePath]) {
8 fileCache[filePath] = createFile(filePath, properties)
9 }
11 return fileCache[filePath] as File<Frontmatter, Properties>
Language ts

With this builds are slightly more effecient, but its not really noticable.

Development is where this shines as being a single process this cache persists. File objects cache the contents and bundle within themselves so once the development site has read a file it doesn't have to again.

A quick file watcher can then be used to clear the files internal cache if the file on the disk changes.

Any Better?

The build performance is about the same. Vercel did get the new site built quicker but 1 result isn't enough to go on.

Development feels a lot better, pages compile faster and although the cache isn't used every time, it is used enough to notice an improvement.

As always the updated source code is on GitHub if you interested.