At a high level, this site is 99% static content with a small bit content dynamically fetched from some APIs (literally just the activity section of the home page). I’m taking advantage of SSG with a revalidation endpoint to avoid any on-demand data fetching, while still letting me easily update the content as frequently as needed.
The design of this site has gone through countless iterations, with earlier versions looking more like an iOS app. I never felt like it quite aligned with taste and personality, as it always felt derivative of someone else’s work. This version is definitely not perfect, but I’m at least happy enough to put the pencil down, so to speak (I say that knowing full well I’ll probably do a complete redesign in a few months).
The ultimate goal was to make the site cater to the strengths of both desktop and mobile environments, while maintaining a consistent design system that doesn’t contain wildly divergent layouts/components. The only area where I chose to stray from this philosophy was with the navbar/drawer, as I really dislike navigation placed at the top of the screen on mobile. Aside from that, pretty much everything is just shifted around at a breakpoint.
One of my top priorities when researching content management solutions was being able to easily add content no matter where I am without having to commit anything and wait for the site to rebuild. While there are many traditional CMS solutions that address this, I wanted something that would cater to several different types of content, not just blog posts. This is not just a personal site with a blog; it is my personal knowledge base for all things related to development. So, logically, I landed on Notion, the king of knowledge bases.
“But Zach!”, you may be saying to yourself. “Why aren’t you hosting your data internally so it exists until the end of time?! What if Notion’s servers go down or they kill the business entirely? Won’t your site be totally bricked?” Well, for one, I am not the only one that will get screwed by that. Seemingly half of their business model is basically making a glorified database that external sources can consume. And secondly, the features I’m able to utilize in the context of this site make my workflow so seamless that it would take a very skilled team of engineers to make anything that comes anywhere close to it. So I’m fine taking that risk (YOLO, right fellow teens?).
All of the content pages on this site pull data from tables/pages hosted in Notion under a dedicated space. Even shorter form content like the colophon page and about section on the home page are hosted there so I can easily update them without touching the source code. I’m using the Klippper chrome extension to make saving website links a breeze, as it automatically scrapes the page title, description, and favicon and stores them to a table of my choosing.
The last thing I want to touch on here is how I actually update the site when content in Notion has changed, as the only major drawback to this setup is that Notion doesn’t have any integrated webhooks to do so automatically. Instead, I came up with the next best thing - a button in the Notion space that hits an endpoint I set up to revalidate all the data in the background. This endpoint is secured with a secret key that needs to be passed for it to do anything and, just in case that wasn’t enough, I added Arcjet rate-limiting to deter any determined hackers.
To actually fetch the data from Notion and make it look pretty, I’m making using of @notionhq/client + notion-to-md for Notion content processing and next-mdx-remote + rehype-pretty-code for basic markdown rendering. Leerob's blog post on managing markdown content is a great starting point for this, though he went a little too barebones for my liking.
If you're curious about any details I didn't cover here or want to delve deeper, the site is fully open source, so feel free to take a look around.
Hopefully there will be more posts to come, but I can’t promise anything :)