This is how I made the Museum of Websites (MOW), and the lessons I learned along the way. #BuildInPublic
Tech Stack
When I started this project in early 2025, I’ve had human devs and AI recommend that I use React and Next.js because that is where all the jobs are these days. However, I chose to avoid Next.js because it seems designed to favor Vercel’s hosting ecosystem and to eventually lock you into it. Also, I dislike how React files lack support for style tags within components and how there is no built-in CSS scoping like with Vue, Svelte, or Astro.
Astro
I chose Astro because…
✅ The content for this website is mostly static, and Astro allows you to start static-only and add dynamic functionalty if/when needed. Jason from CodeTV has the best explanation of Astro that I’ve seen:
✅ I love the flexibility. You can make some pages pre-rendered, while other pages be server-side rendered. And even the pages that are pre-rendered can have live, dynamic content thanks to dynamic islands / partial hydration.
✅ The dynamic content that I did have planned (user views, likes, and comments) could theoretically be added as dynamic islands / partial hydration, and I was curious to see how far I could push this.
✅ I love the .astro file structure. It’s basically the same as a .html file with embeded <style>
and <script>
sections, and this is how I originally learned webdev as a 90s kid. The only difference is that a .astro file includes a frontmatter section where you can write Javascript that is meant to execute only during build time (for pre-rendered pages) or on the server (for server-side rendered pages).
✅ I love that it ships with no Javascript by default. This allows you to make the site very lean.
✅ I love that you can choose to use client-side components from React, Vue, or Svelte, or go entirely vanilla JS, or do a combination of all 4 (though I would not recommend this). This allowed me to dabble with the these other frameworks while working in Astro and realize that I hate the structure of .jsx files, that Svelte has the most similar structure to Astro, and that I can just implement the component using vanilla JS and an endpoint running server-side on Astro.
✅ I heard a ton of good things about Astro from comments and blogs while catching up on the Javascript framework scene in 2024 and 2025. There’s so much love and passion and excitement in the Astro community that is unmatched by others.
✅ I love that Astro allows you to use markdown files as a flat-file content management system (CMS), so you can avoid having to setup and run a database for the core content. Another benefit is that you use Github as free version control and backup.
There are downsides to a flat-file approach though, the biggest being ‘schema duplication’. Each website featured on MOW has a folder in the /src/content/websites/ directory, and within each website’s folder there is an index.md file along with image and video files for screenshots and screen recordings. The frontmatter in each index.md file contains all the fields that are used in filters and page templates. This makes it quick and easy to edit the values, however, the downside is that if you need to add/edit/remove any field name, this has to be repeated for every index.md file in that collection. You’ll have to do these updates manually file by file or write a custom migration script to automate the process. In contrast, in a database, you can just rename the column once and it applies to all the entries.
When I started building MOW, I had to edit the schema and index.md files frequently while ironing out the design. So to minimize the update costs, I started with only a few website entries and only added more once I was confident I had finalized a schema.
In hindsight, it might have been easier to use a database (or a Database As A Service like Supabase or Baserow) for most of the website fields instead of using the frontmatter in markdown files.
✅ Lastly, I love Astro allows you to use a regular database if you want to, and that’s what I did for the dynamic content.
Supabase
For dynamic features like user authentication, user likes, user comments, and page view counters, a database is needed.
For this project, I opted for a Database as a Service (DBaaS) so that I can focus on launching quickly and on learning Astro.
The two most popular options are Supabase (PostgreSQL) and Google’s Firebase (NoSQL). Appwrite, Neon, Sentry, Turso, and Xata are also mentioned by Astro. And others have brought up PocketBase (SQLite) as an alternative solution that you can self-host.
I chose Supabase (PostgreSQL) because…
- I already had SQL experience.
- Supabase has a decent free tier.
- Supabase includes auth services.
- Supabase is open-source.
- Supabase has a good reputation from users.
- Supabase has a large amount of tutorial content and documentation, including by Astro itself.
- Supabase has automatic security warnings.
I avoided Firebase (NoSQL) because…
- I have fatigue with Google’s declining quality, horrible support, and seeming distain for its own users. For example, I’ve had a Google Business listing suddenly get taken down with no explanation, and support was only able to tell me to read the Terms of Service. I read the terms of service line by line, couldn’t find any violation, and support would continue to refuse to clarify what the issue was. Experiences like this have killed my trust in anything Google. And that’s before even considering the risk of Google killing off it’s own product.
- Firebase allows for easy misconfiguration of security rules with zero warnings.
- Firebase uses a Domain Specific Language (DSL) for security rules, database rules, and storage rules. Unlike SQL, this DSL is not useful or transferrable anywhere outside of Firebase.
- Firebase is not open-source.
I avoided PocketBase (SQLite) because…
- PocketBase does not support Row Level Security (RSL).
- PocketBase is not even on v1 yet.
- The PocketBase team has stopped being active on Twitter / X.com.
Things I learned using Supabase:
-
Make sure to pay attention to the security warnings on the dashboard. There was a viral post on X.com from a vibe coder who had their Supabase hacked. It turned out he forgot to limit his views and functions to security invoker (vs security definer), and this is something I’ve seen Supabase warn directly about in the dashboard and via email.
-
For anything table tracking private user activity, make sure to enable Row Level Security (RLS) and then set policies for it. If you have RLS enabled on a table, but haven’t set RLS policies yet, then you won’t be able to read or write anything to the table.
-
Know when to ignore Supabase warnings. For example, if you want to make aggregate data available to everyone from a table that tracks user activity and has RLS policies, you will have to create a view that runs as security definer. In Museum of Websites, this situation came up when I wanted to make each user’s likes private, but the total like count to be public.
-
By default, for views, Supabase grants all privileges to every user for some odd reason. So make sure to revoke everything from anon and authenticated users, and then just grant them SELECT priviliges.
REVOKE ALL ON public.aggregate_likes FROM anon, authenticated; GRANT SELECT ON public.aggregate_likes TO anon, authenticated;
-
The alternative to running a view as security definer would be to have a “materialized view”. But when trying to setup materialized views and have them immediately update with each change to the underlying table, I found them to be a pain to work with for my use case. They are mostly suitable for data that doesn’t change often and/or that is ok to be a bit stale.
-
The default “public” schema doesn’t mean that every table within the schema is public.
-
When using Supabase with Astro, you can run the Supabase client on server-side rendered (SSR) pages or within pre-rendered pages using client-side components like Svelte, Vue, or React. However, when using Supabase with client-side components, you will have to expose the ANON key to the public. This requires taking extra security precautions. To avoid that, I chose to run Supabase only on SSR API endpoints in Astro and then access those endpoints from pre-rendered pages using vanilla JavaScript.
-
Supabase’s AI Assistant is not very useful (as of June 2025) and will sometimes even give you invalid code. I found ChatGPT 4o to be more helpful and accurate.
Cursor & ChatGPT
After trying all the various “no code” tools and AI site builders, I don’t think AI will replace webdevs. But webdevs who know how to properly use AI will replace webdevs who don’t because AI in the right hands provides a huge productivity boost.
The most popular AI-assisted code editor in 2025 continues to be Cusor. While I do sometimes enjoy the auto-complete that Cursor offers, I’ve had mixed results with it creating new features from scratch or fixing issues in existing features. Though I did find it to be better than Windsurf which I briefly tried after I heard it was acquired by OpenAI.
Overall, I prefer discussing features and bugs with ChatGPT 4o. I find it performs better as both a mentor (for discussing the pros and cons of different possible approaches) and as a junior dev (for doing grunt work). I think it also does a great job of taking on the role of rubber duck debugging.
However, it’s important to note that AI tools can and do give bad code sometimes, or completely misunderstand context and requirements. I’ve found my previous webdev, compsci, and product manager backgrounds to be critical for successfully working with AI and being able to challenge it when necessary. The more you understand the fundamentals of your work, the more productive AI can make you.
Github
This one is obvious for anyone with webdev experience, but seems to be forgotten by vibe coders and new devs so it’s worth bringing it up.
Github is a way to store your code and track changes made to it. It’s pretty much unavoidable when working in teams, but also super helpful when working solo. Whenever I add a new feature or fix a bug, I make sure to save it as a commit. Think of it like making checkpoints in a video game.
This is extra important if you’re letting AI make changes to your code, or if you’re trying to refactor working code to make it more efficient but accidentally break it.
Media
Every website featured on the Museum of Websites needs a screenshot and a video. Here are the processes I follow to make sure they look consistent.
Screenshots
For screenshots, I was pleasantly surprised to learn that the built-in DevTools in Chrome browser include a tool for this. The tool even has a full-size option that automatically captures the entire length of the site without the scrollbar being included. This is something that even premium, paid tools like CleanShotX aren’t able to do. The “scrolling capture” in CleanShotX leaves behind scrollbar artefacts that need to be cleaned up in another app like Photoshop.
Instructions on Chrome browser…
- Enter full-screen mode on the browser window (Command + Control + F in MacOS) to get rid of the window’s rounded corners,
- Open Inspect mode in Chrome’s DevTools (Command + Option + C in MacOS),
- Set the viewport size to 1587px x 992px for an aspect ratio of 16:10,
- Open the Command Menu (Command + Shift + P in MacOS) and type in “screenshot”,
- Select “Capture full size screenshot” to capture the entire page,
or “Capture screenshot” to capture only the visible part of the page.
Why 1587px x 992px? Because 992px is the viewport height in full-screen mode on my screen (Espresso 17 Pro). And I want the screenshot to have a 16:10 aspect ratio. So 992px x 1.6 = 1587px.
And because my screen is 4K HiDPI (aka Retina), the screenshot comes out as 3174px x 1984px. So then in Photoshop I save a copy as 1600px x 1000px. This is my subjective choice that I feel best balances quality vs file size for web. Depending on how the website performs, I might decrease the size further to 1440px x 900px or 1280px x 800px.
Video
For videos, the process starts the same as for screenshots but then requires an additional tool like (CleanShotX).
On Chrome browser…
- Enter full-screen mode on the browser window (Command + Control + F in MacOS) to get rid of the window’s rounded corners,
- Open Inspect mode in Chrome’s DevTools (Command + Option + C in MacOS),
- Check that the viewport size is still 1587px x 992px (as explained earlier).
On CleanShotX…
- In Settings > Recording > General,
- turn OFF “Scale Retina videos to 1x”
- turn ON “‘Do Not Disturb’ while recording”
- turn OFF “Show cursor”
- turn OFF “Highlight clicks”
- turn ON “Remember last selection”
- In Settings > Recording > Video,
- set “Video FPS” to “60”
(you can lower it to 30 if you want to save space, but I find 60 to be the best balance between quality and file size).
- set “Video FPS” to “60”
- In Settings > Shortcuts,
- for “Screen Recording” > “Record Screen / Stop Recording”, set Control + Q.
(this will allow you to quickly stop recording with your left hand while keeping your right hand on the trackpad or arrow keys for scrolling).
- for “Screen Recording” > “Record Screen / Stop Recording”, set Control + Q.
Then I press Control + Q to start recording, scrolling down using the down arrow key (unless the site has some particularly cool functionality that is related to mouse tracking), and then press Control + Q to stop the recording.
CleanShotX saves the video output in .mp4 format which is ideal for web.
But when I tried to reduce the video from 3168px × 1984px to 1600px x 1000px I ran into some issues.
-
Quicktime wouldn’t allow me to set custom dimensions. The closest was 1080p. Also it only exports as .mov, which can have playback issues on the web on non-Mac devices.
-
VLC Player would only give me files in .m4v format even when selecting MP4, and I ran into playback issues with this format on my website when used within modals.
So then I found ffmpeg and it worked beautifully! Note though that it requires you to be comfortable using Terminal / command line. Once I had it installed, I’d then run this command for the conversion:
ffmpeg -i input.mp4 -vf "scale=1600:-2" -c:a copy output.mp4