Git static pages

2025-Aug-20

I’ve reached my wit’s end with web crawlers taking 100% of my CPU to download absolutely everything from forgejo. robots.txt isn’t honored by these things.

I’m rolling out a static site generator for git. It’s my hope that by providing a glimpse of the code to web crawlers, and the ability to git clone the entire repo, I can get a nice snappy site again, which serves everybody:

Humans
will have the ability to quickly peruse the code, and the ability to download it if they want more.
Web Crawlers
will be able to fetch a couple versions of all the files, for whatever use they have in mind.
Code Contributors
will be able to push commits over SSH.