Re: automated workflows for websites - Seirdy


This is so similar to my setup! I run Stylelint and v.Nu too. I send v.Nu output through a JQ filter to filter out false-positives (after reporting them



Onion Details



Page Clicks: 0

First Seen: 03/11/2024

Last Indexed: 10/21/2024

Domain Index Total: 190



Onion Content



This is so similar to my setup! I run Stylelint and v.Nu too. I send v.Nu output through a JQ filter to filter out false-positives (after reporting them upstream); you might eventually do something similar, since there are a lot of these. Your blog post reminds me that I need something better than regex substitutions for customizing footnote and section links; Hugo’s parallel nature prevents it from doing post-processing of fully-assembled pages. Other tools I use: xmllint to validate that the markup is well-formed XHTML5 syntax; it runs much more quickly than v.Nu and does light auto-formatting, but is also more limited. There’s also a W3C feed validator written in Python worth checking out; I send my Atom feeds through that. I run axe-core , IBM Equal Access checker, and Webhint on every page with headless browsers. In the future: I’ll need to figure out a good workflow for easily validating JSON according to a schema, and adding some Microformats + Microdata validation too (maybe using Schemara?). The whole thing takes several minutes to run, so I don’t run it every commit; just building my site (no linting or validation) requires only a tarball with some statically-linked binaries. It’s more in line with the “built to last” philosophy; I’m curious if you have any thoughts about it.