i wanna add syntax highlighting, but the easiest way to do that would involve adding javascript to the site, which i wanna avoid as much as possible

#

i wanna finally finish Eightfold but god it's been almost two years(!!) and i gotta figure out what the hell i was doing when i last touched it

at least it's almost done lol

EDIT: oh my god, everything is commented as fuck, i'm such a big brain smartgirl for doing that, thank you past me

like, look at this, these are some ridiculously informative comments (this is from some sample code, though; the comments in the actual library are a bit less thorough):

/// Convert a `glTF` [Transform](gltf::scene::Transform) to a [nalgebra]
/// [Affine3].
///
/// An affine transformation is, in order, a non-uniform scaling, a rotation, and then a
/// translation.
///
/// A `glTF` transformation is stored either as an affine transformation matrix, or as separate
/// translation, rotation, and scale components. Therefore, the most general possible kind of
/// transformation is affine, which means that the *least* general kind of transformation we can
/// return is an [Affine3].
///
/// ## See Also
///
/// * [nalgebra's explanation of transformations](https://www.nalgebra.org/docs/user_guide/points_and_transformations/#transformations)
pub fn gltf_to_nalgebra(g: &gltf::scene::Transform) -> Affine3<f32> {
    match g {
        // the Matrix variant is stored as a column-major [[f32; 4]; 4], so we can just transmute
        // that into an [f32; 16] and use that directly.
        gltf::scene::Transform::Matrix { matrix: ref m } => {
            // the glTF spec states that matrix transformations *must* be decomposable to their
            // translation, rotation, and scale components. Therefore, a matrix from a compliant
            // glTF file can be converted directly to an Affine3.
            Affine3::<f32>::from_matrix_unchecked(Matrix4::from_column_slice(
                // arrays are stored contiguously, so, in memory, an [[f32; 4]; 4] is identical to
                // an [f32; 16], which means we can safely interpret one to the other.
                //
                // `std::mem::transmute` tells Rust's compiler that we want to interpret something of
                // type `A` as, instead, something of type `B`. It doesn't actually do anything at
                // runtime.
                unsafe { std::mem::transmute::<&[[f32; 4]; 4], &[f32; 16]>(m) }.as_slice(),
            ))
        }
        // this is a bit more complicated, because we have to convert these three components into
        // a single Transform3.
        gltf::scene::Transform::Decomposed {
            translation: ref trans, // [x, y, z]
            rotation: ref rot,      // unit quaternion, [x, y, z, w]
            scale: ref scl,         // [x, y, z]
        } => {
            // Store the resulting homogeneous Matrix4 as an Affine3.
            // We don't have to check for correctness, because we already know
            // that the matrix we're storing represents an affine transformation.
            Affine3::from_matrix_unchecked(
                // construct an Isometry (a rotation followed by a
                // translation) from `trans` and `rot`
                Isometry3::from_parts(
                    Translation3::from(*trans), // <- we can convert `trans` directly
                    Unit::new_unchecked(Quaternion::from(*rot)), // <- same with `rot`. The glTF spec
                                                                 // requires rotations to be stored as
                                                                 // unit quaternions, so we don't need
                                                                 // to validate that here.
                                                                 // Conveniently, nalgebra and glTF
                                                                 // use the same format for
                                                                 // quaternions.
                )
                // convert the Isometry3 to a homogenous Matrix4, so we can
                // apply the scaling (remember, an isometry is a rotation
                // followed by a translation; it, by definition, cannot have a
                // rotation, and the Isometry3 struct reflects that.)
                .to_homogeneous()
                // apply the scaling, resulting in a matrix M = Translation * Rotation * Scale.
                //
                // Reminder: when transforming a point using a matrix, the transformations
                // are applied to the point in the reverse of the order they were applied to the
                // matrix. So, a point transformed by a `TRS` (`Translation * Rotation * Scale`)
                // matrix is first scaled, then rotated, then translated. This is important because
                // applying those transformations in another order would produce a different end
                // result.
                .prepend_nonuniform_scaling(&Vector3::from(*scl)),
            )
        }
    }
}
#
88x31 button for Signal Garden

i made a button hurray

#

been listening to friend from the internet by laryssa okada; it's really good and it's thematically appropriate for messing with websites

i was gonna embed the bandcamp player for it but i don't wanna have that automatically load whatever tracker nonsense they have going on so you're just gonna have to click the link instead

#

i should probably move most of the 88x31s into their own page so i'm not reuploading every page every time i change the button set

#

every day my urge to write an app for ao3 grows stronger

mostly just because scrolling through the "Trans Female Character" tag all the time is finicky and i wanna be able to just have a nice feed of new stories to read without having to remember all the ones i've already seen and decided not to read

#

i should finish flyover chapter 2 or something so the most recent post isn't the "how to block scrapers" one because it's a little too nerd emoji

#

I went out to look for more buttons and I'm begging everyone who has 88x31s or other pixel art on their website: please use image-rendering: crisp-edges

#

Posted the first chapter of another story I've been writing (it's backdated to when I published it on Ao3, so it should appear down near the bottom of the main post list).

I should probably add a "Read More" button to long posts when they're displayed on the main post feed. I also need to figure out how to get Eleventy to collect chapters / post series into single pages. Hmmmmmmmm.

#

Buttons

Added some buttons to the bottom of each page, since it was free real estate and I want to link to other sites that I like.

I'll probably make a dedicated page for links to other sites at some point? I might do that immediately after posting this; who knows.

Maybe next I'll write a guide to CSS grids? Since, you know, that seems like something people might want.

Also, does this note block thing look good? The one floating to the side of this post. (Or, maybe not floating to the side, if you're reading this in a future where I changed how notes look.) Maybe I should also add a comments section, somehow.

I'm also considering including some Javascript to randomize the order of buttons when you load the page, just so there's no bias towards any specific site. Maybe I'm overthinking it? Hm.

#

How to Block Scrapers on Every Nginx Virtualhost in NixOS

Because my bandwidth usage is already too high.

A couple months ago I realized that a lot of my home bandwidth was being eaten by AI scrapers constantly refreshing the login screen of the Jellyfin instance I host for my friends on my home server. Regardless of one's opinions about the ethicality of LLMs, the scrapers gathering training data for them are bad for the ecosystem and they're making me pay extra money to Comcast, so: here's how to block them in Nginx (as long as you're using NixOS).

First, here is a robots.txt file containing a list of user agent strings for common scrapers. If you want to add that to a vhost, you can add this to the vhost config:

location =/robots.txt {
  alias /path/to/the/robots.txt/file;
}

Web crawlers are supposed to respect rules set in robots.txt files, but they sometimes ignore them (either through malice or by mistake), so it's also useful to block them entirely.

All you have to do to block a specific user agent in Nginx is to add something like this to the server config, where "GPTBot", "Amazonbot" and "Bytespider" are the user agent strings you want to block:

if ($http_user_agent ~* "(GPTBot|Amazonbot|Bytespider)") {
  return 444;
}

("444" isn't a real HTTP status code; Nginx uses it internally to signal that it should drop the connection without a response.)

Nginx, as far as I know, doesn't let you set common configuration settings shared by all vhosts, so, if you've got more than one vhost, you'll have to do a lot of copy-and-pasting. Nix, however, makes that (relatively) simple.

The naive way to do this in NixOS would be something like this:

services.nginx.virtualHosts = let
  robots = ["GPTBot" "Amazonbot" "Bytespider"];
  rules = lib.concatStringsSep "|" robots;
  robotsTxt = let
    agentsStr = pkgs.lib.concatStringsSep "\n" (map (agent: "User-agent: ${agent}" robots));
  in pkgs.writeText "robots.txt" ''
    ${agentsStr}
    Disallow: /
  '';
in {
  "vhost-A" = {
    # ... other config ...
    locations."=/robots.txt".alias = ${robotsTxt};
    extraConfig = ''
      if ($http_user_agent ~* "(${rules})") {
        return 444;
      }
    '';
  };
  "vhost-B" = {
    # ... other config ...
    locations."=/robots.txt".alias = ${robotsTxt};
    extraConfig = ''
      if ($http_user_agent ~* "(${rules})") {
        return 444;
      }
    '';
  };
  # ... and so on
};

But that gets tedious and it's easy to forget to add the rules to a specific vhost. Instead, you can override the services.nginx.virtualHosts module to automatically apply the rules for you:

let
  robots = ["GPTBot" "Amazonbot" "Bytespider"];
  rules = lib.concatStringsSep "|" robots;
  robotsTxt = let
    agentsStr = pkgs.lib.concatStringsSep "\n" (map (agent: "User-agent: ${agent}" robots));
  in pkgs.writeText "robots.txt" ''
    ${agentsStr}
    Disallow: /
  '';
in {
  options = with lib; {
    services.nginx.virtualHosts = mkOption {
      type = types.attrsOf (types.submodule {
        config = {
          locations."=/robots.txt" = lib.mkDefault {
            alias = robotsTxt;
          };
          extraConfig = ''
            if ($http_user_agent ~* "(${rules})") {
              return 444;
            }
          '';
        };
      });
    };
  };
  config = {
    # normal nginx vhost config goes here
  };
}

Because that overrides the submodule used by virtualHosts.<name>, this configuration will automatically apply to every vhost, including ones defined by external modules.

Addendum, 2024-09-24

I wrote a NixOS module implementing this, including automatically getting the block list from ai-robots-txt.

Apparently, the NixOS manual does actually obliquely reference that you can type-merge submodules, in the documentation for types.deferredModule.

#

Neocities

A multi-channel website.

I've been updating my website, which has been... somewhat? mirrored? to Neocities for a while, so now I'm updating the Neocities version, and I'm adding ~Neocities exclusive content~ (i.e. this post, so far).

Not sure how much I want the Neocities version to be different from the main version. I'm planning on posting fiction that I'm written here and I'm a little too embarrassed about that to have it just right up there on the front page of the main site (since that's also gonna be what, like, employers and recruiters and stuff see), but I haven't decided yet whether I want that to be Neocities-exclusive or just displayed more prominently on Neocities than on the main site.

Probably the latter? Kinda depends on how hard it is to wrangle Eleventy (the site generator I'm using) into doing it that way.

The RSS feed actually is different for the Neocities version, since the Neocities version has exclusive posts.

Anyway, since this is Neocities (or maybe you're reading this directly from the Git repo, for some reason), here's a ~CSS Secret~: all the CSS on this site is based on the CSS I wrote for an instant messenger I'm working on called Troposphere, and there's still some leftover artifacts from that -- for example, most of the variables are called things like --channel-shadow or --msg-bg, and there are still some rules defined for classes that don't actually appear in the HTML, because I haven't cleaned those out yet.

Feel free to steal the CSS, by the way, as long as you link back here.

#