home // text // multi-media // misc // info
bots
]
The Bots – Past & Present
I’ve created a couple of social media bots over the whims & over the years—you can find the full listing here. Since I recently had to refactor & migrate some of them, I thought I’d take a moment to go over my previous setup & contrast that with the new and (in my mind) sleeker approach. The bots affected are (at the time of writing):
- The Chambers Pot – rejected entries in the Monk & Robot series
- Ethan Mars – Edvard Munch’s Scream by way of Heavy Rain
- The Herbalist’s Primer – a field guide to fantastical flora inspired by the book of the same name
- The Lytton Tribune – a reworking of the map tiles from Police Quest into a community newsletter (more here)
- Not Podracing – Anakin calling his shot
Prior work
Ethan Mars & the Herbalist are both basically pure Tracery—essentially mad-libs—with a limited vocabulary and a constrained dictionary; as such, they were ideal candidates for the Cheap Bots, Done Quick (CBDQ) line of products.
Of course, at the time I started working on them, around 2018, I was already pulling back from Twitter in revulsion, and hesitant to give that company more content to scrape—in the algorithmic advertising volume sense, not the genAI sense, though I suppose there are equivalencies—and so I limited my bots to Mastodon only, using Cheap Bots, Toot Suite, a member of the CBDQ EU/CU. Hosting & posting those two was straightforward.
The others—Chambers, Lytton, & Podracing—were a little more roundabout.
First, I’m a firm believer that you should turn your computer off every day—for environmental, emotional, and spiritual reasons—so I was necessarily looking to host them In The Cloud (where computers are on indefinitely, I know, hush). Somehow—maybe from example, or from recommendation, or maybe sheer contingent history—I ended up hosting them on Glitch: The friendly community where everyone builds the web, hacking together some loose JavaScript samples I’d found online.
Now, on a high level, I don’t know how to use Glitch: The friendly community where everyone builds the web. What I managed to cobble together was: I would use Express to expose a (secret) HTTP endpoint on Glitch: The friendly community where everyone builds the web. Every time that endpoint was booped, according to my stolen code, my bot(s) would post to Mastodon! Easy.
Of course, actually scheduling those boops and those posts was tricky. I wasn’t entirely sure how often my Glitch: The friendly community where everyone builds the web app would refresh, how it might store variables between runs, and so on. What I needed was a cheap script that could run regularly, indefinitely, in the cloud, and store variables between runs. I ended up choosing Google App Script. Google is objectively a net harm to the planet at this point, but I figured I was costing them money, so everybody wins.
The idea was:
- Google script runs every hour. It increments one variable.
- When that variable hits a certain threshold (i.e., hours-between-posts), it boops the endpoint on Glitch: The friendly community where everyone builds the web.
- Then the app behind that endpoint sends out the post!
I just had to figure out a good interval for posting. My heuristic was: how annoying I thought the bot would be to an end user (or rather: how much visual & mental attention it would demand). But also, I wanted the intervals to be prime numbers, or at the very least not a clean divisor of 24, so that the postings would “drift” over time. I morally object to bots posting at the same time, every single day.
So, for instance, in the case of Podracing, after the Google script ran seven times (i.e., after seven hours), it would trip a threshold, and poke the Express endpoint, whose backing app responded by sending out a new post. Simple as that.
It wasn’t simple, and I’m still embarrassed at the convoluted workaround. I still don’t understand Node. I think NPM stands for National Public Madio. Anyway, the important thing was that it technically worked, even though I knew I was under the looming threat of eventual, recurring (sur)charges from either Google or Glitch: The friendly community where everyone builds the web.
But for a time: it was Okay.
As I haven’t mentioned, the bots were all hosted on the botsin.space instance. This was the hip new spot for bots in the fediverse, at the time of implementation; not a Wuher in sight. This is important, because…
RIP botsin.space
In October 2024, the maintainer of botsin.space announced that they were shutting down the server. After a brief grace period, and some time for mourning, I started considering what to do with my automated friends.
As I hinted at elsewhere, I offloaded the mental workload of choosing another instance to other, smarter people, and chose mastodon.social as my bots’ new fediverse home. Hopefully, this one won’t go anywhere for a long while, vagaries of open-source notwithstanding.
At the same time, I decided I might also port those bots to Bluesky (i.e., copy-paste some more code to mirror the already-generated posts to a different API). A lot of people don’t like Bluesky but are there anyway; I have no strong feelings about Bluesky but I’m there anyway. Mirroring the posts seemed like minimally-disruptive* fun.
(* Not entirely true, as setting up multiple accounts with custom domains & maintaining dev keys &c. is actually kind of tedious!)
The refactor
While the sunsetting of botsin.space engendered (so gender) a bit of extra work, I was at least grateful for the opportunity to simplify my deployment(s).
I was never comfortable with all of my (non-CBDQ) bots behind separate, heavyweight app frontends on Glitch: The friendly community where everyone builds the web; there are steps to connect on the backend via git, but it felt (to me) a little fragile; maintenance required (for me) loading the apps in the browser anyway; and I didn’t like the idea of the Express HTTP endpoints being technically public (if technically obfuscated via weird URLs).
I’ve been gradually doing more devops work at work, and enjoying it, and screwing it up, and learning things. Among the list of those things, are such things as: Python, and GitHub Actions. Eventually, an idea coalesced, that I could rewrite my bots, all in one place, in a (to me) more interesting language.
So I did that.
I’m already getting tired of writing & hearing myself talk (“talk”), but I managed to port all of my (non-CBDQ) bots over to Python, and host them on GitHub, where posting is automated via GitHub Actions.
You can find the source code here!
I won’t go into the technical details (you can read the code), but I do want to shout out this blog post for pointing out how to schedule posts via (no points for guessing) the GitHub Actions schedule
trigger.
All that remained was to figure out how to (re)schedule these actions to run/post on the kind of interval that would “drift” over the days, as mentioned above. My solution was:
- Schedule the GitHub Actions to run hourly
- Modulo the current run number (monotonically increasing by 1 every run/hour) by my interval
- …and only run the “posting” actions if that modulo is 0 (zero).
There are small tweaks to these steps, but that’s the overview. You can see an example GitHub Action workflow here; note the “Evaluate timing” step, which calculates the modulo/interval; and the “if” line on every subsequent step, which only allows for posting on our given interval.
And there, as a favourite professor would say, you have it.
The bots are all listed here, with their Masto & Bsky links; and, once again, you can study the source code here.
I’m happy to answer any & all questions wherever good questions are sold.
Addendum
At some point, Mastodon introduced functionality to more explicitly migrate user accounts… e.g., check out my own archived account on masto.social! Unfortunately, I got to this migration work late, and can (afaik) no longer sign into the botsin.space originals to perform that redirect, so you’ll have to follow those Masto bots anew. Apologies!