home ¦ Archives ¦ Atom ¦ RSS

Not Standing Still: Jupyter AI

Previously I’ve noted my appreciation for marimo notebooks, especially how their reactive cell model was different from Jupyter notebook. Marimo has also been developing an interesting narrative around integration with agentic coding.

In this blog we explain why agentic coding tools like Claude work exceptionally well with marimo, especially when compared to other notebooks such as Jupyter. We also share tips on how to best use Claude when working with marimo. While this blog focuses on Claude Code, you don’t have to use your terminal if you don’t want to: in a future blog post we’ll describe how marimo provides a batteries-included AI-native editor, with a best-in-class experience for working with LLMs and your data in a single development environment

However, Jupyter is an established, robust, and large ecosystem. With a lot of smart people at the forefront of data science and machine learning. So I should have known the Jupyter team would not stand still in the face of AI advances.

Enter Jupyter AI

read more ...


Carrot Weather, Meatbags

Daring Fireball’s Jon Gruber conducted an amusing interview with Brian Mueller, the developer behind Carrot Weather.

Carrot tells me my first install was a little under 4 years ago, on November 12th, 2021. Feels like it’s been much longer. I’ve probably opened it over 95% of the days since then.

Awaiting the unofficial transcript for some interesting pull quotes.

The fact that the app has personality is one of its best features. I first heard of the Will Smith / Chris Rock contretemps from Carrot.

It’s an extremely well designed, well featured, and non-intrusive app. I highly recommend it to all meatbags.


Cursor Cli

I’ve heard a lot of good things about Cursor, but thought it was another IDE with AI inside. And I am not an IDE guy.

Turns out Cursor has a CLI! From the docs:

Cursor CLI lets you interact with AI agents directly from your terminal to write, review, and modify code. Whether you prefer an interactive terminal interface or print automation for scripts and CI pipelines, the CLI provides powerful coding assistance right where you work.

Claude Code, Codex CLI, aider, Gemini CLI, and Cursor CLI. The quiver is getting mighty full. Time to get back to work.


Side Project Sage Advice

Ned Batchelder is a wise elder of the Python community. He recently offered this advice on side projects, “Forgive yourself”

My advice is: forgive yourself. It’s OK to rewrite the whole thing. It’s OK to not rewrite the whole thing. It’s OK to ignore it for months at a time. It’s OK to stop in the middle of a project and never come back to it. It’s OK to obsess about “irrelevant” details.

The great thing about a side project is that you are the only person who decides what and how it should be.

As someone who’s using side projects as “auditionware”, I need to keep this philosophy in mind.

Batchelder also plenty of other good bits of advice in that post. Especially the nugget about how “nice” becomes easier the more you do it. Gotta get the reps.

Meanwhile, just go read the whole thing.


Marimo Acquired

I haven’t extolled it too much here, but I’m a fan of what marimo has been doing in the computational notebook space. As a small, venture backed, startup it was relatively safe to root for them. Of course venture backed implies a certain space of outcomes and one of them came to pass. Marimo joined CoreWeave.

TL;DR Marimo is joining CoreWeave. We’re continuing to build the open-source marimo notebook, while also leveling up molab with serious compute. Our long-term mission remains the same: to build the world’s best open-source programming environment for working with data.

marimo is, and always will be, free, open-source, and permissively licensed.

I’m hopeful that this won’t severely impact marimo for at least two to three years. There’s a fair bit of underlying robustness and stability that could be added amongst the strong drive on features. Given similar acquisitions I’ve seen, beyond that timescale is a crapshoot. Blitzscaling startups or existing Big Tech organizations purposefully make acquisitions fundamentally different as they become further integrated. It’s sort of the point.

Of course some folks will be bent out of shape and declare “back to Jupyter!!” That’s reasonable, except from what I understand you can’t get to marimo from Jupyter.

Simon Willison puts the proper nuance on the current state of play

Give (sic) CoreWeave’s buying spree only really started this year it’s impossible to say how well these acquisitions are likely to play out - they haven’t yet established a track record.


Impressed With Claude Code Web

It’s early days but I had a really nice initial experience using Claude Code Web

Claude Code on the web runs Claude Code tasks remotely, working with code from your GitHub repositories. This article explains how it works, when to use it instead of running Claude Code in your terminal or IDE, and what workflows it enables.

Simon Willison has a post with a much more detailed interaction, but overall he seems to align with how I felt.

I used it to modernize an old repo, lastfm-to-sqlite, which ingests Last.fm scrobbles into an sqlite database. The workflow just naturally progressed from using Claude Code on the cli. Maybe I went overboard with the summaries, but I found it extremely helpful throughout the development process. My hope is that the summaries can become either archival documentation or rolled into release notes.

As I was working, the integration of Claude Code, on the Web and command line, with the GitHub CLI makes for a nice combination. My next experiment will be to see if I can sketch out some requirements in a GitHub issue, have Claude Code Web fetch it, and then churn out a solid PR. Leveraging GitHub issues and PRs as a system of record for agentic engagement with a codebase might be a best practice.

Also on my TODO list is giving Codex on the web a try.


Pygame GUI

Link parkin’: Pygame GUI

Pygame GUI is a module to help you make graphical user interfaces for games written in pygame. The module is firmly forward looking and is designed to work on Pygame 2 and Python 3.

The “Quick Start Guide” in the documentation leans on pygame CE

GitHub source repo


cargo-update

Link parkin: cargo-update

A cargo subcommand for checking and applying updates to installed executables

Handy if you cargo install tools like starship and atuin, amongst others. Was sort of surprised cargo didn’t have something like that baked in.


In Good Company: Agentic Generative Art

I was poking around in Claude’s configuration menus and noticed this:

A screen capture of the Claude desktop config menu showing a skill
for algorithmic art

Well, well, well. Claude really does have a skill for generative art. Here’s a quick quote from the skill’s markdown:

Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists’ work to avoid copyright violations.

A skill for Claude extends the system sort of like MCP and tools but with less code:

Claude can now use Skills to improve how it performs specific tasks. Skills are folders that include instructions, scripts, and resources that Claude can load when needed.

Claude will only access a skill when it’s relevant to the task at hand. When used, skills make Claude better at specialized tasks like working with Excel or following your organization’s brand guidelines.

The Anthropic engineering blog has a deeper dive into the overall design and intent of skills:

Building a skill for an agent is like putting together an onboarding guide for a new hire. Instead of building fragmented, custom-designed agents for each use case, anyone can now specialize their agents with composable capabilities by capturing and sharing their procedural knowledge. In this article, we explain what Skills are, show how they work, and share best practices for building your own.

And if that isn’t enough for you, Simon Willison jumps in with his trademark enthusiasm, Claude Skills are awesome, maybe a bigger deal than MCP

The algorithmic-artskill relies on p5.js to implement requested pieces. This makes sense given Claude Code’s TypeScript foundation. What’s crazy to me is that the Skills mechanism adds the entirety of that previously linked markdown to the LLM context if the skill is needed. That’s a lot of tokens consumed! On the flip side, it’s a tremendous example of actual prompting for generative art in the wild.

Just more motivation to pursue new dreams of agentic generative art using Python. Oft attributed to Picasso, “good artists copy, great artists steal.” I might not be either, but I’ll definitely be examining that skill for inspiration.


Diggin’ On: Tinzo and HYPERHOUSE

I’ve had a recent bout of chasing suggestions in Apple Music DJ Mix compilations. Basically following some of those “You Might Also Like” connections at the bottom of playlists. Two new artist discoveries are Tinzo and Anna Lunoe.

Tinzo has an attractive list of compilations, which I landed on through her Jazzy House Mix. It was as advertised a well done mix of jazz flavored house cuts. Definitely off my beaten path but multiple head nodding tracks, especially Feels Good (Yeah) [Kelly G. Little Louie Parti Mix].

Siblings Tinzo and Jojo are the co-founders and resident DJs behind the New York based YouTube channel and party, Book Club Radio.

As a Queer Filipino-American, Tinzo’s music is rooted in the LGBTQIA+ experience, with inspiration drawn from house, jazz, and trance. With a diverse skill set in performance, digital marketing, and event production, Tinzo is dedicated to promoting unapologetic joy and inclusivity within the music and live events industry.

HYPERHOUSE Radio is an Apple Music residency for Anna Lunoe. According to Wikipedia, she was in on the ground floor of Apple Music!

Hosted by magnetic tastemaker and underground-club killer Anna Lunoe, Season 3 of HYPERHOUSE is all about the Sydney-born DJ’s prowess. Twice a month on Apple Music 1, Lunoe will share a 60-minute mix exploring the best in new dance and electronic music.

HYPERHOUSE: Episode 72 was just refreshingly new to me. Didn’t feature anyone I was really aware of other than TSHA (gooddess).

Between these two series and Beats in Space, there is an endless trove of DJ mixes to consume on Apple Music.


soco-scribbler: Agentic Development

I made some recent forward progress on my side quest project soco-scribbler. This is an effort to revise the sonos-lastfm project, which uses SoCo under the covers, to log track info from Sonos speakers.

The screen capture below illustrates monitoring of three speakers and track logging from one that’s actively playing. The ultimate goal is to emit the track info to other systems such as an sqlite database or a message streaming platform like nats. But just handing off to Python logging is a good start.

Screen capture of soco-scribbler running in a terminal
console

The interesting bit is that I used OpenAI’s Codex CLI to flesh out and revise my broken handwritten first cut. Here’s the task list that I prompted Codex to create.

  • [x] Rework the scribble CLI options in src/soco_scribbler/soco_scribbler.py:18 to remove Last.fm credential flow, keep interval settings, and add logging parameters.
  • [x] Convert SocoScribbler into a subclass of SonosScrobbler that sets placeholder Last.fm env vars, disables the Last.fm network, and accepts logging configuration.
  • [x] Implement helpers to ensure log storage, format timestamped entries, append to file/console, and override scrobble_track to log locally while updating history.
  • [x] Update the command body to use the new options, set interval env vars, drop credential/setup handling, and keep the monitoring loop via run().
  • [x] Validate the new CLI surface with uv run python -m soco_scribbler.soco_scribbler --help and manually confirm logging output without Last.fm submission.
  • [x] Introduce platformdirs to resolve per-user config/data directories in an OS-aware way and refactor existing hardcoded paths.
  • [x] Update documentation/help text to describe the new config directory behavior powered by platformdirs.

Not only did I subclass the SonosScrobbler class to hijack track scrobbling but I specced out a bunch of other necessary but rote tasks. After approving the plan, I just told Codex to “make it so”. Thus the xs checking off task completion. The result worked right out of the gate. There’s some subsequent human written adjustments to handle a bit of an edge case, but I think I could have just thrown the error message at Codex and it would have been fine fixing it.

Once I got up and rolling this was a quite pleasant experience. One that I might not have even given a go if not for agentic coding. As I’ve said in other venues, I’m skeptically optimistic but this decreases the level of skepticism a bit.


Pygame Community Edition

Searching my archives, looks like I first encountered pygame over 16 years ago 😲! pygame subsequently served me well for experimenting with generative art in Python.

Back in 2023, a post from Diego Crespo, about drama in the pygame community, came across my transom.

At over 20 years old, Pygame is one of the most widely used Python libraries for game development. It has inspired many beginners and hobbyists to learn Python and create their own games, including myself. It’s also used in schools for educational purposes, for quick prototyping of game mechanics, and multimedia applications. But recently it was forked2. At first this shouldn’t seem unusual, because as of the time of writing, it has over 2.6k forks. People fork open source projects all the time, to contribute code, to make their own experimental changes, or just for fun. But this fork is different, as it is led by many of the core maintainers of the original project. The forked version of Pygame is called Pygame Community Edition (Pygame CE).

At the time, I duly noted the occurrence but was off doing other things and didn’t take any action.

However, with the benefit of improvements in the Python ecosystem and my own abilities, I’m contemplating revisiting my ancient peyote repository for generative art noodling. pygame Community Edition (Pygame-ce) seems to have a lively community that’s keeping pygame up to date with the changes in Python.

Pygame is a free and open-source cross-platform library for the development of multimedia applications like video games using Python. It uses the Simple DirectMedia Layer library and several other popular libraries to abstract the most common functions, making writing these programs a more intuitive task.

This distribution is called ‘pygame - Community Edition’ (‘pygame-ce’ for short).

It is a fork of the upstream pygame project by its former core developers, and was created after impossible challenges prevented them from continuing development upstream. The new distribution aims to offer more frequent releases, continuous bugfixes and enhancements, and a more democratic governance model.

It’ll definitely have to be part of the next steps.


Agentic Coding and Generative Art

Following up on my prior musings about Python and generative art. Agentic coding could of course be part of building a Processing-like platform for Python. At the end of the day though, the actual generative code is just code.

What would agentic coding support for generative art pieces look like? On one hand, you could start very “vibe codey” with almost purely natural language expressions of what an artist wants. Spit out some code. Run a visual display. Iterate in natural language.

On the other hand, the agentic framework could assist proficient coders with leveraging underlying code features of the platform, even helping with domain specific mathematical, vector graphic, and low level pixel array manipulation (bit bliting) languages to support artistic flair.

Time for some research.


The CLAUDECODE Environment Variable

Across my personal gear, the bash login startup has a pretty florid banner generated by hyfetch. This is fine for the occasional human paced terminal shell instantiation. However, when using claude code for agentic coding there’s a Bash tool which seems to use my login profile. The banner is just junk that eats up the context token budget, so I was casting about for a means to disable it when Claude was doing the invocation. Claude swore it wasn’t using a login shell, despite evidence to the contrary. I figured with all the agentic coding excitement, surfacing a discussion of this point would be relatively fruitless straightforward. Sadly I ended up a bit disappointed.

Turns out there’s a CLAUDECODE environment variable that is set to 1 for the Bash tool. I couldn’t really find any documentation other than this acknowledgment in a GitHub issue. Thought I’d publish a note on the Web in case someone else is struggling here as well. In any event, the following bash snippet conserves a few tokens.

if [[ -z "$CLAUDECODE" ]]; then
   if [[ -x $HOME/.local/bin/neowofetch ]]; then
      neowofetch --package_managers off --package_minimal
   fi
fi

Alternatively, the Claude Code documentation indicates the ability to customize the environment through a settings.json file, thereby allowing you to define your own flag if you’d like.


More On Prek

From Hugo van Kemenade, some benchmarking of the performance of prek for managing git commit hooks:

prek is noticeably quicker at installing hooks.

⚡ About 10x faster than pre-commit and uses only a third of disk space. This 10x is a comparison of installing hooks using the excellent hyperfine benchmarking tool.

Here’s my own comparison.

Kemenade doesn’t see 10x but definitely a significant performance boost. He also provides some handy shell aliases for pre-commit and prek.


TSDProxy

Link parkin’: TSDProxy

TSDProxy is an application that automatically creates a proxy to virtual addresses in your Tailscale network. Easy to configure and deploy, based on Docker container labels or a simple proxy list file. It simplifies traffic redirection to services running inside Docker containers, without the need for a separate Tailscale container for each service.

I landed on TSDProxy via exploration of incorporating dokku into my tailnet. On cursory examination, here be dragons. The tricky bit is if you want to serve SSL traffic from a container, which then requires some DNS and cert jujitsu. There is also a dokku tailscale plugin but the same configuration caveats apply.

Seems like an adventure worth diving into.


Moderninzing Python Generative Art

Over 15 years ago, I completed a generative art hack in Python. Effectively I transliterated a work from Processing into a combination of pygame and some lower level bit manipulation libraries. Further work never got all that far into making a robust framework or re-implementing other works.

Fast forward to the current era and maybe it’s time for a revisit. Especially as I’m looking for some new themes for this blog. First off, there’s likely a free order of magnitude or two speedup that’s been gained just through processor and GPU improvements. Second, thanks to the AI bubble, the software layers for programming with accelerators in Python has vastly improved. Third, it looks like OpenGL isn’t the only way to do fast bit level graphics anymore.

Just for grins, I’ve been tasking Gemini Deep Research to create reports on how this could be done. Here’s a sample:

I. Executive Synthesis: Framework Recommendations and Performance Benchmarks

A. The Architecture of Choice: Optimal Stack for Real-Time

Generative Art The investigation into Python frameworks suitable for real-time generative art, specifically those handling large bitmap images represented by NumPy arrays, identifies an optimal stack that minimizes CPU overhead and leverages modern hardware capabilities. The most architecturally sound and performance-oriented solution centers on the use of pygfx and fastplotlib, relying on the WGPU graphics backend.

This combination is strategically superior because it addresses the constraints inherent in legacy visualization tools. Frameworks such as VisPy and Pyglet, while mature, are typically built upon OpenGL. In contrast, WGPU represents a crucial evolutionary step in graphics APIs, serving as a high-level abstraction layer that translates uniformly across modern, low-level APIs, including Vulkan, Metal, and Direct3D. Since WGPU itself is a Rust implementation with C bindings , adopting a WGPU-based library inherently satisfies the requirement for low-level language bindings while maintaining a high-level Python application interface. This architectural choice is not merely an incremental performance improvement but a foundational necessity to fully exploit the parallel processing power of modern GPUs, ensuring greater long-term stability and maximizing performance gains compared to frameworks constrained by the legacy overhead of OpenGL.

This could also be an interesting exploratory application of agentic coding. I know a bit about graphics and generative art, but I’m not an expert. Maybe I could coding-centaur my way into a useful framework in a reasonable amount of time.


Diggin’ On Mëstiza

Lately I’ve been diggin’ on DJ mixes from the fabulous duo known as Mëstiza (warning, hella glitz on their site, here’s a more accessible biography and a Wikipedia entry) Blurb directly from their site, all caps included.

MËSTIZA HAS SOLIDIFIED THEMSELVES AS A DYNAMIC FORCE IN THE GLOBAL MUSIC SCENE, SEAMLESSLY AND UNIQUELY INTERTWINING ELECTRONIC MUSIC WITH THE RICH TRADITIONS OF FLAMENCO. THEIR INNOVATIVE ARTISTRY CELEBRATES THEIR SPANISH HERITAGE WHILE CHAMPIONING FEMALE EMPOWERMENT, CREATING A VIBRANT FUSION OF MUSIC, FASHION, AND CULTURAL STORYTELLING.

Rummaging around on Apple Music I discovered their and Ushuaïa Ibiza mix sets. Outstanding blending of house and Spanish vibes. Love the stage garb as well. Now on the lookout for other live collections. Highly recommended.


prek

Link parkin’: prek

pre-commit is a framework to run hooks written in many languages, and it manages the language toolchain and dependencies for running the hooks.

prek is a reimagined version of pre-commit, built in Rust. It is designed to be a faster, dependency-free and drop-in alternative for it, while also providing some additional long-requested features.

I’ve plugged prek into a few repos and it feels like a winner.


Hopper: Python Developer Tooling Handbook

Link parkin’: Python Developer Tooling Handboook, by Tim Hopper.

This is not a book about programming Python. Instead, the goal of this book is to help you understand the ecosystem of tools used to make Python development easier and more productive. For example, this book will help you make sense of the complex world of building Python packages: what exactly are uv, Poetry, Flit, Setuptools, and Hatch? What are the pros and cons of each? How do they compare to each other? It also covers tools for linting, formatting, and managing dependencies.

Hopper’s handbook is a really rich resource. Despite the mention of other Python packaging frameworks and tools, clearly the uv wave (really Astral wave if you add in ruff and ty) landed on those shores. There’s a lot of good actionable advice. And the Explanation section has bunch of foundational, non-Astral, Python packaging wisdom. The attendant blog looks great as well.

Tim’s excellent PyBites Podcast interview episode tipped me off to the handbook.


Bootstrapping Python CLI Packages

As an avowed command line interface (CLI) guy my default approach to building new Python functionality is to write a packages that’s embedded within a CLI right from the getgo. Fortunately, Python is blessed with many packages to support this. click and typer are my gotos. I so admire click that I believe the package should be a part of the Python standard library.

I also have a few opinions regarding packaging (uv please), logging (use loguru), and configuration (platform user directories + TOML files). If you do enough of these CLIs within a certain period of time, you start to yearn for some bootstrapping automation. Recently I landed on a couple of packages that align with my preferences and could really help here.

First off is a batteries included cookiecutter for new Python packages.

There are many cookiecutter templates, but this one is mine and I’m sharing it with you. Create complete Python packages with zero configuration - including CLI, testing, documentation, and automated PyPI publishing via GitHub Actions.

Second is the typerdrive package (background from Tucker Beck)

During my time as an engineer working primarily with Python, I’ve written a a fair number of CLIs powered by Typer. One type of project that has been popping up for me a lot lately involves writing CLI programs that interface with RESTful APIs. These are pretty common these days with so many service companies offering fully operational battlestations…I mean, platforms that can be accessed via API.

I’ve established some pretty useful and re-usable patterns in building these kinds of apps, and I keep finding new ways to improve both the developer experience and the user experience. Every time I go about porting some feature across to a new or old CLI, I wish there was a library that wrapped them all up in a nice package. Now, there is typerdrive:

These are the challenges I found myself facing repeatedly when building CLIs that talk to APIs:

  • Settings management: so you’re not providing the same values as arguments over and over
  • Cache management: to store auth tokens you use to access a secure API
  • Handling errors: repackaging ugly errors and stack traces into nice user output
  • Client management: serializing data into and out of your requests
  • Logging management: storing and reviewing logs to diagnose errors

typerdrive wraps all this up into an interface that’s easy for users and developers as well.

I could see myself creating a combination of these two into a new cookiecutter with a few tweaks of my own for AI engineered CLIs and REPLs. My thanks to the fine gentlemen who authored these packages and made them publicly available.


Python 3.14 Released

Python 3.14 got released recently. The team at Astral has a good feature overview amongst the many floating around on the ’Net. The overview is admittedly tinged with a focus on uv and ruff, so getting a few differing takes is a good idea. There’s nothing in this particular Python release I’m in a rush to try out, but the progress on exploiting processor concurrency is heartening.


marimo and quarto

Link parkin’: marimo + quarto

This repository provides a framework for integrating Quarto with marimo, enabling markdown documents to be executed in a marimo environment, and reactive in page.

Previously I’ve written about how marimo is an interesting project that’s advancing the state of the art in the Python computational notebook space. One of quarto’s claims to fame is straightforward incorporation of Jupyter notebooks in scientific publishing. As I’m getting up to speed with quarto, I ventured out to see how well marimo was integrated.

After giving it a quick test drive, the linked extension looks promising, but is a tad glitchy. Apparently a JavaScript support library for marimo that’s a few point releases behind main is necessary to get embedded interactivity working. Not that I desperately need that feature, but it’s mildly annoying.

If the extension sees continued support and improvement, I’ll be putting it to good use.


? The Next Era

Typically I’m a “meta is murder” blogger. I prefer delivering content to talking about delivering content. Today I’m making a minor exception. Mainly for posterity.

Most of the action in the technology space writ large is driven by AI. It’s close to inescapable. Being an active technologist I’m swept up in it as well. Even if I wasn’t planning on making it a big part of the next day job (🤞), I’d likely be diving in out of pure curiosity.

The topic is big enough and more career oriented that I’m going to break my work out in the space into another site. memexponent.net will house all of my work and thoughts on AI engineering, hopefully building up a useful portfolio over time.

Where does that leave Mass Programming Resistance (MPR)? To Be Determined. Here’s a quick laundry list of areas I might take this site back to in depth

  • Popular Media: Books, Music, Podcasts, Movies, Sports, Episodic Series (is it really TV anymore?)
  • Generative Art
  • Programming Language Design and Implementation
  • Data Management, Data Engineering, and Analytics
  • Non-AI Technology

Me being me, there has to be a tech angle. I can’t go pure culture and criticism though. Just need to find a proper balance.

More to come…


Pelican YAML Metadata

Link parkin’: a Pelican plugin that enables YAML formatted front matter

This Pelican plugin allows articles written in Markdown to define their metadata using a YAML header. This format is compatible with other popular static site generators like Jekyll or Hugo.

It is fully backwards-compatible with the default metadata parsing.

I’m also working up another blog that uses quarto. Quarto Markdown is Pandoc Markdown which is extended to use YAML for its metadata. Eventually I’ll do some agentic coding to build a CLI tool to assist in creating new posts for either style of blog. So getting them lined up on the same format is a good thing.


uv and .env

From Daniel Roy Greenfeld, TIL: Loading .env files with uv run”

We don’t need python-dotenv, use uv run with —env-file, and your env vars from .env get loaded.

Good to know, even though I’m all in on direnv to auto load .env files. Also, handy to make Poe the Poet tasks that invoke uv underneath the covers really explicit.


Trey Hunner Cheatsheets

Link parkin, from Trey Hunner’s site: Python Articles on Cheat Sheets

A collection of the many Python cheat sheets within Python Morsels articles and screencasts.

I especially like the cheatsheet on the pathlib module:

I now use pathlib for nearly all file-related code in Python, especially when I need to construct or deconstruct file paths or ask questions of file paths.

I’d like to make the case for Python’s pathlib module… but first let’s look at a cheat sheet of common path operations.


Digging On Amapiano

Now that I have working search on this here blog I can ask questions like, “Have I ever mentioned amapiano music?” And as of this moment, the answer is “no”.

Let’s fix that.

Here’s the intro paragraph on amapiano from Wikipedia

Amapiano is a genre of music from South Africa that became popular in mid-2012 with an earlier regular occurrence on South African radio stations in the early 2000s. It is a hybrid of kwaito, deep house, gqom, jazz, soul, and lounge music characterized by synths and wide, percussive basslines. The word “amapiano” derives from the IsiZulu word for “pianos”.

I can’t pinpoint an exact moment, but it most likely was a few years ago coming out of the pandemic where I first bumped into the genre. YouTube tags me as streaming this video of TxC playing Boiler Room London four years ago. I’m pretty sure that was the first hit because I was also somewhat astounded that I’d sit through an hour long video of a dj session. Or at least have it on in the background. Also, TxC are definitely a hot look on stage. Last but not least that was likely my intro to Boiler Room which will get a whole post of its own someday.

I’m still a House and DnB guy in the main, but I’m always game to toss in an amapiano mix discovery on Apple Music, which does some curation and promotion of the form, or anything eye catching from Boiler Room. Since amapiano partially derives from House this makes complete sense. Connecting with the African continent is a cherry on top.

Highly recommended.


Litestar Lookin’

I enjoyed a relatively recent James Bennett, erm, broadside “Litestar is worth a look”. Broadside is probably too strong a term but gives you a sense of the tone. His post discusses why one should consider an alternative Python based HTTP serving engine, Litestar, as a productive modern framework. In particular, he got in a few healthy shots at a couple of my faves FastAPI and pydantic.

You save this as app.py, run with litestar run or hand it directly to the ASGI server of your choice, and it launches a web application. You go to /greet?name=Bob and it replies “Hi, Bob!”. Leave out the name parameter and it responds with an HTTP 400 telling you the name parameter is required.

So what. Big deal. The FastAPI Evangelism Strike Force will be along soon to bury you under rocket-ship emoji while explaining that FastAPI does the same thing but a million times better. And if you’re a Java person used to Spring, or a .NET person used to ASP.NET MVC, well, there’s nothing here that’s new to you; you’ve had this style of annotation/signature-driven framework for years (and in fact one thing I like about Litestar is how often it reminds me of the good parts of those frameworks). And did anyone tell you FastAPI does this, too! 🚀🚀🚀🚀🚀🚀🚀🚀🚀

But there are a lot of things that make Litestar stand out to me in the Python world. I’m going to pick out three to talk about today, and one of them is hiding in plain sight in that simple example application.

Here’s my summarization of his three points:

  1. Scalable management of route specification and organization
  2. Decoupling from Pydantic for schema validation and serialization/deserialization which enables …
  3. Application of SQLAlchemy, best of breed in the Python ecosystem, for database integration

The entire blog post is well worth reading and reasonably argued. Litestar won’t immediately become the first thing I reach for when building an HTTP backend. However, Bennett succeeded in provoking me to at least consider exploring Litestar for some future projects just to understand the tradeoffs and the developer experience. Robust alternatives are always good to know about. His closing graf captures the intent and outcome.

I could go on for a lot longer listing things I like about Litestar, and probably wind up way too far into my own subjective preferences, but hopefully I’ve given you enough of a realistic taste of what it offers that, next time you’re about to build a Python web app, you might decide to reach for 💡⭐ to carry you to the moon 🚀🚀🚀.


zsa cards

Link parkin’, just because they’re so beautiful, zsa cards

The tagline is “A deck of inspiration and connection.” I’ve bought both the Original and Premium versions and have a deck within easy reach right on my desk. When I get stuck in a monotonous monthly status update Zoom call that doesn’t really need my participation, I just flip through and admire the cards, letting my mind wander.

As advertised, they are quite attractive, entrancing objects. Highly recommended.


The marimo moment

In my previous full time gig, I did a bit of work implementing a set of APIs using FastAPI and deploying them into AWS. I needed to get a picture of API usage from external partners and AWS didn’t have anything to easily use straight out of the box. So I set about doing some dashboard development and initially thought about using Jupyter but decided to take a sidequest into marimo which had been popping up quite a bit on my podcast radar. Worked like a charm.

Also as part of my job, I built a little NLP model training platform on top of Coiled running in AWS. Highly recommend Coiled if you need to scale compute on AWS but have minimal in-house cloud and ops expertise and staffing. Especially coiled batch, which let us effectively use AWS GPU nodes. We were running super lean and didn’t have time to really drill down on all that AWS had to offer.

And now Coiled has illustrated running marimo notebooks on coiled.

My spider sense tells me marimo is having a moment and building towards escape velocity. It won’t dislodge Jupyter so much as provide a complement in the notebook ecosystem, similar to how polars complements pandas in the dataframe ecosystem.

Here’s some data points about marimo being on my radar. These are all from podcasts I subscribe to and where I consume episodes regularly:

A Listen Notes search would seem to confirm my intuition that marimo is making a push to increase visibility, possibly due to a round of venture funding at the end of last year. Podcasts aren’t the only place marimo has been popping up. The founder, Akshay Agrawal, did a PyCon US 2025 talk which is available on YouTube, and the tech features prominently in the TalkPython course, LLM Building Blocks in Python”.

I’ve found Agrawal to be pretty engaging and thoughtful in all of these conversations. He seems to be coming from a pragmatic place of hard won experience. It’s giving me confidence that this project might have legs.

My limited experimentation with marimo has shown promising shoots although it’s a fast moving target. I really like the reactive execution design choice and the underlying usage of plain Python as the notebook storage format. They’re integrating agentic AI features, of course, but there’s interesting possibilities for agentic co-development of an interactive computational artifact along with the user. Probably worthy of doing some digging into the CS research literature for comps.

Here’s to marimo finding traction and enduring.


ffmpeg, homebrew, and aac

Link parkin’ this Stack Overflow solution for personal reference:

Homebrew v2.0 dropped all of the extra options that are not explicitly enabled in each formulae. So the —with options no longer work if you use the core Homebrew formulae.

Instead you can use a third-party repository (or “tap”) such as ​homebrew-ffmpeg. This tap was created in response to the removal of the options from the core formulae.

$ brew tap homebrew-ffmpeg/ffmpeg
$ brew install homebrew-ffmpeg/ffmpeg/ffmpeg --with-fdk-aac
# or
$ brew install homebrew-ffmpeg/ffmpeg/ffmpeg --with-fdk-aac --HEAD

I have a script that uses ffmpeg to convert WAV files to m4a files for Apple Music. It needs the non-free Fraunhofer FDK AAC to properly encode the data and write out in the correct file format. It’s not often I use the script though and a skoosh of bitrot had settled in to my installation with homebrew. This solution fixed things up in a pinch.


Blogaversary

What a momentous day!! According to my math, this blog is heading into its 17th year of existence. It all started way back when with a white plastic MacBook. That thing is still kicking as well 😲. An Intel Mac, I’ve had it happily running Ubuntu Linux for a few years. Despite only having 2 CPU cores and 4GB of RAM it still comes in handy for varied experiments.

But a downer as well. Today I got laid off from my current gig. I had a pretty good streak of picking my exits, but this came out of the blue. Won’t define me though. Just more time to really figure out what’s real and what’s BS in the agentic tool space.

Onwards to new adventures 🏴‍☠️ & 🥷!

Addendum: here’s the first post


Terminal Craft Time Sink

Hola peeps! It’s been a minute. Time for an update.

Nothing major on the life front. In fact, a bit of summer relief from the tyranny of kid activity. I’ve actually been able to sleep in a few days. And work slogs on per usual.

On the personal tech front though, I’ve been spending some quality time seriously reworking my terminal lifestyle. The last couple of posts hint at what’s been going on. Those were just initial steps and much more buffing, waxing, polishing, and refinement were needed. Let’s dive in …

read more ...


Adapting Atuin

As previously mentioned, atuin has been something of a godsend. Interacting with the bash command history is a complete joy. There’s a minor configuration tweak, illustrated below, that I want to mention in case it helps someone else out. I prioritized session and directory history ahead of global for lookup. Tab sprawl is the name of the game for me, each one like an individual context. So session makes the right place to start even if it’s initially empty. global is just a few C-r hits away if needed.

[search]
## The list of enabled filter modes, in order of priority.
## The "workspace" mode is skipped when not in a workspace or workspaces = false.
## Default filter mode can be overridden with the filter_mode setting.
# filters = [ "global", "host", "session", "workspace", "directory" ]
filters = [ "session", "directory", "global", "host", "workspace" ]

The above TOML goes in ~/.config/atuin/config.toml

Meanwhile, I spent some quality time working on direnv configuration. Direnv is less of an immediate win because there’s some effort needed to add it to existing projects and use it for initiating new ones. Also, it works more as implicit magic than explicit navigation. Trey Hunner provided a good starting point but I’m molding his approach for my workflow. I’m working on a bash function to properly setup .direnv for my pre-existing uv based Python projects. pyenv is going the way of the dodo. Viva la Frank Wiles.

zoxide however, is going to take some getting used to. The directory teleportation mental model needs some burn in.


Adopting Atuin

For the longest time, like a decade or more, I’ve been really irritated by the behavior of the bash C-r key binding. It’s supposed to be a reverse history search by default. It has some non-obvious behavior though if you decide you’ve gone too far in your history.

The fix to bash is probably straightforward and I could have got on with my life. But sometimes I’m a dope. Anyway, I’m retooling my shell life, prompted by this Frank Wiles’ post My CLI World

Enter atuin:

Atuin CLI

The magical shell history tool loved by developers worldwide. Sync your commands across machines, search everything instantly, and keep your data encrypted. Open source.

Yeah, atuin is what I needed. And being the dope I am, I’ve known about it for a while now. The cross-host synching seems cools, but the default synch server being someone else’s host put me off a little.

The initial experience is looking great and exceeding my expectations.

Yup. I really am a dope. Onwards to integrating direnv and zoxide.


Man In The AIrena

Let it be known that I made my first foray into agentic AI coding, two days ago on July 6th, 2025. I worked with Claude Code to start prototyping a little tool to build m3u playlists. This will come in handy for making my local collection of music files exposable as part of my OwnTone project.

It wasn’t quite vibe coding, as I didn’t let Claude off the leash to make changes on its own. All of its requests were reviewed personally. At the same time, I haven’t written a line of code in the repo. In fact, I’ve barely looked at any of the source code. Meantime, my initial runs for building playlists look pretty good. To top it off, now a bunch of ideas for extending the tool are cooking in my head.

The experience hasn’t been mindblowingly life altering, but vaguely satisfying. This is a project I would have likely procrastinated on interminably. With an hour of effort, give or take, at least I’m started. Lots of boilerplate and mind numbing testing avoided. So there’s a path forward. And thoughts of many other deferred projects that could now be within reach. Credit to Andrew Ng, writing in The Batch, for the nudge to finally push out from the dock.

I’m completely sympathetic to all the folks who are apprehensive to pessimistic on where this is going and what it will ultimately cost. For me, guarded experimentation is the right path forward. It feels like there is some there there. Maybe (probably) not enough to match the huckster BS and snake oil, but possibly a useful normal technology. YMMV.


Owntone Dialtone

For the longest time, I’ve been dreaming of a hackable solution to drive my multiple Sonos speaker setup. A while back, I thought I could cobble something together based upon Music Player Daemon (mpd). Didn’t really have time to dig deep into setting things up. The other bit is that getting Sonos to pick up streams from mpd seemed a bit kludgey.

I kept plugging away doing background research and discovered OwnTone

OwnTone is a media server that lets you play audio sources such as local files, Spotify, pipe input or internet radio to AirPlay 1 and 2 receivers, Chromecast receivers, Roku Soundbridge, a browser or the server’s own sound system. Or you can listen to your music via any client that supports mp3 streaming.

You control the server via a web interface, Apple Remote, an Android remote (e.g. Retune), an MPD client, json API or DACP.

OwnTone also serves local files via the Digital Audio Access Protocol (DAAP) to iTunes (Windows), Apple Music (macOS) and Rhythmbox (Linux), and via the Roku Server Protocol (RSP) to Roku devices.

Did some holiday hunkering down and got an OwnTone server deployed on my homeLAN. Had some challenges doing a build under Linux on a mini-PC but I found a community package that worked out well. Turns out I had some Linuxbrew stuff causing conflicts. After that it was a little bit of firewall configuration aaaand …

crossjam’s OwnTone server dashboard

The frontend is nice while OwnTone also provides an mpd facade that’s controllable with mpd clients like mpc. There’s also a good looking JSON API. Plus it supports LastFM integration although I’ll keep on with my soco-scribbler work. Even more beautiful is the fact that it’s reachable over my TailScale network. All in all, a side project hacker’s delight.

Some burn-in needs to happen with a full play list run, but this is promising. Given the protocols that OwnTone is designed to service (DAAP and AirPlay), it shouldn’t be a surprise it does an excellent job of finding my AirPlay 2 speakers as output targets. I also have a pair of Homepod Minis that are only lightly used. Maybe this will give them renewed life.


On Tyranny

Today’s a good day to start a reread of Timothy Snyder’s On Tyranny. Might even complete it in one sitting.

Don’t Be A Bystander


1Password CLI

Just have to throw a shout out to the 1Password CLI tool. It’s actually a really elegant way to plug your friendly neighborhood secrets manager into the terminal/shell life. And the integration with Apple’s Touch ID is well done. Highly recommended.

© 2008-2025 C. Ross Jam. Built using Pelican. Theme based upon Giulio Fidente’s original svbhack, and slightly modified by crossjam.