- Artificial Intelligence (LLMs, AI agents, and the future of human expertise)
- Blockchain (Decentralized infrastructure, networks, and ecosystem evolution)
- Data Engineering (Building data infrastructure that actually scales)
- DevOps (Infrastructure, automation, and operational philosophy)
- General (Culture, science, and the miscellaneous)
- Retro Computing (The machines and culture that shaped computing)
- Music Production (Gear, sound design, and creative workflow)
- Personal Development (Expertise, craft, and the engineering mindset)
- Space (Infrastructure and vision for human expansion beyond Earth)
Physical Modeling Synthesis: The Underrated Future of Sound Design
If you’ve spent any time with Pianoteq or the Audio Modeling SWAM instruments, you’ve felt something different. Not the crisp accuracy of a sampled library, not the flexibility of wavetable synthesis - but something that responds like an instrument. Strings that vibrate with sympathetic resonance. Piano keys with wooden resistance. A cello that sings differently when you bow it hard versus soft. This is physical modeling: mathematics as an instrument, not just a sampler or synth engine. ...
MPE Deep Dive: Why Expressive MIDI Changes Everything
If you have spent any time around electronic music in the last decade, you have probably seen the letters MPE written on the side of a controller and not thought too much about them. The acronym sounds like a feature bullet. It is not. It is a quiet but fundamental reframing of what an electronic instrument can do, and once you have spent serious time playing one, going back to a fixed-velocity keyboard feels like trading a touch screen for a number pad. ...
Hardware Sequencers in 2026: When Physical Beats Software
By mid-2026, the “in-the-box” vs “out-of-the-box” debate has fundamentally shifted. We no longer argue about analog warmth or filter aliasing - neural synthesis has made those distinctions almost invisible to the ear. The new battleground is cognitive load, and that is where dedicated hardware sequencers are quietly winning ground back. As I argued in The Automation Paradox, once AI can generate a passable 16-bar loop in seconds, the human’s job shifts to curation and intent. A hardware sequencer is the most direct tool we have for enforcing that intent. ...
Music Production: Mobile Apps for iPad and iPhone
About iOS has quietly become one of the most interesting platforms for music making. The combination of multi-touch, low-latency audio, AUv3 plug-in hosting, and instant-on hardware means an iPad can sit somewhere between a notebook for ideas and a fully credible studio instrument. The list below is what I keep installed and reach for - synths I actually open, grooveboxes that have made it into finished tracks, and a couple of learning tools that are genuinely useful. Apps are grouped by developer so you can find related instruments quickly. Where an app is iPhone-only or particularly suited to one device, I’ve called that out. ...
The Best MPE Controllers in 2026
About MIDI Polyphonic Expression (MPE) is an extension of MIDI that gives every note its own continuous control over pitch, pressure, and timbre. Where standard MIDI shares pitch bend and aftertouch across the whole channel, MPE spreads notes across multiple channels so each voice in a chord can be bent, swelled, or shaped independently. In practice this means you can hold a chord and bend a single note up, slide between voicings without re-triggering, add vibrato to the top line of a phrase while the rest sustains cleanly, and play electronic instruments with the kind of per-note nuance that string and wind players have always taken for granted. ...
Suno in May 2026: where the platform actually is
TL;DR - Suno v5.5 (March 2026) is the most expressive model yet, and three personalisation features finally make the platform usable as a real workflow: Voices (clone your own verified singing voice), Custom Models (fine-tune v5.5 on your own catalogue), and My Taste (lightweight preference learning for everyone). The Warner Music deal is now visible in the product - older models are being deprecated, free accounts have lost commercial download rights, and the ownership language has softened from “you own this” to “you have commercial rights.” Best used for demos, stem libraries, and personal sound signatures; still risky for releases that need clean copyright provenance. ...
AI Agents That Actually Work: Patterns From Real Projects
I have spent the last eighteen months either building, reviewing, or operating systems that some marketing department somewhere has called “agents”. The definition has been so thoroughly stretched that it now means anything from a chatbot with a calculator tool to a long-running autonomous workflow that touches production infrastructure. Underneath the noise there is a real engineering discipline emerging, and the patterns that separate the systems that survive contact with real users from the ones that demo well and fall over are starting to be legible. ...
My Tracks - April 2026
A selection of my music production work from April 2026. I move freely between funky house, chillsynth, ballads, techno, hard house and instrumental soundscapes. I build tracks around rhythm, mood and tiny sparks of emotion that grow into something bigger. Some tunes hit hard, some float, some just wander in and make themselves at home. Many of the tracks have been remastered and almost all album art has been updated, so the tracks have been republished. ...
Connecting Claude to Ableton: Why the New Knowledge Connector Matters
On 28 April 2026 Anthropic shipped a batch of nine creative-tool connectors for Claude, and one of them is the Ableton Knowledge connector. It is a small thing on the surface and a big thing underneath. Here is what it does, what it does not do, and why it matters if you spend your evenings inside Live or staring at a Push. What the Connector Actually Does The official Ableton connector grounds Claude’s answers in Ableton’s own product documentation for Live and Push. That is the whole pitch, and it is more useful than it sounds. ...
A Year of Agents, and What is Coming Next
A year ago, in April 2025, “AI in your workflow” mostly meant a chat window in a browser tab and an autocomplete plugin in your editor. You typed, it suggested, you accepted or rejected. The interaction model was small. The blast radius was small. The verb was “ask”. In April 2026, the verb is “delegate”. That is the headline, and it is not subtle once you go looking for it. The tools you use day to day no longer wait for prompts. They run for minutes at a time, open files, edit them, run shells, spin up sub-agents, browse the web, and come back with a result that is either roughly right or visibly wrong. You are no longer in the loop on every keystroke. You are in the loop on the outcome. ...