• Artificial Intelligence (LLMs, AI agents, and the future of human expertise)
  • Blockchain (Decentralized infrastructure, networks, and ecosystem evolution)
  • Data Engineering (Building data infrastructure that actually scales)
  • DevOps (Infrastructure, automation, and operational philosophy)
  • General (Culture, science, and the miscellaneous)
  • Retro Computing (The machines and culture that shaped computing)
  • Music Production (Gear, sound design, and creative workflow)
  • Personal Development (Expertise, craft, and the engineering mindset)
  • Space (Infrastructure and vision for human expansion beyond Earth)
Scott Galloway on AI - The Marketing Professor's Case That the Rich Don't Need You Anymore Banner

Scott Galloway on AI: The Marketing Professor's Case That the Rich Don't Need You Anymore

Scott Galloway is the kind of commentator the AI conversation rarely produces: not a researcher, not a founder, not a doomer, not a booster. He is a marketing professor and a serial entrepreneur with a record of correctly reading the corporate stories of the last two decades, and he has spent the last two years pointing at the AI story with increasing concern. The headline of his pitch - that AI was not built for ordinary people and that the rich no longer need them - is provocative on purpose. The argument underneath is more careful, and worth pulling apart on its own terms. ...

May 4, 2026 · 14 min · James M
Hybrid Systems Montage MC-707 Banner

Hybrid Systems: Montage + MC-707 Architecture and Workflow

The Yamaha Montage M and the Roland MC-707 are both, on paper, complete instruments. The Montage is a flagship synth workstation with three distinct sound engines and the kind of polyphony and DSP headroom that makes most studio plugins look slow. The MC-707 is a compact groovebox with eight tracks, an internal sequencer, sample playback, and the kind of immediate hands-on workflow that makes laptop production feel laborious by comparison. ...

May 4, 2026 · 9 min · James M
ETL Tools and Data Integration

ETL Tools & Data Integration Platforms

What is ETL? ETL is a foundational data engineering process that powers modern analytics: Extract - Retrieve data from various sources (databases, APIs, files, cloud services, streaming platforms) Transform - Clean, validate, deduplicate, and reshape data into required data models Load - Move processed data into data warehouses, data lakes, or analytical systems ETL ensures data quality, consistency, and accessibility for analytics and reporting. In 2026 the dominant pattern is ELT (Extract-Load-Transform), which leverages cloud data warehouse compute for transformation, and increasingly EtLT (adding lightweight pre-load transforms for streaming and schema drift). See the Fundamentals of Data Engineering book for a deeper framing. ...

May 4, 2026 · 9 min · James M
The State of Blockchain in 2026 Banner

The State of Blockchain in 2026

TL;DR The blockchain industry in 2026 is no longer arguing about whether it has a future. The arguments are about which layers do which jobs. Bitcoin remains the reserve asset and the most credible neutral settlement layer. Ethereum is the dominant smart-contract base layer, with most activity now happening on its Layer 2s. Solana has taken the high-throughput application crown. Polkadot is mid-pivot from infrastructure to applications. The two structural shifts that define 2026 are modular blockchains (Celestia, EigenLayer) and the stablecoin economy, where annual settlement volume now exceeds Visa. Real-world asset tokenization has gone from a slide-deck thesis to a $30B+ live market, led by BlackRock’s BUIDL and tokenized US treasuries. The destination for the next two years is clear: payments, treasuries, and AI agents using crypto rails - and most users will not know they are using a blockchain. What Actually Survived It is worth saying out loud: most of the things that called themselves “the future of finance” in 2021 are gone. The 2022-2023 unwind cleared out the projects that had no users, no revenue, and no reason to exist. What remains in 2026 is a much smaller, much more boring, and much more useful set of networks. ...

May 4, 2026 · 15 min · James M
Interstellar Physics and Philosophy Banner

The Physics and Philosophy of Interstellar

There are not many films where the visual effects pipeline produces a peer-reviewed physics paper. Christopher Nolan’s Interstellar is one of them. The visualisation of the supermassive black hole Gargantua was rigorous enough that it ended up in Classical and Quantum Gravity, co-authored by the visual effects team and Nobel laureate Kip Thorne. That single fact captures what makes the film unusual. It is, on the surface, a story about love, time, and survival. Underneath, it is a serious attempt to take Einstein’s general relativity and put it on a 70mm IMAX screen with as little fudging as Hollywood would allow. ...

May 4, 2026 · 14 min · James M
Space Debris Tragedy of the Commons Banner

Space Debris Is a Tragedy of the Commons - Here's the Math

TL;DR Low Earth orbit (LEO) in 2026 contains roughly 40,000 tracked objects larger than 10cm and an estimated 1 million pieces of debris between 1cm and 10cm. Most of it is dead satellites, spent stages, and fragments from past collisions or anti-satellite tests. The economics are a textbook tragedy of the commons. Each launch operator captures the upside of putting hardware in orbit. The cost of debris is shared across every other operator and every future mission. There is no global price on creating debris. The risk is non-linear. Kessler syndrome - a cascade where collisions create more debris that triggers more collisions - is not a hypothetical. We are already in the early stages in some altitude bands. The fix is also a textbook commons solution: price the externality. End-of-life deorbit requirements, debris remediation funds, transparent collision-avoidance markets, and active debris removal services. Some of this exists. Most of it is undersupplied. The realistic 2026 picture: not yet a crisis, on a trajectory that becomes one within a decade if nothing changes, with the most useful policy interventions being the ones that price debris creation directly rather than relying on goodwill. The Numbers Order-of-magnitude figures, drawn from ESA’s space debris office and LeoLabs tracking data, as of 2026: ...

May 3, 2026 · 9 min · James M
China Space Programme 2026 Banner

China's Space Programme in 2026 - Tiangong, Chang'e, Lunar Plans

TL;DR China’s space programme in 2026 is one of the most consistently executed national space efforts in history. Where Western programmes have lurched between budgets and political cycles, China’s CNSA has shipped roughly what it announced, on roughly the timelines it announced. The Tiangong space station is fully operational, continuously crewed, and has hosted both domestic and international experiments. The Chang’e lunar series has progressed from sample return (Chang’e 5, 6) to the precursors of a crewed lunar landing programme planned before 2030. China has now returned samples from both the near and far sides of the Moon - the only nation to have done so. The lunar plan centres on the International Lunar Research Station (ILRS) - a long-term, China-led, multinational lunar surface base, with crewed landings as a milestone rather than the goal. Mars sample return, deep-space exploration, and a permanent lunar presence are all on a credible timeline. The realistic 2030 picture is two distinct, durable lunar architectures - American and Chinese - running in parallel. Why It Is Worth Looking Carefully It is easy in Western coverage to treat China’s space programme as a backdrop to the Artemis story. That undersells what is actually happening. ...

May 3, 2026 · 9 min · James M
Self-Hosted vs Managed in 2026 Banner

Self-Hosted vs Managed in 2026 - The Cost Math Has Changed Again

TL;DR The self-hosted vs managed decision in 2026 is genuinely different from the same decision in 2022. The math has shifted in three directions: cloud egress costs, AI workload economics, and self-hosted tooling maturity. Managed remains the right default for most teams. The thing that has changed is that the threshold at which self-hosting becomes worth considering has dropped. Workloads that were obviously managed in 2022 are genuine 50/50 calls in 2026. The most important shift is that self-hosting is no longer synonymous with on-premises. Modern self-hosting often means renting bare-metal in a colocation, running your own clusters in a hyperscaler, or using sovereign cloud providers - all with different economics. For specific categories - AI inference at scale, data egress-heavy workloads, predictable steady-state compute, regulated environments - self-hosting now wins on cost more often than people assume. The honest framing: managed is the right default; self-hosting is the right minority case; the minority is bigger than it used to be. Why This Decision Got Harder For most of the 2010s the answer was easy. Managed services were cheaper than self-hosting once you priced in operational overhead. The cloud providers competed aggressively. Self-hosting was for the regulated, the eccentric, and the very large. ...

May 3, 2026 · 9 min · James M
The eBPF Revolution Banner

The eBPF Revolution - What Every Platform Engineer Should Know

TL;DR eBPF is the technology that lets you run safe, sandboxed programs inside the Linux kernel without writing kernel modules. In 2026 it is the foundation under most serious observability, networking, and runtime security tools. The interesting story is not the technology itself - it is the wave of products built on top of it: Cilium for networking, Tetragon for runtime security, Pixie, Parca, and Coroot for observability, plus a long tail of vendor offerings using eBPF under the hood. For platform engineers, eBPF is not “a thing you have to learn to write.” It is a thing you have to know about so you can choose tools intelligently and understand what is happening on your nodes when those tools cause problems. The most important shift eBPF has enabled is observability without instrumentation. You can see what is happening on a system without modifying the application, without restarting it, and with low overhead. That is genuinely new. What eBPF Actually Is eBPF stands for “extended Berkeley Packet Filter,” which is historical and confusing because eBPF has long since outgrown packet filtering. The simple version: ...

May 3, 2026 · 9 min · James M
AI-Native Pipelines Banner

AI-Native Pipelines - What Changes When Your Consumer Is an LLM, Not a Dashboard

TL;DR Data pipelines were optimised for human consumers - dashboards, BI tools, analysts. In 2026 a growing share of pipeline output flows directly to language models, agents, and retrieval systems. That changes the design constraints in ways that catch teams off guard. Aggregation matters less. Context fidelity matters more. Freshness behaves differently. Schema moves from rigid to negotiated. Cost shifts from compute to tokens. The biggest mistake is treating an LLM consumer as if it were just another dashboard. It is not. It does not skim, it does not interpret charts, it does not have working memory across rows. It needs to be fed. The new patterns - retrieval-aware partitioning, embedding pipelines, structured-document outputs, prompt-shaped views, evaluation harnesses for data quality - are the actual subject of “AI-native data engineering” in 2026. The Underlying Shift For thirty years the implicit consumer of every data pipeline was a human looking at a screen. Even when the pipeline ended in an API or a CSV, the conceptual end-user was someone who would interpret the output with judgement, context, and skim-reading. ...

May 3, 2026 · 9 min · James M