There is a lot of research that shows the type of calorie you consume determines to some extent the next calorie you want to consume. You are more likely to be "sated" (i.e. not want to eat more calories), if you eat protein than you are ultra-processed carbohydrates, low calorie soda will leave your body yearning sugar, and so on.
When you couple this with the motivations of industrial food companies (some of whom are now owned by tobacco companies), and the research they do into the neuroscience effects of flavour, texture, even packaging of food, you'll start to spot that a push to "Real Food", and for that food to be less processed and more inclined towards protein, is more likely to result in overall calorie reduction.
One of the things that isn't cutting through on this program is saying "eat protein" is assumed to mean "eat meat", which some assume means you can eat burgers. Nope. Healthy protein is not red meat that has been fried - that's going to take a bit more education, I expect.
It is a chain of very cheap pubs, often known by the abbreviated name "spoons".
It's the sort of place you can go at 9am and see people having a full English breakfast with a large glass of wine. It's people who want to drink a lot of alcohol for not a lot of money, but not quite at the point where they're buying very cheap cider (which is always alcoholic in the UK), and sitting in the park with it. There's a veneer of high-functioning about it.
They do vary a bit (the "posh pub" in central Hull is the 'spoons, one of the roughest pubs I've been to in West London is also a 'spoons), but the clientele are typically white, working class, pro-Brexit (the founder is very anti-EU and publishes an in-house propaganda mag to that effect), pretty right wing, heavy drinkers.
It's not my preferred crowd, I'd rather spend a bit more and go to a pub where there's a chance somebody is reading something other than the Daily Mail or The Sun, but each to their own.
> pro-Brexit (the founder is very anti-EU and publishes an in-house propaganda mag to that effect), pretty right wing, heavy drinkers
That's a massive stretch. In my experience, the common denominator with Wetherspoons is it's somewhere people go for the cheap drinks and food. You get people of different backgrounds, age ranges and political beliefs going to Wetherspoons pubs (including plenty of apolitical people). The only undeniably true statements is that Tim Martin was pro-Brexit and there was anti-EU material in the Wetherspoons magazine around the time of the Brexit referendum, but beyond that it's not an issue that's particularly high profile anymore, it's not part of daily conversation like it once was, many people have moved on from discussing it.
> Matz: But as a programmer, the language I want to use might more often be C. I'm a C programmer, have been for many years. A C programmer for decades.
He continues: "C++. Well, I've programmed in it but it's kind of like, please spare me - so I prefer C."
I also liked this part:
> ..From a specification perspective, Lisp and SmallTalk are like seniors to me. There's a lot I can reference from them. They're like a treasure trove of ideas. If anything, I often go to them, get ideas, and incorporate them into Ruby.
I agree with you on all of this, and found myself wondering if the author had actually studied the Industrial Revolution at all.
The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.
The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.
Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.
I'm not sure we're seeing this in AI software generation yet.
Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.
It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).
I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.
I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.
Parent comment was (I think), specifically talking about sitcoms from what I understood.
Sitcoms are - and I know this is a little condescending to point out - comedies contrived to exist in a particular situation: situation comedy → sitcom.
In the old day, the "situation" needed to be relatively relatable and static to allow drop-in viewers channel surfing, or the casual viewer the parent described.
Soap operas and other long-running drama series are built differently: they are meant to have long story arcs that keep people engaged in content over many weeks, months or years. There are throwbacks to old storylines, there are twists and turns to keep you watching, and if you miss an episode you get lost, so you don't ever miss an episode - or the soap adverts within them, their reason for being for which they are named - in case you are now behind with everything.
You'll find sports networks try to build the story arc around games too - create a sense of "missing out" if you don't watch the big game live.
I think the general point is that in the stream subscription era, everything has become like this "don't miss out" form, by doubling down on the need to see everything from the beginning and become a completist.
You can't easily have a comedy show like Cheers or Married... With Children, in 2026, because there's nothing to keep you in the "next episode" loop in the structure, so you end up with comedies with long-running arcs like Schitt's Creek.
The last set of sitcoms that were immune to this were probably of the Brooklyn 99, Cougartown and Modern Family era - there were in-jokes for the devotees, but you could pick up an episode easily mid-series and just dive in and not be totally lost.
Interesting exception: Tim Allen has managed to get recommissioned with an old style format a couple of times, but he's had to make sure he's skewing to an older audience (read: it's a story of Republican guys who love hunting), for any of it to make sense to TV execs.
However... scripting requires (in my experience), a different ergonomic to shippable software. I can't quite put my finger on it, but bash feels very scriptable, go feels very shippable, python is somewhere in the middle, ruby is closer to bash, rust is up near go on the shippable end.
Good scripting is a mixture of OS-level constructs available to me in the syntax I'm in (bash obviously is just using OS commands with syntactic sugar to create conditional, loops and variables), and the kinds of problems where I don't feel I need a whole lot of tooling: LSPs, test coverage, whatever. It's languages that encourage quick, dirty, throwaway code that allows me to get that one-off job done the guy in sales needs on a Thursday so we can close the month out.
Go doesn't feel like that. If I'm building something in Go I want to bring tests along for the ride, I want to build a proper build pipeline somewhere, I want a release process.
I don't think I've thought about language ergonomics in this sense quite like this before, I'm curious what others think.
Talking about Python "somewhere in the middle" - I had a demo of a simple webview gtk app I wanted to run on vanilla Debian setup last night.. so I did the canonical-thing-of-the-month and used uv to instantiate a venv and pull the dependencies. Then attempted to run the code.. mayhem. Errors indicating that the right things were in place but that the code still couldn't run (?) and finally Python Core Dumped.. OK. This is (in some shape or form) what happens every single time I give Python a fresh go for an idea. Eventually Golang is more verbose (and I don't particularly like the mod.go system either) but once things compile.. they run. They don't attempt running or require xyz OS specific hack.
Gtk makes that simple python program way more complex since it'll need more than pure-python dependencies.
It's really a huge pain point in python. Pure python dependencies are amazingly easy to use, but there's a lot of packages that depend on either c extensions that need to be built or have OS dependencies. It's gotten better with wheels and manylinux builds, but you can still shoot your foot off pretty easily.
Python is near the top in languages that have given me trouble in other peoples' production software. Everything can be working fine and then one day the planets fall out of alignment or something and the Python portion of the software breaks and the fix is as clear as mud.
I'm pretty sure the gtk dependencies weren't built by Astral, which, yes, unfortunately means that it won't always just work, as they streamline their Python builds in... unusual ways. A few months ago I had a similar issue running a Tkinter project with uv, then all was well when I used conda instead.
Yeah.. this is exactly the overall reality of the ecosystem isn't it? That being said I do hope uv succeeds in their unification effort, there's nothing worse than relying on a smattering of diff package managers and built streams to get basic stuff working. It's like a messy workshop, it works but there's a implicit cost in terms of the lack of clarity and focus for the user. It's a cost I'm not willingly paying.
It may not be the grand unifier if they aren't willing to compromise. Currently I'd say conda is the "grand unifier", giving users 100% what they ask for artifacts-wise, albeit rather slowly. On the other hand, uv provides things super fast, but those things may break 5% of the time in unusual ways on unusual configs. I have no issue using both for the fullest experience.
I've had similar issues with anaconda, once upon a time. I've hit a critical roadblock that ruined my day with every single Python dependency/environment tool except basic venv + requirements.txt, I think. That gets in the way the least but it's also not very helpful, you're stuck with requirements.txt which tends to be error-prone to manage.
For me, the dividing line is how compact the language representation is, specifically if you can get the job done in one file or not.
I have no doubt that there's a lot of Go jobs that will fit in a 500 line script, no problem. But the language is much more geared towards modules of many files that all work together to design user-defined types, multi-threading, and more. None of that's a concern for BASH, with Python shipping enough native types to do most jobs w/o need for custom ones.
If you need a whole directory of code to make your bang-line-equipped Go script work, you may as well compile that down and install it to /usr/local/bin.
Also the lack of bang-line support in native Go suggests that everyone is kinda "doing it wrong". The fact that `go run` just compiles your code to a temporary binary anyway, points in that direction.
Have to disagree, "technically" yes, both are interpreted languages, but the ergonomics and mental overhead of doing certain things are wildly different:
In python, doing math or complex string or collection operations is usually a simple oneliner, but calling shell commands or other OS processes requires fiddling with the subprocess module, writing ad-hoc streaming loops, etc - don't even start with piping several commands together.
Bash is the opposite: As long as your task can be structured as a series of shell commands, it absolutely shines - but as soon as you require custom data manipulation in any form, you'll run into awkward edge cases and arbitrary restrictions - even for things that are absolutely basic in other languages.
> In python, ..., calling shell commands or other OS processes requires fiddling with the subprocess module, writing ad-hoc streaming loops, etc - don't even start with piping several commands together.
The subprocess module is horrendous but even if it was great bash is simpler. I just think about trying to create a pipe of processes in python without the danger of blocking.
bash to me is the C++ of scripting. There are a bunch of arcane rules which all have to be followed, forget a single one of them and you've got a vulnerability.
Maybe the ergonomics of writing code is less of a problem if you have a quick way of asking an LLM to do the edits? We can optimize for readability instead.
More specifically, for the readability of code written by an LLM.
git on the command line is the power tool. The VS Code plugin is the training wheels version.
These tools the TFA discuss are powerful, but have a learning curve. VS Code has a flatter learning curve, but it's up to you to decide if they are abstracting away things you actually need to understand.
All abstractions serve a purpose, but they're also always leaky. To extend your assembler analogy: sometimes you actually do need to understand what's happening on the stack and the heap, most people never do.
Use what works for you, but don't think that abstractions will be enough for every scenario, sometimes you're going to need to get under the hood and go deep, and if all you ever do is hand-wave it away and hope your IDE/framework/chosen programming language interpreter or compiler is going to just "sort it out" for you... well, good luck with that, and I hope it works out for you. It's just not my lived experience after 30+ years in the industry.
> git on the command line is the power tool. The VS Code plugin is the training wheels version.
I don't disagree with your underlying point, but git is perhaps the worst example. It is a horrible tool in several ways. Powerful yes, but it pays too large a price in UX.
I've only ever used git through the CLI as well, but having switched to jujutsu (also CLI) I am not going back. It is quite eye-opening how much simpler git should be, for the average user (I realize "average user" is doing some heavy-lifting here -- git covers an enormous number of diverse use cases).
What jujutsu-CLI is for me (version control UX that just works) might be VSCode's GUI git integration for other people, or magit, or GitButler, or whatever other GUI or TUI.
Who cares about training wheels? If the real deal is a unicycle with one pedal and no saddle, I will keep using training wheels, thank you very much.
Sometimes you need to debug the observability stuff a little.
As a general rule, ssh'ing into prod is a terrible idea. Getting into a pre-prod box to figure out why metrics aren't getting pushed and trying something quickly before you go back to making the changes you need to push into the repo, less so.
Part of me thinks that in 2026 (and onwards), I should make the most of the M4 processor I have in my hands, the GPUs and superb screen I will always have access to, and have a programming environment that is absolutely dripping in luxury baubles compared to what I started out with in the 1990s.
And then I frequently throw open a terminal and start up tmux with neovim in one window and a command line for git and the like in another.
There is something about understanding the tools and the process, the author is kind of right there, but also, there is just something about the ergonomics and speed of it all. There is more flexibility to extend easily, to customise, to make my environment fit using these more "primitive" tools than there is in taking an opinionated stack somebody else has designed.
When you couple this with the motivations of industrial food companies (some of whom are now owned by tobacco companies), and the research they do into the neuroscience effects of flavour, texture, even packaging of food, you'll start to spot that a push to "Real Food", and for that food to be less processed and more inclined towards protein, is more likely to result in overall calorie reduction.
One of the things that isn't cutting through on this program is saying "eat protein" is assumed to mean "eat meat", which some assume means you can eat burgers. Nope. Healthy protein is not red meat that has been fried - that's going to take a bit more education, I expect.
reply