Two months ago I would have written the same thing.
I have 50 years experience programming. I have adapted to change over time to stay employable. And I have cultivated programming as a craft, taking pride in my experience and expertise and knowing how to write working code "by hand."
Then a couple of months ago my employer adopted AI, and I saw almost immediately that I couldn't keep up with it. I could mock it, criticize, point out the silly mistakes it makes, but I found it hard to argue with the results. The programmers using AI (Claude Code in our case) got their work done faster, and I couldn't honestly say their work looked any worse than it had before AI -- in fact I noticed more unit tests, fewer regressions, and abilities enhanced even from the more junior programmers. I had to get on the bus or get off, so I learned how to use AI and have seen my own productivity increase at least 3x.
I think we need to distinguish between programming as a craft -- the thing the author says he enjoys and won't give up -- and programming as labor someone else pays for. Anyone who has worked in the software development business for very long understands that our employers and customers don't care about our craft. They don't care about readabiity, maintainability, technical debt, best practices. They care about getting things done that address the business problems they have, or think they have.
For a long time we -- programmers or whatever euphemism you prefer -- have held the upper hand. Our bosses and customers had no alternative but to pay us to write code for them. They have had to put up with shockingly unpredictable processes that lead to chronic schedule and budget overruns. They have paid for low-quality software, then paid us to do it over. Only a fraction of software projects succeed (go into production and/or result in profit or cost savings), and an even smaller fraction get delivered on time and within budget. I don't mean to imply that we have done that on purpose, but programmers do like to pat themselves on the back and talk about best practices and clean code and every other method and tool "stack" we present as silver bullets, but have little to show for it, for decades.
Now AI comes along and the curtain gets pulled back, and we're indignant, threatened, defensive. A mere bot can't possibly write code as good as I can! The AI companies reek of fraud, corruption, environmental destruction.
No matter what happens to the current crop of AI companies, or how much money gets wasted or grifted, or how much pollution they cause, the LLMs and the coding tools they enable won't go away. They work, regardless of their owners and the damage they cause. Programming will look like this from now on whether we like it or not.
We can retreat into our craft, like the guy with hand tools carving tables in his garage. But I know I can't feed myself or my family with my software craftsmanship, because no one will pay for that anymore. Faced with this reality I had to decide to either leave the business (I am at retirement age anyway) or adapt and continue to get paid. We will all have to make that choice.
In my so-far limited but overall good experience with AI programming I think knowing how to program, and having a lot of experience, gives me a significant advantage over a non-technical manager or a newb programmer. I know how to tell the tool what I want it to do in clear unambiguous terms, and I know how to decide among alternative approaches, and how to judge the result. I won't call myself a "prompt engineer" anytime soon but that describes what I do now. The author can wait for this all to blow over and for programming to go back to hand-crafted code, but I don't think that will happen.
The warp drives in Star Trek seem awesome too, we're just behind schedule.
Starship excels at funneling tax money into Musk's various enterprises. Whether it actually reaches orbit, much less the moon or Mars, merely incidental, the sexy marketing photos for an imaginary island resort.
By the time Starship does actually achieve orbit it will likely get damaged by all of the debris SpaceX has parked around the planet.
In The Art Of Computer Programming, one of the most influential and comprehensive series of books on the subject, Knuth uses a fictional assembly language called MIX in the examples. The reader does "just run the program in their head."
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
You don't say what kind of work you would look for, where in the large geography called the Bay Area, or what kind of living standard you expect. So no way anyone can give a meaningful answer, as reflected in the lack of responses.
In the software business for over 40 years. I never thought serious programmers would use words like "describing the vibe" and "watching it manifest." Or to see pseudo-meaningful phrases like "dopamine hit" and "hyper-flow" in an article about programming.
The humblebrags alone turn me off: You think too fast and have too many ideas pouring out of your "speed of thought" brain. Mere software design and coding impose "drag" and "friction," like a shark forced to swim in mud with the rest of us with less-hyperactive minds. Lay off the Adderall.
I raised three kids. Children go through a period of intense curiosity where they try to make sense of the world, ask what feels like a hundred questions an hour, and present random thoughts and ideas and theories at the "speed of thought," or a speed faster than an adult can pay attention. I think of that as fun and charming with children, not aware of what they don't know, and interpreting their random theories as novel ideas. Then they grow up and learn to focus their mental faculties, and with some luck and skill discern signal from noise.
@dang I appreciate the tireless and thankless work you do in HN, sincerely, but I don't always agree.
> Don't be curmudgeonly.
I feel flattered to get identified as a curmudgeon in company with Socrates, Samuel Johnson, Mark Twain, and George Carlin. I might take offense at the implicit ageism but at my age I roll with it. HN teems with unchallenged insults directed at the elderly, grating on us old people, but in line with the HN demographic.
> Thoughtful criticism is fine, but please don't be rigidly or generically negative.
No one can "be" those things since that implies an identity. One can write in a negative tone. Accusations of rigidity and genericity would require a large sample. No one who knows me would describe me as "rigid or generically negative" so I will let that go as an ignorant judgment.
> please don't cross into personal attack.
Refuting the OP's claims can't count as personal attack, unless we hollow out all argument and rhetoric. I apologize for the Adderall comment, should have left that out.
> That is in no way allowed here.
Ironic given the personal nature of the moderator scolding, attacking my age and identity by telling me what not to "be."
I realize it's a distinction without a difference in this case, but the reason that guideline says "Don't be curmudgeonly" as opposed to "Don't be a curmudgeon" is precisely to avoid giving the impression of labeling the person themselves. It's a transient quality that anyone can have. But I get that it didn't land that way and I'm sorry.
Actually you put it quite nicely when you say: "No one can "be" those things since that implies an identity" - I quite agree, and that's exactly what that guideline was trying (but evidently failing) to avoid. To my ear it sounds analogous to the "Don't be snarky" guideline. If I say I was "being" snarky at a certain moment (or impatient or rude or what have you), it doesn't follow that I "am" a snarky (etc.) person. That's how I meant it anyhow - I hear your point and do not mean to persuade you out of it.
The Adderall comment was the worst bit, but 'You think too fast and have too many ideas pouring out of your "speed of thought" brain' was also crossing into personal attack, and so were the last two sentences comparing the other person to a child that failed to grow up. The trouble is that these sorts of swipes accrue like mercury in the bloodstream and the ecosystem can only handle so much.
@dang Thanks for the thoughtful reply. I abandoned all other social media years ago, I stick around on HN largely because of the moderation.
You probably know that I did not actually feel insulted or attacked. One of the few advantages of getting older: I care less and less what people appear to think about me, or what they say. And I don't think you intended insult. I alert at language using forms of "to be," to the annoyance of people who argue with me.
I understand how my comment can read like a personal attack, and I could have interpreted the OP more generously, or kept my mouth shut. I will try to do better. Something about the "I have too many ideas popping into my head" and "I think too fast" -- posted daily in one form or another, or spouted in co-working spaces, sets me off. My problem, which I will blame on cognitive decline and general feeling that I have reached the end of my road in the tech industry.
You’re fair to call out the wording. I agree some of it reads more buzzword-y than intended.
My point wasn’t that thinking fast is inherently good or that coding is “drag.” Quite the opposite: the friction of implementation used to be a form of thinking time for me. Typing forced pacing and reflection.
What I’m noticing now is that when iteration becomes extremely cheap, the bottleneck shifts from “can I build this?” to “should I build this?” That’s not about hyperactivity. It’s about decision quality.
The “dopamine” part wasn’t meant as a brag but as a caution. Fast feedback loops can encourage shallow iteration instead of deeper design if you’re not careful.
So if anything, I’m arguing for more deliberate thinking, not less.
Grift of the month. How can anyone take this seriously year after year? Didn't we get promised robots building colonies on Mars by now? Or at least Starship carrying a payload and not exploding?
I don't understand why anyone finds it interesting that a machine, or chatbot, never tires or gets demoralized. You have to anthromorphize the LLM before you can even think of those possibilities. A tractor never tires or gets demoralized either, because it can't. Chatbots don't "dive into a rabbit hole ... and then keep digging" because they have superhuman tenacity, they do it because that's what software does. If I ask my laptop to compute the millionth Fibonacci number it doesn't sigh and complain, and I don't think it shows any special qualities unless I compare it to a person given the same job.
You're a machine. You're literally a wet, analog device converting some forms of energy into other forms just like any other machine as you work, rest, type out HN comments, etc. There is nothing special about the carbon atoms in your body -- there's no metadata attached to them marking them out as belonging to a Living Person. Other living-person-machines treat "you" differently than other clusters of atoms only because evolution has taught us that doing so is a mutually beneficial social convention.
So, since you're just a machine, any text you generate should be uninteresting to me -- correct?
Alternatively, could it be that a sufficiently complex and intricate machine can be interesting to observe in its own right?
If humans are machines, they are still a subset of machines and they (among other animals) are the only ones who can be demotivated and so it is still a mistake to assume an entirely different kind of machine would have those properties.
>Other living-person-machines treat "you" differently than other clusters of atoms only because evolution has taught us that doing so is a mutually beneficial social convention
Evolution doesn't "teach" anything. It's just an emergent property of the fact that life reproduces (and sometimes doesn't). If you're going to have this radically reductionist view of humanity, you can't also treat evolution as having any kind of agency.
"If humans are machines, they are still a subset of machines and they (among other animals) are the only ones who can be demotivated and so it is still a mistake to assume an entirely different kind of machine would have those properties."
Wrong level of abstraction. And not the definition of machine.
I might feel awe or amazement at what human-made machines can do -- the reason I got into programming. But I don't attribute human qualities to computers or software, a category error. No computer ever looked at me as interesting or tenacious.
I have 50 years experience programming. I have adapted to change over time to stay employable. And I have cultivated programming as a craft, taking pride in my experience and expertise and knowing how to write working code "by hand."
Then a couple of months ago my employer adopted AI, and I saw almost immediately that I couldn't keep up with it. I could mock it, criticize, point out the silly mistakes it makes, but I found it hard to argue with the results. The programmers using AI (Claude Code in our case) got their work done faster, and I couldn't honestly say their work looked any worse than it had before AI -- in fact I noticed more unit tests, fewer regressions, and abilities enhanced even from the more junior programmers. I had to get on the bus or get off, so I learned how to use AI and have seen my own productivity increase at least 3x.
I think we need to distinguish between programming as a craft -- the thing the author says he enjoys and won't give up -- and programming as labor someone else pays for. Anyone who has worked in the software development business for very long understands that our employers and customers don't care about our craft. They don't care about readabiity, maintainability, technical debt, best practices. They care about getting things done that address the business problems they have, or think they have.
For a long time we -- programmers or whatever euphemism you prefer -- have held the upper hand. Our bosses and customers had no alternative but to pay us to write code for them. They have had to put up with shockingly unpredictable processes that lead to chronic schedule and budget overruns. They have paid for low-quality software, then paid us to do it over. Only a fraction of software projects succeed (go into production and/or result in profit or cost savings), and an even smaller fraction get delivered on time and within budget. I don't mean to imply that we have done that on purpose, but programmers do like to pat themselves on the back and talk about best practices and clean code and every other method and tool "stack" we present as silver bullets, but have little to show for it, for decades.
Now AI comes along and the curtain gets pulled back, and we're indignant, threatened, defensive. A mere bot can't possibly write code as good as I can! The AI companies reek of fraud, corruption, environmental destruction.
No matter what happens to the current crop of AI companies, or how much money gets wasted or grifted, or how much pollution they cause, the LLMs and the coding tools they enable won't go away. They work, regardless of their owners and the damage they cause. Programming will look like this from now on whether we like it or not.
We can retreat into our craft, like the guy with hand tools carving tables in his garage. But I know I can't feed myself or my family with my software craftsmanship, because no one will pay for that anymore. Faced with this reality I had to decide to either leave the business (I am at retirement age anyway) or adapt and continue to get paid. We will all have to make that choice.
In my so-far limited but overall good experience with AI programming I think knowing how to program, and having a lot of experience, gives me a significant advantage over a non-technical manager or a newb programmer. I know how to tell the tool what I want it to do in clear unambiguous terms, and I know how to decide among alternative approaches, and how to judge the result. I won't call myself a "prompt engineer" anytime soon but that describes what I do now. The author can wait for this all to blow over and for programming to go back to hand-crafted code, but I don't think that will happen.
reply