In my case its coding real world apps that people use and pay money for. I no longer personally type most of my code, Instead I describe stuff or write pseudo code that LLMs end up converting into the real thing.
It's very good at handling the BS part of coding but also its very good at knowing things that I don't know. I recently used it to hack a small bluetooth printer which requires its own iOS app to print, using DeepSeek and ChatGPT I was able to reverse engineer the printer communication and then create an app that will print whatever I want from my macOS laptop.
Before AI I would have to study how Bluetooth works now I don't have to. Instead, I use my general knowledge of protocols and communications and describe it to the machine and I'm asking for ideas. Then I try things and ask the stuff that I noticed but I don't understand, then I figure out how this particular device works and then describe it to the machine and ask it to generate me code that will do the thing that I discovered. LLMs are amazing at filling the gaps in a patchy knowledge, like my knowledge of Bluetooth. Because I don't know much about Bluetooth, I ended up creating a CRUD for Bluetooth because that's what I needed when trying to communicate and control my bluetooth devices(it's also what I'm used to from Web tech). I'm bit embarrassed about it but I think I will release it commercially anyway.
If I have a good LLM under my hand, I don't have to know specialised knowledge on frameworks or tools. General understanding of how things works and building up from there is all I need.
I see, for single operator, no customers products it works nicely. You may find you use it less and less and will actually require that Bluetooth knowledge eventually as you grow a product.
LLMs so far seem to be good at developing prototype apps. But most of my projects already have codegen and scaffolding tools so I guess I don’t get that use out of them.
I predict that once you release your embarrassing app, you will find all the corner cases and domain secrets come rearing out with little ability of the LLM to help you (especially with Bluetooth).
The Bluetooth app thing is just an example of LLMs helping me build something I don't have beyond-basics knowledge of.
For other stuff, I still find it very useful because why would I bother to code something non-novel when I can just tell the LLM what I need?
For example, if I need a code that finds the device that given characteristics belongs(bluetooth stuff, again) to I can just tell the LLM to write it for me. It doesn't take a genius to write such a code, its elemental stuff and I would rather not spend my working memory on remembering the structures and names of variables. I copy+paste the current class that handles the bluetooth comms, tell it that I need a function for sending data to the printer and it gives me back the result. There's no art in writing such a code, its standard code for an API and I would prefer not to bother with it.
“Before AI I would have to study how Bluetooth works now I don't have to.”
And
“It's very good at handling the BS part of coding…”
This is the part that I think is difficult in a team situation.
Learning and understanding is the important part, and certainly isn’t BS.
I understand that it really can make it seem like velocity has increased when you really are shipping things that more or less “work”, but it’s really a good practice to understand your code.
I’ve had to spend a significant amount of time fixing work that was admittedly generated using AI by other engineers, and I really fear engineers are beginning to trade deep understanding for the high of getting something that “works” with little effort.
It might “work” but you might be ignoring the work everyone around you is doing to clean up your brittle code that doesn’t scale and wasn’t thought through at inception.
You have an entirely valid worry and I get a bit scared at my use of AI because of this. I fear that dev jobs might go away or become third world only jobs like electronics manufacturing but in the mean time its scary how much it atrophies your mind. At the same time, it has opened up a universe of answers to questions I wouldn't normally ask because the bar was too high. Everyone seems to have their own unique stories.
For example just today, I dumped a crash log from the Mac version of Microsoft Remote Desktop into it. This damn app locks up 10 times a day for me causing a "Force Quit" event and subsequent crash dump to be generated. Normally what can I do with that crash dump other than send it off to Apple/Microsoft? It identified where it thought the crash was coming from: excessive right clicking causing some sort of foundational error in their logic. Avoiding right clicking has solved the issue for me. Now that I write this out, I could have spent hours upon hours finding a needle in a haystack and that would probably made me a better developer but the bar is too high, there is too much other work I have to get done than to chase this. Instead I would have just lived with it. Now I have some closure at least.
Again it seems like everyone has got their own unique stories. Is AI taking everything over? Not yet. Can I go back to pre-AI? No, its like going back to Windows 95.
It is effective because you can spend your mental energy on the things that matter, things that make difference.
Code quality actually doesn't matter when you remove the human from the loop as long as it works correctly because it becomes something made by a machine to be interpreted by a machine.
Code isn’t a binary scale of works or doesn’t - there is inefficient code and insecure code and everything else in between that still technically “works” - but a lack of understanding will eventually cause these “working” solutions to catch up to you.
You can always revisit that part of code if it doesn’t perform. For vast majority of code running on consumer devices there’s no difference between smart implementation and mediocre implementation. LLMs are great at being mediocre by default.
As for security, that mostly stems from the architecture. LLMs mediocracy also helps with following industry conventions and best practices.
In my case I never get the code being written at once, instead I make LLMs write pieces that I put together myself. Never got used to copilot or Cursor, I feel in control only with the chat interface.
Not understanding how Bluetooth works while building a Bluetooth thing seems like… a problem, though. Like, there are going to be bugs, and you’re going to have to deal with them, and that is where the “just ask the magic robot” approach tends to break down.
Funny enough, you already don't have access to low level radio so building a "Bluetooth thing" is just about dealing with some libraries and API.
Bugs happen but its not that different from any other type of bugs. Also, you end up learning about Bluetooth as bugs and other unexpected behavior happen. The great thing about LLMs is that they are interactive, so for example when collecting Bluetooth packets for analysis I ended up learning that the communication with Bluetooth is a bit like talking through a middleman and some packet types are only about giving instructions to the Bluetooth chip and others are actually about communicating with a connected device.
Using LLM for coding something you don't understand is much different than Googling something, then copy+paste a snippet from Stackoverflow because you can ask for instant explanation and modifications for testing edge cases and other ideas.
The only part I would quibble with is the fear that superficial AI generated code becomes widespread. It's not that I think this won't happen, and I wouldn't want it on my team, but I think it could actually increase demand for competent software engineers.
I got into coding about a decade ago when cheap outsourcing had been all the rage for a number of years. A lot of my early experience was taking over very poorly written apps that had started off with fast development and then quickly slowed down as all of the sloppy shortcuts built up and eventually ground development to a halt. There's a decent chance LLMs lead to another boom in that kind of work.
For mass production/scalability, I absolutely agree with you.
For products that won't be scaled, I imagine it becomes just another abstraction layer, with the cost of human input outweighing the cost of the additional infrastructure / beefing up hardware to support the inefficiencies created.
Oh come on, I'm not an "AI believer", but it regularly does things for me like write complex SQL queries that I can then verify are correct. Just something like that will often save me 20-40 minutes over doing it manually. There is _something_ there, even if it's not going to replace the workforce anytime soon.
It's very good at handling the BS part of coding but also its very good at knowing things that I don't know. I recently used it to hack a small bluetooth printer which requires its own iOS app to print, using DeepSeek and ChatGPT I was able to reverse engineer the printer communication and then create an app that will print whatever I want from my macOS laptop.
Before AI I would have to study how Bluetooth works now I don't have to. Instead, I use my general knowledge of protocols and communications and describe it to the machine and I'm asking for ideas. Then I try things and ask the stuff that I noticed but I don't understand, then I figure out how this particular device works and then describe it to the machine and ask it to generate me code that will do the thing that I discovered. LLMs are amazing at filling the gaps in a patchy knowledge, like my knowledge of Bluetooth. Because I don't know much about Bluetooth, I ended up creating a CRUD for Bluetooth because that's what I needed when trying to communicate and control my bluetooth devices(it's also what I'm used to from Web tech). I'm bit embarrassed about it but I think I will release it commercially anyway.
If I have a good LLM under my hand, I don't have to know specialised knowledge on frameworks or tools. General understanding of how things works and building up from there is all I need.
It's like a CNC machine but for coding.