Hacker News new | past | comments | ask | show | jobs | submit login

Isn’t this kind of thing the story of tech though?

Languages like Python and Java come around, and old-school C engineers grouse that the kids these days don’t really understand how things work, because they’re not managing memory.

Modern web-dev comes around and now the old Java hands are annoyed that these new kids are just slamming NPM packages together and polyfills everywhere and no one understands Real Software Design.

I actually sort of agree with the old C hands to some extent. I think people don’t understand how a lot of things actually work. And it also doesn’t really seem to matter 95% of the time.




I don't think the value of senior developers is so much in knowing how more things work, but rather that they've learnt (over many projects of increasing complexity) how to design and build larger more complex systems, and this knowledge mostly isn't documented for LLMs to learn from. An LLM can do the LLM thing and copy designs it has seen, but this is cargo-cult behavior - copy the surface form of something without understanding why it was built that way, and when a different design would have been better for a myriad of reasons.

This is really an issue for all jobs, not just software development, where there is a large planning and reasoning component. Most of the artifacts available to train an LLM on are the end result of reasoning, not the reasoning process themselves (the day by day, hour by hour, diary of the thought process of someone exercising their journeyman skills). As far as software is concerned, even the end result of reasoning is going to have very limited availability when it comes to large projects since there are relatively few large projects that are open source (things like Linux, gcc, etc). Most large software projects are commercial and proprietary.

This is really one of the major weaknesses of LLM-as-AGI, or LLM-as-human-worker-replacement - their lack of ability to learn on the job and pick up a skill for themselves as opposed to needing to have been pre-trained on it (with the corresponding need for training data). In-context learning is ephemeral and anyways no substitute for weight updates where new knowledge and capabilities have been integrated with existing knowledge into a consistent whole.


Just because there are these abstractions layers that happened in the past does not mean that it will continue to happen that way. For example, many no-code tools promised just that, but they never caught on.

I believe that there's a "optimal" level of abstraction, which, for the web, seems to be something like the modern web stack of HTML, JavaScript and some server-side language like Python, Ruby, Java, JavaScript.

Now, there might be tools that make a developer's life easier, like a nice IDE, debugging tools, linters, autocomplete and also LLMs to a certain degree (which, for me, still is a fancy autocomplete), but they are not abstraction layers in that sense.


I love that you brought no-code tools into this because I think it's interesting it never worked correctly.

My guess is: on one side, things like squarespace and wix get super super good for building sites that don't feel like squarespace and wix, (I'm not sure I'd want to be a pure "website dev" right now - although I think squarespace squashed a lot of that long ago) - and then very very nice tooling for "real engineers" (whatever that means).

I'm pretty handy with tech, I mean last time I built anything real was the 90s but I know how most things work pretty well. I sat down to ship an app last weekend, no sleep and Monday rolling around GCP was giving me errors and I hadn't realized one of the files the LLMs wrote looked like code but was all placeholder.

I think this is basically what the anthropic report says, automation issues happen via displacement, and displacement is typically fine, except the displacement this time is happening very rapidly (I read in different report, expecting traditionally ~80 years of displacement happens in ~10 years with AI)


Excel is a "no-code" system and people seem to like it. Of course, sometimes it tampers with your data in horrifying ways because something you entered (or imported into the system from elsewhere) just happened to look kinda like a date, even though it was intended to be something completely different. So there's that.


> Excel is a "no-code" system and people seem to like it.

If you've found any Excel guru that don't spend most of their time in VBA, you have a really unusual experience.


I've worked in finance for 20 years and this is the complete opposite of my experience. Excel is ubiquitous and drives all sorts of business processes in various departments. I've seen people I would consider Excel gurus, in that they are able to use Excel much more productively than normal users, but I've almost never seen anyone use VBA.


Huge numbers of accountants and lawyers use excel heavily knowing only the built in formula language. They will have a few "gurus" sprinkled around who can write macros but this is used sparingly because the macros are a black box and make it harder to audit the financial models.


Excel is a programming system with pure functions, imperative code (VBA/Python recently), database (cell grid, sheets etc.) and visualization tools.

So, not really "no-code".


That’s technically correct but it’s also wrong.

No-code in excel is that most functions are implemented for user and user doesn’t have to know anything about software development to create what he needs and doesn’t need software developer to do stuff for him.


Excel is hardly "no-code". Any heavy use of Excel I've seen uses formulas, which are straight-up code.


But any heavy use of "no-code" apps also ends up looking this way, with "straight-up code" behind many of the wysiwyg boxes.


Right, but "no-code" implies something: programming without code. Excel is not that in any fashion. It's either programming with code or an ordinary spreadsheet application without code. You'd really have to stretch your definitions to consider it "no-code" in a way that wouldn't apply to pretty much any office application.


I would disagree. Every formula you enter into a cell is "code". Moreover, more complex worksheets require VBA.


> Modern web-dev comes around and now the old Java hands are annoyed that these new kids are just slamming NPM packages together and polyfills everywhere and no one understands Real Software Design.

The real issue here is that a lot of the modern tech stacks are crap, but won for other reasons, e.g. JavaScript is a terrible language but became popular because it was the only one available in browsers. Then you got a lot of people who knew JavaScript so they started putting it in places outside the browser because they didn't want to learn another language.

You get a similar story with Python. It's essentially a scripting language and poorly suited to large projects, but sometimes large projects start out as small ones, or people (especially e.g. mathematicians in machine learning) choose a language for their initial small projects and then lean on it again because it's what they know even when the project size exceeds what the language is suitable for.

To slay these beasts we need to get languages that are actually good in general but also good at the things that cause languages to become popular, e.g. to get something better than JavaScript to be able to run in browsers, and to make languages with good support for large projects to be easier to use for novices and small ones, so people don't keep starting out in a place they don't want to end up.


With Web Assembly you can write in any language, even C++.

Unfortunately it doesn’t expose the DOM, so you still need JavaScript


My son is a CS major right now, and since I've been programming my whole adult life, I've been keeping an eye on his curriculum. They do still teach CS majors from the "ground up" - he took system architecture, assembly language and operating systems classes. While I kind of get the sense that most of them memorize enough to pass the tests and get their degree, I have to believe that they end up retaining some of it.


I think this is still true of a solid CS curriculum.

But it’s also true that your son will probably end up working with boot camp grads who didn’t have that education. Your son will have a deeper understanding of the world he’s operating in, but what I’m saying is that from what I’ve seen it largely hasn’t mattered all that much. The bootcampers seem to do just fine for the most part.


Yes, they remember the concepts, mostly. Not the details. But that's often enough to help with reasoning about higher-level problems.


I always considered my education to be a "Table of Contents" listing for what I'd actually learn later


And also these old C hands don't seem to get paid (significantly) more than a regular web-dev who doesn't care about hardware, memory, performance etc. Go figure.


Pay is determined by the supply and demand for labor, which encompass many factors beyond the difficulty of the work.

Being a game developer is harder than being an enterprise web services developer. Who gets paid more, especially per hour?


They do where I'm from, and spend most of their time cleaning up the messes that the regular web-devs create...


The real hardcore experts should be writing libraries anyway, to fully take advantage of their expertise in a tiny niche and to amortize the cost of studying their subproblem across many projects. It has never been easier to get people to call your C library, right? As long as somebody can write the Python interface…

Numpy has delivered so many FLOPs for BLAS libraries to work on.

Does anyone really care if you call their optimized library from C or Python? It seems like a sophomoric concern.


I think the problem is that with the over-reliance on LLMs, that expertise of writing the foundational libraries that even other languages rely on, is going away. That is exactly the problem.


Yea, every progeammer should write at least a cpu emulator in their language of choice, its such a undervalued exercise that will teach you so much about how stuff really works.


You can go to the next step. I studied computer engineering not computer science in college. We designed our own CPU and then implemented it in an FPGA.

You can go further and design it out of discrete logic gates. Then write it in Verilog. Compare the differences and which made you think more about optimizations.


"in order to bake a pie you must first create the universe", at some point, reaching to lower and lower levels stops being useful.


Sure.

Older people are always going to complain about younger people not learning something that they did. When I graduated in 1997 and started working I remember some topics that were not taught but the older engineers were shocked I didn't know it from college.

We keep creating new knowledge. It is impossible to fit everything into a 4 year curriculum without deemphasizing some other topic.

I learned Motorola 68000 assembly language in college. I talked to a recent computer science graduate and he had never seen assembly before. I also showed him how I write static HTML in vi the same way I did in 1994 for my simple web site and he laughed. He showed me the back end to their web site and how it interacts with all their databases to generate all the HTML dynamically.


The universe underneath the pie is mostly made up of invariant laws that must be followed.

The OS, libraries, web browser, runtime, and JavaScript framework underneath your website are absolutely riddled with bugs, and knowing how to identify and fix them makes you a better engineer. Many junior developers get hung up on the assumption that the function they're calling is perfect, and are incapable of investigating whether that's the truth.

This is true of many of the shoulders-of-giants we're standing on, including the stack beneath python, rust, whatever...


In fairness, creating a universe is pretty useful.


When I was a kid I "wrote" (mostly copied from a programming magazine) a 4-bit CPU emulator on my TI-99/4a. Simple as it was, it was the light bulb coming on for me about how CPUs actually worked. I could then understand the assembly language books that had been impenetrable to me before. In college when I first started using "C", pointers made intuitive sense. It's a very valuable exercise.


Notably, I don't think there was a mass disemployment of "old C hands". They just work on different things.


I wonder about this too - and also wonder what the difference of order is between the historical shifts you mention and the one we're seeing now (or will see soon).

Is it 10 times the "abstracting away complexity and understanding"? 100, 1000, [...]?

This seems important.

There must be some threshold beyond which (assuming most new developers are learning using these tools) fundamental ability to understand how the machine works and thus ability to "dive in and figure things out" when something goes wrong is pretty much completely lost.


> There must be some threshold beyond which (assuming most new developers are learning using these tools) fundamental ability to understand how the machine works and thus ability to "dive in and figure things out" when something goes wrong is pretty much completely lost.

For me this happened when working on some Spring Boot codebase thrown together by people who obviously had no idea what they were doing (which maybe is the point of Spring Boot; it seems to encourage slopping a bunch of annotations together in the hope that it will do something useful). I used to be able to fix things when they went wrong, but this thing is just so mysterious and broken in such ridiculous ways that I can never seem to get to the bottom of it,


> Languages like Python and Java come around, and old-school C engineers grouse that the kids these days don’t really understand how things work

Everything has a place, you most likely wouldn't write an HPC database in Python, and you wouldn't write a simple CRUD recipe app in C.

I think the same thing applies to using LLMS, you don't use the code it generates to control a power plant or fly an airplane. You use it for building the simple CRUD recipe app where the stakes are essentially zero.


$1 for the pencil, $1000 for the line.

That’s the 5% when it does matter.


Yes this is what people like to think. It’s not really true in practice.


And that last 5% is what you're paying for


But not really. Looking around my shop, I’m probably the only one around who used to write a lot of C code. No one is coming to ask me about obscure memory bugs. I’m certainly not getting paid better than my peers.

The knowledge I have is personally gratifying to me because I like having a deeper understanding of things. But I have to tell you I thought knowing more would give me a deeper advantage than it has in actual practice.


You're providing value every time you kill a bad idea "because things don't actually work that way" or shave a loop, you're just not tracking the value and neither is your boss.

To your employer, hiring people who know things (i.e. you) has giving them a deeper advantage in actual practice.


I would argue that your advantage right now is that YOU are the one position they can't replace with LLMs, because your knowledge requires exact fine detail on pointers and everything and needs that exact expertise. You might have toughen the same pay as your peers, but you also carry additional stability.


Is that because the languages being used at your shop have largely abstracted away memory bug issues? If you were to get a job writing embedded systems, or compilers, or OSes, wouldn't your knowledge be more highly valued and sought after (assuming you were one of the more senior devs)?


If you have genuine systems programming knowledge, usually the answer is to innovate on a particular toolchain or ship your own product (I understand you may not like business stuff though.)


LLMs are a much bigger jump in productivity than moving to a high level language.


Lately, I've been asking ChatGPT for answers to problems that I've gotten stuck on. I have yet to receive a correct answer from it that actually increases my productivity.


I don't know what to say.

I've been able to get code working in libraries that I'm wholly unfamiliar with pretty rapidly by asking the LLM what to do.

As an example, this weekend I got a new mechanical keyboard. I like to use caps+hjkl as arrows and don't want to remap in software because I'll connect to multiple computers. Turns out there's a whole open source system for this called QMK that requires one to write C to configure the keyboard.

It's been over a decade since I touched a Makefile and I never really knew C anyway but I was able get the keyboard configured and also have some custom RGB lighting on it pretty easily by going back and forth with the LLM.


It is just very random. LLMs help me write a synthesizer using an odd synth technique in an obscure musical programming language with no problem, help me fix my broken linux system no problem but then can't do anything right with the python library pyro. I think it is why people have such different experiences. It all depends randomly on how what you want to solve lines up with what the models are good at.


At least for the type of coding I do, if someone gave me the choice between continuing to work in a modern high-level language (such as C#) without LLM assistance, or switching to a low-level language like C with LLM assistance, I know which one I would choose.


Likewise, under no circumstances would I trade C for LLM-aided assembly programming. That sounds hellish. Of course it could (probably will?) be the case that this may change at some point. Innovations in higher-level languages aren't returning productivity improvements at anywhere close to the same rate as LLMs are, and in any case LLMs probably benefit from improvements to higher-level languages as well.


The programming is an interface to the machine. The AI even what we know now (LLM's, Agents, RAG) will absorb all that. It has many flaws but is still much better than most programmers.

All future programmers will be using it.

For the programmers that don't want to use it. I think there will be literally billions of lines of unbelievably bad code generated by these 1-100 generation Ai's and junior programmers that need to be corrected and fixed.


> It has many flaws but is still much better than most programmers.

This says more about most programmers then about any given LLM




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: