"code is extremely fragile in a way human language is not"
Very well put. Change a single character in a working complex program and it may start doing something completely different, or, much worse, subtly different.
Another thing we all learned the hard way when people tried "model-driven development" (that is, code that built other code from UML diagrams and flowcharts) is that writing code the first time is one thing. Modifying it later is something else entirely.
In fairness, this does apply to natural language too.
"Let's eat, grandma!"
"Let's eat grandma!"
The diffference is that we have no notion of getting people to blindly and literally follow instructions generated by AI. People in the execution loop creates an implicit layer of sanity-checking, plus language is inherently ambiguous and the reader will tend to interpret things in sensible ways even if the writer didn't understand fully.
Isn't that because compilers aren't written to cope with variations though - that rigour is necessary because humans can't deal with ambiguity. A compiler written using AI could happily understand what 'int', 'itn', 'it', 'integer', 'IntyMcintyFace', and every conceivable variation mean, and still compile them all to the same machine code. Humans don't want that in a language because it makes it hard to use. AI doesn't care.
I disagree with this. I think humans excel at ambiguity (they also excel at getting the intended meaning wrong, of course). Computers on the other hand take instructions literally. You could train them to probabilistically guess what the misspelling means, but whether they'll be better than a human remains to be seen (I personally doubt it but this can be tested).
What irks me about the assertion that "code is text" is that it's false. Code has a textual representation, which some people (not me!) argue is not even the best one; what's clear is that text is just a representation, not the only one and it's not directly code. To have an AI learn to "type" code as a string of words and characters seems obtuse if the goal is to have AI generated software. AI could operate at a different level, why bother with typing characters? It seems to me the wrong level of abstraction, akin to designing a robot hand and driving it with an AI to physically use a keyboard as a way to write code.
Actually I think this illustrates what is wrong in the idea of AI-generated code.
If you feel uneasy about AI-generated binary code ("I want to be able to debug it if something goes wrong!") you should feel equally-uneasy about AI-generated high-level language. The changes that it'll be broken in subtle ways are likely to be very similar, and I don't see good reason to believe that debugging AI-generated Haskell is going to be that much easier than debugging AI-generated executables.
Very well put. Change a single character in a working complex program and it may start doing something completely different, or, much worse, subtly different.