Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just yesterday I was thinking to myself "I wonder how long it will be until we start generating prompts using code and then we're just writing code again"


There probably is a cool mix of both that is better than either one separately. I'm thinking something like

  func = llm('function that sorts two input argument lists')
where llm calls openai or a local llm (cached for later use). This way you don't lose the benefits of a coding interface (e.g. make all code with openai and the maintainability mess that can come with that). And you get readability through the prompt etc etc. (I mean this project is sort of in that direction already.) It's basically like writing code with a framework that is 'filled out' automatically by a coworker.

Worth being creative with ideas like this at least.


You're not the first to think that to yourself :)

> It may be illuminating to try to imagine what would have happened if, right from the start our native tongue would have been the only vehicle for the input into and the output from our information processing equipment. My considered guess is that history would, in a sense, have repeated itself, and that computer science would consist mainly of the indeed black art how to bootstrap from there to a sufficiently well-defined formal system.

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/E...


I welcome programming English++ with open arms so long as I can scold it when it makes mistakes and it doesn’t require move semantics.

I’m joking of course but I do think LLM’s will become part of the programming language lexer of some kind. If not already being looked into.


Introducing non-deterministic inputs into programs is already wild.

I think once we get past "make the LLM generate #$%!#@ JSON" we will start seeing a lot more of this, since we will then be able to constrain the code-paths that are followed.

I could absolutely see LLM-powered operators being introduced at some point, that we use just like an if statement.

Imagine "if(input like LLM(greeting))" or while(LLM(input) like "interested")

essentially distilling unstructured inputs into a canonical form to be used in control flows:

Switch(LLM(input)): - case request - case question - case query


Why not the other way around? LLM interfaced with a controller which recognizes commands in text stream and can insert data.

If it's too dark call 'HAL.lights_on' with the current room as a parameter.

"Tell me how old Napoleon would be today." "Today Napoleon, assuming he is alive, would be calc(today - data.Napoleon.birthdate).year"




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: