Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'll add that programmers don't produce code. They produce business guarantees. And their main tool to do so are:

* Understanding the value proposition as a human, often as through a conversation; often with hidden criteria that you'll have to uncover

* Making sure the code respects the value proposition. As reading other people's code takes more time than writing it, ChatGPT won't help that much here.

----

One development to this story I'm anticipating for though is connecting ChatGPT to formal proof systems, and making sure it has sufficient coverage with 'internal thoughts' that are logic-verified (by opposition to killing the problem big-data-style). Sort of like the internal monologue that we have sometimes.

I say short and medium term we're sort of ok; but I don't know about 10 years from now.



Chain of thought reasoning is being explored with some success. It’s better for it to be explicit output instead of internal, remember that the next token is being predicted based on the previous ones.

Your two points may not be as sound as you think. The second one can be defeated now since ChatGPT is quite good at generating descriptions of what code is doing. The first point is more subtle, you can iterate on code with ChatGPT but right now you need to know how to run the code to get the results before providing feedback. Once tools are developed for ChatGPT to do that itself then it could just show the results and a non technical person could ask for follow up changes.


Clearly most programmers do produce code, although I’ve worked with a few who seem to do so only once all other options have been exhausted.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: