Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the case with all LLMs as far as I know.

With the chatGPT api you just send everything up to that point + the new input to get the new output.

I think the benefit for the service is that it’s stateless. They just have requests in and out and don’t have to worry about anything else.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: