Hacker News new | past | comments | ask | show | jobs | submit login

I just had ChatGPT write me a JMAP client (for Fastmail) that'd create a draft. Then I asked it to:

"Write an OpenAI API client that would take "new_message" from the above, feed it to the API with a prompt that asks the model to either indicate that a reply is needed by outputting the string "REPLY_123123_123123" and the message to send, or give a summary. If a reply is needed, create a draft with the suggested response in the Draft mailbox."

It truncated the "REPLY_123123_123123" bit to "REPLY_", and the prompt it suggested was entirely unusuable, but the rest was fine.

I tried a couple of times to get it to generate a better prompt, but that was interestingly tricky - it kept woefully underspecifying the prompts. Presumably it has seen few examples of LLM prompts and results in its training data so far.

But overall it got close enough that I'm tempted to hook this up to my actual mailbox.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: