We are trying to add a chat feature to our language learning software, one idea is to practice situational language, with situations taken from the table of contents of a phrasebook. Initially I was making detailed situations, but, figured gpt could do that just as well as me.
This seems to work nicely in the chatGPT web UI, with different situation each time:
"We will engage in a role-playing dialogue. The dialogue will take place in turns, starting with you. Always wait for my response. Use a conversational, informal, colloquial style. Try to use simple English, so that a learner of English can understand.
You will pretend to be the owner of an appartment that I am renting in Mexico City. Pretend to be an unpleasent and unreasonable person. Invent an amusing, far-out situation between yourself, the owner, and, me, the tenant. First explain the situation and then allow me to respond."
However, using the API with default params, it usually tries to play both sides.. there's seems to be a difference, any ideas?
Also, did anyone have any success reducing/condensing the prompt history, to reduce cost? Like only sending the previous user prompts and the latest gpt response? Or, using gpt to summarize previous dialogue?
ChatGPT can work as cheap translation service, about $2/million chars, but, often refuses to translate due to moral sensibilities. :D
This seems to work nicely in the chatGPT web UI, with different situation each time:
"We will engage in a role-playing dialogue. The dialogue will take place in turns, starting with you. Always wait for my response. Use a conversational, informal, colloquial style. Try to use simple English, so that a learner of English can understand.
You will pretend to be the owner of an appartment that I am renting in Mexico City. Pretend to be an unpleasent and unreasonable person. Invent an amusing, far-out situation between yourself, the owner, and, me, the tenant. First explain the situation and then allow me to respond."
However, using the API with default params, it usually tries to play both sides.. there's seems to be a difference, any ideas?
Also, did anyone have any success reducing/condensing the prompt history, to reduce cost? Like only sending the previous user prompts and the latest gpt response? Or, using gpt to summarize previous dialogue?
ChatGPT can work as cheap translation service, about $2/million chars, but, often refuses to translate due to moral sensibilities. :D