Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

you declared a huge problem and followed up with an IF.

deepseek API supports caching, stop manufacturing problems where there is none.

https://api-docs.deepseek.com/guides/kv_cache



Sure. But there is no way I'm going to use the deepseek endpoint.

Openrouter says they might use your data for training.


First you complained about lack of caching. When you were informed that the model supports caching, instead of admitting your error you switched to an unrelated complaint. I hope that you you do not use similar strategies for discussion in your personal and work life.


Your broad attack on me as a person is unnecessary.

If you read my post carefully, you will realize that I did not make any contradictory statements.


Not a broad attack, it is specifically targeted at your proud xenophobia.


Absolutely ridiculous.

My wife is Chinese.


caching is not a function of the model but the provider, all models can be cached. the provider serving the model decides if they are going to cache it. openrouter is not a provider but a middleman between providers, so some of their providers for deepseek might provide caching and some might not. if you just use any then you might run into the issue. some of their provider might use your data for training, some might not. you have to look at the list and you can cherry pick ones that won't train on your data and that also provide caching.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: