Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I use GPT from API in emacs, it's wonderful. Gptel is the program.

Although API access through Groq to Llama 3 (8b and 70b) is so much faster, that i cannot stand how slow GPT is anymore. It is slooow, still very capable model, but marginally better than open source alternatives.



you should try -4o. It's incredibly fast


Have you tried groq.com? Because I don't think gpt-4o is "incredibly" fast. I've been frustrated at how slow gpt-4-turbo has been lately, and gpt-4o just seems to be "acceptably" fast now, which is a big improvement, but still, not groq-level.


Yes, of course, probably sometime in the following days. Some people mention it already works in the playground.

I was wondering why OpenAI didn't release a smaller model but faster. 175 billion parameters works well, but speed sometimes is crucial. Like, a 20b parameters model could compute 10x faster.


true. at least rn though, it types around the same speed of 3.5 turbo




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: