Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

you should try -4o. It's incredibly fast


Have you tried groq.com? Because I don't think gpt-4o is "incredibly" fast. I've been frustrated at how slow gpt-4-turbo has been lately, and gpt-4o just seems to be "acceptably" fast now, which is a big improvement, but still, not groq-level.


Yes, of course, probably sometime in the following days. Some people mention it already works in the playground.

I was wondering why OpenAI didn't release a smaller model but faster. 175 billion parameters works well, but speed sometimes is crucial. Like, a 20b parameters model could compute 10x faster.


true. at least rn though, it types around the same speed of 3.5 turbo




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: