Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I might be wrong but aren't embedding models usually bidirectional and not causal, so the attention mechanism itself is more expensive.


It depends on the architecture (you very well can convert a decoder-only causal model to an embeddings model, e.g. Qwen/Mistral), but it is true the traditional embeddings models such as a BERT-based one are bidirectional, although unclear how much more compute that inherently requires.

Compare to ModernBERT, which uses more modern techniques and is still bidirectional, but it is very very speedy. https://huggingface.co/blog/modernbert


yes exactly




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: