Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I noticed this "thank you" today: "GGML

Thank you to the GGML team for the tensor library that powers Ollama’s inference – accessing GGML directly from Go has given a portable way to design custom inference graphs and tackle harder model architectures not available before in Ollama."

Source: https://ollama.com/blog/multimodal-models




Thanks for the linked article! I was looking for a local vision model to recognize my handwritten notes, and this article provided a good TLDR about doing this in Ollama.

I think Ollama can improve TLDR and add more attribution to llama.cpp to their README. I don't understand why there's no reply from ollama maintainers for so long




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: