Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh, one of the worst forms of torture is definitely trying to get a random Python AI project from GitHub running locally. There's almost always a conflict between versions Python, Cuda, Pytorch, and a hodgepodge of pip and conda packages. Publishing a requirements.txt is the bare miminum everybody usually does, but that's usually not enough to reconstruct the environment. The ecosystem should just standardize to using declaratively prebuilt container environments or something.

Granted, my experience is mostly from the GPT-2 era, so I'm not sure if it's still this painful.



Don’t know if this would help your case or not, but jart’s llamafile seems like it would be useful

[6] https://github.com/Mozilla-Ocho/llamafile




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: