Hacker Newsnew | past | comments | ask | show | jobs | submit | habedi0's commentslogin

This is pretty cool!


Tnx, any questions you can also ask on the git repo.


I think they want to add lambdas to Zig at some point. The number of changes is high between releases.


Unless something has changed from this (https://github.com/ziglang/zig/issues/1717#issuecomment-1627...), that is unlikely. The irony is that Zig's new async/await strategy is to use event loops with passed-in functions (https://www.youtube.com/watch?v=x3hOiOcbgeA&t=3643s).


That's kinda weird and sad. Lambdas are very useful TBH.


Wouldn't surprise me whatsoever if someone creates zigplus, which is just some TypeScript-style superset that adds lambdas and interfaces to Zig as syntax sugar.


I created this library to help me verify the correctness of my Zig code in another project. However, I wanted the option to strip the checks when the code becomes more mature, for efficiency. For efficiency, I mainly care about speed, which is why checks are only compiled out in `ReleaseFast`.


Thanks.

I'm new to Zig myself and created this library because I had trouble verifying the correctness of the code I wrote for a set of complex data structures in another larger Zig project (this one: https://github.com/habedi/ordered). I'm currently experimenting with how to use DbC in Zig, which led to the creation of this library.


Looks like a great (offline) use case for small language models. Are you going to share the source code for your implementation?



Thanks for the share.


The Story of Mel is all you need?


Hi, sure.


This is pretty cool!


Thanks.


Lol. No. Not heretical at all.

I might add support for other model providers (like Google and Azure) in the future. Although I'm trying to keep the scope of the project very small because it's easier for me to maintain it.

Anyway, I think adding new LLM providers is pretty straightforward if you want to do it yourself. You just need to implement the API of the BaseLLM (see the `cogitator/model/base.py` file) for your provider. After that, you just use it like how you use OllamaLLM or OpenAILLM.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: