Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Sounds like your LLM guy just isn’t very good.

That's the central idea here. Most guys available to hire aren't. Hence why they get constrained into a framework that limits the damage they can cause. In other areas of software development the frameworks are quite mature at this point so it works well enough.

This AI/LLM/whatever you want to call it area of development, however, hadn't garnered much interest until recently, and thus there isn't much in the way of frameworks to lean on. But business is trying to ramp up around it, thus needing to hire those who aren't good to fill seats. Like the parent says, LangChain may not be the framework we want, but it is the one we have, which beats letting the not-very-good developers create some unconstrained mess.

If you win the lottery by snagging one of the small few good developers out there, then certainly you can let them run wild engineering a much better solution. But not everyone is so fortunate.



LLMs are, at least at present, exactly the kind of thing where trying to use an abstraction without understanding what it actually does is exactly what's going to create a mess in the long run.


> Most guys available to hire aren't

Sounds like your hiring team just isn’t very good.

There are plenty of skilled people working in LLM land


Some hiring teams just don’t operate in unlimited venture capital land and have tight boundaries in terms of compensation. There’s someone good in anything if you can throw enough money at the problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: