Hacker News new | past | comments | ask | show | jobs | submit login

The obsession with "AI" is particularly fascinating.

10 years ago, before it was a household term, people could mention without blushing that their production implementation ended up being a simple if-statement. If anything, being able to come up with an insight that was clear and well-focused enough to be implemented in such a straightforward way was a feather in one's cap.

Nowadays, if you aren't building a model with at least a few million parameters (though billions would be better), that ends up being so complex that the only hope for productionizing it is to stick it into a container and ask the engineering and ops folks not to peek inside it too carefully - sort of like a parent leaving the door to their teenager's bedroom closed - then nobody wants to take it seriously.

Call me a curmudgeon, but it doesn't half remind me of a while back when I was being directed to process 12MB datasets on a distributed computing cluster. At some point the tools stopped being a means to an end, and started being the only thing anybody cares about.




It is funny, what you are usually supposed to be replacing /automating with ML is the nasty 100 line conditional statement that forms some bit of "business logic" that nobody really understands. This thing usually being the time pressured fever dream of some dev that left the company years ago. Now it is replaced by something that is even more incomprehensible. If it is done well then at least you get performance guarantees and the ability to retrain. If it is done badly, then you don't know what the features are, which model belongs with which dataset, how to regenerate training sets, where the original training set went, how to set the hyperparameters to get reasonable training performance, etc, etc.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: