Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In my mind, LLMs are lowering the barrier of searching in the same way Google did in the early 2000s. Back then, you had to very specifically tailor your search key words, not use words such as "the," "a," etc. Google eventually managed to turn queries such as "what's the population of Ghana" into ready-made answers.

LLMs do exactly that for more complex queries, with the downside of possible hallucinations. Suddenly, instead of doing a research on the topic, a person looking to become "a programmer" asks ChatGPT to create a syllabus for their situation, and possibly even actually generate the contents of the syllabus. ChatGPT then "searches the internet" and creates the response.

I have gained confidence that LLMs won't be much more (at least in the next couple years) than search engines with the upside of responding to complex queries, and downside of hallucinations. And for that, I find LLMs quite useful.



Problem is that the investors forking the money that fuels the research as well as the development and maintenance of this tech are doing so expecting huge returns that are unlikely to come during their lifetimes.

Once the AI winter comes once again, investor money will dry up as the realisation sets in that LLM evolution has peaked and it is all declining marginal utility when it comes to investing in LLMs.

Once the snow settles, only the open source models by big companies will survive and likely mostly treated as another egg in the basket of opportunities. Companies like OpenAI will be the most affected as their reason for existing is getting more value out of LLMs several orders of magnitude higher than the current one.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: