Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As soon as that's my baseline assumption, I think I'm done with the internet. I can get LLM slop on my own.


I thought the article was well written. I'm assuming the author did most of the writing because it didn't sound like AI slop. I also assume he meant he uses AI to assist, not as the main driver.


It really wasn't well written. I contains factual errors that stand out like lighthouses showing the author had an idea about an article but doesn't actually know the material.


> I contains (sic) factual errors that stand out like lighthouses showing the author had an idea about an article but doesn't actually know the material.

Whoops ^ To be fair, technically, I also contain some factual errors, if you consider the rare genetic mutation or botched DNA transcription.

So far, I haven't found anything that I would consider to be a glaring factual error. What did I miss?

I'm not talking merely about a difference in imagination of how the past might have unfolded. If you view this as an alternative history, I think the author made a plausible case. Certainly not the only way; reasonable people can disagree.


Sorry about that 't'. It was (very) late.


I meant it was readable. It's speculative but it's well-informed speculation, not clueless nonsense. I agree that fact checking becomes more important because LLMs hallucinate. I feel the same about vibe coding. If you don't know much about programming then running vibe code is a risky bet (depending on the criticality of the problem)


Author here, in fact all the words you read I wrote, LLMs are not very good at writing.


That matches my experience too.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: