Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I hope agents.txt becomes standard and websites actually start to build agent-specific interfaces (or just have API docs in their agent.txt). In my mind it's different from "robots" which is meant to apply rules to broad web-scraping tools.


I hope they don't build agent-specific interfaces. I want my agent to have the same interface I do. And even more importantly, I want to have the same interface my agent does. It would be a bad future if the capabilities of human and agent interfaces drift apart and certain things are only possible to do in the agent interface.


I think the word you're looking for is Apartheid, and I think you're right.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: