Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If-Modified-Since and ETag are nice and everyone should implement them but IME the implementation status is much better on the reader side than on the feed side. Trim your (main) feed to only recent posts and use Atom's paginatio to link to the rest for new subscribers and the difference in data transferred becomes much smaller.

> Besides that, a well-behaved feed will have the same content as what you will get on the actual web site. The HTML might be slightly different to account for any number of failings in stupid feed readers in order to save the people using those programs from themselves, but the actual content should be the same. Given that, there's an important thing to take away from this: there is no reason to request every single $(&^$(&^@#* post that's mentioned in the feed.

> If you pull the feed, don't pull the posts. If you pull the posts, don't pull the feed. If you pull both, you're missing the whole point of having an aggregated feed!

Unfortunately there are too many feeds that don't include the full content for this to work. And a reader won't know if the feed has the full content before fetching the HTML page. This can also change from post to post so it can't just determine this when subscribing.

> Then there are the user-agents who lie about who they are or where they are coming from because they think it's going to get them special treatment somehow.

These exist because of misbehaved web servers that block based on user agen't or send different content. And since you are complaining about faked user agents that probably includes you.

> Sending referrers which make no sense is just bad manners.

HTTP Referer should not exist. And has been abused by spammers for ages.



> These exist because of misbehaved web servers that block based on user agen't or send different content. And since you are complaining aber faked user agents that probably includes you.

That's a niche. It's about 1 million percent more likely a fake request is coming from an overzealous AI scraper nowadays. I have blocked hundreds of them and I'm on the verge of giving up and handing over money to Cloudflare just for their AI scraping protection.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: