Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Pure speculation: With so many subs now private, they're hitting a performance wall searching for enough content to fill the front page. If I'm logged in and go to the site root, it throws an error. If logged out, it works. If I go to a specific (not privatized) sub, it works.


Rule number one for high traffic sites with a lot of pageloads coming from search engines: Aggressively cache content shown to unauthenticated users. Since they don't really interact with the site, they probably won't even notice if it's a bit stale.

They used to have a really good blog post about their caching infra which for some reason was deleted. Archive link: https://web.archive.org/web/20210205121832/https://redditblo...


One thing that could be happening is many clients/bots/crawlers pinging pages that gone private now and that is overwhelming servers.


Wouldn't that be easier on the server load? You're ust displaying the error page instead of actually grabbing the feed of the subreddit


That's if the bot is coded well. I've cut off bots from sites and services before, only to have them start to DOS because their response to an error was to just try again, immediately and forever


I'd draw people's attention to https://www.reddit.com/r/redditdev/comments/13wsiks/comment/...

> On March 14th, Apollo made nearly 1 billion requests against our API in a single day, triggered in part by our system outage. After the outage, Apollo started making 53% fewer calls per day. If the app can operate with half the daily request volume, can it operate with fewer?

That's a backend server making a request that doesn't count against its rate limit as soon as it can again.

The push notification backend is described as making a request every 6 seconds for each user. If it isn't backing off correctly and delaying a requeueing of the next job, that can drastically increase the request rate.


Depends...

If a multitude of bots are written immediately retry upon receiving an error without backing off, and are now stuck in loops of flooding the servers with requests, could the number of requests become a greater factor than the simplicity of the responses?


You never know. Maybe their code throws an Exception (expensive in some languages) or logs something extra/differently if you access a private sub.


I would think so. Bots are already rate-limited when accessing the API so it's hard to see how this has increased server load, unless there's something funky in the backend involving private/dark subreddits.


this is Reddit, they only hire junior devs.


I'm experiencing the same. /r/games is not private and it works.


Another anecdote: top posts of all time load for me, even top posts this week. But front page and r/popular don't.


I would expect most of the 'top' categories are cached with a long-ish TTL. If Reddit's DB is down that would explain why those types of sorts are functioning.


Yeah :) Front page was a for loop and too many subreddits went dark




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: