The article goes into some details beyond scaling that I can resonate with. I had a few forums and IRC servers in the past that grew rather large. I eventually shut them down, not because of scalability but because of legal liability and dealing with the myriad of personality issues that put my domains at risk. Scaling a forum or IRC to hundreds of thousands or even millions of people is not hard especially nowadays with cloud scaling and the current state of modern kernels and hardware.
What I found too challenging was having to moderate the content and finding moderators that could be trusted to remove illegal content in a timely manor. Worse, there were trolls that would use bots to post highly illegal material and then automatically submit their own posts to my registrars, server providers and government. The bots somehow even grabbed screenshots right after they posted content. I say bots because there was no way a human to perform their actions so quickly. This was a losing battle and I did not have the legal resources to deal with it, nor the development resources to play the cat and mouse arms race 24/7. I do have my own conspiracy theories as to who these bot owners were but that doesn't matter any more. Nowadays I could probably block more of those bots with techniques I have learned but I just do not have the desire to get back into that quagmire.
I suspect some of the Mastodon admins will learn this lesson with time. They, like me, will probably start in a state of denial and dismiss the risk until it gets real. And it certainly gets real.
The only technical work around I could find was to set forums to make all posts moderator-approved, meaning only the poster can see their post until a moderator approves it. This does not scale and people want their posts to be instantly available. With IRC I had to constantly add new file sharing domains to word filters to block the links to illegal material and that was also a losing battle.
[Edit] BeefWellington brings up a good point. I should add that I am referring to public instances of forums and IRC servers that anyone may join. Private servers are at much lower risk assuming the trusted members are good at setting strong passwords and static content is not accessible at all without an account and Mastodon servers are not linked to lesser trusted or non-private instances.
> I suspect some of the Mastodon admins will learn this lesson with time.
I doubt it. A lot of the new instances are invitation-only, and the point of Federation is I can just run my own instance and seek out the content I desire. I don't have to let anyone else onto my instance.
I can see that working. Private instances that only invite truly trustworthy people are probably much lower risk, the only risk being account take-over and the static files are are not accessible by bots then the bar is set much higher.
I should clarify that I was referring to forums and IRC servers that anyone could join. The Mastodon model in this case would be public instances that are not strictly private and are linked to other instances. Private instances would be much safer. The risk of linked instances would map to the weakest link.
I completely understand and agree with their incentives. Those with the public instances will play the winning/losing lottery, losing being not managing the troll automated induced bad content fast enough. I encourage anyone taking on this challenge to first and foremost get some trustworthy non-toxic non-power-tripping moderators around the world for the "follow the sun" management of the instances.
Some of those will exist no matter what (Twitter accounts are unlimited and still sold) but - limited to invite only doesn't need to mean "limited as in scarce" - there's no reason to share an account if you can just invite the person, instead.
I've managed closed Fb group for ten years now. For new members we have a voting system in place. Inviting wouldn't work because if members could invite whoever they choose it will sooner or later lead to having members that not everybody is comfortable with. A friend of my friend may not necessarily be my friend. It'll create friction and lead to all sorts of interpersonal problems and soon enough it's not a peaceful community anymore, people start to block each other or lash out in comments just because their personalities or beliefs clash. I don't believe in invites anymore.
Google Wave and Googl+ also had invite system, it didn't work out well. Gmail is exception to the rule I'd say.
That's ONE point of federation, but the other is cross-instance discovery and communication. If that part is underused (blocked, disabled), there's very little point in using Mastodon. You can recreate this everywhere, including at Twitter, Reddit, Discord.
What I found too challenging was having to moderate the content and finding moderators that could be trusted to remove illegal content in a timely manor. Worse, there were trolls that would use bots to post highly illegal material and then automatically submit their own posts to my registrars, server providers and government. The bots somehow even grabbed screenshots right after they posted content. I say bots because there was no way a human to perform their actions so quickly. This was a losing battle and I did not have the legal resources to deal with it, nor the development resources to play the cat and mouse arms race 24/7. I do have my own conspiracy theories as to who these bot owners were but that doesn't matter any more. Nowadays I could probably block more of those bots with techniques I have learned but I just do not have the desire to get back into that quagmire.
I suspect some of the Mastodon admins will learn this lesson with time. They, like me, will probably start in a state of denial and dismiss the risk until it gets real. And it certainly gets real.
The only technical work around I could find was to set forums to make all posts moderator-approved, meaning only the poster can see their post until a moderator approves it. This does not scale and people want their posts to be instantly available. With IRC I had to constantly add new file sharing domains to word filters to block the links to illegal material and that was also a losing battle.
[Edit] BeefWellington brings up a good point. I should add that I am referring to public instances of forums and IRC servers that anyone may join. Private servers are at much lower risk assuming the trusted members are good at setting strong passwords and static content is not accessible at all without an account and Mastodon servers are not linked to lesser trusted or non-private instances.